LLMAccess.jl
LLMAccess.jl provides a lightweight, composable interface and CLI to interact with multiple LLM providers (OpenAI, Anthropic, Google, Mistral, OpenRouter, DeepSeek, Ollama, Ollama Cloud) from Julia.
- Flexible provider abstraction with typed methods
- Sensible defaults via environment variables
- Small CLI helpers for quick usage in shells and scripts
See the README for end-to-end examples and environment setup.
Installation
This package is a standard Julia project. From the package REPL:
pkg> dev /path/to/llmaccess.jlInstantiate dependencies:
julia --project -e 'using Pkg; Pkg.instantiate()'Quick Start
Programmatic usage:
using LLMAccess
# Call by provider name (keywords for options)
text = LLMAccess.call_llm(
"google",
"You are helpful",
"Hello!";
model = LLMAccess.get_default_model("google"),
temperature = LLMAccess.get_default_temperature(),
)
println(text)Notes:
- For attachments (e.g., images), use the typed form which accepts an attachment path positionally:
call_llm(GoogleLLM(), system, input, model, temperature, attach_file). The name-based helper does not take attachments.
Output normalization:
By default the dispatcher normalizes certain punctuation in responses (dashes and smart quotes). To opt out in the name-based helper, pass normalize_output=false:
text = LLMAccess.call_llm("google", "", "“Quotes” and — dashes –"; normalize_output=false)CLI examples (see also the CLI page):
julia --project script/ask.jl --llm google "Hello"
julia --project script/cmd.jl --llm openai "list files changed today"
julia --project script/cmd.jl -f ./script/example.sh --llm openai "Review {{F|:ext=md}}"
julia --project script/ask.jl --alias
julia --project script/ask.jl --llm-alias
julia --project script/ask.jl --llm deepseek --model r1 "Outline a reasoning trace"
julia --project script/ask.jl --llm ollama --model gemma3-4b-ollama "Summarize this file"
julia --project script/ask.jl --llm ollama_cloud --model gpt-oss:120b "Explain this snippet"
Note: `script/cmd.jl` copies the generated command to your clipboard by default. Use `--no-copy` to disable copying for that script, or `--cmd 'your command'` to bypass the LLM and still use the copy/execute flow. Prompts can reference `{{FILE}}`/`{{F}}` (and `{{FILE|cmd}}`/`{{F|cmd}}`) to embed the `-f/--file` path or pipe it through a short shell snippet, with the value available on STDIN and as `$FILE_PLACEHOLDER`. To avoid external helpers for common tweaks, prefix helpers with `:` such as `{{F|:ext=pdf}}`, `{{F|:basename}}`, `{{F|:dirname}}`, `{{F|:stem}}`, `{{F|:ext}}`, or `{{F|:remove-ext}}`. The legacy `path:` prefix is still supported for older prompts.Configuration
Configure API keys and defaults via environment variables (examples):
OPENAI_API_KEY,ANTHROPIC_API_KEY,GOOGLE_API_KEY,MISTRAL_API_KEY,OPENROUTER_API_KEY,DEEPSEEK_API_KEY,OLLAMA_API_KEY(cloud only; the local daemon does not require a key.)DEFAULT_LLM,DEFAULT_OPENAI_MODEL,DEFAULT_GOOGLE_MODEL,DEFAULT_OLLAMA_MODEL,DEFAULT_OLLAMA_CLOUD_MODEL, etc.DEFAULT_TEMPERATURE
Refer to README for full details.