Skip to contents

Creates an OllamaConfig object which can be passed to create_agent()

Usage

config_Ollama(
  model_name,
  temperature = TEMPERATURE_DEFAULT,
  base_url = OLLAMA_URL_DEFAULT,
  think = NULL
)

Arguments

model_name

Character: The name of the LLM model to use. Must be an Ollama model.

temperature

Numeric: The temperature for the model.

base_url

Character: Base URL of Ollama server.

think

Optional Logical or Character {"low", "medium", "high"}: Default thinking mode for this config. Logical values target models like deepseek or qwen3; character values target gpt-oss. Can be overridden per call.

Value

OllamaConfig object

Author

EDG

Examples

# Requires running Ollama server and gemma4:e4b model
if (FALSE) { # \dontrun{
  config_Ollama(
    model_name = "gemma4:e4b",
    temperature = 0.2
  )
} # }