Skip to contents

llmapply is the lapply-style entry point for running a single prompt against an LLM repeatedly over a vector of inputs. Pass either a model name (in which case an LLM is built on the fly using backend, system_prompt, output_schema) or a pre-built LLM object.

Usage

llmapply(
  x,
  model_or_llm,
  backend = c("ollama", "openai", "anthropic"),
  system_prompt = SYSTEM_PROMPT_DEFAULT,
  output_schema = NULL,
  verbosity = 1L,
  extract_responses = TRUE,
  ...
)

Arguments

x

Character or list: Values to iterate over. Each element forms the user prompt for one call to the LLM.

model_or_llm

Character or LLM: Either the name of a model (a string) or a pre-built LLM object (for example from create_Ollama, create_OpenAI, or create_Anthropic).

backend

Character {"ollama", "openai", "anthropic"}: Backend to use when model_or_llm is a string. Ignored when model_or_llm is an LLM object.

system_prompt

Character: System prompt to use when building the LLM from a model name. Ignored when model_or_llm is an LLM object.

output_schema

Optional Schema: Output schema to enforce, created with schema. When model_or_llm is a string, this is baked into the built LLM. When model_or_llm is a pre-built LLM, supplying this here is a conflict and will error.

verbosity

Integer [0, Inf): Verbosity level. The per-call verbosity is verbosity - 1L.

extract_responses

Logical: If TRUE, return a character vector of assistant responses (with NA_character_ for missing assistant content). If FALSE, return the raw list of Message objects from each call.

...

Additional per-call arguments forwarded to generate (e.g. temperature, top_p, max_tokens, stop, think, top_k, seed).

Value

If extract_responses = TRUE, a character vector the same length as x. Otherwise, a list of Message objects.

Details

Per-call overrides such as temperature, top_p, max_tokens, stop, think, plus backend-specific options like top_k or seed, are forwarded via ... to generate. Vectors passed via ... are not yet recycled across x — they are forwarded as-is to each call.

Author

EDG

Examples