Commercial enterprise offerings

Announcing ellmer: A package for interacting with Large Language Models in R

Written by Posit Team
2025-02-25
Ellmer package for AI in R

Get our email updates

Interested in learning more about Posit + AI tools? Join our email list.

We are delighted to announce the release of ellmer 0.1.1, an R package designed to simplify interacting with large language models (LLMs) in R.

Install ellmer from CRAN:

install.packages("ellmer")

An LLM is a type of machine-learning model capable of a wide variety of tasks, including natural language, image, audio, and video processing. ellmer acts like a bridge to integrate the power of LLMs into your R workflows.

The comprehensive “Get Started” documentation provides a guide to setting up and using ellmer.

Choosing your LLM provider

ellmer supports a wide range of model providers for both individual users and enterprises. For personal projects, you can connect to providers like OpenAI (using chat_openai()), Anthropic (using chat_anthropic), or Gemini (using chat_gemini()). If you’re just getting started, we recommend Gemini because it offers powerful models with a generous free tier.

For organizations with specific security or compliance requirements, ellmer integrates with major cloud providers such as Azure (chat_azure()), AWS Bedrock (chat_bedrock()), Databricks (chat_databricks()), or Snowflake (chat_snowflake()). If there’s something your organisation needs that we don’t currently support, please let us know!

Create a new chat object like so:

library(ellmer)

chat <- chat_gemini(
  system_prompt = "You are a friendly but terse assistant.",
)

Authentication

You can authenticate your session based on the requirements of your provider. For providers like OpenAI and Anthropic, you’ll need an API key. The recommended approach is to store your API key in your .Renviron file using usethis::edit_r_environ(). Then, access it in your R script:

Sys.getenv("GOOGLE_API_KEY")

ellmer also automatically detects OAuth or IAM-based credentials for supported providers, including those managed by Posit Workbench and Posit Connect. If your Posit administrator has configured authentication, calls like chat_bedrock(), chat_snowflake(), and chat_databricks() will work immediately.

Interacting with LLMs: Chatting, Tool Calling, and More

The ellmer package has rich documentation that helps you get started with LLMs regardless of your prior experience.

  • Chatting. ellmer offers several ways to chat with LLMs:
    • Interactive chat console: Use live_console(chat) or live_browser() for a real-time chat experience within your R console or browser. This is a great way to explore and experiment with different prompts.
    • Interactive method call: The chat() function allows you to interact with the LLM programmatically so that you can integrate LLM interactions into your R scripts and functions.
    • Programmatic chat: Create a chat object inside a function that returns the result as a string.

  • Tool/function calling. ellmer’s built-in tool calling feature extends the capabilities of LLMs by connecting them to external R functions. For example, you can create a function to get the current time and then register it with the chat object using register_tool(). The LLM can then use this tool to answer questions that require access to real-time information.

  • Streaming and async APIs. For real-time processing of LLM responses, use the stream() method. This is useful for building interactive applications, such as chatbots in Shiny apps. For non-streaming, asynchronous operations, continue to use the chat() method but handle the result as a promise.

  • Structured data extraction. To ensure consistent data extraction, and avoid issues with file type conversions, use the $extract_data() method to define the desired structure.

What’s next for ellmer?

We’re actively developing ellmer and have exciting plans for the future.

  • We’re implementing caching, batching, and parallel requests to optimize performance and reduce costs when processing large volumes of queries. This will enable mall-like workflows with interaction with any supported provider.
  • We’re working on a package to provide retrieval augmented generation (RAG). This lets you incorporate private or domain-specific data into your LLM problems by chunking documents and adding relevant context to queries.

We encourage you to try ellmer and share your feedback!