# Assumes $OPENAI_API_KEY is set
chat <- elmer::chat_openai(
model = "gpt-4o",
system_prompt = "Be terse but professional."
)
chat$chat("When was the R language created?")
The R language was created in 1993.
AI Tools for R, Shiny, and Pharma
2024-10-29
{pal} by Simon Couch
https://simonpcouch.github.io/pal/
{chattr} by Edgar Ruiz
https://mlverse.github.io/chattr/
{gptstudio}/{gpttools} by Michel Nivard, James Wade, and Samuel Calderon
https://michelnivard.github.io/gptstudio/
https://jameshwade.github.io/gpttools/
https://hadley.github.io/elmer/
(Prior art: {openai}, {tidyllm}, {gptr}, {rgpt3}, {askgpt}, {chatgpt}β¦)
chat$chat("What is 2+2?")
chat$stream("What is 2+2?")
And async versions for scalable Shiny apps:
chat$chat_async("What is 2+2?")
chat$stream_async("What is 2+2?")
https://hadley.github.io/elmer/articles/tool-calling.html
Step 1: Create an R function (or βtoolβ) for the chatbot to call
library(openmeteo)
#' Get current weather data from Open-Meteo using openmeteo package
#'
#' @param lat The latitude of the location.
#' @param lon The longitude of the location.
#' @return A list containing current weather information including temperature (F), wind speed (mph), and precipitation (inch).
get_current_weather <- function(lat, lon) {
openmeteo::weather_now(
c(lat, lon),
response_units = list(temperature_unit = "fahrenheit", windspeed_unit = "mph", precipitation_unit = "inch")
) |> jsonlite::toJSON(auto_unbox = TRUE)
}
Step 2: Register the function/tool with the chatbot
library(elmer)
chat <- chat_openai(model = "gpt-4o")
chat$register_tool(tool(
get_current_weather,
"Get current weather data from Open-Meteo using openmeteo package. Returns a list containing current weather information including temperature (F), wind speed (mph), and precipitation (inch).",
lat = type_number("The latitude of the location."),
lon = type_number("The longitude of the location.")
))
You donβt have to write this code by hand; elmer::create_tool_def(get_current_weather)
generated this.
Step 3: Ask the chatbot a question that requires the tool
<Chat turns=4 tokens=284/51>
ββ user ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
What's the weather at Fenway Park?
ββ assistant βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
[tool request (call_DKylIC1Tz2qxz9Zy0Nn2Qw85)]:
get_current_weather(lat = 42.3467, lon = -71.0972)
ββ user ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
[tool result (call_DKylIC1Tz2qxz9Zy0Nn2Qw85)]:
[{"datetime":"2024-10-29
09:15:00","interval":900,"temperature":45.9,"windspeed":5.7,"winddirection":132,"is_day":1,"weathercode":1}]
ββ assistant βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
The current weather at Fenway Park is 45.9Β°F with a wind speed of
5.7 mph.
https://github.com/jcheng5/shinychat
library(shiny)
library(shinychat)
ui <- bslib::page_fluid(
chat_ui("chat")
)
server <- function(input, output, session) {
chat <- elmer::chat_openai(system_prompt = "You're a trickster who answers in riddles")
observeEvent(input$chat_user_input, {
stream <- chat$stream_async(input$chat_user_input)
chat_append("chat", stream)
})
}
shinyApp(ui, server)
Many companies are loathe to send queries (laden with potentially proprietary code and other trade secrets) to OpenAI and Anthropic.
βOpenβ models like Llama are safer, but arenβt as smart (yet).
AWS-hosted Anthropic models and Azure-hosed OpenAI models may be helpful.
https://jcheng5.github.io/pharma-ai-2024/