Package: tidyllm 0.2.6

Eduard Brüll

tidyllm: Tidy Integration of Large Language Models

A tidy interface for integrating large language model (LLM) APIs such as 'Claude', 'Openai', 'Groq','Mistral' and local models via 'Ollama' into R workflows. The package supports text and media-based interactions, interactive message history, batch request APIs, and a tidy, pipeline-oriented interface for streamlined integration into data workflows. Web services are available at <https://www.anthropic.com>, <https://openai.com>, <https://groq.com>, <https://mistral.ai/> and <https://ollama.com>.

Authors:Eduard Brüll [aut, cre]

tidyllm_0.2.6.tar.gz
tidyllm_0.2.6.zip(r-4.5)tidyllm_0.2.6.zip(r-4.4)tidyllm_0.2.6.zip(r-4.3)
tidyllm_0.2.6.tgz(r-4.4-any)tidyllm_0.2.6.tgz(r-4.3-any)
tidyllm_0.2.6.tar.gz(r-4.5-noble)tidyllm_0.2.6.tar.gz(r-4.4-noble)
tidyllm_0.2.6.tgz(r-4.4-emscripten)tidyllm_0.2.6.tgz(r-4.3-emscripten)
tidyllm.pdf |tidyllm.html
tidyllm/json (API)
NEWS

# Install 'tidyllm' in R:
install.packages('tidyllm', repos = c('https://edubruell.r-universe.dev', 'https://cloud.r-project.org'))

Peer review:

Bug tracker:https://github.com/edubruell/tidyllm/issues

On CRAN:

6.85 score 32 stars 17 scripts 502 downloads 55 exports 33 dependencies

Last updated 8 hours agofrom:43d878f5cd. Checks:OK: 1 ERROR: 6. Indexed: yes.

TargetResultDate
Doc / VignettesOKNov 22 2024
R-4.5-winERRORNov 22 2024
R-4.5-linuxERRORNov 22 2024
R-4.4-winERRORNov 22 2024
R-4.4-macERRORNov 22 2024
R-4.3-winERRORNov 22 2024
R-4.3-macERRORNov 22 2024

Exports:azure_openaiazure_openai_chatazure_openai_embeddingchatchatgptcheck_batchcheck_claude_batchcheck_openai_batchclaudeclaude_chatdf_llm_messageembedfetch_batchfetch_claude_batchfetch_openai_batchgeminigemini_chatgemini_delete_filegemini_embeddinggemini_file_metadatagemini_list_filesgemini_upload_fileget_metadataget_replyget_reply_dataget_user_messagegroqgroq_chatgroq_transcribelast_metadatalast_replylast_reply_datalast_user_messagelist_batcheslist_claude_batcheslist_openai_batchesllm_messageLLMMessagemistralmistral_chatmistral_embeddingollamaollama_chatollama_download_modelollama_embeddingollama_list_modelsopenaiopenai_chatopenai_embeddingpdf_page_batchrate_limit_infosend_batchsend_claude_batchsend_openai_batchtidyllm_schema

Dependencies:askpassbase64encclicpp11curlfansigenericsgluehttr2jsonlitelifecyclelubridatemagrittropensslpdftoolspillarpkgconfigpngpurrrqpdfR6rappdirsRcpprlangS7stringistringrsystibbletimechangeutf8vctrswithr

Get Started

Rendered fromtidyllm.Rmdusingknitr::rmarkdownon Nov 22 2024.

Last update: 2024-11-22
Started: 2024-10-16

Readme and manuals

Help Manual

Help pageTopics
Azure-OpenAI Endpoint Provider Functionazure_openai
Send LLM Messages to an OpenAI Chat Completions endpoint on Azureazure_openai_chat
Generate Embeddings Using OpenAI API on Azureazure_openai_embedding
Chat with a Language Modelchat
Alias for the OpenAI Provider Functionchatgpt
Check Batch Processing Statuscheck_batch
Check Batch Processing Status for Claude APIcheck_claude_batch
Check Batch Processing Status for OpenAI Batch APIcheck_openai_batch
Provider Function for Claude models on the Anthropic APIclaude
Interact with Claude AI models via the Anthropic APIclaude_chat
Convert a Data Frame to an LLMMessage Objectdf_llm_message
Generate text embeddingsembed
Fetch Results from a Batch APIfetch_batch
Fetch Results for a Claude Batchfetch_claude_batch
Fetch Results for an OpenAI Batchfetch_openai_batch
Google Gemini Provider Functiongemini
Send LLMMessage to Gemini APIgemini_chat
Delete a File from Gemini APIgemini_delete_file
Generate Embeddings Using the Google Gemini APIgemini_embedding
Retrieve Metadata for a File from Gemini APIgemini_file_metadata
List Files in Gemini APIgemini_list_files
Upload a File to Gemini APIgemini_upload_file
Retrieve Metadata from Assistant Repliesget_metadata last_metadata
Retrieve Assistant Reply as Textget_reply last_reply
Retrieve Assistant Reply as Structured Dataget_reply_data last_reply_data
Retrieve a User Message by Indexget_user_message last_user_message
Groq API Provider Functiongroq
Send LLM Messages to the Groq Chat APIgroq_chat
Transcribe an Audio File Using Groq transcription APIgroq_transcribe
List all Batch Requests on a Batch APIlist_batches
List Claude Batch Requestslist_claude_batches
List OpenAI Batch Requestslist_openai_batches
Create or Update Large Language Model Message Objectllm_message
Large Language Model Message ClassLLMMessage
Mistral Provider Functionmistral
Send LLMMessage to Mistral APImistral_chat
Generate Embeddings Using Mistral APImistral_embedding
Ollama API Provider Functionollama
Interact with local AI models via the Ollama APIollama_chat
Download a model from the Ollama APIollama_download_model
Generate Embeddings Using Ollama APIollama_embedding
Retrieve and return model information from the Ollama APIollama_list_models
OpenAI Provider Functionopenai
Send LLM Messages to the OpenAI Chat Completions APIopenai_chat
Generate Embeddings Using OpenAI APIopenai_embedding
Batch Process PDF into LLM Messagespdf_page_batch
Get the current rate limit information for all or a specific APIrate_limit_info
Send a batch of messages to a batch APIsend_batch
Send a Batch of Messages to Claude APIsend_claude_batch
Send a Batch of Messages to OpenAI Batch APIsend_openai_batch
Create a JSON schema for structured outputstidyllm_schema