chatLLM: A Flexible Interface for 'LLM' API Interactions
Provides a flexible interface for interacting with Large Language Model ('LLM')
providers including 'OpenAI', 'Azure OpenAI', 'Azure AI Foundry', 'Groq', 'Anthropic',
'DeepSeek', 'DashScope', 'Gemini', 'Grok', 'GitHub Models', and AWS Bedrock. Supports both synchronous and asynchronous
chat-completion APIs, with
features such as retry logic, dynamic model selection, customizable parameters, and
multi-message conversation handling. Designed to streamline integration with
state-of-the-art LLM services across multiple platforms.
| Version: |
0.1.4 |
| Imports: |
httr (≥ 1.4.0), jsonlite (≥ 1.7.2), stats |
| Suggests: |
aws.signature, future, promises, later, testthat, roxygen2 |
| Published: |
2026-02-15 |
| DOI: |
10.32614/CRAN.package.chatLLM |
| Author: |
Kwadwo Daddy Nyame Owusu Boakye [aut, cre] |
| Maintainer: |
Kwadwo Daddy Nyame Owusu Boakye <kwadwo.owusuboakye at outlook.com> |
| BugReports: |
https://github.com/knowusuboaky/chatLLM/issues |
| License: |
MIT + file LICENSE |
| URL: |
https://github.com/knowusuboaky/chatLLM,
https://knowusuboaky.github.io/chatLLM/ |
| NeedsCompilation: |
no |
| Materials: |
README, NEWS |
| CRAN checks: |
chatLLM results |
Documentation:
Downloads:
Linking:
Please use the canonical form
https://CRAN.R-project.org/package=chatLLM
to link to this page.