Configure and use multiple Language Model providers in R2R
LLMProvider
supports multiple third-party Language Model (LLM) providers, offering flexibility in choosing and switching between different models based on your specific requirements. This guide provides an in-depth look at configuring and using various LLM providers within the R2R framework.
LiteLLMProvider
offers a unified interface for multiple LLM services.
Key features:
OpenAILLM
class provides direct integration with OpenAI’s models.
Key features:
completions
section in your r2r.toml
file:
generation_config
is used to establish the default generation parameters for your deployment. These settings can be overridden at runtime, offering flexibility in your application. You can adjust parameters:rag
or get_completion
methodsLLMConfig
: A configuration class for LLM providers.LLMProvider
: An abstract base class that defines the interface for all LLM providers.LLMConfig
class is used to configure LLM providers:
LLMProvider
is an abstract base class that defines the common interface for all LLM providers:
LLMProvider
.get_completion
and get_completion_stream
.LLMConfig
class to include your custom provider:R2RConfig
class handles the configuration of various components, including LLMs. Here’s a simplified version: