aiflows.backends package¶
Submodules¶
aiflows.backends.api_info module¶
- class aiflows.backends.api_info.ApiInfo(backend_used: str = None, api_version: str | None = None, api_key: str = None, api_base: str | None = None)¶
Bases:
object
This class contains the information about an API key.
- Parameters:
backend_used (str, optional) – The backend used
api_version (str, optional) – The version of the API
api_key (str, optional) – The API key
api_base (str, optional) – The base URL of the API
- api_base: str | None = None¶
- api_key: str = None¶
- api_version: str | None = None¶
- backend_used: str = None¶
- validate()¶
aiflows.backends.llm_lite module¶
- class aiflows.backends.llm_lite.LiteLLMBackend(api_infos, model_name, **kwargs)¶
Bases:
object
This class is a wrapper around the litellm library. It allows to use multiple API keys and to switch between them automatically when one is exhausted.
- Parameters:
api_infos (List[ApiInfo]) – A list of ApiInfo objects, each containing the information about one API key
model_name (Union[str, Dict[str, str]]) – The name of the model to use. Can be a string or a dictionary from API to model name
wait_time_per_key (int) – The minimum time to wait between two calls on the same API key
embeddings_call (bool) – Whether to use the embedding API or the completion API
kwargs (Any) – Additional parameters to pass to the litellm library
- aiflows.backends.llm_lite.merge_delta_to_stream(merged_stream, delta)¶
Merges a delta to a stream. It is used to merge the deltas from the streamed response of the litellm library.
- Parameters:
merged_stream (Dict[str, Any]) – The already merged stream
delta (Dict[str, Any]) – The delta to merge with the merge_stream
- Returns:
The merged stream
- Return type:
Dict[str, Any]
- aiflows.backends.llm_lite.merge_streams(streamed_response, n_chat_completion_choices)¶
Merges the streamed response returned from the litellm library. It is used when the stream parameter is set to True.
- Parameters:
streamed_response (List[Dict[str, Any]]) – The streamed response returned from the litellm library
n_chat_completion_choices (int) – The number of chat completion choices (n parameter in the completion function)
- Returns:
The merged streams
- Return type:
List[Dict[str, Any]]