openai
OpenAI specific module.
Hint
Use pip to install the necessary dependencies for this module:
pip install mltb2[openai]
- class mltb2.openai.OpenAiAzureChat(azure_endpoint: str, model: str, api_key: str | None = None, base_url: str | None = None, api_version: str | None = None, azure_ad_token: str | None = None, azure_ad_token_provider: Callable[[], str] | Literal['auto', 'auto_on_behalf_of'] | None = None)[source]
Bases:
OpenAiChat,_OpenAiAzureChatBaseTool to interact with Azure OpenAI chat models.
This can also be constructed with
from_yaml().See also
OpenAI API reference: Create chat completion
Quickstart: Get started generating text using Azure OpenAI Service
AzureADTokenProviderexampleAzureMLOnBehalfOfCredentialdoc
- Parameters:
api_key (str | None) – The OpenAI API key.
model (str) – The OpenAI model name.
api_version (str | None) – The OpenAI API version. A common value for this is
2023-05-15.azure_ad_token (str | None) – The Azure Active Directory token.
azure_ad_token_provider (Callable[[], str] | Literal['auto', 'auto_on_behalf_of'] | None) – A function that returns an Azure Active Directory token, which will be invoked on every request. Or set to
"auto"to use default Azure credentials or"auto_on_behalf_of"to use Azure on behalf of credentials.azure_endpoint (str) – The Azure endpoint.
base_url (str | None)
- classmethod from_yaml(yaml_file, api_key: str | None = None, azure_ad_token: str | None = None, azure_ad_token_provider: Callable[[], str] | Literal['auto', 'auto_on_behalf_of'] | None = None, **kwargs)[source]
Construct this class from a yaml file.
If the
api_keyis not set in the yaml file, it will be loaded from the environment variableOPENAI_API_KEY.- Parameters:
yaml_file – The yaml file.
api_key (str | None) – The OpenAI API key.
azure_ad_token (str | None) – The Azure Active Directory token.
azure_ad_token_provider (Callable[[], str] | Literal['auto', 'auto_on_behalf_of'] | None) – A function that returns an Azure Active Directory token, which will be invoked on every request. Or set to
"auto"to use default Azure credentials or"auto_on_behalf_of"to use Azure on behalf of credentials.kwargs – extra kwargs to override parameters
- Returns:
The constructed class.
- class mltb2.openai.OpenAiChat(model: str, api_key: str | None = None, base_url: str | None = None)[source]
Bases:
objectTool to interact with OpenAI chat models.
This also be constructed with
from_yaml().See also
OpenAI API reference: Create chat completion
- Parameters:
- create_completions(prompt: str | list[dict[str, str]], completion_kwargs: dict[str, Any] | None = None, clean_openai_tokens: bool = False) OpenAiChatResult[source]
Create a model response for the given prompt (chat conversation).
- Parameters:
prompt (str | list[dict[str, str]]) – The prompt for the model.
completion_kwargs (dict[str, Any] | None) –
Keyword arguments for the OpenAI completion.
modelcan not be set viacompletion_kwargs! Please set themodelin the initializer.messagescan not be set viacompletion_kwargs! Please set thepromptargument.
Also see:
openai.resources.chat.completions.Completions.create()OpenAI API reference: Create chat completion
clean_openai_tokens (bool) – Remove OpenAI special tokens from the prompt.
- Returns:
The result of the OpenAI completion.
- Return type:
- async create_completions_async(prompt: str | list[dict[str, str]], completion_kwargs: dict[str, Any] | None = None, clean_openai_tokens: bool = False) OpenAiChatResult[source]
Create a model response for the given prompt (chat conversation).
- Parameters:
prompt (str | list[dict[str, str]]) – The prompt for the model.
completion_kwargs (dict[str, Any] | None) –
Keyword arguments for the OpenAI completion.
modelcan not be set viacompletion_kwargs! Please set themodelin the initializer.messagescan not be set viacompletion_kwargs! Please set thepromptargument.
Also see:
openai.resources.chat.completions.Completions.create()OpenAI API reference: Create chat completion
clean_openai_tokens (bool) – Remove OpenAI special tokens from the prompt.
- Returns:
The result of the OpenAI completion.
- Return type:
- classmethod from_yaml(yaml_file, api_key: str | None = None, **kwargs)[source]
Construct this class from a yaml file.
If the
api_keyis not set in the yaml file, it will be loaded from the environment variableOPENAI_API_KEY.- Parameters:
yaml_file – The yaml file.
api_key (str | None) – The OpenAI API key.
kwargs – extra kwargs to override parameters
- Returns:
The constructed class.
- class mltb2.openai.OpenAiChatResult(content: str | None = None, model: str | None = None, prompt_tokens: int | None = None, completion_tokens: int | None = None, total_tokens: int | None = None, finish_reason: str | None = None, completion_args: dict[str, Any] | None = None)[source]
Bases:
objectResult of an OpenAI chat completion.
If you want to convert this to a
dictuseasdict(open_ai_chat_result)from thedataclassesmodule.See also
OpenAI API reference: The chat completion object
- Parameters:
content (str | None) – the result of the OpenAI completion
model (str | None) – model name which has been used
prompt_tokens (int | None) – number of tokens of the prompt
completion_tokens (int | None) – number of tokens of the completion (
content)total_tokens (int | None) – number of total tokens (
prompt_tokens + content_tokens)finish_reason (str | None) –
The reason why the completion stopped.
stop: Means the API returned the full completion without running into any token limit.length: Means the API stopped the completion because of running into a token limit.content_filter: When content was omitted due to a flag from the OpenAI content filters.tool_calls: When the model called a tool.function_call(deprecated): When the model called a function.
completion_args (dict[str, Any] | None) –
The arguments which have been used for the completion. Examples:
model: always setmax_tokens: only set ifcompletion_kwargscontainedmax_tokenstemperature: only set ifcompletion_kwargscontainedtemperaturetop_p: only set ifcompletion_kwargscontainedtop_p
- class mltb2.openai.OpenAiTokenCounter(model_name: str, show_progress_bar: bool = False)[source]
Bases:
objectCount OpenAI tokens.
- Parameters:
- class mltb2.openai._OpenAiAzureChatBase(azure_endpoint: str)[source]
Bases:
object- Parameters:
azure_endpoint (str)