openai

OpenAI specific module.

Hint

Use pip to install the necessary dependencies for this module: pip install mltb2[openai]

class mltb2.openai.OpenAiAzureChat(azure_endpoint: str, model: str, api_key: str | None = None, api_version: str | None = None, azure_ad_token: str | None = None)[source]

Bases: OpenAiChat, _OpenAiAzureChatBase

Tool to interact with Azure OpenAI chat models.

This can also be constructed with from_yaml().

Parameters:
  • api_key (str | None) – The OpenAI API key.

  • model (str) – The OpenAI model name.

  • api_version (str | None) – The OpenAI API version. A common value for this is 2023-05-15.

  • azure_endpoint (str) – The Azure endpoint.

  • azure_ad_token (str | None) –

classmethod from_yaml(yaml_file, api_key: str | None = None, azure_ad_token: str | None = None, **kwargs)[source]

Construct this class from a yaml file.

If the api_key is not set in the yaml file, it will be loaded from the environment variable OPENAI_API_KEY.

Parameters:
  • yaml_file – The yaml file.

  • api_key (str | None) – The OpenAI API key.

  • azure_ad_token (str | None) – Azure AD token

  • kwargs – extra kwargs to override parameters

Returns:

The constructed class.

class mltb2.openai.OpenAiChat(model: str, api_key: str | None = None)[source]

Bases: object

Tool to interact with OpenAI chat models.

This also be constructed with from_yaml().

See also

OpenAI API reference: Create chat completion

Parameters:
  • api_key (str | None) – The OpenAI API key.

  • model (str) – The OpenAI model name.

create_completions(prompt: str | List[Dict[str, str]], completion_kwargs: Dict[str, Any] | None = None, clean_openai_tokens: bool = False) OpenAiChatResult[source]

Create a model response for the given prompt (chat conversation).

Parameters:
  • prompt (str | List[Dict[str, str]]) – The prompt for the model.

  • completion_kwargs (Dict[str, Any] | None) –

    Keyword arguments for the OpenAI completion.

    • model can not be set via completion_kwargs! Please set the model in the initializer.

    • messages can not be set via completion_kwargs! Please set the prompt argument.

    Also see:

  • clean_openai_tokens (bool) – Remove OpenAI special tokens from the prompt.

Returns:

The result of the OpenAI completion.

Return type:

OpenAiChatResult

async create_completions_async(prompt: str | List[Dict[str, str]], completion_kwargs: Dict[str, Any] | None = None, clean_openai_tokens: bool = False) OpenAiChatResult[source]

Create a model response for the given prompt (chat conversation).

Parameters:
  • prompt (str | List[Dict[str, str]]) – The prompt for the model.

  • completion_kwargs (Dict[str, Any] | None) –

    Keyword arguments for the OpenAI completion.

    • model can not be set via completion_kwargs! Please set the model in the initializer.

    • messages can not be set via completion_kwargs! Please set the prompt argument.

    Also see:

  • clean_openai_tokens (bool) – Remove OpenAI special tokens from the prompt.

Returns:

The result of the OpenAI completion.

Return type:

OpenAiChatResult

classmethod from_yaml(yaml_file, api_key: str | None = None, **kwargs)[source]

Construct this class from a yaml file.

If the api_key is not set in the yaml file, it will be loaded from the environment variable OPENAI_API_KEY.

Parameters:
  • yaml_file – The yaml file.

  • api_key (str | None) – The OpenAI API key.

  • kwargs – extra kwargs to override parameters

Returns:

The constructed class.

class mltb2.openai.OpenAiChatResult(content: str | None = None, model: str | None = None, prompt_tokens: int | None = None, completion_tokens: int | None = None, total_tokens: int | None = None, finish_reason: str | None = None, completion_args: Dict[str, Any] | None = None)[source]

Bases: object

Result of an OpenAI chat completion.

If you want to convert this to a dict use asdict(open_ai_chat_result) from the dataclasses module.

See also

OpenAI API reference: The chat completion object

Parameters:
  • content (str | None) – the result of the OpenAI completion

  • model (str | None) – model name which has been used

  • prompt_tokens (int | None) – number of tokens of the prompt

  • completion_tokens (int | None) – number of tokens of the completion (content)

  • total_tokens (int | None) – number of total tokens (prompt_tokens + content_tokens)

  • finish_reason (str | None) –

    The reason why the completion stopped.

    • stop: Means the API returned the full completion without running into any token limit.

    • length: Means the API stopped the completion because of running into a token limit.

    • content_filter: When content was omitted due to a flag from the OpenAI content filters.

    • tool_calls: When the model called a tool.

    • function_call (deprecated): When the model called a function.

  • completion_args (Dict[str, Any] | None) –

    The arguments which have been used for the completion. Examples:

    • model: always set

    • max_tokens: only set if completion_kwargs contained max_tokens

    • temperature: only set if completion_kwargs contained temperature

    • top_p: only set if completion_kwargs contained top_p

classmethod from_chat_completion(chat_completion: ChatCompletion, completion_kwargs: Dict[str, Any] | None = None)[source]

Construct this class from an OpenAI ChatCompletion object.

Parameters:
  • chat_completion (ChatCompletion) – The OpenAI ChatCompletion object.

  • completion_kwargs (Dict[str, Any] | None) – The arguments which have been used for the completion.

Returns:

The constructed class.

class mltb2.openai.OpenAiTokenCounter(model_name: str, show_progress_bar: bool = False)[source]

Bases: object

Count OpenAI tokens.

Parameters:
  • model_name (str) –

    The OpenAI model name. Some examples:

    • gpt-4

    • gpt-3.5-turbo

    • text-davinci-003

    • text-embedding-ada-002

  • show_progress_bar (bool) – Show a progressbar during processing.

__call__(text: str | Iterable) int | List[int][source]

Count tokens for text.

Parameters:

text (str | Iterable) – The text for which the tokens are to be counted.

Returns:

The number of tokens if text was just a str. If text is an Iterable then a list of number of tokens.

Return type:

int | List[int]

class mltb2.openai._OpenAiAzureChatBase(azure_endpoint: str)[source]

Bases: object

Parameters:

azure_endpoint (str) –

mltb2.openai.remove_openai_tokens(messages: List[Dict[str, str]]) List[Dict[str, str]][source]

Remove OpenAI special tokens from the messages.

These tokens are <|im_start|> and <|im_end|> and they can cause problems when passed to the OpenAI API.

Parameters:

messages (List[Dict[str, str]]) – The OpenAI messages.

Returns:

The messages without OpenAI special tokens.

Return type:

List[Dict[str, str]]