openai_interface

Interface to use a LLM served by OpenAI.

Classes

OpenAILLMInterface

Initializes interface for OpenAI served LLM.

Module Contents

class openai_interface.OpenAILLMInterface(configuration_path: str, use_chat_api: bool = False, default_response: str = None)

Bases: usersimcrs.simulator.llm.interfaces.llm_interface.LLMInterface

Initializes interface for OpenAI served LLM.

Parameters:
  • configuration_path – Path to the configuration file.

  • use_chat_api – Whether to use the chat or completion API. Defaults to False (i.e., completion API).

  • default_response – Default response to be used if the LLM fails to generate a response.

Raises:

FileNotFoundError – If the configuration file is not found.

generate_utterance(prompt: usersimcrs.simulator.llm.prompt.utterance_generation_prompt.UtteranceGenerationPrompt) dialoguekit.core.Utterance

Generates a user utterance given a prompt.

Parameters:

prompt – Prompt for generating the utterance.

Returns:

Utterance in natural language.

get_llm_api_response(prompt: str, initial_prompt: str = None) str

Gets the raw response from the LLM API.

This method should be used to interact directly with the LLM API, i.e., for everything that is not related to the generation of an utterance.

Parameters:
  • prompt – Prompt for the LLM.

  • initial_prompt – Initial prompt for the chat API. Defaults to None.

Returns:

Response from the LLM API without any post-processing.