openai_interface¶
Interface to use a LLM served by OpenAI.
Classes¶
Initializes interface for OpenAI served LLM. |
Module Contents¶
- class openai_interface.OpenAILLMInterface(configuration_path: str, use_chat_api: bool = False, default_response: str = None)¶
Bases:
usersimcrs.simulator.llm.interfaces.llm_interface.LLMInterface
Initializes interface for OpenAI served LLM.
- Parameters:
configuration_path – Path to the configuration file.
use_chat_api – Whether to use the chat or completion API. Defaults to False (i.e., completion API).
default_response – Default response to be used if the LLM fails to generate a response.
- Raises:
FileNotFoundError – If the configuration file is not found.
- generate_response(prompt: usersimcrs.simulator.llm.prompt.Prompt) dialoguekit.core.Utterance ¶
Generates a user utterance given a prompt.
- Parameters:
prompt – Prompt for generating the utterance.
- Returns:
Utterance.