ollama_interface

Interface to use a LLM served by Ollama.

Classes

OllamaLLMInterface

Initializes interface for ollama served LLM.

Module Contents

class ollama_interface.OllamaLLMInterface(configuration_path: str, default_response: str = None)

Bases: usersimcrs.simulator.llm.interfaces.llm_interface.LLMInterface

Initializes interface for ollama served LLM.

Parameters:
  • configuration_path – Path to the configuration file.

  • default_response – Default response to be used if the LLM fails to generate a response.

Raises:
  • FileNotFoundError – If the configuration file is not found.

  • ValueError – If the model or host is not specified in the config.

generate_utterance(prompt: usersimcrs.simulator.llm.prompt.utterance_generation_prompt.UtteranceGenerationPrompt) dialoguekit.core.Utterance

Generates a user utterance given a prompt.

Parameters:

prompt – Prompt for generating the utterance.

Returns:

Utterance in natural language.

get_llm_api_response(prompt: str) str

Gets the raw response from the LLM API.

This method should be used to interact directly with the LLM API, i.e., for everything that is not related to the generation of an utterance.

Parameters:
  • prompt – Prompt for the LLM.

  • **kwargs – Additional arguments to be passed to the API call.

Returns:

Response from the LLM API without any post-processing.