ollama_interface¶
Interface to use a LLM served by Ollama.
Classes¶
Initializes interface for ollama served LLM. |
Module Contents¶
- class ollama_interface.OllamaLLMInterface(configuration_path: str, default_response: str = None)¶
Bases:
usersimcrs.simulator.llm.interfaces.llm_interface.LLMInterface
Initializes interface for ollama served LLM.
- Parameters:
configuration_path – Path to the configuration file.
default_response – Default response to be used if the LLM fails to generate a response.
- Raises:
FileNotFoundError – If the configuration file is not found.
ValueError – If the model or host is not specified in the config.
- generate_response(prompt: usersimcrs.simulator.llm.prompt.Prompt) dialoguekit.core.Utterance ¶
Generates a user utterance given a prompt.
- Parameters:
prompt – Prompt for generating the utterance.
- Returns:
Utterance.