yaze 0.3.2
Link to the Past ROM Editor
 
Loading...
Searching...
No Matches
yaze::cli::OllamaConfig Struct Reference

#include <ollama_ai_service.h>

Public Attributes

std::string base_url = "http://localhost:11434"
 
std::string model = "qwen2.5-coder:7b"
 
float temperature = 0.1
 
int max_tokens = 2048
 
std::string system_prompt
 
bool use_enhanced_prompting = true
 

Detailed Description

Definition at line 16 of file ollama_ai_service.h.

Member Data Documentation

◆ base_url

std::string yaze::cli::OllamaConfig::base_url = "http://localhost:11434"

◆ model

std::string yaze::cli::OllamaConfig::model = "qwen2.5-coder:7b"

◆ temperature

float yaze::cli::OllamaConfig::temperature = 0.1

Definition at line 19 of file ollama_ai_service.h.

Referenced by yaze::cli::OllamaAIService::GenerateResponse().

◆ max_tokens

int yaze::cli::OllamaConfig::max_tokens = 2048

Definition at line 20 of file ollama_ai_service.h.

Referenced by yaze::cli::OllamaAIService::GenerateResponse().

◆ system_prompt

std::string yaze::cli::OllamaConfig::system_prompt

◆ use_enhanced_prompting

bool yaze::cli::OllamaConfig::use_enhanced_prompting = true

Definition at line 22 of file ollama_ai_service.h.

Referenced by yaze::cli::OllamaAIService::OllamaAIService().


The documentation for this struct was generated from the following file: