Docker support and Ollama support (#47)
- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER
This commit is contained in:
7
.gitignore
vendored
7
.gitignore
vendored
@@ -6,3 +6,10 @@ src/
|
||||
eval_results/
|
||||
eval_data/
|
||||
*.egg-info/
|
||||
.ollama/
|
||||
ollama_data/
|
||||
.local/
|
||||
.cache/
|
||||
.pytest_cache/
|
||||
.devcontainer/
|
||||
.env
|
||||
|
||||
Reference in New Issue
Block a user