- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER
16 lines
149 B
Plaintext
16 lines
149 B
Plaintext
env/
|
|
__pycache__/
|
|
.DS_Store
|
|
*.csv
|
|
src/
|
|
eval_results/
|
|
eval_data/
|
|
*.egg-info/
|
|
.ollama/
|
|
ollama_data/
|
|
.local/
|
|
.cache/
|
|
.pytest_cache/
|
|
.devcontainer/
|
|
.env
|