Docker support and Ollama support (#47)

- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER
This commit is contained in:
Geeta Chauhan
2025-06-25 20:57:05 -07:00
committed by GitHub
parent 7abff0f354
commit 78ea029a0b
23 changed files with 2141 additions and 19 deletions

View File

@@ -192,6 +192,10 @@ print(decision)
You can view the full list of configurations in `tradingagents/default_config.py`.
## Docker usage and local ollama tests ##
See [Docker Readme](./Docker-readme.md) for details.
## Contributing
We welcome contributions from the community! Whether it's fixing a bug, improving documentation, or suggesting a new feature, your input helps make this project better. If you are interested in this line of research, please consider joining our open-source financial AI research community [Tauric Research](https://tauric.ai/).