Docker support and Ollama support (#47)
- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER
This commit is contained in:
@@ -192,6 +192,10 @@ print(decision)
|
||||
|
||||
You can view the full list of configurations in `tradingagents/default_config.py`.
|
||||
|
||||
## Docker usage and local ollama tests ##
|
||||
|
||||
See [Docker Readme](./Docker-readme.md) for details.
|
||||
|
||||
## Contributing
|
||||
|
||||
We welcome contributions from the community! Whether it's fixing a bug, improving documentation, or suggesting a new feature, your input helps make this project better. If you are interested in this line of research, please consider joining our open-source financial AI research community [Tauric Research](https://tauric.ai/).
|
||||
|
||||
Reference in New Issue
Block a user