Docker support and Ollama support (#47)
- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER
This commit is contained in:
10
tests/__init__.py
Normal file
10
tests/__init__.py
Normal file
@@ -0,0 +1,10 @@
|
||||
"""
|
||||
TradingAgents Test Suite
|
||||
|
||||
This package contains all test scripts for the TradingAgents application:
|
||||
- test_openai_connection.py: OpenAI API connectivity tests
|
||||
- test_ollama_connection.py: Ollama connectivity tests
|
||||
- test_setup.py: General setup and configuration tests
|
||||
"""
|
||||
|
||||
__version__ = "1.0.0"
|
||||
Reference in New Issue
Block a user