- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER
30 lines
308 B
Plaintext
30 lines
308 B
Plaintext
typing-extensions
|
|
langchain-openai
|
|
langchain-experimental
|
|
langchain_anthropic
|
|
langchain_google_genai
|
|
pandas
|
|
yfinance
|
|
praw
|
|
feedparser
|
|
stockstats
|
|
eodhd
|
|
langgraph
|
|
chromadb
|
|
setuptools
|
|
backtrader
|
|
akshare
|
|
tushare
|
|
finnhub-python
|
|
parsel
|
|
requests
|
|
tqdm
|
|
pytz
|
|
redis
|
|
chainlit
|
|
rich
|
|
questionary
|
|
ollama
|
|
pytest
|
|
python-dotenv
|