Docker support and Ollama support (#47)
- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER
This commit is contained in:
@@ -1,6 +1,8 @@
|
||||
typing-extensions
|
||||
langchain-openai
|
||||
langchain-experimental
|
||||
langchain_anthropic
|
||||
langchain_google_genai
|
||||
pandas
|
||||
yfinance
|
||||
praw
|
||||
@@ -22,5 +24,6 @@ redis
|
||||
chainlit
|
||||
rich
|
||||
questionary
|
||||
langchain_anthropic
|
||||
langchain-google-genai
|
||||
ollama
|
||||
pytest
|
||||
python-dotenv
|
||||
|
||||
Reference in New Issue
Block a user