- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER
PULL REQUEST: Add support for other backends, such as OpenRouter and Ollama it had two requirments missing. added those