- Add top-k and cosine-threshold parms for api server
- Update .env and cli parms handling with new parameters
- Improve splash screen display
- Update bash and storage classes to read new parameters from .env file.
- Remove model parameter from azure_openai_complete (all LLM complete functions must have the same parameter structure)
- Use LLM_MODEL env var in Azure OpenAI function
- Comment out Lollms example in .env.example (duplication with Ollama example)