Commit Graph

1838 Commits

Author SHA1 Message Date
yangdx
ef73ed4291 Install Lightrag as a Linux Service (sample files and installation guide) 2025-01-19 12:53:13 +08:00
yangdx
a7b37652cf Add document scan API notes in API README.md 2025-01-19 12:24:46 +08:00
yangdx
3a227701b2 pre-commit run --all-files 2025-01-19 10:44:46 +08:00
yangdx
fb9a645f5e Update README with Ollama API and Open WebUI details
- Add section on query mode selection
- Separate Ollama API and Open WebUI details
- Clarify query prefix usage
2025-01-19 10:38:01 +08:00
yangdx
a78be2ab17 pre-commit run --all-files 2025-01-19 08:07:26 +08:00
yangdx
ea88981146 Update README.md for LightRAG Server 2025-01-19 06:45:32 +08:00
yangdx
853a9d2064 Updated API version to 1.0.3
- Bumped API version to 1.0.3
- Fixed version reference in server code
2025-01-19 06:06:17 +08:00
yangdx
387be31f09 Refactor embedding function initialization and remove start-server.sh
- Simplified RAG initialization logic by deduplicating embedding function
- Removed start-server.sh script which is not needed
- No functional changes to the application
2025-01-19 05:19:02 +08:00
yangdx
8ea179a98b Migrate Ollama API to lightrag_server.py 2025-01-19 04:44:30 +08:00
Nick French
df69d386c5 Fixes #596 - Hardcoded model deployment name in azure_openai_complete
Fixes #596

Update `azure_openai_complete` function to accept a model parameter with a default value of 'gpt-4o-mini'.

* Modify the function signature of `azure_openai_complete` to include a `model` parameter with a default value of 'gpt-4o-mini'.
* Pass the `model` parameter to the `azure_openai_complete_if_cache` function instead of the hardcoded model name 'conversation-4o-mini'.

---

For more details, open the [Copilot Workspace session](https://copilot-workspace.githubnext.com/HKUDS/LightRAG/issues/596?shareId=XXXX-XXXX-XXXX-XXXX).
2025-01-17 12:10:26 -05:00
Saifeddine ALOUI
35f04b51e6 Update lightrag_server.py 2025-01-17 11:18:45 +01:00
zrguo
28a84b2aa2 Merge pull request #592 from danielaskdd/yangdx
Add Ollama compatible API server
2025-01-17 14:29:31 +08:00
yangdx
fde0aa32c7 pre-commit run --all-files 2025-01-17 14:28:24 +08:00
yangdx
a561879040 Translate comments to English 2025-01-17 14:27:27 +08:00
yangdx
fa9765ecd9 pre-commit run --all-files 2025-01-17 14:20:55 +08:00
yangdx
939e399dd4 Translate comment to English 2025-01-17 13:36:31 +08:00
zrguo
7e4ba8d14d Merge pull request #591 from luohuanhuan2019/main
add readme_zh
2025-01-17 12:07:27 +08:00
yangdx
3138ae7599 添加对 mix 查询模式的支持 2025-01-17 11:04:36 +08:00
Saifeddine ALOUI
6813742a86 fixed some linting issues 2025-01-17 02:34:29 +01:00
Saifeddine ALOUI
52ca5ea6aa removed repeated dependency 2025-01-17 01:37:12 +01:00
Saifeddine ALOUI
5fe28d31e9 Fixed linting 2025-01-17 01:36:16 +01:00
Saifeddine ALOUI
84f7f15046 Added optional Azure configuration 2025-01-17 00:54:24 +01:00
Saifeddine ALOUI
65a44a4644 Added api version and Configuration details at startup as well as more useful information 2025-01-17 00:53:49 +01:00
Saifeddine ALOUI
b8c0631e99 Enhanced documentation 2025-01-17 00:49:17 +01:00
Saifeddine ALOUI
d8309c81d5 Fixed typing error 2025-01-16 23:22:57 +01:00
Saifeddine ALOUI
ea566d815d Added environment variables control of all lightrag server parameters preparing for the usage in docker 2025-01-16 23:21:50 +01:00
Saifeddine ALOUI
2c3ff234e9 Moving extended api documentation to new doc folder 2025-01-16 22:14:16 +01:00
yangdx
847963d19a 修复 /query 和 /query/stream 端点处理stream模式是的错误 2025-01-17 03:35:03 +08:00
yangdx
34d6b85adb 修复清理查询前缀时未能正确清理空格的问题 2025-01-17 01:50:07 +08:00
luohuanhuan2019
98f5d7c596 Prompt words to keep the pronunciation consistent 2025-01-16 21:50:43 +08:00
luohuanhuan2019
36c7abf358 Remove garbled characters from system prompt words 2025-01-16 21:35:37 +08:00
yangdx
ac11a7192e revert changeds make by mistake 2025-01-16 21:04:45 +08:00
yangdx
d15753d51a Merge branch 'main' into yangdx 2025-01-16 20:20:09 +08:00
yangdx
95ff048a9e 为Ollama API添加性能统计功能
- 新增token估算函数
- 记录流式响应时间
- 计算输入输出token数
- 统计响应生成时间
- 返回详细的性能指标
2025-01-16 19:42:34 +08:00
zrguo
d7cfe029eb Update __init__.py 2025-01-16 14:24:29 +08:00
zrguo
b84aab5cd0 Merge pull request #590 from jin38324/main
Enhance Robustness of insert Method with Pipeline Processing and Caching Mechanisms
2025-01-16 14:20:08 +08:00
Gurjot Singh
0265c2359c Merge branch 'HKUDS:main' into feature-implementation 2025-01-16 10:53:01 +05:30
jin
6ae8647285 support pipeline mode 2025-01-16 12:58:15 +08:00
jin
d5ae6669ea support pipeline mode 2025-01-16 12:52:37 +08:00
jin
17a2ec2bc4 Merge branch 'HKUDS:main' into main 2025-01-16 09:59:27 +08:00
yangdx
5e4c9dd4d7 移除api server 对 lightrag-hku 的依赖(解决非编辑调试方式安装无法启动api服务的问题) 2025-01-16 03:26:47 +08:00
yangdx
ae9e37a120 Merge remote-tracking branch 'origin/main' into yangdx 2025-01-16 01:50:46 +08:00
yangdx
ea22d62c25 移除调试日志打印代码 2025-01-15 23:11:15 +08:00
yangdx
9632a8f0dc 解决查询命中缓存时流式响应未遵循Ollma规范的问题
- rag返回结果未字符串时,响应分两次发送
- 第一次发送查询内容
- 第二次发送统计信息
2025-01-15 23:09:50 +08:00
yangdx
ca2caf47bc 修改流式响应的输出格式:从event-stream改为x-ndjson 2025-01-15 22:14:57 +08:00
yangdx
6d44178f63 修复测试用例流结束判断 2025-01-15 21:26:20 +08:00
yangdx
af9ac188f0 增强聊天接口的调试和性能统计功能
- 添加原始请求日志记录
- 修改响应结构以包含性能统计
- 更新测试用例以展示性能数据
- 优化响应格式为字典结构
- 增加请求体解码功能
2025-01-15 21:15:12 +08:00
yangdx
8ef1248c76 将OllamaChatRequest的stream参数默认值改为True 2025-01-15 20:54:22 +08:00
yangdx
f81b1cdf0a 为Ollama API返回结果添加图像字段和性能统计信息
- 在OllamaMessage中添加images字段
- 响应消息中增加images字段
- 完成标记中添加性能统计信息
- 更新测试用例以处理性能统计
- 移除测试用例中的/naive前缀
2025-01-15 20:46:45 +08:00
yangdx
23f838ec94 优化流式响应处理并添加测试用例
- 修复流式响应中的完成标记逻辑
- 添加非流式调用测试
- 添加流式调用测试
- 优化JSON序列化,支持非ASCII字符
- 确保生成器在完成标记后立即结束
2025-01-15 20:18:17 +08:00