Commit Graph

8 Commits

Author SHA1 Message Date
vinland100 9ec07a6594 feat: Centralize Git tokens to system environment variables and add Gitea branch verification. 2026-01-05 17:12:47 +08:00
vinland100 2b0c7f5c2a feat: Lock LLM and embedding configurations to system environment variables, mask API keys, and refactor frontend logout. 2026-01-05 14:45:00 +08:00
lintsinghua 8fe96a83cf feat(agent): 使用用户配置的LLM参数替代硬编码值
重构所有Agent和LLM服务,移除硬编码的temperature和max_tokens参数
添加get_analysis_config函数统一处理分析配置
在LLM测试接口中显示用户保存的配置参数
前端调试面板默认显示LLM测试详细信息
2025-12-19 16:08:26 +08:00
lintsinghua 2e11f3e1a3 feat(llm): 增强LLM错误处理和调试信息展示
在LLMError异常类中添加api_response字段存储原始错误信息
实现_extract_api_response方法从异常中提取API响应
前端增加调试信息展示面板,显示详细的错误诊断数据
后端测试接口返回完整的调试信息,包括耗时、错误类型等
2025-12-19 11:41:06 +08:00
lintsinghua f640bfbaba feat: 添加敏感信息加密存储功能
- 新增 encryption.py 加密服务,使用 Fernet 对称加密
- API Key、Token 等敏感字段在数据库中加密存储
- 读取时自动解密,兼容未加密的旧数据
- 优化配置保存后自动更新前端状态
2025-11-28 17:51:17 +08:00
lintsinghua 7091f891d1 feat(llm): enhance LLM connection testing with improved error handling and adapter instantiation
- Bypass LLMFactory cache during connection tests to ensure fresh API calls with latest configuration
- Directly instantiate native adapters (Baidu, Minimax, Doubao) and LiteLLM adapter based on provider type
- Add comprehensive error handling in LiteLLM adapter with specific exception catching for authentication, rate limiting, and connection errors
- Implement user-friendly error messages for common failure scenarios (invalid API key, authentication failure, timeout, connection issues)
- Add response validation to detect and report empty API responses
- Disable LiteLLM internal caching to guarantee actual API calls during testing
- Update available models list with 2025 latest models across all providers (Gemini, OpenAI, Claude, Qwen, DeepSeek, etc.)
- Improve error message clarity and debugging information in config endpoint
2025-11-28 16:53:01 +08:00
lintsinghua 22c528acf1 refactor(llm): consolidate LLM adapters with LiteLLM unified layer
- Replace individual adapter implementations (OpenAI, Claude, Gemini, DeepSeek, Qwen, Zhipu, Moonshot, Ollama) with unified LiteLLM adapter
- Keep native adapters for providers with special API formats (Baidu, MiniMax, Doubao)
- Update LLM factory to route requests through LiteLLM for supported providers
- Add test-llm endpoint to validate LLM connections with configurable timeout and token limits
- Add get-llm-providers endpoint to retrieve supported providers and their configurations
- Update config.py to ignore extra environment variables (VITE_* frontend variables)
- Refactor Baidu adapter to use new complete() method signature and improve error handling
- Update pyproject.toml dependencies to include litellm package
- Update env.example with new configuration options
- Simplify adapter initialization and reduce code duplication across multiple provider implementations
2025-11-28 16:41:39 +08:00
lintsinghua 6ce5b3c6c1 refactor: 重构项目结构,将前端和后端代码分离到独立目录
- 将前端代码移动到 frontend/ 目录
- 将后端代码移动到 backend/ 目录
- 更新 .gitignore 以包含 Python 和前端构建产物
- 修复 LLM JSON 解析问题,增强错误处理
- 修复前端配置默认值,改为从后端获取
- 删除 AdminDashboard 中的数据库信息和统计卡片
- 完善系统配置管理,支持从后端获取默认配置
2025-11-26 21:11:12 +08:00