新增系统管理功能,可在Web界面修改配置,无需手动修改配置文件,无需重新构建docker镜像

This commit is contained in:
lintsinghua 2025-10-26 13:38:17 +08:00
parent c13f266557
commit 165372cfd6
6 changed files with 879 additions and 73 deletions

104
README.md
View File

@ -37,8 +37,9 @@
- **AI 驱动的深度分析**:超越传统静态分析,理解代码意图,发现深层逻辑问题。
- **多维度、全方位评估**:从**安全性**、**性能**、**可维护性**到**代码风格**,提供 360 度无死角的质量评估。
- **清晰、可行的修复建议**:独创 **What-Why-How** 模式,不仅告诉您“是什么”问题,还解释“为什么”,并提供“如何修复”的具体代码示例。
- **清晰、可行的修复建议**:独创 **What-Why-How** 模式,不仅告诉您"是什么"问题,还解释"为什么",并提供"如何修复"的具体代码示例。
- **多平台LLM/本地LLM支持**: 已实现 10+ 主流平台API调用功能Gemini、OpenAI、Claude、通义千问、DeepSeek、智谱AI、Kimi、文心一言、MiniMax、豆包、Ollama本地大模型支持用户自由配置和切换。
- **可视化运行时配置**:无需重新构建镜像,直接在浏览器中配置所有 LLM 参数和 API Keys支持 API 中转站,配置保存在本地浏览器,安全便捷。
- **现代化、高颜值的用户界面**:基于 React + TypeScript 构建,提供流畅、直观的操作体验。
## 🎬 项目演示
@ -72,19 +73,23 @@
git clone https://github.com/lintsinghua/XCodeReviewer.git
cd XCodeReviewer
# 2. 配置环境变量
cp .env.example .env
# 编辑 .env 文件,至少配置以下内容:
# VITE_LLM_PROVIDER=gemini
# VITE_LLM_API_KEY=your_api_key_here
# 3. 构建并启动
# 2. 构建并启动(无需预先配置)
docker-compose up -d
# 4. 访问应用
# 3. 访问应用
# 浏览器打开 http://localhost:5174
```
**✨ 运行时配置**
Docker 部署后,您可以直接在浏览器中配置所有设置,无需重新构建镜像:
1. 访问 `http://localhost:5174/admin`(系统管理页面)
2. 在"系统配置"标签页中配置 LLM API Keys 和其他参数
3. 点击保存并刷新页面即可使用
> 📖 **详细配置说明请参考**[系统配置使用指南](#系统配置首次使用必看)
### 💻 本地开发部署
适合需要开发或自定义修改的场景。
@ -164,7 +169,16 @@ VITE_LLM_GAP_MS=1000 # 增加请求间隔
<details>
<summary><b>如何快速切换 LLM 平台?</b></summary>
只需修改 `.env` 中的 `VITE_LLM_PROVIDER` 和对应的 API Key
**方式一:浏览器配置(推荐)**
1. 访问 `http://localhost:5174/admin` 系统管理页面
2. 在"系统配置"标签页选择不同的 LLM 提供商
3. 填入对应的 API Key
4. 保存并刷新页面
**方式二:环境变量配置**
修改 `.env` 中的配置:
```env
# 切换到 OpenAI
@ -235,11 +249,29 @@ VITE_BAIDU_API_KEY=your_api_key:your_secret_key
获取地址https://console.bce.baidu.com/qianfan/
</details>
<details>
<summary><b>如何使用 API 中转站?</b></summary>
许多用户使用 API 中转服务来访问 LLM更稳定、更便宜。配置方法
1. 访问系统管理页面(`/admin`
2. 在"系统配置"标签页中:
- 选择 LLM 提供商(如 OpenAI
- **API 基础 URL**: 填入中转站地址(如 `https://your-proxy.com/v1`
- **API Key**: 填入中转站提供的密钥(而非官方密钥)
3. 保存并刷新页面
**注意**
- 中转站 URL 通常以 `/v1` 结尾OpenAI 兼容格式)
- 使用中转站的 API Key不是官方的
- 确认中转站支持你选择的 AI 模型
</details>
<details>
<summary><b>如何备份本地数据库?</b></summary>
本地数据存储在浏览器 IndexedDB 中:
- 在应用的"数据库管理"页面导出为 JSON 文件
- 在应用的"系统管理"页面导出为 JSON 文件
- 通过导入 JSON 文件恢复数据
- 注意:清除浏览器数据会删除所有本地数据
</details>
@ -356,18 +388,28 @@ VITE_OPENAI_API_KEY=your_openai_key
</details>
<details>
<summary><b>💾 本地数据库管理</b></summary>
<summary><b>⚙️ 系统管理</b></summary>
- **三种数据库模式**
- 🏠 **本地模式**:使用浏览器 IndexedDB数据完全本地化隐私安全
- ☁️ **云端模式**:使用 Supabase支持多设备同步
- 🎭 **演示模式**:无需配置,快速体验功能
- **数据管理功能**
访问 `/admin` 页面,提供完整的系统配置和数据管理功能:
- **🔧 可视化配置管理**(运行时配置):
- 🎯 **LLM 配置**:在浏览器中直接配置 API Keys、模型、超时等参数
- 🔑 **平台密钥**:管理 10+ LLM 平台的 API Keys支持快速切换
- ⚡ **分析参数**:调整并发数、间隔时间、最大文件数等
- 🌐 **API 中转站支持**:轻松配置第三方 API 代理服务
- 💾 **配置优先级**:运行时配置 > 构建时配置,无需重新构建镜像
- **💾 数据库管理**
- 🏠 **三种模式**:本地 IndexedDB / Supabase 云端 / 演示模式
- 📤 **导出备份**:将数据导出为 JSON 文件
- 📥 **导入恢复**:从备份文件恢复数据
- 🗑️ **清空数据**:一键清理所有本地数据
- 📊 **存储监控**:实时查看存储空间使用情况
- **智能统计**:项目、任务、问题的完整统计和可视化展示
- **📈 数据概览**
- 项目、任务、问题的完整统计
- 可视化图表展示质量趋势
- 存储使用情况分析
</details>
## 🛠️ 技术栈
@ -397,6 +439,7 @@ XCodeReviewer/
│ ├── components/ # React 组件
│ │ ├── layout/ # 布局组件 (Header, Footer, PageMeta)
│ │ ├── ui/ # UI 组件库 (基于 Radix UI)
│ │ ├── system/ # 系统配置组件
│ │ ├── database/ # 数据库管理组件
│ │ └── debug/ # 调试组件
│ ├── pages/ # 页面组件
@ -404,7 +447,7 @@ XCodeReviewer/
│ │ ├── Projects.tsx # 项目管理
│ │ ├── InstantAnalysis.tsx # 即时分析
│ │ ├── AuditTasks.tsx # 审计任务
│ │ └── AdminDashboard.tsx # 数据库管理
│ │ └── AdminDashboard.tsx # 系统管理
│ ├── features/ # 功能模块
│ │ ├── analysis/ # 分析相关服务
│ │ │ └── services/ # AI 代码分析引擎
@ -432,6 +475,26 @@ XCodeReviewer/
## 🎯 使用指南
### 系统配置(首次使用必看)
访问 `/admin` 系统管理页面,在"系统配置"标签页中配置:
#### 1. **配置 LLM 提供商**
- 选择您要使用的 LLM 平台Gemini、OpenAI、Claude 等)
- 填入 API Key支持通用 Key 或平台专用 Key
- 可选配置模型名称、API 基础 URL用于中转站
#### 2. **配置 API 中转站**(如果使用)
- 在"API 基础 URL"中填入中转站地址(如 `https://your-proxy.com/v1`
- 填入中转站提供的 API Key
- 保存配置
#### 3. **调整分析参数**(可选)
- 最大分析文件数、并发请求数、请求间隔
- 输出语言(中文/英文)
**配置完成后点击"保存所有更改"并刷新页面即可使用。**
### 即时代码分析
1. 访问 `/instant-analysis` 页面
2. 选择编程语言(支持 10+ 种语言)
@ -579,8 +642,9 @@ pnpm lint
- **✅ 多平台LLM支持**: 已实现 10+ 主流平台API调用功能Gemini、OpenAI、Claude、通义千问、DeepSeek、智谱AI、Kimi、文心一言、MiniMax、豆包、Ollama本地大模型支持用户自由配置和切换
- **✅ 本地模型支持**: 已加入对 Ollama 本地大模型的调用功能,满足数据隐私需求
- **Multi-Agent Collaboration**: 考虑引入多智能体协作架构,会实现`Agent+人工对话`反馈的功能,包括多轮对话流程展示,人工对话中断干涉等,以获得更清晰、透明、监督性的审计过程,提升审计质量
- **✅ 可视化配置管理**: 已实现运行时配置系统,支持在浏览器中直接配置所有 LLM 参数、API Keys支持 API 中转站,无需重新构建镜像
- **✅ 专业报告文件生成**: 根据不同的需求生成相关格式的专业审计报告文件,支持文件报告格式定制等
- **Multi-Agent Collaboration**: 考虑引入多智能体协作架构,会实现`Agent+人工对话`反馈的功能,包括多轮对话流程展示,人工对话中断干涉等,以获得更清晰、透明、监督性的审计过程,提升审计质量
- **审计标准自定义**: 不同团队有自己的编码规范,不同项目有特定的安全要求,也正是我们这个项目想后续做的东西。当前的版本还属于一个“半黑盒模式”,项目通过 Prompt 工程来引导分析方向和定义审计标准实际分析效果由强大的预训练AI 模型内置知识决定。后续将结合强化学习、监督学习微调等方法开发以支持自定义规则配置通过YAML或者JSON定义团队特定规则提供常见框架的最佳实践模板等等以获得更加符合需求和标准的审计结果
---

View File

@ -39,6 +39,7 @@ In the fast-paced world of software development, ensuring code quality is crucia
- **🎯 Multi-dimensional, Comprehensive Assessment**: From **security**, **performance**, **maintainability** to **code style**, providing 360-degree quality evaluation.
- **💡 Clear, Actionable Fix Suggestions**: Innovative **What-Why-How** approach that not only tells you "what" the problem is, but also explains "why" and provides "how to fix" with specific code examples.
- **✅ Multi-Platform LLM/Local Model Support**: Implemented API calling functionality for 10+ mainstream platforms (Gemini, OpenAI, Claude, Qwen, DeepSeek, Zhipu AI, Kimi, ERNIE, MiniMax, Doubao, Ollama Local Models), with support for free configuration and switching
- **⚙️ Visual Runtime Configuration**: Configure all LLM parameters and API Keys directly in the browser without rebuilding images. Supports API relay services, with configurations saved locally in the browser for security and convenience.
- **✨ Modern, Beautiful User Interface**: Built with React + TypeScript, providing a smooth and intuitive user experience.
## 🎬 Project Demo
@ -72,19 +73,29 @@ One-click deployment with Docker, no Node.js environment required:
git clone https://github.com/lintsinghua/XCodeReviewer.git
cd XCodeReviewer
# 2. Configure environment variables
cp .env.example .env
# Edit .env file, configure at least:
# VITE_LLM_PROVIDER=gemini
# VITE_LLM_API_KEY=your_api_key_here
# 3. Build and start
# 2. Build and start (no pre-configuration needed)
docker-compose up -d
# 4. Access the application
# 3. Access the application
# Open http://localhost:5174 in your browser
```
**✨ New Feature: Runtime Configuration**
After Docker deployment, you can configure all settings directly in the browser without rebuilding the image:
1. Visit `http://localhost:5174/admin` (System Management page)
2. Configure LLM API Keys and other parameters in the "System Configuration" tab
3. Click save and refresh the page to use
Benefits:
- ✅ No need to hardcode API Keys in Docker images (more secure)
- ✅ Modify configuration anytime without rebuilding
- ✅ Support for API relay services
- ✅ Quickly switch between different LLM platforms
> 📖 **For detailed configuration instructions, see**: [System Configuration Guide](#system-configuration-first-time-setup)
### 💻 Local Development Deployment
Suitable for development or custom modifications.
@ -164,7 +175,16 @@ VITE_LLM_GAP_MS=1000 # Increase request interval
<details>
<summary><b>How to quickly switch LLM platforms?</b></summary>
Simply modify `VITE_LLM_PROVIDER` and the corresponding API Key in `.env`:
**Method 1: Browser Configuration (Recommended)**
1. Visit `http://localhost:5174/admin` System Management page
2. Select different LLM provider in the "System Configuration" tab
3. Enter the corresponding API Key
4. Save and refresh the page
**Method 2: Environment Variable Configuration**
Modify configuration in `.env`:
```env
# Switch to OpenAI
@ -235,11 +255,29 @@ VITE_BAIDU_API_KEY=your_api_key:your_secret_key
Get from: https://console.bce.baidu.com/qianfan/
</details>
<details>
<summary><b>How to use API relay services?</b></summary>
Many users use API relay services to access LLMs (more stable and cheaper). Configuration method:
1. Visit System Management page (`/admin`)
2. In the "System Configuration" tab:
- Select LLM provider (e.g., OpenAI)
- **API Base URL**: Enter relay service address (e.g., `https://your-proxy.com/v1`)
- **API Key**: Enter the key provided by the relay service (not the official key)
3. Save and refresh the page
**Notes**:
- Relay service URLs usually end with `/v1` (OpenAI-compatible format)
- Use the relay service's API Key, not the official one
- Confirm that the relay service supports your chosen AI model
</details>
<details>
<summary><b>How to backup local database?</b></summary>
Local data is stored in browser IndexedDB:
- Export as JSON file from "Database Management" page
- Export as JSON file from "System Management" page
- Import JSON file to restore data
- Note: Clearing browser data will delete all local data
</details>
@ -357,18 +395,28 @@ For cloud data sync:
</details>
<details>
<summary><b>💾 Local Database Management</b></summary>
<summary><b>⚙️ System Management</b></summary>
- **Three Database Modes**:
- 🏠 **Local Mode**: Uses browser IndexedDB, data is completely localized, privacy-secure
- ☁️ **Cloud Mode**: Uses Supabase, supports multi-device synchronization
- 🎭 **Demo Mode**: No configuration needed, quick feature preview
- **Data Management Features**:
Visit `/admin` page for complete system configuration and data management features:
- **🔧 Visual Configuration Management** (Runtime Configuration):
- 🎯 **LLM Configuration**: Configure API Keys, models, timeout, and other parameters directly in the browser
- 🔑 **Platform Keys**: Manage API Keys for 10+ LLM platforms with quick switching support
- ⚡ **Analysis Parameters**: Adjust concurrency, interval time, max files, etc.
- 🌐 **API Relay Support**: Easily configure third-party API relay services
- 💾 **Configuration Priority**: Runtime config > Build-time config, no need to rebuild images
- **💾 Database Management**:
- 🏠 **Three Modes**: Local IndexedDB / Supabase Cloud / Demo Mode
- 📤 **Export Backup**: Export data as JSON files
- 📥 **Import Recovery**: Restore data from backup files
- 🗑️ **Clear Data**: One-click cleanup of all local data
- 📊 **Storage Monitoring**: Real-time view of storage space usage
- **Smart Statistics**: Complete statistics and visualization of projects, tasks, and issues
- **📈 Data Overview**:
- Complete statistics for projects, tasks, and issues
- Visual charts showing quality trends
- Storage usage analysis
</details>
## 🛠️ Tech Stack
@ -399,6 +447,7 @@ XCodeReviewer/
│ ├── components/ # React components
│ │ ├── layout/ # Layout components (Header, Footer, PageMeta)
│ │ ├── ui/ # UI component library (based on Radix UI)
│ │ ├── system/ # System configuration components
│ │ ├── database/ # Database management components
│ │ └── debug/ # Debug components
│ ├── pages/ # Page components
@ -406,7 +455,7 @@ XCodeReviewer/
│ │ ├── Projects.tsx # Project management
│ │ ├── InstantAnalysis.tsx # Instant analysis
│ │ ├── AuditTasks.tsx # Audit tasks
│ │ └── AdminDashboard.tsx # Database management
│ │ └── AdminDashboard.tsx # System management
│ ├── features/ # Feature modules
│ │ ├── analysis/ # Analysis related services
│ │ │ └── services/ # AI code analysis engine
@ -434,6 +483,26 @@ XCodeReviewer/
## 🎯 Usage Guide
### System Configuration (First-Time Setup)
Visit `/admin` System Management page and configure in the "System Configuration" tab:
#### 1. **Configure LLM Provider**
- Select the LLM platform you want to use (Gemini, OpenAI, Claude, etc.)
- Enter API Key (supports universal Key or platform-specific Key)
- Optional: Configure model name, API base URL (for relay services)
#### 2. **Configure API Relay Service** (if using)
- Enter relay service address in "API Base URL" (e.g., `https://your-proxy.com/v1`)
- Enter the API Key provided by the relay service
- Save configuration
#### 3. **Adjust Analysis Parameters** (optional)
- Max analyze files, concurrent requests, request interval
- Output language (Chinese/English)
**After configuration, click "Save All Changes" and refresh the page to use.**
### Instant Code Analysis
1. Visit the `/instant-analysis` page
2. Select programming language (supports 10+ languages)
@ -580,10 +649,10 @@ Currently, XCodeReviewer is in rapid prototype validation stage. Based on projec
- ✅ **Multi-Platform LLM Support**: Implemented API integration for 10+ mainstream platforms (Gemini, OpenAI, Claude, Qwen, DeepSeek, Zhipu AI, Kimi, ERNIE, MiniMax, Doubao, Ollama), with flexible configuration and switching
- ✅ **Local Model Support**: Added Ollama local model integration to meet data privacy requirements
- ✅ **Local Database Support**: Implemented IndexedDB-based local database for fully localized data storage and privacy protection
- **Multi-Agent Collaboration**: Introduce multi-agent architecture with `Agent + Human Dialogue` feedback, including multi-round dialogue visualization and human intervention for clearer, transparent, and supervised audit processes
- ✅ **Visual Configuration Management**: Implemented runtime configuration system supporting browser-based configuration of all LLM parameters and API Keys, API relay service support, no need to rebuild images
- ✅ **Professional Report Generation**: Generate professional audit reports in various formats based on different needs, with customizable templates and format configurations
- **Custom Audit Standards**: Support custom audit rule configuration via YAML/JSON, provide best practice templates for common frameworks, and leverage reinforcement learning and supervised fi
- **Multi-Agent Collaboration**: Introduce multi-agent architecture with `Agent + Human Dialogue` feedback, including multi-round dialogue visualization and human intervention for clearer, transparent, and supervised audit processes
- **Custom Audit Standards**: Support custom audit rule configuration via YAML/JSON, provide best practice templates for common frameworks, and leverage reinforcement learning and supervised fine-tuning for more targeted and standards-compliant audit results
---

View File

@ -52,7 +52,7 @@ const routes: RouteConfig[] = [
visible: false,
},
{
name: "数据库管理",
name: "系统管理",
path: "/admin",
element: <AdminDashboard />,
visible: true,

View File

@ -0,0 +1,652 @@
import { useState, useEffect } from "react";
import { Card, CardContent, CardHeader, CardTitle, CardDescription } from "@/components/ui/card";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Label } from "@/components/ui/label";
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
import { Alert, AlertDescription } from "@/components/ui/alert";
import { Badge } from "@/components/ui/badge";
import {
Settings,
Save,
RotateCcw,
Eye,
EyeOff,
CheckCircle2,
AlertCircle,
Info,
Key,
Zap,
Globe,
Database
} from "lucide-react";
import { toast } from "sonner";
// LLM 提供商配置
const LLM_PROVIDERS = [
{ value: 'gemini', label: 'Google Gemini', icon: '🔵', category: 'international' },
{ value: 'openai', label: 'OpenAI GPT', icon: '🟢', category: 'international' },
{ value: 'claude', label: 'Anthropic Claude', icon: '🟣', category: 'international' },
{ value: 'deepseek', label: 'DeepSeek', icon: '🔷', category: 'international' },
{ value: 'qwen', label: '阿里云通义千问', icon: '🟠', category: 'domestic' },
{ value: 'zhipu', label: '智谱AI (GLM)', icon: '🔴', category: 'domestic' },
{ value: 'moonshot', label: 'Moonshot (Kimi)', icon: '🌙', category: 'domestic' },
{ value: 'baidu', label: '百度文心一言', icon: '🔵', category: 'domestic' },
{ value: 'minimax', label: 'MiniMax', icon: '⚡', category: 'domestic' },
{ value: 'doubao', label: '字节豆包', icon: '🎯', category: 'domestic' },
{ value: 'ollama', label: 'Ollama 本地模型', icon: '🖥️', category: 'local' },
];
// 默认模型配置
const DEFAULT_MODELS = {
gemini: 'gemini-2.5-flash',
openai: 'gpt-4o-mini',
claude: 'claude-3-5-sonnet-20241022',
qwen: 'qwen-turbo',
deepseek: 'deepseek-chat',
zhipu: 'glm-4-flash',
moonshot: 'moonshot-v1-8k',
baidu: 'ERNIE-3.5-8K',
minimax: 'abab6.5-chat',
doubao: 'doubao-pro-32k',
ollama: 'llama3',
};
interface SystemConfigData {
// LLM 配置
llmProvider: string;
llmApiKey: string;
llmModel: string;
llmBaseUrl: string;
llmTimeout: number;
llmTemperature: number;
llmMaxTokens: number;
// 平台专用配置
geminiApiKey: string;
openaiApiKey: string;
claudeApiKey: string;
qwenApiKey: string;
deepseekApiKey: string;
zhipuApiKey: string;
moonshotApiKey: string;
baiduApiKey: string;
minimaxApiKey: string;
doubaoApiKey: string;
ollamaBaseUrl: string;
// GitHub 配置
githubToken: string;
// 分析配置
maxAnalyzeFiles: number;
llmConcurrency: number;
llmGapMs: number;
outputLanguage: string;
}
const STORAGE_KEY = 'xcodereviewer_runtime_config';
export function SystemConfig() {
const [config, setConfig] = useState<SystemConfigData>({
llmProvider: 'gemini',
llmApiKey: '',
llmModel: '',
llmBaseUrl: '',
llmTimeout: 150000,
llmTemperature: 0.2,
llmMaxTokens: 4096,
geminiApiKey: '',
openaiApiKey: '',
claudeApiKey: '',
qwenApiKey: '',
deepseekApiKey: '',
zhipuApiKey: '',
moonshotApiKey: '',
baiduApiKey: '',
minimaxApiKey: '',
doubaoApiKey: '',
ollamaBaseUrl: 'http://localhost:11434/v1',
githubToken: '',
maxAnalyzeFiles: 40,
llmConcurrency: 2,
llmGapMs: 500,
outputLanguage: 'zh-CN',
});
const [showApiKeys, setShowApiKeys] = useState<Record<string, boolean>>({});
const [hasChanges, setHasChanges] = useState(false);
const [configSource, setConfigSource] = useState<'runtime' | 'build'>('build');
// 加载配置
useEffect(() => {
loadConfig();
}, []);
const loadConfig = () => {
try {
// 尝试从 localStorage 加载运行时配置
const savedConfig = localStorage.getItem(STORAGE_KEY);
if (savedConfig) {
const parsedConfig = JSON.parse(savedConfig);
setConfig(parsedConfig);
setConfigSource('runtime');
toast.success("已加载运行时配置");
} else {
// 使用构建时配置
loadFromEnv();
setConfigSource('build');
}
} catch (error) {
console.error('Failed to load config:', error);
loadFromEnv();
}
};
const loadFromEnv = () => {
// 从环境变量加载(构建时配置)
const envConfig: SystemConfigData = {
llmProvider: import.meta.env.VITE_LLM_PROVIDER || 'gemini',
llmApiKey: import.meta.env.VITE_LLM_API_KEY || '',
llmModel: import.meta.env.VITE_LLM_MODEL || '',
llmBaseUrl: import.meta.env.VITE_LLM_BASE_URL || '',
llmTimeout: Number(import.meta.env.VITE_LLM_TIMEOUT) || 150000,
llmTemperature: Number(import.meta.env.VITE_LLM_TEMPERATURE) || 0.2,
llmMaxTokens: Number(import.meta.env.VITE_LLM_MAX_TOKENS) || 4096,
geminiApiKey: import.meta.env.VITE_GEMINI_API_KEY || '',
openaiApiKey: import.meta.env.VITE_OPENAI_API_KEY || '',
claudeApiKey: import.meta.env.VITE_CLAUDE_API_KEY || '',
qwenApiKey: import.meta.env.VITE_QWEN_API_KEY || '',
deepseekApiKey: import.meta.env.VITE_DEEPSEEK_API_KEY || '',
zhipuApiKey: import.meta.env.VITE_ZHIPU_API_KEY || '',
moonshotApiKey: import.meta.env.VITE_MOONSHOT_API_KEY || '',
baiduApiKey: import.meta.env.VITE_BAIDU_API_KEY || '',
minimaxApiKey: import.meta.env.VITE_MINIMAX_API_KEY || '',
doubaoApiKey: import.meta.env.VITE_DOUBAO_API_KEY || '',
ollamaBaseUrl: import.meta.env.VITE_OLLAMA_BASE_URL || 'http://localhost:11434/v1',
githubToken: import.meta.env.VITE_GITHUB_TOKEN || '',
maxAnalyzeFiles: Number(import.meta.env.VITE_MAX_ANALYZE_FILES) || 40,
llmConcurrency: Number(import.meta.env.VITE_LLM_CONCURRENCY) || 2,
llmGapMs: Number(import.meta.env.VITE_LLM_GAP_MS) || 500,
outputLanguage: import.meta.env.VITE_OUTPUT_LANGUAGE || 'zh-CN',
};
setConfig(envConfig);
};
const saveConfig = () => {
try {
localStorage.setItem(STORAGE_KEY, JSON.stringify(config));
setHasChanges(false);
setConfigSource('runtime');
toast.success("配置已保存!刷新页面后生效");
// 提示用户刷新页面
setTimeout(() => {
if (window.confirm("配置已保存。是否立即刷新页面使配置生效?")) {
window.location.reload();
}
}, 1000);
} catch (error) {
console.error('Failed to save config:', error);
toast.error("保存配置失败");
}
};
const resetConfig = () => {
if (window.confirm("确定要重置为构建时配置吗?这将清除所有运行时配置。")) {
localStorage.removeItem(STORAGE_KEY);
loadFromEnv();
setHasChanges(false);
setConfigSource('build');
toast.success("已重置为构建时配置");
}
};
const updateConfig = (key: keyof SystemConfigData, value: any) => {
setConfig(prev => ({ ...prev, [key]: value }));
setHasChanges(true);
};
const toggleShowApiKey = (field: string) => {
setShowApiKeys(prev => ({ ...prev, [field]: !prev[field] }));
};
const getCurrentApiKey = () => {
const provider = config.llmProvider.toLowerCase();
const keyMap: Record<string, string> = {
gemini: config.geminiApiKey,
openai: config.openaiApiKey,
claude: config.claudeApiKey,
qwen: config.qwenApiKey,
deepseek: config.deepseekApiKey,
zhipu: config.zhipuApiKey,
moonshot: config.moonshotApiKey,
baidu: config.baiduApiKey,
minimax: config.minimaxApiKey,
doubao: config.doubaoApiKey,
ollama: 'ollama',
};
return config.llmApiKey || keyMap[provider] || '';
};
const isConfigured = getCurrentApiKey() !== '';
return (
<div className="space-y-6">
{/* 配置状态提示 */}
<Alert>
<Info className="h-4 w-4" />
<AlertDescription className="flex items-center justify-between">
<div>
<strong></strong>
{configSource === 'runtime' ? (
<Badge variant="default" className="ml-2"></Badge>
) : (
<Badge variant="outline" className="ml-2"></Badge>
)}
<span className="ml-4 text-sm">
{isConfigured ? (
<span className="text-green-600 flex items-center gap-1">
<CheckCircle2 className="h-3 w-3" /> LLM
</span>
) : (
<span className="text-orange-600 flex items-center gap-1">
<AlertCircle className="h-3 w-3" /> LLM
</span>
)}
</span>
</div>
<div className="flex gap-2">
{hasChanges && (
<Button onClick={saveConfig} size="sm">
<Save className="w-4 h-4 mr-2" />
</Button>
)}
{configSource === 'runtime' && (
<Button onClick={resetConfig} variant="outline" size="sm">
<RotateCcw className="w-4 h-4 mr-2" />
</Button>
)}
</div>
</AlertDescription>
</Alert>
<Tabs defaultValue="llm" className="w-full">
<TabsList className="grid w-full grid-cols-4">
<TabsTrigger value="llm">
<Zap className="w-4 h-4 mr-2" />
LLM
</TabsTrigger>
<TabsTrigger value="platforms">
<Key className="w-4 h-4 mr-2" />
</TabsTrigger>
<TabsTrigger value="analysis">
<Settings className="w-4 h-4 mr-2" />
</TabsTrigger>
<TabsTrigger value="other">
<Globe className="w-4 h-4 mr-2" />
</TabsTrigger>
</TabsList>
{/* LLM 基础配置 */}
<TabsContent value="llm" className="space-y-6">
<Card>
<CardHeader>
<CardTitle>LLM </CardTitle>
<CardDescription></CardDescription>
</CardHeader>
<CardContent className="space-y-4">
<div className="space-y-2">
<Label>使 LLM </Label>
<Select
value={config.llmProvider}
onValueChange={(value) => updateConfig('llmProvider', value)}
>
<SelectTrigger>
<SelectValue />
</SelectTrigger>
<SelectContent>
<div className="px-2 py-1.5 text-sm font-semibold text-muted-foreground"></div>
{LLM_PROVIDERS.filter(p => p.category === 'international').map(provider => (
<SelectItem key={provider.value} value={provider.value}>
{provider.icon} {provider.label}
</SelectItem>
))}
<div className="px-2 py-1.5 text-sm font-semibold text-muted-foreground mt-2"></div>
{LLM_PROVIDERS.filter(p => p.category === 'domestic').map(provider => (
<SelectItem key={provider.value} value={provider.value}>
{provider.icon} {provider.label}
</SelectItem>
))}
<div className="px-2 py-1.5 text-sm font-semibold text-muted-foreground mt-2"></div>
{LLM_PROVIDERS.filter(p => p.category === 'local').map(provider => (
<SelectItem key={provider.value} value={provider.value}>
{provider.icon} {provider.label}
</SelectItem>
))}
</SelectContent>
</Select>
</div>
<div className="space-y-2">
<Label> API Key</Label>
<div className="flex gap-2">
<Input
type={showApiKeys['llm'] ? 'text' : 'password'}
value={config.llmApiKey}
onChange={(e) => updateConfig('llmApiKey', e.target.value)}
placeholder="留空则使用平台专用 API Key"
/>
<Button
variant="outline"
size="icon"
onClick={() => toggleShowApiKey('llm')}
>
{showApiKeys['llm'] ? <EyeOff className="h-4 w-4" /> : <Eye className="h-4 w-4" />}
</Button>
</div>
<p className="text-xs text-muted-foreground">
使 API Key使 API Key
</p>
</div>
<div className="space-y-2">
<Label></Label>
<Input
value={config.llmModel}
onChange={(e) => updateConfig('llmModel', e.target.value)}
placeholder={`默认:${DEFAULT_MODELS[config.llmProvider as keyof typeof DEFAULT_MODELS] || '自动'}`}
/>
<p className="text-xs text-muted-foreground">
使
</p>
</div>
<div className="space-y-2">
<Label>API URL</Label>
<Input
value={config.llmBaseUrl}
onChange={(e) => updateConfig('llmBaseUrl', e.target.value)}
placeholder="例如https://api.example.com/v1"
/>
<div className="text-xs text-muted-foreground space-y-1">
<p>💡 <strong>使 API </strong></p>
<details className="cursor-pointer">
<summary className="text-primary hover:underline"> API </summary>
<div className="mt-2 p-3 bg-muted rounded space-y-1 text-xs">
<p><strong>OpenAI </strong></p>
<p> https://your-proxy.com/v1</p>
<p> https://api.openai-proxy.org/v1</p>
<p className="pt-2"><strong></strong></p>
<p> https://your-api-gateway.com/openai</p>
<p> https://custom-endpoint.com/api</p>
<p className="pt-2 text-orange-600"> LLM </p>
</div>
</details>
</div>
</div>
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
<div className="space-y-2">
<Label></Label>
<Input
type="number"
value={config.llmTimeout}
onChange={(e) => updateConfig('llmTimeout', Number(e.target.value))}
/>
</div>
<div className="space-y-2">
<Label>0-2</Label>
<Input
type="number"
step="0.1"
min="0"
max="2"
value={config.llmTemperature}
onChange={(e) => updateConfig('llmTemperature', Number(e.target.value))}
/>
</div>
<div className="space-y-2">
<Label> Tokens</Label>
<Input
type="number"
value={config.llmMaxTokens}
onChange={(e) => updateConfig('llmMaxTokens', Number(e.target.value))}
/>
</div>
</div>
</CardContent>
</Card>
</TabsContent>
{/* 平台专用密钥 */}
<TabsContent value="platforms" className="space-y-6">
<Alert>
<Key className="h-4 w-4" />
<AlertDescription>
<div className="space-y-1">
<p> API Key便 API Key使</p>
<p className="text-xs text-muted-foreground pt-1">
💡 <strong>使 API </strong><strong> API Key</strong> Key
LLM API URL
</p>
</div>
</AlertDescription>
</Alert>
{[
{ key: 'geminiApiKey', label: 'Google Gemini API Key', icon: '🔵', hint: '官方https://makersuite.google.com/app/apikey | 或使用中转站 Key' },
{ key: 'openaiApiKey', label: 'OpenAI API Key', icon: '🟢', hint: '官方https://platform.openai.com/api-keys | 或使用中转站 Key' },
{ key: 'claudeApiKey', label: 'Claude API Key', icon: '🟣', hint: '官方https://console.anthropic.com/ | 或使用中转站 Key' },
{ key: 'qwenApiKey', label: '通义千问 API Key', icon: '🟠', hint: '官方https://dashscope.console.aliyun.com/ | 或使用中转站 Key' },
{ key: 'deepseekApiKey', label: 'DeepSeek API Key', icon: '🔷', hint: '官方https://platform.deepseek.com/ | 或使用中转站 Key' },
{ key: 'zhipuApiKey', label: '智谱AI API Key', icon: '🔴', hint: '官方https://open.bigmodel.cn/ | 或使用中转站 Key' },
{ key: 'moonshotApiKey', label: 'Moonshot API Key', icon: '🌙', hint: '官方https://platform.moonshot.cn/ | 或使用中转站 Key' },
{ key: 'baiduApiKey', label: '百度文心 API Key', icon: '🔵', hint: '官方格式API_KEY:SECRET_KEY | 或使用中转站 Key' },
{ key: 'minimaxApiKey', label: 'MiniMax API Key', icon: '⚡', hint: '官方https://www.minimaxi.com/ | 或使用中转站 Key' },
{ key: 'doubaoApiKey', label: '字节豆包 API Key', icon: '🎯', hint: '官方https://console.volcengine.com/ark | 或使用中转站 Key' },
].map(({ key, label, icon, hint }) => (
<Card key={key}>
<CardHeader>
<CardTitle className="text-base flex items-center gap-2">
<span>{icon}</span>
{label}
</CardTitle>
<CardDescription className="text-xs">{hint}</CardDescription>
</CardHeader>
<CardContent>
<div className="flex gap-2">
<Input
type={showApiKeys[key] ? 'text' : 'password'}
value={config[key as keyof SystemConfigData] as string}
onChange={(e) => updateConfig(key as keyof SystemConfigData, e.target.value)}
placeholder={`输入 ${label}`}
/>
<Button
variant="outline"
size="icon"
onClick={() => toggleShowApiKey(key)}
>
{showApiKeys[key] ? <EyeOff className="h-4 w-4" /> : <Eye className="h-4 w-4" />}
</Button>
</div>
</CardContent>
</Card>
))}
<Card>
<CardHeader>
<CardTitle className="text-base flex items-center gap-2">
<span>🖥</span>
Ollama URL
</CardTitle>
<CardDescription className="text-xs"> Ollama API </CardDescription>
</CardHeader>
<CardContent>
<Input
value={config.ollamaBaseUrl}
onChange={(e) => updateConfig('ollamaBaseUrl', e.target.value)}
placeholder="http://localhost:11434/v1"
/>
</CardContent>
</Card>
</TabsContent>
{/* 分析参数配置 */}
<TabsContent value="analysis" className="space-y-6">
<Card>
<CardHeader>
<CardTitle></CardTitle>
<CardDescription></CardDescription>
</CardHeader>
<CardContent className="space-y-4">
<div className="space-y-2">
<Label></Label>
<Input
type="number"
value={config.maxAnalyzeFiles}
onChange={(e) => updateConfig('maxAnalyzeFiles', Number(e.target.value))}
/>
<p className="text-xs text-muted-foreground">
</p>
</div>
<div className="space-y-2">
<Label>LLM </Label>
<Input
type="number"
value={config.llmConcurrency}
onChange={(e) => updateConfig('llmConcurrency', Number(e.target.value))}
/>
<p className="text-xs text-muted-foreground">
LLM
</p>
</div>
<div className="space-y-2">
<Label></Label>
<Input
type="number"
value={config.llmGapMs}
onChange={(e) => updateConfig('llmGapMs', Number(e.target.value))}
/>
<p className="text-xs text-muted-foreground">
LLM
</p>
</div>
<div className="space-y-2">
<Label></Label>
<Select
value={config.outputLanguage}
onValueChange={(value) => updateConfig('outputLanguage', value)}
>
<SelectTrigger>
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="zh-CN">🇨🇳 </SelectItem>
<SelectItem value="en-US">🇺🇸 English</SelectItem>
</SelectContent>
</Select>
</div>
</CardContent>
</Card>
</TabsContent>
{/* 其他配置 */}
<TabsContent value="other" className="space-y-6">
<Card>
<CardHeader>
<CardTitle>GitHub </CardTitle>
<CardDescription> GitHub Personal Access Token 访</CardDescription>
</CardHeader>
<CardContent>
<div className="space-y-2">
<Label>GitHub Token</Label>
<div className="flex gap-2">
<Input
type={showApiKeys['github'] ? 'text' : 'password'}
value={config.githubToken}
onChange={(e) => updateConfig('githubToken', e.target.value)}
placeholder="ghp_xxxxxxxxxxxx"
/>
<Button
variant="outline"
size="icon"
onClick={() => toggleShowApiKey('github')}
>
{showApiKeys['github'] ? <EyeOff className="h-4 w-4" /> : <Eye className="h-4 w-4" />}
</Button>
</div>
<p className="text-xs text-muted-foreground">
https://github.com/settings/tokens
</p>
</div>
</CardContent>
</Card>
<Card>
<CardHeader>
<CardTitle></CardTitle>
</CardHeader>
<CardContent className="space-y-3 text-sm text-muted-foreground">
<div className="flex items-start gap-3 p-3 bg-muted rounded-lg">
<Database className="h-5 w-5 text-primary mt-0.5" />
<div>
<p className="font-medium text-foreground"></p>
<p>
localStorage
Docker
</p>
</div>
</div>
<div className="flex items-start gap-3 p-3 bg-muted rounded-lg">
<Settings className="h-5 w-5 text-green-600 mt-0.5" />
<div>
<p className="font-medium text-foreground"></p>
<p>
&gt;
</p>
</div>
</div>
<div className="flex items-start gap-3 p-3 bg-muted rounded-lg">
<Key className="h-5 w-5 text-orange-600 mt-0.5" />
<div>
<p className="font-medium text-foreground"></p>
<p>
API Keys 访
</p>
</div>
</div>
</CardContent>
</Card>
</TabsContent>
</Tabs>
{/* 底部操作按钮 */}
{hasChanges && (
<div className="fixed bottom-6 right-6 flex gap-3 bg-background border rounded-lg shadow-lg p-4">
<Button onClick={saveConfig} size="lg">
<Save className="w-4 h-4 mr-2" />
</Button>
<Button onClick={loadConfig} variant="outline" size="lg">
</Button>
</div>
)}
</div>
);
}

View File

@ -16,11 +16,13 @@ import {
Clock,
AlertTriangle,
TrendingUp,
Package
Package,
Settings
} from "lucide-react";
import { api, dbMode, isLocalMode } from "@/shared/config/database";
import { DatabaseManager } from "@/components/database/DatabaseManager";
import { DatabaseStatusDetail } from "@/components/database/DatabaseStatus";
import { SystemConfig } from "@/components/system/SystemConfig";
import { toast } from "sonner";
export default function AdminDashboard() {
@ -112,11 +114,11 @@ export default function AdminDashboard() {
<div className="flex items-center justify-between">
<div>
<h1 className="text-3xl font-bold text-gray-900 flex items-center gap-3">
<Database className="h-8 w-8 text-primary" />
<Settings className="h-8 w-8 text-primary" />
</h1>
<p className="text-gray-600 mt-2">
使
LLM设置使
</p>
</div>
<Button variant="outline" onClick={loadStats}>
@ -212,14 +214,20 @@ export default function AdminDashboard() {
</div>
{/* 主要内容标签页 */}
<Tabs defaultValue="overview" className="w-full">
<TabsList className="grid w-full grid-cols-4">
<Tabs defaultValue="config" className="w-full">
<TabsList className="grid w-full grid-cols-5">
<TabsTrigger value="config"></TabsTrigger>
<TabsTrigger value="overview"></TabsTrigger>
<TabsTrigger value="storage"></TabsTrigger>
<TabsTrigger value="operations"></TabsTrigger>
<TabsTrigger value="settings"></TabsTrigger>
<TabsTrigger value="settings"></TabsTrigger>
</TabsList>
{/* 系统配置 */}
<TabsContent value="config" className="space-y-6">
<SystemConfig />
</TabsContent>
{/* 数据概览 */}
<TabsContent value="overview" className="space-y-6">
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6">

View File

@ -1,85 +1,98 @@
// 环境变量配置
// 从 localStorage 读取运行时配置
const STORAGE_KEY = 'xcodereviewer_runtime_config';
const getRuntimeConfig = () => {
try {
const saved = localStorage.getItem(STORAGE_KEY);
return saved ? JSON.parse(saved) : null;
} catch {
return null;
}
};
const runtimeConfig = getRuntimeConfig();
// 环境变量配置(支持运行时配置覆盖)
export const env = {
// ==================== LLM 通用配置 ====================
// 当前使用的LLM提供商 (gemini|openai|claude|qwen|deepseek|zhipu|moonshot|baidu|minimax|doubao|ollama)
LLM_PROVIDER: import.meta.env.VITE_LLM_PROVIDER || 'gemini',
LLM_PROVIDER: runtimeConfig?.llmProvider || import.meta.env.VITE_LLM_PROVIDER || 'gemini',
// LLM API Key
LLM_API_KEY: import.meta.env.VITE_LLM_API_KEY || '',
LLM_API_KEY: runtimeConfig?.llmApiKey || import.meta.env.VITE_LLM_API_KEY || '',
// LLM 模型名称
LLM_MODEL: import.meta.env.VITE_LLM_MODEL || '',
LLM_MODEL: runtimeConfig?.llmModel || import.meta.env.VITE_LLM_MODEL || '',
// LLM API 基础URL (可选,用于自定义端点或代理)
LLM_BASE_URL: import.meta.env.VITE_LLM_BASE_URL || '',
LLM_BASE_URL: runtimeConfig?.llmBaseUrl || import.meta.env.VITE_LLM_BASE_URL || '',
// LLM 请求超时时间(ms)
LLM_TIMEOUT: Number(import.meta.env.VITE_LLM_TIMEOUT) || 150000,
LLM_TIMEOUT: runtimeConfig?.llmTimeout || Number(import.meta.env.VITE_LLM_TIMEOUT) || 150000,
// LLM 温度参数 (0.0-2.0)
LLM_TEMPERATURE: Number(import.meta.env.VITE_LLM_TEMPERATURE) || 0.2,
LLM_TEMPERATURE: runtimeConfig?.llmTemperature !== undefined ? runtimeConfig.llmTemperature : (Number(import.meta.env.VITE_LLM_TEMPERATURE) || 0.2),
// LLM 最大token数
LLM_MAX_TOKENS: Number(import.meta.env.VITE_LLM_MAX_TOKENS) || 4096,
LLM_MAX_TOKENS: runtimeConfig?.llmMaxTokens || Number(import.meta.env.VITE_LLM_MAX_TOKENS) || 4096,
// ==================== Gemini AI 配置 (兼容旧配置) ====================
GEMINI_API_KEY: import.meta.env.VITE_GEMINI_API_KEY || '',
GEMINI_API_KEY: runtimeConfig?.geminiApiKey || import.meta.env.VITE_GEMINI_API_KEY || '',
GEMINI_MODEL: import.meta.env.VITE_GEMINI_MODEL || 'gemini-2.5-flash',
GEMINI_TIMEOUT_MS: Number(import.meta.env.VITE_GEMINI_TIMEOUT_MS) || 25000,
// ==================== OpenAI 配置 ====================
OPENAI_API_KEY: import.meta.env.VITE_OPENAI_API_KEY || '',
OPENAI_API_KEY: runtimeConfig?.openaiApiKey || import.meta.env.VITE_OPENAI_API_KEY || '',
OPENAI_MODEL: import.meta.env.VITE_OPENAI_MODEL || 'gpt-4o-mini',
OPENAI_BASE_URL: import.meta.env.VITE_OPENAI_BASE_URL || '',
// ==================== Claude 配置 ====================
CLAUDE_API_KEY: import.meta.env.VITE_CLAUDE_API_KEY || '',
CLAUDE_API_KEY: runtimeConfig?.claudeApiKey || import.meta.env.VITE_CLAUDE_API_KEY || '',
CLAUDE_MODEL: import.meta.env.VITE_CLAUDE_MODEL || 'claude-3-5-sonnet-20241022',
// ==================== 通义千问 配置 ====================
QWEN_API_KEY: import.meta.env.VITE_QWEN_API_KEY || '',
QWEN_API_KEY: runtimeConfig?.qwenApiKey || import.meta.env.VITE_QWEN_API_KEY || '',
QWEN_MODEL: import.meta.env.VITE_QWEN_MODEL || 'qwen-turbo',
// ==================== DeepSeek 配置 ====================
DEEPSEEK_API_KEY: import.meta.env.VITE_DEEPSEEK_API_KEY || '',
DEEPSEEK_API_KEY: runtimeConfig?.deepseekApiKey || import.meta.env.VITE_DEEPSEEK_API_KEY || '',
DEEPSEEK_MODEL: import.meta.env.VITE_DEEPSEEK_MODEL || 'deepseek-chat',
// ==================== 智谱AI 配置 ====================
ZHIPU_API_KEY: import.meta.env.VITE_ZHIPU_API_KEY || '',
ZHIPU_API_KEY: runtimeConfig?.zhipuApiKey || import.meta.env.VITE_ZHIPU_API_KEY || '',
ZHIPU_MODEL: import.meta.env.VITE_ZHIPU_MODEL || 'glm-4-flash',
// ==================== Moonshot 配置 ====================
MOONSHOT_API_KEY: import.meta.env.VITE_MOONSHOT_API_KEY || '',
MOONSHOT_API_KEY: runtimeConfig?.moonshotApiKey || import.meta.env.VITE_MOONSHOT_API_KEY || '',
MOONSHOT_MODEL: import.meta.env.VITE_MOONSHOT_MODEL || 'moonshot-v1-8k',
// ==================== 百度文心一言 配置 ====================
BAIDU_API_KEY: import.meta.env.VITE_BAIDU_API_KEY || '',
BAIDU_API_KEY: runtimeConfig?.baiduApiKey || import.meta.env.VITE_BAIDU_API_KEY || '',
BAIDU_MODEL: import.meta.env.VITE_BAIDU_MODEL || 'ERNIE-3.5-8K',
// ==================== MiniMax 配置 ====================
MINIMAX_API_KEY: import.meta.env.VITE_MINIMAX_API_KEY || '',
MINIMAX_API_KEY: runtimeConfig?.minimaxApiKey || import.meta.env.VITE_MINIMAX_API_KEY || '',
MINIMAX_MODEL: import.meta.env.VITE_MINIMAX_MODEL || 'abab6.5-chat',
// ==================== 豆包 配置 ====================
DOUBAO_API_KEY: import.meta.env.VITE_DOUBAO_API_KEY || '',
DOUBAO_API_KEY: runtimeConfig?.doubaoApiKey || import.meta.env.VITE_DOUBAO_API_KEY || '',
DOUBAO_MODEL: import.meta.env.VITE_DOUBAO_MODEL || 'doubao-pro-32k',
// ==================== Ollama 本地模型配置 ====================
OLLAMA_API_KEY: import.meta.env.VITE_OLLAMA_API_KEY || 'ollama',
OLLAMA_MODEL: import.meta.env.VITE_OLLAMA_MODEL || 'llama3',
OLLAMA_BASE_URL: import.meta.env.VITE_OLLAMA_BASE_URL || 'http://localhost:11434/v1',
OLLAMA_BASE_URL: runtimeConfig?.ollamaBaseUrl || import.meta.env.VITE_OLLAMA_BASE_URL || 'http://localhost:11434/v1',
// ==================== Supabase 配置 ====================
SUPABASE_URL: import.meta.env.VITE_SUPABASE_URL || '',
SUPABASE_ANON_KEY: import.meta.env.VITE_SUPABASE_ANON_KEY || '',
// ==================== GitHub 配置 ====================
GITHUB_TOKEN: import.meta.env.VITE_GITHUB_TOKEN || '',
GITHUB_TOKEN: runtimeConfig?.githubToken || import.meta.env.VITE_GITHUB_TOKEN || '',
// ==================== 应用配置 ====================
APP_ID: import.meta.env.VITE_APP_ID || 'xcodereviewer',
// ==================== 分析配置 ====================
MAX_ANALYZE_FILES: Number(import.meta.env.VITE_MAX_ANALYZE_FILES) || 40,
LLM_CONCURRENCY: Number(import.meta.env.VITE_LLM_CONCURRENCY) || 2,
LLM_GAP_MS: Number(import.meta.env.VITE_LLM_GAP_MS) || 500,
MAX_ANALYZE_FILES: runtimeConfig?.maxAnalyzeFiles || Number(import.meta.env.VITE_MAX_ANALYZE_FILES) || 40,
LLM_CONCURRENCY: runtimeConfig?.llmConcurrency || Number(import.meta.env.VITE_LLM_CONCURRENCY) || 2,
LLM_GAP_MS: runtimeConfig?.llmGapMs || Number(import.meta.env.VITE_LLM_GAP_MS) || 500,
// ==================== 语言配置 ====================
OUTPUT_LANGUAGE: import.meta.env.VITE_OUTPUT_LANGUAGE || 'zh-CN', // zh-CN | en-US
OUTPUT_LANGUAGE: runtimeConfig?.outputLanguage || import.meta.env.VITE_OUTPUT_LANGUAGE || 'zh-CN', // zh-CN | en-US
// ==================== 开发环境标识 ====================
isDev: import.meta.env.DEV,