Merge branch 'songquanpeng:main' into test

This commit is contained in:
Ghostz 2024-04-02 00:50:08 +08:00 committed by GitHub
commit 71325d5be6
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
64 changed files with 673 additions and 314 deletions

View File

@ -241,17 +241,19 @@ If the channel ID is not provided, load balancing will be used to distribute the
+ Example: `SESSION_SECRET=random_string` + Example: `SESSION_SECRET=random_string`
3. `SQL_DSN`: When set, the specified database will be used instead of SQLite. Please use MySQL version 8.0. 3. `SQL_DSN`: When set, the specified database will be used instead of SQLite. Please use MySQL version 8.0.
+ Example: `SQL_DSN=root:123456@tcp(localhost:3306)/oneapi` + Example: `SQL_DSN=root:123456@tcp(localhost:3306)/oneapi`
4. `FRONTEND_BASE_URL`: When set, the specified frontend address will be used instead of the backend address. 4. `LOG_SQL_DSN`: When set, a separate database will be used for the `logs` table; please use MySQL or PostgreSQL.
+ Example: `LOG_SQL_DSN=root:123456@tcp(localhost:3306)/oneapi-logs`
5. `FRONTEND_BASE_URL`: When set, the specified frontend address will be used instead of the backend address.
+ Example: `FRONTEND_BASE_URL=https://openai.justsong.cn` + Example: `FRONTEND_BASE_URL=https://openai.justsong.cn`
5. `SYNC_FREQUENCY`: When set, the system will periodically sync configurations from the database, with the unit in seconds. If not set, no sync will happen. 6. `SYNC_FREQUENCY`: When set, the system will periodically sync configurations from the database, with the unit in seconds. If not set, no sync will happen.
+ Example: `SYNC_FREQUENCY=60` + Example: `SYNC_FREQUENCY=60`
6. `NODE_TYPE`: When set, specifies the node type. Valid values are `master` and `slave`. If not set, it defaults to `master`. 7. `NODE_TYPE`: When set, specifies the node type. Valid values are `master` and `slave`. If not set, it defaults to `master`.
+ Example: `NODE_TYPE=slave` + Example: `NODE_TYPE=slave`
7. `CHANNEL_UPDATE_FREQUENCY`: When set, it periodically updates the channel balances, with the unit in minutes. If not set, no update will happen. 8. `CHANNEL_UPDATE_FREQUENCY`: When set, it periodically updates the channel balances, with the unit in minutes. If not set, no update will happen.
+ Example: `CHANNEL_UPDATE_FREQUENCY=1440` + Example: `CHANNEL_UPDATE_FREQUENCY=1440`
8. `CHANNEL_TEST_FREQUENCY`: When set, it periodically tests the channels, with the unit in minutes. If not set, no test will happen. 9. `CHANNEL_TEST_FREQUENCY`: When set, it periodically tests the channels, with the unit in minutes. If not set, no test will happen.
+ Example: `CHANNEL_TEST_FREQUENCY=1440` + Example: `CHANNEL_TEST_FREQUENCY=1440`
9. `POLLING_INTERVAL`: The time interval (in seconds) between requests when updating channel balances and testing channel availability. Default is no interval. 10. `POLLING_INTERVAL`: The time interval (in seconds) between requests when updating channel balances and testing channel availability. Default is no interval.
+ Example: `POLLING_INTERVAL=5` + Example: `POLLING_INTERVAL=5`
### Command Line Parameters ### Command Line Parameters

View File

@ -242,17 +242,18 @@ graph LR
+ 例: `SESSION_SECRET=random_string` + 例: `SESSION_SECRET=random_string`
3. `SQL_DSN`: 設定すると、SQLite の代わりに指定したデータベースが使用されます。MySQL バージョン 8.0 を使用してください。 3. `SQL_DSN`: 設定すると、SQLite の代わりに指定したデータベースが使用されます。MySQL バージョン 8.0 を使用してください。
+ 例: `SQL_DSN=root:123456@tcp(localhost:3306)/oneapi` + 例: `SQL_DSN=root:123456@tcp(localhost:3306)/oneapi`
4. `FRONTEND_BASE_URL`: 設定されると、バックエンドアドレスではなく、指定されたフロントエンドアドレスが使われる。 4. `LOG_SQL_DSN`: を設定すると、`logs`テーブルには独立したデータベースが使用されます。MySQLまたはPostgreSQLを使用してください。
5. `FRONTEND_BASE_URL`: 設定されると、バックエンドアドレスではなく、指定されたフロントエンドアドレスが使われる。
+ 例: `FRONTEND_BASE_URL=https://openai.justsong.cn` + 例: `FRONTEND_BASE_URL=https://openai.justsong.cn`
5. `SYNC_FREQUENCY`: 設定された場合、システムは定期的にデータベースからコンフィグを秒単位で同期する。設定されていない場合、同期は行われません。 6. `SYNC_FREQUENCY`: 設定された場合、システムは定期的にデータベースからコンフィグを秒単位で同期する。設定されていない場合、同期は行われません。
+ 例: `SYNC_FREQUENCY=60` + 例: `SYNC_FREQUENCY=60`
6. `NODE_TYPE`: 設定すると、ノードのタイプを指定する。有効な値は `master``slave` である。設定されていない場合、デフォルトは `master` 7. `NODE_TYPE`: 設定すると、ノードのタイプを指定する。有効な値は `master``slave` である。設定されていない場合、デフォルトは `master`
+ 例: `NODE_TYPE=slave` + 例: `NODE_TYPE=slave`
7. `CHANNEL_UPDATE_FREQUENCY`: 設定すると、チャンネル残高を分単位で定期的に更新する。設定されていない場合、更新は行われません。 8. `CHANNEL_UPDATE_FREQUENCY`: 設定すると、チャンネル残高を分単位で定期的に更新する。設定されていない場合、更新は行われません。
+ 例: `CHANNEL_UPDATE_FREQUENCY=1440` + 例: `CHANNEL_UPDATE_FREQUENCY=1440`
8. `CHANNEL_TEST_FREQUENCY`: 設定すると、チャンネルを定期的にテストする。設定されていない場合、テストは行われません。 9. `CHANNEL_TEST_FREQUENCY`: 設定すると、チャンネルを定期的にテストする。設定されていない場合、テストは行われません。
+ 例: `CHANNEL_TEST_FREQUENCY=1440` + 例: `CHANNEL_TEST_FREQUENCY=1440`
9. `POLLING_INTERVAL`: チャネル残高の更新とチャネルの可用性をテストするときのリクエスト間の時間間隔 (秒)。デフォルトは間隔なし。 10. `POLLING_INTERVAL`: チャネル残高の更新とチャネルの可用性をテストするときのリクエスト間の時間間隔 (秒)。デフォルトは間隔なし。
+ 例: `POLLING_INTERVAL=5` + 例: `POLLING_INTERVAL=5`
### コマンドラインパラメータ ### コマンドラインパラメータ

View File

@ -87,7 +87,7 @@ _✨ 通过标准的 OpenAI API 格式访问所有的大模型,开箱即用
5. 支持**多机部署**[详见此处](#多机部署)。 5. 支持**多机部署**[详见此处](#多机部署)。
6. 支持**令牌管理**,设置令牌的过期时间和额度。 6. 支持**令牌管理**,设置令牌的过期时间和额度。
7. 支持**兑换码管理**,支持批量生成和导出兑换码,可使用兑换码为账户进行充值。 7. 支持**兑换码管理**,支持批量生成和导出兑换码,可使用兑换码为账户进行充值。
8. 支持**通道管理**,批量创建通道。 8. 支持**渠道管理**,批量创建渠道。
9. 支持**用户分组**以及**渠道分组**,支持为不同分组设置不同的倍率。 9. 支持**用户分组**以及**渠道分组**,支持为不同分组设置不同的倍率。
10. 支持渠道**设置模型列表**。 10. 支持渠道**设置模型列表**。
11. 支持**查看额度明细**。 11. 支持**查看额度明细**。
@ -349,38 +349,40 @@ graph LR
+ `SQL_MAX_OPEN_CONNS`:最大打开连接数,默认为 `1000` + `SQL_MAX_OPEN_CONNS`:最大打开连接数,默认为 `1000`
+ 如果报错 `Error 1040: Too many connections`,请适当减小该值。 + 如果报错 `Error 1040: Too many connections`,请适当减小该值。
+ `SQL_CONN_MAX_LIFETIME`:连接的最大生命周期,默认为 `60`,单位分钟。 + `SQL_CONN_MAX_LIFETIME`:连接的最大生命周期,默认为 `60`,单位分钟。
4. `FRONTEND_BASE_URL`:设置之后将重定向页面请求到指定的地址,仅限从服务器设置。 4. `LOG_SQL_DSN`:设置之后将为 `logs` 表使用独立的数据库,请使用 MySQL 或 PostgreSQL。
5. `FRONTEND_BASE_URL`:设置之后将重定向页面请求到指定的地址,仅限从服务器设置。
+ 例子:`FRONTEND_BASE_URL=https://openai.justsong.cn` + 例子:`FRONTEND_BASE_URL=https://openai.justsong.cn`
5. `MEMORY_CACHE_ENABLED`:启用内存缓存,会导致用户额度的更新存在一定的延迟,可选值为 `true``false`,未设置则默认为 `false` 6. `MEMORY_CACHE_ENABLED`:启用内存缓存,会导致用户额度的更新存在一定的延迟,可选值为 `true``false`,未设置则默认为 `false`
+ 例子:`MEMORY_CACHE_ENABLED=true` + 例子:`MEMORY_CACHE_ENABLED=true`
6. `SYNC_FREQUENCY`:在启用缓存的情况下与数据库同步配置的频率,单位为秒,默认为 `600` 秒。 7. `SYNC_FREQUENCY`:在启用缓存的情况下与数据库同步配置的频率,单位为秒,默认为 `600` 秒。
+ 例子:`SYNC_FREQUENCY=60` + 例子:`SYNC_FREQUENCY=60`
7. `NODE_TYPE`:设置之后将指定节点类型,可选值为 `master``slave`,未设置则默认为 `master` 8. `NODE_TYPE`:设置之后将指定节点类型,可选值为 `master``slave`,未设置则默认为 `master`
+ 例子:`NODE_TYPE=slave` + 例子:`NODE_TYPE=slave`
8. `CHANNEL_UPDATE_FREQUENCY`:设置之后将定期更新渠道余额,单位为分钟,未设置则不进行更新。 9. `CHANNEL_UPDATE_FREQUENCY`:设置之后将定期更新渠道余额,单位为分钟,未设置则不进行更新。
+ 例子:`CHANNEL_UPDATE_FREQUENCY=1440` + 例子:`CHANNEL_UPDATE_FREQUENCY=1440`
9. `CHANNEL_TEST_FREQUENCY`:设置之后将定期检查渠道,单位为分钟,未设置则不进行检查。 10. `CHANNEL_TEST_FREQUENCY`:设置之后将定期检查渠道,单位为分钟,未设置则不进行检查。
+ 例子:`CHANNEL_TEST_FREQUENCY=1440` + 例子:`CHANNEL_TEST_FREQUENCY=1440`
10. `POLLING_INTERVAL`:批量更新渠道余额以及测试可用性时的请求间隔,单位为秒,默认无间隔。 11. `POLLING_INTERVAL`:批量更新渠道余额以及测试可用性时的请求间隔,单位为秒,默认无间隔。
+ 例子:`POLLING_INTERVAL=5` + 例子:`POLLING_INTERVAL=5`
11. `BATCH_UPDATE_ENABLED`:启用数据库批量更新聚合,会导致用户额度的更新存在一定的延迟可选值为 `true``false`,未设置则默认为 `false` 12. `BATCH_UPDATE_ENABLED`:启用数据库批量更新聚合,会导致用户额度的更新存在一定的延迟可选值为 `true``false`,未设置则默认为 `false`
+ 例子:`BATCH_UPDATE_ENABLED=true` + 例子:`BATCH_UPDATE_ENABLED=true`
+ 如果你遇到了数据库连接数过多的问题,可以尝试启用该选项。 + 如果你遇到了数据库连接数过多的问题,可以尝试启用该选项。
12. `BATCH_UPDATE_INTERVAL=5`:批量更新聚合的时间间隔,单位为秒,默认为 `5` 13. `BATCH_UPDATE_INTERVAL=5`:批量更新聚合的时间间隔,单位为秒,默认为 `5`
+ 例子:`BATCH_UPDATE_INTERVAL=5` + 例子:`BATCH_UPDATE_INTERVAL=5`
13. 请求频率限制: 14. 请求频率限制:
+ `GLOBAL_API_RATE_LIMIT`:全局 API 速率限制(除中继请求外),单 ip 三分钟内的最大请求数,默认为 `180` + `GLOBAL_API_RATE_LIMIT`:全局 API 速率限制(除中继请求外),单 ip 三分钟内的最大请求数,默认为 `180`
+ `GLOBAL_WEB_RATE_LIMIT`:全局 Web 速率限制,单 ip 三分钟内的最大请求数,默认为 `60` + `GLOBAL_WEB_RATE_LIMIT`:全局 Web 速率限制,单 ip 三分钟内的最大请求数,默认为 `60`
14. 编码器缓存设置: 15. 编码器缓存设置:
+ `TIKTOKEN_CACHE_DIR`:默认程序启动时会联网下载一些通用的词元的编码,如:`gpt-3.5-turbo`,在一些网络环境不稳定,或者离线情况,可能会导致启动有问题,可以配置此目录缓存数据,可迁移到离线环境。 + `TIKTOKEN_CACHE_DIR`:默认程序启动时会联网下载一些通用的词元的编码,如:`gpt-3.5-turbo`,在一些网络环境不稳定,或者离线情况,可能会导致启动有问题,可以配置此目录缓存数据,可迁移到离线环境。
+ `DATA_GYM_CACHE_DIR`:目前该配置作用与 `TIKTOKEN_CACHE_DIR` 一致,但是优先级没有它高。 + `DATA_GYM_CACHE_DIR`:目前该配置作用与 `TIKTOKEN_CACHE_DIR` 一致,但是优先级没有它高。
15. `RELAY_TIMEOUT`:中继超时设置,单位为秒,默认不设置超时时间。 16. `RELAY_TIMEOUT`:中继超时设置,单位为秒,默认不设置超时时间。
16. `SQLITE_BUSY_TIMEOUT`SQLite 锁等待超时设置,单位为毫秒,默认 `3000` 17. `SQLITE_BUSY_TIMEOUT`SQLite 锁等待超时设置,单位为毫秒,默认 `3000`
17. `GEMINI_SAFETY_SETTING`Gemini 的安全设置,默认 `BLOCK_NONE` 18. `GEMINI_SAFETY_SETTING`Gemini 的安全设置,默认 `BLOCK_NONE`
18. `THEME`:系统的主题设置,默认为 `default`,具体可选值参考[此处](./web/README.md)。 19. `THEME`:系统的主题设置,默认为 `default`,具体可选值参考[此处](./web/README.md)。
19. `ENABLE_METRIC`:是否根据请求成功率禁用渠道,默认不开启,可选值为 `true``false` 20. `ENABLE_METRIC`:是否根据请求成功率禁用渠道,默认不开启,可选值为 `true``false`
20. `METRIC_QUEUE_SIZE`:请求成功率统计队列大小,默认为 `10` 21. `METRIC_QUEUE_SIZE`:请求成功率统计队列大小,默认为 `10`
21. `METRIC_SUCCESS_RATE_THRESHOLD`:请求成功率阈值,默认为 `0.8` 22. `METRIC_SUCCESS_RATE_THRESHOLD`:请求成功率阈值,默认为 `0.8`
23. `INITIAL_ROOT_TOKEN`:如果设置了该值,则在系统首次启动时会自动创建一个值为该环境变量值的 root 用户令牌。
### 命令行参数 ### 命令行参数
1. `--port <port_number>`: 指定服务器监听的端口号,默认为 `3000` 1. `--port <port_number>`: 指定服务器监听的端口号,默认为 `3000`
@ -419,7 +421,7 @@ https://openai.justsong.cn
+ 检查你的接口地址和 API Key 有没有填对。 + 检查你的接口地址和 API Key 有没有填对。
+ 检查是否启用了 HTTPS浏览器会拦截 HTTPS 域名下的 HTTP 请求。 + 检查是否启用了 HTTPS浏览器会拦截 HTTPS 域名下的 HTTP 请求。
6. 报错:`当前分组负载已饱和,请稍后再试` 6. 报错:`当前分组负载已饱和,请稍后再试`
+ 上游道 429 了。 + 上游道 429 了。
7. 升级之后我的数据会丢失吗? 7. 升级之后我的数据会丢失吗?
+ 如果使用 MySQL不会。 + 如果使用 MySQL不会。
+ 如果使用 SQLite需要按照我所给的部署命令挂载 volume 持久化 one-api.db 数据库文件,否则容器重启后数据会丢失。 + 如果使用 SQLite需要按照我所给的部署命令挂载 volume 持久化 one-api.db 数据库文件,否则容器重启后数据会丢失。
@ -427,8 +429,8 @@ https://openai.justsong.cn
+ 一般情况下不需要,系统将在初始化的时候自动调整。 + 一般情况下不需要,系统将在初始化的时候自动调整。
+ 如果需要的话,我会在更新日志中说明,并给出脚本。 + 如果需要的话,我会在更新日志中说明,并给出脚本。
9. 手动修改数据库后报错:`数据库一致性已被破坏,请联系管理员` 9. 手动修改数据库后报错:`数据库一致性已被破坏,请联系管理员`
+ 这是检测到 ability 表里有些记录的道 id 是不存在的,这大概率是因为你删了 channel 表里的记录但是没有同步在 ability 表里清理无效的道。 + 这是检测到 ability 表里有些记录的道 id 是不存在的,这大概率是因为你删了 channel 表里的记录但是没有同步在 ability 表里清理无效的道。
+ 对于每一个通道,其所支持的模型都需要有一个专门的 ability 表的记录,表示该通道支持该模型。 + 对于每一个渠道,其所支持的模型都需要有一个专门的 ability 表的记录,表示该渠道支持该模型。
## 相关项目 ## 相关项目
* [FastGPT](https://github.com/labring/FastGPT): 基于 LLM 大语言模型的知识库问答系统 * [FastGPT](https://github.com/labring/FastGPT): 基于 LLM 大语言模型的知识库问答系统

View File

@ -136,3 +136,5 @@ var MetricQueueSize = env.Int("METRIC_QUEUE_SIZE", 10)
var MetricSuccessRateThreshold = env.Float64("METRIC_SUCCESS_RATE_THRESHOLD", 0.8) var MetricSuccessRateThreshold = env.Float64("METRIC_SUCCESS_RATE_THRESHOLD", 0.8)
var MetricSuccessChanSize = env.Int("METRIC_SUCCESS_CHAN_SIZE", 1024) var MetricSuccessChanSize = env.Int("METRIC_SUCCESS_CHAN_SIZE", 1024)
var MetricFailChanSize = env.Int("METRIC_FAIL_CHAN_SIZE", 128) var MetricFailChanSize = env.Int("METRIC_FAIL_CHAN_SIZE", 128)
var InitialRootToken = os.Getenv("INITIAL_ROOT_TOKEN")

6
common/conv/any.go Normal file
View File

@ -0,0 +1,6 @@
package conv
func AsString(v any) string {
str, _ := v.(string)
return str
}

View File

@ -72,17 +72,21 @@ var ModelRatio = map[string]float64{
"claude-3-sonnet-20240229": 3.0 / 1000 * USD, "claude-3-sonnet-20240229": 3.0 / 1000 * USD,
"claude-3-opus-20240229": 15.0 / 1000 * USD, "claude-3-opus-20240229": 15.0 / 1000 * USD,
// https://cloud.baidu.com/doc/WENXINWORKSHOP/s/hlrk4akp7 // https://cloud.baidu.com/doc/WENXINWORKSHOP/s/hlrk4akp7
"ERNIE-Bot": 0.8572, // ¥0.012 / 1k tokens "ERNIE-Bot": 0.8572, // ¥0.012 / 1k tokens
"ERNIE-Bot-turbo": 0.5715, // ¥0.008 / 1k tokens "ERNIE-Bot-turbo": 0.5715, // ¥0.008 / 1k tokens
"ERNIE-Bot-4": 0.12 * RMB, // ¥0.12 / 1k tokens "ERNIE-Bot-4": 0.12 * RMB, // ¥0.12 / 1k tokens
"ERNIE-Bot-8k": 0.024 * RMB, "ERNIE-Bot-8K": 0.024 * RMB,
"Embedding-V1": 0.1429, // ¥0.002 / 1k tokens "Embedding-V1": 0.1429, // ¥0.002 / 1k tokens
"bge-large-zh": 0.002 * RMB, "bge-large-zh": 0.002 * RMB,
"bge-large-en": 0.002 * RMB, "bge-large-en": 0.002 * RMB,
"bge-large-8k": 0.002 * RMB, "bge-large-8k": 0.002 * RMB,
"PaLM-2": 1, // https://ai.google.dev/pricing
"gemini-pro": 1, // $0.00025 / 1k characters -> $0.001 / 1k tokens "PaLM-2": 1,
"gemini-pro-vision": 1, // $0.00025 / 1k characters -> $0.001 / 1k tokens "gemini-pro": 1, // $0.00025 / 1k characters -> $0.001 / 1k tokens
"gemini-pro-vision": 1, // $0.00025 / 1k characters -> $0.001 / 1k tokens
"gemini-1.0-pro-vision-001": 1,
"gemini-1.0-pro-001": 1,
"gemini-1.5-pro": 1,
// https://open.bigmodel.cn/pricing // https://open.bigmodel.cn/pricing
"glm-4": 0.1 * RMB, "glm-4": 0.1 * RMB,
"glm-4v": 0.1 * RMB, "glm-4v": 0.1 * RMB,
@ -248,6 +252,9 @@ func GetCompletionRatio(name string) float64 {
if strings.HasPrefix(name, "mistral-") { if strings.HasPrefix(name, "mistral-") {
return 3 return 3
} }
if strings.HasPrefix(name, "gemini-") {
return 3
}
switch name { switch name {
case "llama2-70b-4096": case "llama2-70b-4096":
return 0.8 / 0.7 return 0.8 / 0.7

View File

@ -197,7 +197,7 @@ func testChannels(notify bool, scope string) error {
testAllChannelsRunning = false testAllChannelsRunning = false
testAllChannelsLock.Unlock() testAllChannelsLock.Unlock()
if notify { if notify {
err := message.Notify(message.ByAll, "通道测试完成", "", "通道测试完成,如果没有收到禁用通知,说明所有通道都正常") err := message.Notify(message.ByAll, "渠道测试完成", "", "渠道测试完成,如果没有收到禁用通知,说明所有渠道都正常")
if err != nil { if err != nil {
logger.SysError(fmt.Sprintf("failed to send email: %s", err.Error())) logger.SysError(fmt.Sprintf("failed to send email: %s", err.Error()))
} }

View File

@ -16,7 +16,10 @@ func GetAllTokens(c *gin.Context) {
if p < 0 { if p < 0 {
p = 0 p = 0
} }
tokens, err := model.GetAllUserTokens(userId, p*config.ItemsPerPage, config.ItemsPerPage)
order := c.Query("order")
tokens, err := model.GetAllUserTokens(userId, p*config.ItemsPerPage, config.ItemsPerPage, order)
if err != nil { if err != nil {
c.JSON(http.StatusOK, gin.H{ c.JSON(http.StatusOK, gin.H{
"success": false, "success": false,
@ -139,6 +142,7 @@ func AddToken(c *gin.Context) {
c.JSON(http.StatusOK, gin.H{ c.JSON(http.StatusOK, gin.H{
"success": true, "success": true,
"message": "", "message": "",
"data": cleanToken,
}) })
return return
} }

View File

@ -180,24 +180,27 @@ func Register(c *gin.Context) {
} }
func GetAllUsers(c *gin.Context) { func GetAllUsers(c *gin.Context) {
p, _ := strconv.Atoi(c.Query("p")) p, _ := strconv.Atoi(c.Query("p"))
if p < 0 { if p < 0 {
p = 0 p = 0
} }
users, err := model.GetAllUsers(p*config.ItemsPerPage, config.ItemsPerPage)
if err != nil { order := c.DefaultQuery("order", "")
c.JSON(http.StatusOK, gin.H{ users, err := model.GetAllUsers(p*config.ItemsPerPage, config.ItemsPerPage, order)
"success": false,
"message": err.Error(), if err != nil {
}) c.JSON(http.StatusOK, gin.H{
return "success": false,
} "message": err.Error(),
c.JSON(http.StatusOK, gin.H{ })
"success": true, return
"message": "", }
"data": users,
}) c.JSON(http.StatusOK, gin.H{
return "success": true,
"message": "",
"data": users,
})
} }
func SearchUsers(c *gin.Context) { func SearchUsers(c *gin.Context) {

View File

@ -8,12 +8,12 @@
"确认删除": "Confirm Delete", "确认删除": "Confirm Delete",
"确认绑定": "Confirm Binding", "确认绑定": "Confirm Binding",
"您正在删除自己的帐户,将清空所有数据且不可恢复": "You are deleting your account, all data will be cleared and unrecoverable.", "您正在删除自己的帐户,将清空所有数据且不可恢复": "You are deleting your account, all data will be cleared and unrecoverable.",
"\"道「%s」#%d已被禁用\"": "\"Channel %s (#%d) has been disabled\"", "\"道「%s」#%d已被禁用\"": "\"Channel %s (#%d) has been disabled\"",
"道「%s」#%d已被禁用原因%s": "Channel %s (#%d) has been disabled, reason: %s", "道「%s」#%d已被禁用原因%s": "Channel %s (#%d) has been disabled, reason: %s",
"测试已在运行中": "Test is already running", "测试已在运行中": "Test is already running",
"响应时间 %.2fs 超过阈值 %.2fs": "Response time %.2fs exceeds threshold %.2fs", "响应时间 %.2fs 超过阈值 %.2fs": "Response time %.2fs exceeds threshold %.2fs",
"道测试完成": "Channel test completed", "道测试完成": "Channel test completed",
"通道测试完成,如果没有收到禁用通知,说明所有通道都正常": "Channel test completed, if you have not received the disable notification, it means that all channels are normal", "渠道测试完成,如果没有收到禁用通知,说明所有渠道都正常": "Channel test completed, if you have not received the disable notification, it means that all channels are normal",
"无法连接至 GitHub 服务器,请稍后重试!": "Unable to connect to GitHub server, please try again later!", "无法连接至 GitHub 服务器,请稍后重试!": "Unable to connect to GitHub server, please try again later!",
"返回值非法,用户字段为空,请稍后重试!": "The return value is illegal, the user field is empty, please try again later!", "返回值非法,用户字段为空,请稍后重试!": "The return value is illegal, the user field is empty, please try again later!",
"管理员未开启通过 GitHub 登录以及注册": "The administrator did not turn on login and registration via GitHub", "管理员未开启通过 GitHub 登录以及注册": "The administrator did not turn on login and registration via GitHub",
@ -119,11 +119,11 @@
" 个月 ": " M ", " 个月 ": " M ",
" 年 ": " y ", " 年 ": " y ",
"未测试": "Not tested", "未测试": "Not tested",
"道 ${name} 测试成功,耗时 ${time.toFixed(2)} 秒。": "Channel ${name} test succeeded, time consumed ${time.toFixed(2)} s.", "道 ${name} 测试成功,耗时 ${time.toFixed(2)} 秒。": "Channel ${name} test succeeded, time consumed ${time.toFixed(2)} s.",
"已成功开始测试所有道,请刷新页面查看结果。": "All channels have been successfully tested, please refresh the page to view the results.", "已成功开始测试所有道,请刷新页面查看结果。": "All channels have been successfully tested, please refresh the page to view the results.",
"已成功开始测试所有已启用道,请刷新页面查看结果。": "All enabled channels have been successfully tested, please refresh the page to view the results.", "已成功开始测试所有已启用道,请刷新页面查看结果。": "All enabled channels have been successfully tested, please refresh the page to view the results.",
"道 ${name} 余额更新成功!": "Channel ${name} balance updated successfully!", "道 ${name} 余额更新成功!": "Channel ${name} balance updated successfully!",
"已更新完毕所有已启用道余额!": "The balance of all enabled channels has been updated!", "已更新完毕所有已启用道余额!": "The balance of all enabled channels has been updated!",
"搜索渠道的 ID名称和密钥 ...": "Search for channel ID, name and key ...", "搜索渠道的 ID名称和密钥 ...": "Search for channel ID, name and key ...",
"名称": "Name", "名称": "Name",
"分组": "Group", "分组": "Group",
@ -141,9 +141,9 @@
"启用": "Enable", "启用": "Enable",
"编辑": "Edit", "编辑": "Edit",
"添加新的渠道": "Add a new channel", "添加新的渠道": "Add a new channel",
"测试所有道": "Test all channels", "测试所有道": "Test all channels",
"测试所有已启用道": "Test all enabled channels", "测试所有已启用道": "Test all enabled channels",
"更新所有已启用道余额": "Update the balance of all enabled channels", "更新所有已启用道余额": "Update the balance of all enabled channels",
"刷新": "Refresh", "刷新": "Refresh",
"处理中...": "Processing...", "处理中...": "Processing...",
"绑定成功!": "Binding succeeded!", "绑定成功!": "Binding succeeded!",
@ -207,11 +207,11 @@
"监控设置": "Monitoring Settings", "监控设置": "Monitoring Settings",
"最长响应时间": "Longest Response Time", "最长响应时间": "Longest Response Time",
"单位秒": "Unit in seconds", "单位秒": "Unit in seconds",
"当运行道全部测试时": "When all operating channels are tested", "当运行道全部测试时": "When all operating channels are tested",
"超过此时间将自动禁用道": "Channels will be automatically disabled if this time is exceeded", "超过此时间将自动禁用道": "Channels will be automatically disabled if this time is exceeded",
"额度提醒阈值": "Quota reminder threshold", "额度提醒阈值": "Quota reminder threshold",
"低于此额度时将发送邮件提醒用户": "Email will be sent to remind users when the quota is below this", "低于此额度时将发送邮件提醒用户": "Email will be sent to remind users when the quota is below this",
"失败时自动禁用道": "Automatically disable the channel when it fails", "失败时自动禁用道": "Automatically disable the channel when it fails",
"保存监控设置": "Save Monitoring Settings", "保存监控设置": "Save Monitoring Settings",
"额度设置": "Quota Settings", "额度设置": "Quota Settings",
"新用户初始额度": "Initial quota for new users", "新用户初始额度": "Initial quota for new users",
@ -405,7 +405,7 @@
"镜像": "Mirror", "镜像": "Mirror",
"请输入镜像站地址格式为https://domain.com可不填不填则使用渠道默认值": "Please enter the mirror site address, the format is: https://domain.com, it can be left blank, if left blank, the default value of the channel will be used", "请输入镜像站地址格式为https://domain.com可不填不填则使用渠道默认值": "Please enter the mirror site address, the format is: https://domain.com, it can be left blank, if left blank, the default value of the channel will be used",
"模型": "Model", "模型": "Model",
"请选择该道所支持的模型": "Please select the model supported by the channel", "请选择该道所支持的模型": "Please select the model supported by the channel",
"填入基础模型": "Fill in the basic model", "填入基础模型": "Fill in the basic model",
"填入所有模型": "Fill in all models", "填入所有模型": "Fill in all models",
"清除所有模型": "Clear all models", "清除所有模型": "Clear all models",
@ -515,7 +515,7 @@
"请输入自定义渠道的 Base URL": "Please enter the Base URL of the custom channel", "请输入自定义渠道的 Base URL": "Please enter the Base URL of the custom channel",
"Homepage URL 填": "Fill in the Homepage URL", "Homepage URL 填": "Fill in the Homepage URL",
"Authorization callback URL 填": "Fill in the Authorization callback URL", "Authorization callback URL 填": "Fill in the Authorization callback URL",
"请为道命名": "Please name the channel", "请为道命名": "Please name the channel",
"此项可选,用于修改请求体中的模型名称,为一个 JSON 字符串,键为请求中模型名称,值为要替换的模型名称,例如:": "This is optional, used to modify the model name in the request body, it's a JSON string, the key is the model name in the request, and the value is the model name to be replaced, for example:", "此项可选,用于修改请求体中的模型名称,为一个 JSON 字符串,键为请求中模型名称,值为要替换的模型名称,例如:": "This is optional, used to modify the model name in the request body, it's a JSON string, the key is the model name in the request, and the value is the model name to be replaced, for example:",
"模型重定向": "Model redirection", "模型重定向": "Model redirection",
"请输入渠道对应的鉴权密钥": "Please enter the authentication key corresponding to the channel", "请输入渠道对应的鉴权密钥": "Please enter the authentication key corresponding to the channel",

View File

@ -2,6 +2,7 @@ package model
import ( import (
"github.com/songquanpeng/one-api/common" "github.com/songquanpeng/one-api/common"
"gorm.io/gorm"
"strings" "strings"
) )
@ -13,7 +14,7 @@ type Ability struct {
Priority *int64 `json:"priority" gorm:"bigint;default:0;index"` Priority *int64 `json:"priority" gorm:"bigint;default:0;index"`
} }
func GetRandomSatisfiedChannel(group string, model string) (*Channel, error) { func GetRandomSatisfiedChannel(group string, model string, ignoreFirstPriority bool) (*Channel, error) {
ability := Ability{} ability := Ability{}
groupCol := "`group`" groupCol := "`group`"
trueVal := "1" trueVal := "1"
@ -23,8 +24,13 @@ func GetRandomSatisfiedChannel(group string, model string) (*Channel, error) {
} }
var err error = nil var err error = nil
maxPrioritySubQuery := DB.Model(&Ability{}).Select("MAX(priority)").Where(groupCol+" = ? and model = ? and enabled = "+trueVal, group, model) var channelQuery *gorm.DB
channelQuery := DB.Where(groupCol+" = ? and model = ? and enabled = "+trueVal+" and priority = (?)", group, model, maxPrioritySubQuery) if ignoreFirstPriority {
channelQuery = DB.Where(groupCol+" = ? and model = ? and enabled = "+trueVal, group, model)
} else {
maxPrioritySubQuery := DB.Model(&Ability{}).Select("MAX(priority)").Where(groupCol+" = ? and model = ? and enabled = "+trueVal, group, model)
channelQuery = DB.Where(groupCol+" = ? and model = ? and enabled = "+trueVal+" and priority = (?)", group, model, maxPrioritySubQuery)
}
if common.UsingSQLite || common.UsingPostgreSQL { if common.UsingSQLite || common.UsingPostgreSQL {
err = channelQuery.Order("RANDOM()").First(&ability).Error err = channelQuery.Order("RANDOM()").First(&ability).Error
} else { } else {

View File

@ -205,7 +205,7 @@ func SyncChannelCache(frequency int) {
func CacheGetRandomSatisfiedChannel(group string, model string, ignoreFirstPriority bool) (*Channel, error) { func CacheGetRandomSatisfiedChannel(group string, model string, ignoreFirstPriority bool) (*Channel, error) {
if !config.MemoryCacheEnabled { if !config.MemoryCacheEnabled {
return GetRandomSatisfiedChannel(group, model) return GetRandomSatisfiedChannel(group, model, ignoreFirstPriority)
} }
channelSyncLock.RLock() channelSyncLock.RLock()
defer channelSyncLock.RUnlock() defer channelSyncLock.RUnlock()

View File

@ -23,7 +23,7 @@ func CreateRootAccountIfNeed() error {
var user User var user User
//if user.Status != util.UserStatusEnabled { //if user.Status != util.UserStatusEnabled {
if err := DB.First(&user).Error; err != nil { if err := DB.First(&user).Error; err != nil {
logger.SysLog("no user exists, create a root user for you: username is root, password is 123456") logger.SysLog("no user exists, creating a root user for you: username is root, password is 123456")
hashedPassword, err := common.Password2Hash("123456") hashedPassword, err := common.Password2Hash("123456")
if err != nil { if err != nil {
return err return err
@ -35,9 +35,25 @@ func CreateRootAccountIfNeed() error {
Status: common.UserStatusEnabled, Status: common.UserStatusEnabled,
DisplayName: "Root User", DisplayName: "Root User",
AccessToken: helper.GetUUID(), AccessToken: helper.GetUUID(),
Quota: 100000000, Quota: 500000000000000,
} }
DB.Create(&rootUser) DB.Create(&rootUser)
if config.InitialRootToken != "" {
logger.SysLog("creating initial root token as requested")
token := Token{
Id: 1,
UserId: rootUser.Id,
Key: config.InitialRootToken,
Status: common.TokenStatusEnabled,
Name: "Initial Root Token",
CreatedTime: helper.GetTimestamp(),
AccessedTime: helper.GetTimestamp(),
ExpiredTime: -1,
RemainQuota: 500000000000000,
UnlimitedQuota: true,
}
DB.Create(&token)
}
} }
return nil return nil
} }

View File

@ -14,7 +14,7 @@ type Redemption struct {
Key string `json:"key" gorm:"type:char(32);uniqueIndex"` Key string `json:"key" gorm:"type:char(32);uniqueIndex"`
Status int `json:"status" gorm:"default:1"` Status int `json:"status" gorm:"default:1"`
Name string `json:"name" gorm:"index"` Name string `json:"name" gorm:"index"`
Quota int64 `json:"quota" gorm:"default:100"` Quota int64 `json:"quota" gorm:"bigint;default:100"`
CreatedTime int64 `json:"created_time" gorm:"bigint"` CreatedTime int64 `json:"created_time" gorm:"bigint"`
RedeemedTime int64 `json:"redeemed_time" gorm:"bigint"` RedeemedTime int64 `json:"redeemed_time" gorm:"bigint"`
Count int `json:"count" gorm:"-:all"` // only for api request Count int `json:"count" gorm:"-:all"` // only for api request

View File

@ -20,15 +20,26 @@ type Token struct {
CreatedTime int64 `json:"created_time" gorm:"bigint"` CreatedTime int64 `json:"created_time" gorm:"bigint"`
AccessedTime int64 `json:"accessed_time" gorm:"bigint"` AccessedTime int64 `json:"accessed_time" gorm:"bigint"`
ExpiredTime int64 `json:"expired_time" gorm:"bigint;default:-1"` // -1 means never expired ExpiredTime int64 `json:"expired_time" gorm:"bigint;default:-1"` // -1 means never expired
RemainQuota int64 `json:"remain_quota" gorm:"default:0"` RemainQuota int64 `json:"remain_quota" gorm:"bigint;default:0"`
UnlimitedQuota bool `json:"unlimited_quota" gorm:"default:false"` UnlimitedQuota bool `json:"unlimited_quota" gorm:"default:false"`
UsedQuota int64 `json:"used_quota" gorm:"default:0"` // used quota UsedQuota int64 `json:"used_quota" gorm:"bigint;default:0"` // used quota
} }
func GetAllUserTokens(userId int, startIdx int, num int) ([]*Token, error) { func GetAllUserTokens(userId int, startIdx int, num int, order string) ([]*Token, error) {
var tokens []*Token var tokens []*Token
var err error var err error
err = DB.Where("user_id = ?", userId).Order("id desc").Limit(num).Offset(startIdx).Find(&tokens).Error query := DB.Where("user_id = ?", userId)
switch order {
case "remain_quota":
query = query.Order("unlimited_quota desc, remain_quota desc")
case "used_quota":
query = query.Order("used_quota desc")
default:
query = query.Order("id desc")
}
err = query.Limit(num).Offset(startIdx).Find(&tokens).Error
return tokens, err return tokens, err
} }

View File

@ -26,9 +26,9 @@ type User struct {
WeChatId string `json:"wechat_id" gorm:"column:wechat_id;index"` WeChatId string `json:"wechat_id" gorm:"column:wechat_id;index"`
VerificationCode string `json:"verification_code" gorm:"-:all"` // this field is only for Email verification, don't save it to database! VerificationCode string `json:"verification_code" gorm:"-:all"` // this field is only for Email verification, don't save it to database!
AccessToken string `json:"access_token" gorm:"type:char(32);column:access_token;uniqueIndex"` // this token is for system management AccessToken string `json:"access_token" gorm:"type:char(32);column:access_token;uniqueIndex"` // this token is for system management
Quota int64 `json:"quota" gorm:"type:int;default:0"` Quota int64 `json:"quota" gorm:"bigint;default:0"`
UsedQuota int64 `json:"used_quota" gorm:"type:int;default:0;column:used_quota"` // used quota UsedQuota int64 `json:"used_quota" gorm:"bigint;default:0;column:used_quota"` // used quota
RequestCount int `json:"request_count" gorm:"type:int;default:0;"` // request number RequestCount int `json:"request_count" gorm:"type:int;default:0;"` // request number
Group string `json:"group" gorm:"type:varchar(32);default:'default'"` Group string `json:"group" gorm:"type:varchar(32);default:'default'"`
AffCode string `json:"aff_code" gorm:"type:varchar(32);column:aff_code;uniqueIndex"` AffCode string `json:"aff_code" gorm:"type:varchar(32);column:aff_code;uniqueIndex"`
InviterId int `json:"inviter_id" gorm:"type:int;column:inviter_id;index"` InviterId int `json:"inviter_id" gorm:"type:int;column:inviter_id;index"`
@ -40,9 +40,22 @@ func GetMaxUserId() int {
return user.Id return user.Id
} }
func GetAllUsers(startIdx int, num int) (users []*User, err error) { func GetAllUsers(startIdx int, num int, order string) (users []*User, err error) {
err = DB.Order("id desc").Limit(num).Offset(startIdx).Omit("password").Where("status != ?", common.UserStatusDeleted).Find(&users).Error query := DB.Limit(num).Offset(startIdx).Omit("password").Where("status != ?", common.UserStatusDeleted)
return users, err
switch order {
case "quota":
query = query.Order("quota desc")
case "used_quota":
query = query.Order("used_quota desc")
case "request_count":
query = query.Order("request_count desc")
default:
query = query.Order("id desc")
}
err = query.Find(&users).Error
return users, err
} }
func SearchUsers(keyword string) (users []*User, err error) { func SearchUsers(keyword string) (users []*User, err error) {

View File

@ -31,17 +31,17 @@ func notifyRootUser(subject string, content string) {
func DisableChannel(channelId int, channelName string, reason string) { func DisableChannel(channelId int, channelName string, reason string) {
model.UpdateChannelStatusById(channelId, common.ChannelStatusAutoDisabled) model.UpdateChannelStatusById(channelId, common.ChannelStatusAutoDisabled)
logger.SysLog(fmt.Sprintf("channel #%d has been disabled: %s", channelId, reason)) logger.SysLog(fmt.Sprintf("channel #%d has been disabled: %s", channelId, reason))
subject := fmt.Sprintf("道「%s」#%d已被禁用", channelName, channelId) subject := fmt.Sprintf("道「%s」#%d已被禁用", channelName, channelId)
content := fmt.Sprintf("道「%s」#%d已被禁用原因%s", channelName, channelId, reason) content := fmt.Sprintf("道「%s」#%d已被禁用原因%s", channelName, channelId, reason)
notifyRootUser(subject, content) notifyRootUser(subject, content)
} }
func MetricDisableChannel(channelId int, successRate float64) { func MetricDisableChannel(channelId int, successRate float64) {
model.UpdateChannelStatusById(channelId, common.ChannelStatusAutoDisabled) model.UpdateChannelStatusById(channelId, common.ChannelStatusAutoDisabled)
logger.SysLog(fmt.Sprintf("channel #%d has been disabled due to low success rate: %.2f", channelId, successRate*100)) logger.SysLog(fmt.Sprintf("channel #%d has been disabled due to low success rate: %.2f", channelId, successRate*100))
subject := fmt.Sprintf("道 #%d 已被禁用", channelId) subject := fmt.Sprintf("道 #%d 已被禁用", channelId)
content := fmt.Sprintf("该渠道在最近 %d 次调用中成功率为 %.2f%%,低于阈值 %.2f%%,因此被系统自动禁用。", content := fmt.Sprintf("该渠道#%d在最近 %d 次调用中成功率为 %.2f%%,低于阈值 %.2f%%,因此被系统自动禁用。",
config.MetricQueueSize, successRate*100, config.MetricSuccessRateThreshold*100) channelId, config.MetricQueueSize, successRate*100, config.MetricSuccessRateThreshold*100)
notifyRootUser(subject, content) notifyRootUser(subject, content)
} }
@ -49,7 +49,7 @@ func MetricDisableChannel(channelId int, successRate float64) {
func EnableChannel(channelId int, channelName string) { func EnableChannel(channelId int, channelName string) {
model.UpdateChannelStatusById(channelId, common.ChannelStatusEnabled) model.UpdateChannelStatusById(channelId, common.ChannelStatusEnabled)
logger.SysLog(fmt.Sprintf("channel #%d has been enabled", channelId)) logger.SysLog(fmt.Sprintf("channel #%d has been enabled", channelId))
subject := fmt.Sprintf("道「%s」#%d已被启用", channelName, channelId) subject := fmt.Sprintf("道「%s」#%d已被启用", channelName, channelId)
content := fmt.Sprintf("道「%s」#%d已被启用", channelName, channelId) content := fmt.Sprintf("道「%s」#%d已被启用", channelName, channelId)
notifyRootUser(subject, content) notifyRootUser(subject, content)
} }

View File

@ -48,7 +48,10 @@ func ConvertRequest(request model.GeneralOpenAIRequest) *ChatRequest {
MaxTokens: request.MaxTokens, MaxTokens: request.MaxTokens,
Temperature: request.Temperature, Temperature: request.Temperature,
TopP: request.TopP, TopP: request.TopP,
TopK: request.TopK,
ResultFormat: "message",
}, },
Tools: request.Tools,
} }
} }
@ -117,19 +120,11 @@ func embeddingResponseAli2OpenAI(response *EmbeddingResponse) *openai.EmbeddingR
} }
func responseAli2OpenAI(response *ChatResponse) *openai.TextResponse { func responseAli2OpenAI(response *ChatResponse) *openai.TextResponse {
choice := openai.TextResponseChoice{
Index: 0,
Message: model.Message{
Role: "assistant",
Content: response.Output.Text,
},
FinishReason: response.Output.FinishReason,
}
fullTextResponse := openai.TextResponse{ fullTextResponse := openai.TextResponse{
Id: response.RequestId, Id: response.RequestId,
Object: "chat.completion", Object: "chat.completion",
Created: helper.GetTimestamp(), Created: helper.GetTimestamp(),
Choices: []openai.TextResponseChoice{choice}, Choices: response.Output.Choices,
Usage: model.Usage{ Usage: model.Usage{
PromptTokens: response.Usage.InputTokens, PromptTokens: response.Usage.InputTokens,
CompletionTokens: response.Usage.OutputTokens, CompletionTokens: response.Usage.OutputTokens,
@ -140,10 +135,14 @@ func responseAli2OpenAI(response *ChatResponse) *openai.TextResponse {
} }
func streamResponseAli2OpenAI(aliResponse *ChatResponse) *openai.ChatCompletionsStreamResponse { func streamResponseAli2OpenAI(aliResponse *ChatResponse) *openai.ChatCompletionsStreamResponse {
if len(aliResponse.Output.Choices) == 0 {
return nil
}
aliChoice := aliResponse.Output.Choices[0]
var choice openai.ChatCompletionsStreamResponseChoice var choice openai.ChatCompletionsStreamResponseChoice
choice.Delta.Content = aliResponse.Output.Text choice.Delta = aliChoice.Message
if aliResponse.Output.FinishReason != "null" { if aliChoice.FinishReason != "null" {
finishReason := aliResponse.Output.FinishReason finishReason := aliChoice.FinishReason
choice.FinishReason = &finishReason choice.FinishReason = &finishReason
} }
response := openai.ChatCompletionsStreamResponse{ response := openai.ChatCompletionsStreamResponse{
@ -204,6 +203,9 @@ func StreamHandler(c *gin.Context, resp *http.Response) (*model.ErrorWithStatusC
usage.TotalTokens = aliResponse.Usage.InputTokens + aliResponse.Usage.OutputTokens usage.TotalTokens = aliResponse.Usage.InputTokens + aliResponse.Usage.OutputTokens
} }
response := streamResponseAli2OpenAI(&aliResponse) response := streamResponseAli2OpenAI(&aliResponse)
if response == nil {
return true
}
//response.Choices[0].Delta.Content = strings.TrimPrefix(response.Choices[0].Delta.Content, lastResponseText) //response.Choices[0].Delta.Content = strings.TrimPrefix(response.Choices[0].Delta.Content, lastResponseText)
//lastResponseText = aliResponse.Output.Text //lastResponseText = aliResponse.Output.Text
jsonResponse, err := json.Marshal(response) jsonResponse, err := json.Marshal(response)
@ -226,6 +228,7 @@ func StreamHandler(c *gin.Context, resp *http.Response) (*model.ErrorWithStatusC
} }
func Handler(c *gin.Context, resp *http.Response) (*model.ErrorWithStatusCode, *model.Usage) { func Handler(c *gin.Context, resp *http.Response) (*model.ErrorWithStatusCode, *model.Usage) {
ctx := c.Request.Context()
var aliResponse ChatResponse var aliResponse ChatResponse
responseBody, err := io.ReadAll(resp.Body) responseBody, err := io.ReadAll(resp.Body)
if err != nil { if err != nil {
@ -235,6 +238,7 @@ func Handler(c *gin.Context, resp *http.Response) (*model.ErrorWithStatusCode, *
if err != nil { if err != nil {
return openai.ErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil return openai.ErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil
} }
logger.Debugf(ctx, "response body: %s\n", responseBody)
err = json.Unmarshal(responseBody, &aliResponse) err = json.Unmarshal(responseBody, &aliResponse)
if err != nil { if err != nil {
return openai.ErrorWrapper(err, "unmarshal_response_body_failed", http.StatusInternalServerError), nil return openai.ErrorWrapper(err, "unmarshal_response_body_failed", http.StatusInternalServerError), nil

View File

@ -1,5 +1,10 @@
package ali package ali
import (
"github.com/songquanpeng/one-api/relay/channel/openai"
"github.com/songquanpeng/one-api/relay/model"
)
type Message struct { type Message struct {
Content string `json:"content"` Content string `json:"content"`
Role string `json:"role"` Role string `json:"role"`
@ -18,12 +23,14 @@ type Parameters struct {
IncrementalOutput bool `json:"incremental_output,omitempty"` IncrementalOutput bool `json:"incremental_output,omitempty"`
MaxTokens int `json:"max_tokens,omitempty"` MaxTokens int `json:"max_tokens,omitempty"`
Temperature float64 `json:"temperature,omitempty"` Temperature float64 `json:"temperature,omitempty"`
ResultFormat string `json:"result_format,omitempty"`
} }
type ChatRequest struct { type ChatRequest struct {
Model string `json:"model"` Model string `json:"model"`
Input Input `json:"input"` Input Input `json:"input"`
Parameters Parameters `json:"parameters,omitempty"` Parameters Parameters `json:"parameters,omitempty"`
Tools []model.Tool `json:"tools,omitempty"`
} }
type EmbeddingRequest struct { type EmbeddingRequest struct {
@ -62,8 +69,9 @@ type Usage struct {
} }
type Output struct { type Output struct {
Text string `json:"text"` //Text string `json:"text"`
FinishReason string `json:"finish_reason"` //FinishReason string `json:"finish_reason"`
Choices []openai.TextResponseChoice `json:"choices"`
} }
type ChatResponse struct { type ChatResponse struct {

View File

@ -38,6 +38,7 @@ func ConvertRequest(textRequest model.GeneralOpenAIRequest) *Request {
MaxTokens: textRequest.MaxTokens, MaxTokens: textRequest.MaxTokens,
Temperature: textRequest.Temperature, Temperature: textRequest.Temperature,
TopP: textRequest.TopP, TopP: textRequest.TopP,
TopK: textRequest.TopK,
Stream: textRequest.Stream, Stream: textRequest.Stream,
} }
if claudeRequest.MaxTokens == 0 { if claudeRequest.MaxTokens == 0 {

View File

@ -1,6 +1,8 @@
package gemini package gemini
// https://ai.google.dev/models/gemini
var ModelList = []string{ var ModelList = []string{
"gemini-pro", "gemini-1.0-pro-001", "gemini-pro", "gemini-1.0-pro-001", "gemini-1.5-pro",
"gemini-pro-vision", "gemini-1.0-pro-vision-001", "gemini-pro-vision", "gemini-1.0-pro-vision-001",
} }

View File

@ -3,13 +3,14 @@ package ollama
import ( import (
"errors" "errors"
"fmt" "fmt"
"io"
"net/http"
"github.com/gin-gonic/gin" "github.com/gin-gonic/gin"
"github.com/songquanpeng/one-api/relay/channel" "github.com/songquanpeng/one-api/relay/channel"
"github.com/songquanpeng/one-api/relay/constant" "github.com/songquanpeng/one-api/relay/constant"
"github.com/songquanpeng/one-api/relay/model" "github.com/songquanpeng/one-api/relay/model"
"github.com/songquanpeng/one-api/relay/util" "github.com/songquanpeng/one-api/relay/util"
"io"
"net/http"
) )
type Adaptor struct { type Adaptor struct {
@ -22,6 +23,9 @@ func (a *Adaptor) Init(meta *util.RelayMeta) {
func (a *Adaptor) GetRequestURL(meta *util.RelayMeta) (string, error) { func (a *Adaptor) GetRequestURL(meta *util.RelayMeta) (string, error) {
// https://github.com/ollama/ollama/blob/main/docs/api.md // https://github.com/ollama/ollama/blob/main/docs/api.md
fullRequestURL := fmt.Sprintf("%s/api/chat", meta.BaseURL) fullRequestURL := fmt.Sprintf("%s/api/chat", meta.BaseURL)
if meta.Mode == constant.RelayModeEmbeddings {
fullRequestURL = fmt.Sprintf("%s/api/embeddings", meta.BaseURL)
}
return fullRequestURL, nil return fullRequestURL, nil
} }
@ -37,7 +41,8 @@ func (a *Adaptor) ConvertRequest(c *gin.Context, relayMode int, request *model.G
} }
switch relayMode { switch relayMode {
case constant.RelayModeEmbeddings: case constant.RelayModeEmbeddings:
return nil, errors.New("not supported") ollamaEmbeddingRequest := ConvertEmbeddingRequest(*request)
return ollamaEmbeddingRequest, nil
default: default:
return ConvertRequest(*request), nil return ConvertRequest(*request), nil
} }
@ -51,7 +56,12 @@ func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, meta *util.Rel
if meta.IsStream { if meta.IsStream {
err, usage = StreamHandler(c, resp) err, usage = StreamHandler(c, resp)
} else { } else {
err, usage = Handler(c, resp) switch meta.Mode {
case constant.RelayModeEmbeddings:
err, usage = EmbeddingHandler(c, resp)
default:
err, usage = Handler(c, resp)
}
} }
return return
} }

View File

@ -5,6 +5,10 @@ import (
"context" "context"
"encoding/json" "encoding/json"
"fmt" "fmt"
"io"
"net/http"
"strings"
"github.com/gin-gonic/gin" "github.com/gin-gonic/gin"
"github.com/songquanpeng/one-api/common" "github.com/songquanpeng/one-api/common"
"github.com/songquanpeng/one-api/common/helper" "github.com/songquanpeng/one-api/common/helper"
@ -12,9 +16,6 @@ import (
"github.com/songquanpeng/one-api/relay/channel/openai" "github.com/songquanpeng/one-api/relay/channel/openai"
"github.com/songquanpeng/one-api/relay/constant" "github.com/songquanpeng/one-api/relay/constant"
"github.com/songquanpeng/one-api/relay/model" "github.com/songquanpeng/one-api/relay/model"
"io"
"net/http"
"strings"
) )
func ConvertRequest(request model.GeneralOpenAIRequest) *ChatRequest { func ConvertRequest(request model.GeneralOpenAIRequest) *ChatRequest {
@ -139,6 +140,64 @@ func StreamHandler(c *gin.Context, resp *http.Response) (*model.ErrorWithStatusC
return nil, &usage return nil, &usage
} }
func ConvertEmbeddingRequest(request model.GeneralOpenAIRequest) *EmbeddingRequest {
return &EmbeddingRequest{
Model: request.Model,
Prompt: strings.Join(request.ParseInput(), " "),
}
}
func EmbeddingHandler(c *gin.Context, resp *http.Response) (*model.ErrorWithStatusCode, *model.Usage) {
var ollamaResponse EmbeddingResponse
err := json.NewDecoder(resp.Body).Decode(&ollamaResponse)
if err != nil {
return openai.ErrorWrapper(err, "unmarshal_response_body_failed", http.StatusInternalServerError), nil
}
err = resp.Body.Close()
if err != nil {
return openai.ErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil
}
if ollamaResponse.Error != "" {
return &model.ErrorWithStatusCode{
Error: model.Error{
Message: ollamaResponse.Error,
Type: "ollama_error",
Param: "",
Code: "ollama_error",
},
StatusCode: resp.StatusCode,
}, nil
}
fullTextResponse := embeddingResponseOllama2OpenAI(&ollamaResponse)
jsonResponse, err := json.Marshal(fullTextResponse)
if err != nil {
return openai.ErrorWrapper(err, "marshal_response_body_failed", http.StatusInternalServerError), nil
}
c.Writer.Header().Set("Content-Type", "application/json")
c.Writer.WriteHeader(resp.StatusCode)
_, err = c.Writer.Write(jsonResponse)
return nil, &fullTextResponse.Usage
}
func embeddingResponseOllama2OpenAI(response *EmbeddingResponse) *openai.EmbeddingResponse {
openAIEmbeddingResponse := openai.EmbeddingResponse{
Object: "list",
Data: make([]openai.EmbeddingResponseItem, 0, 1),
Model: "text-embedding-v1",
Usage: model.Usage{TotalTokens: 0},
}
openAIEmbeddingResponse.Data = append(openAIEmbeddingResponse.Data, openai.EmbeddingResponseItem{
Object: `embedding`,
Index: 0,
Embedding: response.Embedding,
})
return &openAIEmbeddingResponse
}
func Handler(c *gin.Context, resp *http.Response) (*model.ErrorWithStatusCode, *model.Usage) { func Handler(c *gin.Context, resp *http.Response) (*model.ErrorWithStatusCode, *model.Usage) {
ctx := context.TODO() ctx := context.TODO()
var ollamaResponse ChatResponse var ollamaResponse ChatResponse

View File

@ -35,3 +35,13 @@ type ChatResponse struct {
EvalDuration int `json:"eval_duration,omitempty"` EvalDuration int `json:"eval_duration,omitempty"`
Error string `json:"error,omitempty"` Error string `json:"error,omitempty"`
} }
type EmbeddingRequest struct {
Model string `json:"model"`
Prompt string `json:"prompt"`
}
type EmbeddingResponse struct {
Error string `json:"error,omitempty"`
Embedding []float64 `json:"embedding,omitempty"`
}

View File

@ -31,11 +31,8 @@ func (a *Adaptor) GetRequestURL(meta *util.RelayMeta) (string, error) {
task := strings.TrimPrefix(requestURL, "/v1/") task := strings.TrimPrefix(requestURL, "/v1/")
model_ := meta.ActualModelName model_ := meta.ActualModelName
model_ = strings.Replace(model_, ".", "", -1) model_ = strings.Replace(model_, ".", "", -1)
// https://github.com/songquanpeng/one-api/issues/67 //https://github.com/songquanpeng/one-api/issues/1191
model_ = strings.TrimSuffix(model_, "-0301") // {your endpoint}/openai/deployments/{your azure_model}/chat/completions?api-version={api_version}
model_ = strings.TrimSuffix(model_, "-0314")
model_ = strings.TrimSuffix(model_, "-0613")
requestURL = fmt.Sprintf("/openai/deployments/%s/%s", model_, task) requestURL = fmt.Sprintf("/openai/deployments/%s/%s", model_, task)
return util.GetFullRequestURL(meta.BaseURL, requestURL, meta.ChannelType), nil return util.GetFullRequestURL(meta.BaseURL, requestURL, meta.ChannelType), nil
case common.ChannelTypeMinimax: case common.ChannelTypeMinimax:
@ -73,8 +70,10 @@ func (a *Adaptor) DoRequest(c *gin.Context, meta *util.RelayMeta, requestBody io
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, meta *util.RelayMeta) (usage *model.Usage, err *model.ErrorWithStatusCode) { func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, meta *util.RelayMeta) (usage *model.Usage, err *model.ErrorWithStatusCode) {
if meta.IsStream { if meta.IsStream {
var responseText string var responseText string
err, responseText, _ = StreamHandler(c, resp, meta.Mode) err, responseText, usage = StreamHandler(c, resp, meta.Mode)
usage = ResponseText2Usage(responseText, meta.ActualModelName, meta.PromptTokens) if usage == nil {
usage = ResponseText2Usage(responseText, meta.ActualModelName, meta.PromptTokens)
}
} else { } else {
err, usage = Handler(c, resp, meta.PromptTokens, meta.ActualModelName) err, usage = Handler(c, resp, meta.PromptTokens, meta.ActualModelName)
} }

View File

@ -6,6 +6,7 @@ import (
"encoding/json" "encoding/json"
"github.com/gin-gonic/gin" "github.com/gin-gonic/gin"
"github.com/songquanpeng/one-api/common" "github.com/songquanpeng/one-api/common"
"github.com/songquanpeng/one-api/common/conv"
"github.com/songquanpeng/one-api/common/logger" "github.com/songquanpeng/one-api/common/logger"
"github.com/songquanpeng/one-api/relay/constant" "github.com/songquanpeng/one-api/relay/constant"
"github.com/songquanpeng/one-api/relay/model" "github.com/songquanpeng/one-api/relay/model"
@ -53,7 +54,7 @@ func StreamHandler(c *gin.Context, resp *http.Response, relayMode int) (*model.E
continue // just ignore the error continue // just ignore the error
} }
for _, choice := range streamResponse.Choices { for _, choice := range streamResponse.Choices {
responseText += choice.Delta.Content responseText += conv.AsString(choice.Delta.Content)
} }
if streamResponse.Usage != nil { if streamResponse.Usage != nil {
usage = streamResponse.Usage usage = streamResponse.Usage

View File

@ -118,12 +118,9 @@ type ImageResponse struct {
} }
type ChatCompletionsStreamResponseChoice struct { type ChatCompletionsStreamResponseChoice struct {
Index int `json:"index"` Index int `json:"index"`
Delta struct { Delta model.Message `json:"delta"`
Content string `json:"content"` FinishReason *string `json:"finish_reason,omitempty"`
Role string `json:"role,omitempty"`
} `json:"delta"`
FinishReason *string `json:"finish_reason,omitempty"`
} }
type ChatCompletionsStreamResponse struct { type ChatCompletionsStreamResponse struct {

View File

@ -10,6 +10,7 @@ import (
"fmt" "fmt"
"github.com/gin-gonic/gin" "github.com/gin-gonic/gin"
"github.com/songquanpeng/one-api/common" "github.com/songquanpeng/one-api/common"
"github.com/songquanpeng/one-api/common/conv"
"github.com/songquanpeng/one-api/common/helper" "github.com/songquanpeng/one-api/common/helper"
"github.com/songquanpeng/one-api/common/logger" "github.com/songquanpeng/one-api/common/logger"
"github.com/songquanpeng/one-api/relay/channel/openai" "github.com/songquanpeng/one-api/relay/channel/openai"
@ -129,7 +130,7 @@ func StreamHandler(c *gin.Context, resp *http.Response) (*model.ErrorWithStatusC
} }
response := streamResponseTencent2OpenAI(&TencentResponse) response := streamResponseTencent2OpenAI(&TencentResponse)
if len(response.Choices) != 0 { if len(response.Choices) != 0 {
responseText += response.Choices[0].Delta.Content responseText += conv.AsString(response.Choices[0].Delta.Content)
} }
jsonResponse, err := json.Marshal(response) jsonResponse, err := json.Marshal(response)
if err != nil { if err != nil {

View File

@ -26,7 +26,11 @@ import (
func requestOpenAI2Xunfei(request model.GeneralOpenAIRequest, xunfeiAppId string, domain string) *ChatRequest { func requestOpenAI2Xunfei(request model.GeneralOpenAIRequest, xunfeiAppId string, domain string) *ChatRequest {
messages := make([]Message, 0, len(request.Messages)) messages := make([]Message, 0, len(request.Messages))
var lastToolCalls []model.Tool
for _, message := range request.Messages { for _, message := range request.Messages {
if message.ToolCalls != nil {
lastToolCalls = message.ToolCalls
}
messages = append(messages, Message{ messages = append(messages, Message{
Role: message.Role, Role: message.Role,
Content: message.StringContent(), Content: message.StringContent(),
@ -39,9 +43,33 @@ func requestOpenAI2Xunfei(request model.GeneralOpenAIRequest, xunfeiAppId string
xunfeiRequest.Parameter.Chat.TopK = request.N xunfeiRequest.Parameter.Chat.TopK = request.N
xunfeiRequest.Parameter.Chat.MaxTokens = request.MaxTokens xunfeiRequest.Parameter.Chat.MaxTokens = request.MaxTokens
xunfeiRequest.Payload.Message.Text = messages xunfeiRequest.Payload.Message.Text = messages
if len(lastToolCalls) != 0 {
for _, toolCall := range lastToolCalls {
xunfeiRequest.Payload.Functions.Text = append(xunfeiRequest.Payload.Functions.Text, toolCall.Function)
}
}
return &xunfeiRequest return &xunfeiRequest
} }
func getToolCalls(response *ChatResponse) []model.Tool {
var toolCalls []model.Tool
if len(response.Payload.Choices.Text) == 0 {
return toolCalls
}
item := response.Payload.Choices.Text[0]
if item.FunctionCall == nil {
return toolCalls
}
toolCall := model.Tool{
Id: fmt.Sprintf("call_%s", helper.GetUUID()),
Type: "function",
Function: *item.FunctionCall,
}
toolCalls = append(toolCalls, toolCall)
return toolCalls
}
func responseXunfei2OpenAI(response *ChatResponse) *openai.TextResponse { func responseXunfei2OpenAI(response *ChatResponse) *openai.TextResponse {
if len(response.Payload.Choices.Text) == 0 { if len(response.Payload.Choices.Text) == 0 {
response.Payload.Choices.Text = []ChatResponseTextItem{ response.Payload.Choices.Text = []ChatResponseTextItem{
@ -53,8 +81,9 @@ func responseXunfei2OpenAI(response *ChatResponse) *openai.TextResponse {
choice := openai.TextResponseChoice{ choice := openai.TextResponseChoice{
Index: 0, Index: 0,
Message: model.Message{ Message: model.Message{
Role: "assistant", Role: "assistant",
Content: response.Payload.Choices.Text[0].Content, Content: response.Payload.Choices.Text[0].Content,
ToolCalls: getToolCalls(response),
}, },
FinishReason: constant.StopFinishReason, FinishReason: constant.StopFinishReason,
} }
@ -78,6 +107,7 @@ func streamResponseXunfei2OpenAI(xunfeiResponse *ChatResponse) *openai.ChatCompl
} }
var choice openai.ChatCompletionsStreamResponseChoice var choice openai.ChatCompletionsStreamResponseChoice
choice.Delta.Content = xunfeiResponse.Payload.Choices.Text[0].Content choice.Delta.Content = xunfeiResponse.Payload.Choices.Text[0].Content
choice.Delta.ToolCalls = getToolCalls(xunfeiResponse)
if xunfeiResponse.Payload.Choices.Status == 2 { if xunfeiResponse.Payload.Choices.Status == 2 {
choice.FinishReason = &constant.StopFinishReason choice.FinishReason = &constant.StopFinishReason
} }
@ -121,7 +151,7 @@ func StreamHandler(c *gin.Context, textRequest model.GeneralOpenAIRequest, appId
domain, authUrl := getXunfeiAuthUrl(c, apiKey, apiSecret, textRequest.Model) domain, authUrl := getXunfeiAuthUrl(c, apiKey, apiSecret, textRequest.Model)
dataChan, stopChan, err := xunfeiMakeRequest(textRequest, domain, authUrl, appId) dataChan, stopChan, err := xunfeiMakeRequest(textRequest, domain, authUrl, appId)
if err != nil { if err != nil {
return openai.ErrorWrapper(err, "make xunfei request err", http.StatusInternalServerError), nil return openai.ErrorWrapper(err, "xunfei_request_failed", http.StatusInternalServerError), nil
} }
common.SetEventStreamHeaders(c) common.SetEventStreamHeaders(c)
var usage model.Usage var usage model.Usage
@ -151,7 +181,7 @@ func Handler(c *gin.Context, textRequest model.GeneralOpenAIRequest, appId strin
domain, authUrl := getXunfeiAuthUrl(c, apiKey, apiSecret, textRequest.Model) domain, authUrl := getXunfeiAuthUrl(c, apiKey, apiSecret, textRequest.Model)
dataChan, stopChan, err := xunfeiMakeRequest(textRequest, domain, authUrl, appId) dataChan, stopChan, err := xunfeiMakeRequest(textRequest, domain, authUrl, appId)
if err != nil { if err != nil {
return openai.ErrorWrapper(err, "make xunfei request err", http.StatusInternalServerError), nil return openai.ErrorWrapper(err, "xunfei_request_failed", http.StatusInternalServerError), nil
} }
var usage model.Usage var usage model.Usage
var content string var content string
@ -171,11 +201,7 @@ func Handler(c *gin.Context, textRequest model.GeneralOpenAIRequest, appId strin
} }
} }
if len(xunfeiResponse.Payload.Choices.Text) == 0 { if len(xunfeiResponse.Payload.Choices.Text) == 0 {
xunfeiResponse.Payload.Choices.Text = []ChatResponseTextItem{ return openai.ErrorWrapper(err, "xunfei_empty_response_detected", http.StatusInternalServerError), nil
{
Content: "",
},
}
} }
xunfeiResponse.Payload.Choices.Text[0].Content = content xunfeiResponse.Payload.Choices.Text[0].Content = content
@ -202,15 +228,21 @@ func xunfeiMakeRequest(textRequest model.GeneralOpenAIRequest, domain, authUrl,
if err != nil { if err != nil {
return nil, nil, err return nil, nil, err
} }
_, msg, err := conn.ReadMessage()
if err != nil {
return nil, nil, err
}
dataChan := make(chan ChatResponse) dataChan := make(chan ChatResponse)
stopChan := make(chan bool) stopChan := make(chan bool)
go func() { go func() {
for { for {
_, msg, err := conn.ReadMessage() if msg == nil {
if err != nil { _, msg, err = conn.ReadMessage()
logger.SysError("error reading stream response: " + err.Error()) if err != nil {
break logger.SysError("error reading stream response: " + err.Error())
break
}
} }
var response ChatResponse var response ChatResponse
err = json.Unmarshal(msg, &response) err = json.Unmarshal(msg, &response)
@ -218,6 +250,7 @@ func xunfeiMakeRequest(textRequest model.GeneralOpenAIRequest, domain, authUrl,
logger.SysError("error unmarshalling stream response: " + err.Error()) logger.SysError("error unmarshalling stream response: " + err.Error())
break break
} }
msg = nil
dataChan <- response dataChan <- response
if response.Payload.Choices.Status == 2 { if response.Payload.Choices.Status == 2 {
err := conn.Close() err := conn.Close()

View File

@ -26,13 +26,18 @@ type ChatRequest struct {
Message struct { Message struct {
Text []Message `json:"text"` Text []Message `json:"text"`
} `json:"message"` } `json:"message"`
Functions struct {
Text []model.Function `json:"text,omitempty"`
} `json:"functions"`
} `json:"payload"` } `json:"payload"`
} }
type ChatResponseTextItem struct { type ChatResponseTextItem struct {
Content string `json:"content"` Content string `json:"content"`
Role string `json:"role"` Role string `json:"role"`
Index int `json:"index"` Index int `json:"index"`
ContentType string `json:"content_type"`
FunctionCall *model.Function `json:"function_call"`
} }
type ChatResponse struct { type ChatResponse struct {

View File

@ -83,6 +83,24 @@ func RelayAudioHelper(c *gin.Context, relayMode int) *relaymodel.ErrorWithStatus
return openai.ErrorWrapper(err, "pre_consume_token_quota_failed", http.StatusForbidden) return openai.ErrorWrapper(err, "pre_consume_token_quota_failed", http.StatusForbidden)
} }
} }
succeed := false
defer func() {
if succeed {
return
}
if preConsumedQuota > 0 {
// we need to roll back the pre-consumed quota
defer func(ctx context.Context) {
go func() {
// negative means add quota back for token & user
err := model.PostConsumeTokenQuota(tokenId, -preConsumedQuota)
if err != nil {
logger.Error(ctx, fmt.Sprintf("error rollback pre-consumed quota: %s", err.Error()))
}
}()
}(c.Request.Context())
}
}()
// map model name // map model name
modelMapping := c.GetString("model_mapping") modelMapping := c.GetString("model_mapping")
@ -104,10 +122,15 @@ func RelayAudioHelper(c *gin.Context, relayMode int) *relaymodel.ErrorWithStatus
} }
fullRequestURL := util.GetFullRequestURL(baseURL, requestURL, channelType) fullRequestURL := util.GetFullRequestURL(baseURL, requestURL, channelType)
if relayMode == constant.RelayModeAudioTranscription && channelType == common.ChannelTypeAzure { if channelType == common.ChannelTypeAzure {
// https://learn.microsoft.com/en-us/azure/ai-services/openai/whisper-quickstart?tabs=command-line#rest-api
apiVersion := util.GetAzureAPIVersion(c) apiVersion := util.GetAzureAPIVersion(c)
fullRequestURL = fmt.Sprintf("%s/openai/deployments/%s/audio/transcriptions?api-version=%s", baseURL, audioModel, apiVersion) if relayMode == constant.RelayModeAudioTranscription {
// https://learn.microsoft.com/en-us/azure/ai-services/openai/whisper-quickstart?tabs=command-line#rest-api
fullRequestURL = fmt.Sprintf("%s/openai/deployments/%s/audio/transcriptions?api-version=%s", baseURL, audioModel, apiVersion)
} else if relayMode == constant.RelayModeAudioSpeech {
// https://learn.microsoft.com/en-us/azure/ai-services/openai/text-to-speech-quickstart?tabs=command-line#rest-api
fullRequestURL = fmt.Sprintf("%s/openai/deployments/%s/audio/speech?api-version=%s", baseURL, audioModel, apiVersion)
}
} }
requestBody := &bytes.Buffer{} requestBody := &bytes.Buffer{}
@ -123,7 +146,7 @@ func RelayAudioHelper(c *gin.Context, relayMode int) *relaymodel.ErrorWithStatus
return openai.ErrorWrapper(err, "new_request_failed", http.StatusInternalServerError) return openai.ErrorWrapper(err, "new_request_failed", http.StatusInternalServerError)
} }
if relayMode == constant.RelayModeAudioTranscription && channelType == common.ChannelTypeAzure { if (relayMode == constant.RelayModeAudioTranscription || relayMode == constant.RelayModeAudioSpeech) && channelType == common.ChannelTypeAzure {
// https://learn.microsoft.com/en-us/azure/ai-services/openai/whisper-quickstart?tabs=command-line#rest-api // https://learn.microsoft.com/en-us/azure/ai-services/openai/whisper-quickstart?tabs=command-line#rest-api
apiKey := c.Request.Header.Get("Authorization") apiKey := c.Request.Header.Get("Authorization")
apiKey = strings.TrimPrefix(apiKey, "Bearer ") apiKey = strings.TrimPrefix(apiKey, "Bearer ")
@ -188,20 +211,9 @@ func RelayAudioHelper(c *gin.Context, relayMode int) *relaymodel.ErrorWithStatus
resp.Body = io.NopCloser(bytes.NewBuffer(responseBody)) resp.Body = io.NopCloser(bytes.NewBuffer(responseBody))
} }
if resp.StatusCode != http.StatusOK { if resp.StatusCode != http.StatusOK {
if preConsumedQuota > 0 {
// we need to roll back the pre-consumed quota
defer func(ctx context.Context) {
go func() {
// negative means add quota back for token & user
err := model.PostConsumeTokenQuota(tokenId, -preConsumedQuota)
if err != nil {
logger.Error(ctx, fmt.Sprintf("error rollback pre-consumed quota: %s", err.Error()))
}
}()
}(c.Request.Context())
}
return util.RelayErrorHandler(resp) return util.RelayErrorHandler(resp)
} }
succeed = true
quotaDelta := quota - preConsumedQuota quotaDelta := quota - preConsumedQuota
defer func(ctx context.Context) { defer func(ctx context.Context) {
go util.PostConsumeQuota(ctx, tokenId, quotaDelta, quota, userId, channelId, modelRatio, groupRatio, audioModel, tokenName) go util.PostConsumeQuota(ctx, tokenId, quotaDelta, quota, userId, channelId, modelRatio, groupRatio, audioModel, tokenName)

View File

@ -61,7 +61,7 @@ func RelayImageHelper(c *gin.Context, relayMode int) *relaymodel.ErrorWithStatus
if meta.ChannelType == common.ChannelTypeAzure { if meta.ChannelType == common.ChannelTypeAzure {
// https://learn.microsoft.com/en-us/azure/ai-services/openai/dall-e-quickstart?tabs=dalle3%2Ccommand-line&pivots=rest-api // https://learn.microsoft.com/en-us/azure/ai-services/openai/dall-e-quickstart?tabs=dalle3%2Ccommand-line&pivots=rest-api
apiVersion := util.GetAzureAPIVersion(c) apiVersion := util.GetAzureAPIVersion(c)
// https://{resource_name}.openai.azure.com/openai/deployments/dall-e-3/images/generations?api-version=2023-06-01-preview // https://{resource_name}.openai.azure.com/openai/deployments/dall-e-3/images/generations?api-version=2024-03-01-preview
fullRequestURL = fmt.Sprintf("%s/openai/deployments/%s/images/generations?api-version=%s", meta.BaseURL, imageRequest.Model, apiVersion) fullRequestURL = fmt.Sprintf("%s/openai/deployments/%s/images/generations?api-version=%s", meta.BaseURL, imageRequest.Model, apiVersion)
} }

View File

@ -5,25 +5,29 @@ type ResponseFormat struct {
} }
type GeneralOpenAIRequest struct { type GeneralOpenAIRequest struct {
Model string `json:"model,omitempty"`
Messages []Message `json:"messages,omitempty"` Messages []Message `json:"messages,omitempty"`
Prompt any `json:"prompt,omitempty"` Model string `json:"model,omitempty"`
Stream bool `json:"stream,omitempty"`
MaxTokens int `json:"max_tokens,omitempty"`
Temperature float64 `json:"temperature,omitempty"`
TopP float64 `json:"top_p,omitempty"`
N int `json:"n,omitempty"`
Input any `json:"input,omitempty"`
Instruction string `json:"instruction,omitempty"`
Size string `json:"size,omitempty"`
Functions any `json:"functions,omitempty"`
FrequencyPenalty float64 `json:"frequency_penalty,omitempty"` FrequencyPenalty float64 `json:"frequency_penalty,omitempty"`
MaxTokens int `json:"max_tokens,omitempty"`
N int `json:"n,omitempty"`
PresencePenalty float64 `json:"presence_penalty,omitempty"` PresencePenalty float64 `json:"presence_penalty,omitempty"`
ResponseFormat *ResponseFormat `json:"response_format,omitempty"` ResponseFormat *ResponseFormat `json:"response_format,omitempty"`
Seed float64 `json:"seed,omitempty"` Seed float64 `json:"seed,omitempty"`
Tools any `json:"tools,omitempty"` Stream bool `json:"stream,omitempty"`
Temperature float64 `json:"temperature,omitempty"`
TopP float64 `json:"top_p,omitempty"`
TopK int `json:"top_k,omitempty"`
Tools []Tool `json:"tools,omitempty"`
ToolChoice any `json:"tool_choice,omitempty"` ToolChoice any `json:"tool_choice,omitempty"`
FunctionCall any `json:"function_call,omitempty"`
Functions any `json:"functions,omitempty"`
User string `json:"user,omitempty"` User string `json:"user,omitempty"`
Prompt any `json:"prompt,omitempty"`
Input any `json:"input,omitempty"`
EncodingFormat string `json:"encoding_format,omitempty"`
Dimensions int `json:"dimensions,omitempty"`
Instruction string `json:"instruction,omitempty"`
Size string `json:"size,omitempty"`
} }
func (r GeneralOpenAIRequest) ParseInput() []string { func (r GeneralOpenAIRequest) ParseInput() []string {

View File

@ -1,9 +1,10 @@
package model package model
type Message struct { type Message struct {
Role string `json:"role"` Role string `json:"role,omitempty"`
Content any `json:"content"` Content any `json:"content,omitempty"`
Name *string `json:"name,omitempty"` Name *string `json:"name,omitempty"`
ToolCalls []Tool `json:"tool_calls,omitempty"`
} }
func (m Message) IsStringContent() bool { func (m Message) IsStringContent() bool {

14
relay/model/tool.go Normal file
View File

@ -0,0 +1,14 @@
package model
type Tool struct {
Id string `json:"id,omitempty"`
Type string `json:"type"`
Function Function `json:"function"`
}
type Function struct {
Description string `json:"description,omitempty"`
Name string `json:"name"`
Parameters any `json:"parameters,omitempty"` // request
Arguments any `json:"arguments,omitempty"` // response
}

View File

@ -9,7 +9,7 @@
1. 在 `web` 文件夹下新建一个文件夹,文件夹名为主题名。 1. 在 `web` 文件夹下新建一个文件夹,文件夹名为主题名。
2. 把你的主题文件放到这个文件夹下。 2. 把你的主题文件放到这个文件夹下。
3. 修改你的 `package.json` 文件,把 `build` 命令改为:`"build": "react-scripts build && mv -f build ../build/default"`,其中 `default` 为你的主题名。 3. 修改你的 `package.json` 文件,把 `build` 命令改为:`"build": "react-scripts build && mv -f build ../build/default"`,其中 `default` 为你的主题名。
4. 修改 `common/constants.go` 中的 `ValidThemes`,把你的主题名称注册进去。 4. 修改 `common/config/config.go` 中的 `ValidThemes`,把你的主题名称注册进去。
5. 修改 `web/THEMES` 文件,这里也需要同步修改。 5. 修改 `web/THEMES` 文件,这里也需要同步修改。
## 主题列表 ## 主题列表

View File

@ -437,7 +437,7 @@ const ChannelsTable = () => {
if (success) { if (success) {
record.response_time = time * 1000; record.response_time = time * 1000;
record.test_time = Date.now() / 1000; record.test_time = Date.now() / 1000;
showInfo(`${record.name} 测试成功,耗时 ${time.toFixed(2)} 秒。`); showInfo(`${record.name} 测试成功,耗时 ${time.toFixed(2)} 秒。`);
} else { } else {
showError(message); showError(message);
} }
@ -447,7 +447,7 @@ const ChannelsTable = () => {
const res = await API.get(`/api/channel/test?scope=${scope}`); const res = await API.get(`/api/channel/test?scope=${scope}`);
const { success, message } = res.data; const { success, message } = res.data;
if (success) { if (success) {
showInfo('已成功开始测试道,请刷新页面查看结果。'); showInfo('已成功开始测试道,请刷新页面查看结果。');
} else { } else {
showError(message); showError(message);
} }
@ -470,7 +470,7 @@ const ChannelsTable = () => {
if (success) { if (success) {
record.balance = balance; record.balance = balance;
record.balance_updated_time = Date.now() / 1000; record.balance_updated_time = Date.now() / 1000;
showInfo(`${record.name} 余额更新成功!`); showInfo(`${record.name} 余额更新成功!`);
} else { } else {
showError(message); showError(message);
} }
@ -481,7 +481,7 @@ const ChannelsTable = () => {
const res = await API.get(`/api/channel/update_balance`); const res = await API.get(`/api/channel/update_balance`);
const { success, message } = res.data; const { success, message } = res.data;
if (success) { if (success) {
showInfo('已更新完毕所有已启用道余额!'); showInfo('已更新完毕所有已启用道余额!');
} else { } else {
showError(message); showError(message);
} }
@ -490,7 +490,7 @@ const ChannelsTable = () => {
const batchDeleteChannels = async () => { const batchDeleteChannels = async () => {
if (selectedChannels.length === 0) { if (selectedChannels.length === 0) {
showError('请先选择要删除的道!'); showError('请先选择要删除的道!');
return; return;
} }
setLoading(true); setLoading(true);
@ -501,7 +501,7 @@ const ChannelsTable = () => {
const res = await API.post(`/api/channel/batch`, { ids: ids }); const res = await API.post(`/api/channel/batch`, { ids: ids });
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
showSuccess(`已删除 ${data}道!`); showSuccess(`已删除 ${data}道!`);
await refresh(); await refresh();
} else { } else {
showError(message); showError(message);
@ -513,7 +513,7 @@ const ChannelsTable = () => {
const res = await API.post(`/api/channel/fix`); const res = await API.post(`/api/channel/fix`);
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
showSuccess(`已修复 ${data}道!`); showSuccess(`已修复 ${data}道!`);
await refresh(); await refresh();
} else { } else {
showError(message); showError(message);
@ -633,7 +633,7 @@ const ChannelsTable = () => {
onConfirm={() => { testChannels("all") }} onConfirm={() => { testChannels("all") }}
position={isMobile() ? 'top' : 'left'} position={isMobile() ? 'top' : 'left'}
> >
<Button theme="light" type="warning" style={{ marginRight: 8 }}>测试所有</Button> <Button theme="light" type="warning" style={{ marginRight: 8 }}>测试所有</Button>
</Popconfirm> </Popconfirm>
<Popconfirm <Popconfirm
title="确定?" title="确定?"
@ -648,16 +648,16 @@ const ChannelsTable = () => {
okType={'secondary'} okType={'secondary'}
onConfirm={updateAllChannelsBalance} onConfirm={updateAllChannelsBalance}
> >
<Button theme="light" type="secondary" style={{ marginRight: 8 }}>更新所有已启用道余额</Button> <Button theme="light" type="secondary" style={{ marginRight: 8 }}>更新所有已启用道余额</Button>
</Popconfirm> */} </Popconfirm> */}
<Popconfirm <Popconfirm
title="确定是否要删除禁用道?" title="确定是否要删除禁用道?"
content="此修改将不可逆" content="此修改将不可逆"
okType={'danger'} okType={'danger'}
onConfirm={deleteAllDisabledChannels} onConfirm={deleteAllDisabledChannels}
position={isMobile() ? 'top' : 'left'} position={isMobile() ? 'top' : 'left'}
> >
<Button theme="light" type="danger" style={{ marginRight: 8 }}>删除禁用</Button> <Button theme="light" type="danger" style={{ marginRight: 8 }}>删除禁用</Button>
</Popconfirm> </Popconfirm>
<Button theme="light" type="primary" style={{ marginRight: 8 }} onClick={refresh}>刷新</Button> <Button theme="light" type="primary" style={{ marginRight: 8 }} onClick={refresh}>刷新</Button>
@ -673,7 +673,7 @@ const ChannelsTable = () => {
setEnableBatchDelete(v); setEnableBatchDelete(v);
}}></Switch> }}></Switch>
<Popconfirm <Popconfirm
title="确定是否要删除所选道?" title="确定是否要删除所选道?"
content="此修改将不可逆" content="此修改将不可逆"
okType={'danger'} okType={'danger'}
onConfirm={batchDeleteChannels} onConfirm={batchDeleteChannels}
@ -681,7 +681,7 @@ const ChannelsTable = () => {
position={'top'} position={'top'}
> >
<Button disabled={!enableBatchDelete} theme="light" type="danger" <Button disabled={!enableBatchDelete} theme="light" type="danger"
style={{ marginRight: 8 }}>删除所选</Button> style={{ marginRight: 8 }}>删除所选</Button>
</Popconfirm> </Popconfirm>
<Popconfirm <Popconfirm
title="确定是否要修复数据库一致性?" title="确定是否要修复数据库一致性?"

View File

@ -261,7 +261,7 @@ const OperationSetting = () => {
value={inputs.ChannelDisableThreshold} value={inputs.ChannelDisableThreshold}
type='number' type='number'
min='0' min='0'
placeholder='单位秒,当运行通道全部测试时,超过此时间将自动禁用通道' placeholder='单位秒,当运行渠道全部测试时,超过此时间将自动禁用渠道'
/> />
<Form.Input <Form.Input
label='额度提醒阈值' label='额度提醒阈值'
@ -277,13 +277,13 @@ const OperationSetting = () => {
<Form.Group inline> <Form.Group inline>
<Form.Checkbox <Form.Checkbox
checked={inputs.AutomaticDisableChannelEnabled === 'true'} checked={inputs.AutomaticDisableChannelEnabled === 'true'}
label='失败时自动禁用道' label='失败时自动禁用道'
name='AutomaticDisableChannelEnabled' name='AutomaticDisableChannelEnabled'
onChange={handleInputChange} onChange={handleInputChange}
/> />
<Form.Checkbox <Form.Checkbox
checked={inputs.AutomaticEnableChannelEnabled === 'true'} checked={inputs.AutomaticEnableChannelEnabled === 'true'}
label='成功时自动启用道' label='成功时自动启用道'
name='AutomaticEnableChannelEnabled' name='AutomaticEnableChannelEnabled'
onChange={handleInputChange} onChange={handleInputChange}
/> />

View File

@ -247,6 +247,8 @@ const TokensTable = () => {
const [editingToken, setEditingToken] = useState({ const [editingToken, setEditingToken] = useState({
id: undefined id: undefined
}); });
const [orderBy, setOrderBy] = useState('');
const [dropdownVisible, setDropdownVisible] = useState(false);
const closeEdit = () => { const closeEdit = () => {
setShowEdit(false); setShowEdit(false);
@ -269,7 +271,7 @@ const TokensTable = () => {
let pageData = tokens.slice((activePage - 1) * pageSize, activePage * pageSize); let pageData = tokens.slice((activePage - 1) * pageSize, activePage * pageSize);
const loadTokens = async (startIdx) => { const loadTokens = async (startIdx) => {
setLoading(true); setLoading(true);
const res = await API.get(`/api/token/?p=${startIdx}&size=${pageSize}`); const res = await API.get(`/api/token/?p=${startIdx}&size=${pageSize}&order=${orderBy}`);
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
if (startIdx === 0) { if (startIdx === 0) {
@ -289,7 +291,7 @@ const TokensTable = () => {
(async () => { (async () => {
if (activePage === Math.ceil(tokens.length / pageSize) + 1) { if (activePage === Math.ceil(tokens.length / pageSize) + 1) {
// In this case we have to load more data and then append them. // In this case we have to load more data and then append them.
await loadTokens(activePage - 1); await loadTokens(activePage - 1, orderBy);
} }
setActivePage(activePage); setActivePage(activePage);
})(); })();
@ -392,12 +394,12 @@ const TokensTable = () => {
}; };
useEffect(() => { useEffect(() => {
loadTokens(0) loadTokens(0, orderBy)
.then() .then()
.catch((reason) => { .catch((reason) => {
showError(reason); showError(reason);
}); });
}, [pageSize]); }, [pageSize, orderBy]);
const removeRecord = key => { const removeRecord = key => {
let newDataSource = [...tokens]; let newDataSource = [...tokens];
@ -452,6 +454,7 @@ const TokensTable = () => {
// if keyword is blank, load files instead. // if keyword is blank, load files instead.
await loadTokens(0); await loadTokens(0);
setActivePage(1); setActivePage(1);
setOrderBy('');
return; return;
} }
setSearching(true); setSearching(true);
@ -520,6 +523,23 @@ const TokensTable = () => {
} }
}; };
const handleOrderByChange = (e, { value }) => {
setOrderBy(value);
setActivePage(1);
setDropdownVisible(false);
};
const renderSelectedOption = (orderBy) => {
switch (orderBy) {
case 'remain_quota':
return '按剩余额度排序';
case 'used_quota':
return '按已用额度排序';
default:
return '默认排序';
}
};
return ( return (
<> <>
<EditToken refresh={refresh} editingToken={editingToken} visiable={showEdit} handleClose={closeEdit}></EditToken> <EditToken refresh={refresh} editingToken={editingToken} visiable={showEdit} handleClose={closeEdit}></EditToken>
@ -579,6 +599,21 @@ const TokensTable = () => {
await copyText(keys); await copyText(keys);
} }
}>复制所选令牌到剪贴板</Button> }>复制所选令牌到剪贴板</Button>
<Dropdown
trigger="click"
position="bottomLeft"
visible={dropdownVisible}
onVisibleChange={(visible) => setDropdownVisible(visible)}
render={
<Dropdown.Menu>
<Dropdown.Item onClick={() => handleOrderByChange('', { value: '' })}>默认排序</Dropdown.Item>
<Dropdown.Item onClick={() => handleOrderByChange('', { value: 'remain_quota' })}>按剩余额度排序</Dropdown.Item>
<Dropdown.Item onClick={() => handleOrderByChange('', { value: 'used_quota' })}>按已用额度排序</Dropdown.Item>
</Dropdown.Menu>
}
>
<Button style={{ marginLeft: '10px' }}>{renderSelectedOption(orderBy)}</Button>
</Dropdown>
</> </>
); );
}; };

View File

@ -1,6 +1,6 @@
import React, { useEffect, useState } from 'react'; import React, { useEffect, useState } from 'react';
import { API, showError, showSuccess } from '../helpers'; import { API, showError, showSuccess } from '../helpers';
import { Button, Form, Popconfirm, Space, Table, Tag, Tooltip } from '@douyinfe/semi-ui'; import { Button, Form, Popconfirm, Space, Table, Tag, Tooltip, Dropdown } from '@douyinfe/semi-ui';
import { ITEMS_PER_PAGE } from '../constants'; import { ITEMS_PER_PAGE } from '../constants';
import { renderGroup, renderNumber, renderQuota } from '../helpers/render'; import { renderGroup, renderNumber, renderQuota } from '../helpers/render';
import AddUser from '../pages/User/AddUser'; import AddUser from '../pages/User/AddUser';
@ -139,6 +139,8 @@ const UsersTable = () => {
const [editingUser, setEditingUser] = useState({ const [editingUser, setEditingUser] = useState({
id: undefined id: undefined
}); });
const [orderBy, setOrderBy] = useState('');
const [dropdownVisible, setDropdownVisible] = useState(false);
const setCount = (data) => { const setCount = (data) => {
if (data.length >= (activePage) * ITEMS_PER_PAGE) { if (data.length >= (activePage) * ITEMS_PER_PAGE) {
@ -162,7 +164,7 @@ const UsersTable = () => {
}; };
const loadUsers = async (startIdx) => { const loadUsers = async (startIdx) => {
const res = await API.get(`/api/user/?p=${startIdx}`); const res = await API.get(`/api/user/?p=${startIdx}&order=${orderBy}`);
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
if (startIdx === 0) { if (startIdx === 0) {
@ -184,19 +186,19 @@ const UsersTable = () => {
(async () => { (async () => {
if (activePage === Math.ceil(users.length / ITEMS_PER_PAGE) + 1) { if (activePage === Math.ceil(users.length / ITEMS_PER_PAGE) + 1) {
// In this case we have to load more data and then append them. // In this case we have to load more data and then append them.
await loadUsers(activePage - 1); await loadUsers(activePage - 1, orderBy);
} }
setActivePage(activePage); setActivePage(activePage);
})(); })();
}; };
useEffect(() => { useEffect(() => {
loadUsers(0) loadUsers(0, orderBy)
.then() .then()
.catch((reason) => { .catch((reason) => {
showError(reason); showError(reason);
}); });
}, []); }, [orderBy]);
const manageUser = async (username, action, record) => { const manageUser = async (username, action, record) => {
const res = await API.post('/api/user/manage', { const res = await API.post('/api/user/manage', {
@ -239,6 +241,7 @@ const UsersTable = () => {
// if keyword is blank, load files instead. // if keyword is blank, load files instead.
await loadUsers(0); await loadUsers(0);
setActivePage(1); setActivePage(1);
setOrderBy('');
return; return;
} }
setSearching(true); setSearching(true);
@ -301,6 +304,25 @@ const UsersTable = () => {
} }
}; };
const handleOrderByChange = (e, { value }) => {
setOrderBy(value);
setActivePage(1);
setDropdownVisible(false);
};
const renderSelectedOption = (orderBy) => {
switch (orderBy) {
case 'quota':
return '按剩余额度排序';
case 'used_quota':
return '按已用额度排序';
case 'request_count':
return '按请求次数排序';
default:
return '默认排序';
}
};
return ( return (
<> <>
<AddUser refresh={refresh} visible={showAddUser} handleClose={closeAddUser}></AddUser> <AddUser refresh={refresh} visible={showAddUser} handleClose={closeAddUser}></AddUser>
@ -331,6 +353,22 @@ const UsersTable = () => {
setShowAddUser(true); setShowAddUser(true);
} }
}>添加用户</Button> }>添加用户</Button>
<Dropdown
trigger="click"
position="bottomLeft"
visible={dropdownVisible}
onVisibleChange={(visible) => setDropdownVisible(visible)}
render={
<Dropdown.Menu>
<Dropdown.Item onClick={() => handleOrderByChange('', { value: '' })}>默认排序</Dropdown.Item>
<Dropdown.Item onClick={() => handleOrderByChange('', { value: 'quota' })}>按剩余额度排序</Dropdown.Item>
<Dropdown.Item onClick={() => handleOrderByChange('', { value: 'used_quota' })}>按已用额度排序</Dropdown.Item>
<Dropdown.Item onClick={() => handleOrderByChange('', { value: 'request_count' })}>按请求次数排序</Dropdown.Item>
</Dropdown.Menu>
}
>
<Button style={{ marginLeft: '10px' }}>{renderSelectedOption(orderBy)}</Button>
</Dropdown>
</> </>
); );
}; };

View File

@ -230,7 +230,7 @@ const EditChannel = (props) => {
localInputs.base_url = localInputs.base_url.slice(0, localInputs.base_url.length - 1); localInputs.base_url = localInputs.base_url.slice(0, localInputs.base_url.length - 1);
} }
if (localInputs.type === 3 && localInputs.other === '') { if (localInputs.type === 3 && localInputs.other === '') {
localInputs.other = '2023-06-01-preview'; localInputs.other = '2024-03-01-preview';
} }
if (localInputs.type === 18 && localInputs.other === '') { if (localInputs.type === 18 && localInputs.other === '') {
localInputs.other = 'v2.1'; localInputs.other = 'v2.1';
@ -348,7 +348,7 @@ const EditChannel = (props) => {
<Input <Input
label='默认 API 版本' label='默认 API 版本'
name='azure_other' name='azure_other'
placeholder={'请输入默认 API 版本例如2023-06-01-preview该配置可以被实际的请求查询参数所覆盖'} placeholder={'请输入默认 API 版本例如2024-03-01-preview该配置可以被实际的请求查询参数所覆盖'}
onChange={value => { onChange={value => {
handleInputChange('other', value) handleInputChange('other', value)
}} }}

View File

@ -49,7 +49,7 @@ const typeConfig = {
base_url: "请填写AZURE_OPENAI_ENDPOINT", base_url: "请填写AZURE_OPENAI_ENDPOINT",
// 注意:通过判断 `other` 是否有值来判断是否需要显示 `other` 输入框, 默认是没有值的 // 注意:通过判断 `other` 是否有值来判断是否需要显示 `other` 输入框, 默认是没有值的
other: "请输入默认API版本例如2023-06-01-preview", other: "请输入默认API版本例如2024-03-01-preview",
}, },
modelGroup: "openai", // 模型组名称,这个值是给 填入渠道支持模型 按钮使用的。 填入渠道支持模型 按钮会根据这个值来获取模型组,如果填写默认是 openai modelGroup: "openai", // 模型组名称,这个值是给 填入渠道支持模型 按钮使用的。 填入渠道支持模型 按钮会根据这个值来获取模型组,如果填写默认是 openai
}, },

Binary file not shown.

Before

Width:  |  Height:  |  Size: 40 KiB

After

Width:  |  Height:  |  Size: 4.2 KiB

View File

@ -3,186 +3,186 @@ export const CHANNEL_OPTIONS = {
key: 1, key: 1,
text: 'OpenAI', text: 'OpenAI',
value: 1, value: 1,
color: 'primary' color: 'success'
}, },
14: { 14: {
key: 14, key: 14,
text: 'Anthropic Claude', text: 'Anthropic Claude',
value: 14, value: 14,
color: 'info' color: 'primary'
}, },
3: { 3: {
key: 3, key: 3,
text: 'Azure OpenAI', text: 'Azure OpenAI',
value: 3, value: 3,
color: 'secondary' color: 'success'
}, },
11: { 11: {
key: 11, key: 11,
text: 'Google PaLM2', text: 'Google PaLM2',
value: 11, value: 11,
color: 'orange' color: 'warning'
}, },
24: { 24: {
key: 24, key: 24,
text: 'Google Gemini', text: 'Google Gemini',
value: 24, value: 24,
color: 'orange' color: 'warning'
}, },
28: { 28: {
key: 28, key: 28,
text: 'Mistral AI', text: 'Mistral AI',
value: 28, value: 28,
color: 'orange' color: 'warning'
}, },
15: { 15: {
key: 15, key: 15,
text: '百度文心千帆', text: '百度文心千帆',
value: 15, value: 15,
color: 'default' color: 'primary'
}, },
17: { 17: {
key: 17, key: 17,
text: '阿里通义千问', text: '阿里通义千问',
value: 17, value: 17,
color: 'default' color: 'primary'
}, },
18: { 18: {
key: 18, key: 18,
text: '讯飞星火认知', text: '讯飞星火认知',
value: 18, value: 18,
color: 'default' color: 'primary'
}, },
16: { 16: {
key: 16, key: 16,
text: '智谱 ChatGLM', text: '智谱 ChatGLM',
value: 16, value: 16,
color: 'default' color: 'primary'
}, },
19: { 19: {
key: 19, key: 19,
text: '360 智脑', text: '360 智脑',
value: 19, value: 19,
color: 'default' color: 'primary'
}, },
25: { 25: {
key: 25, key: 25,
text: 'Moonshot AI', text: 'Moonshot AI',
value: 25, value: 25,
color: 'default' color: 'primary'
}, },
23: { 23: {
key: 23, key: 23,
text: '腾讯混元', text: '腾讯混元',
value: 23, value: 23,
color: 'default' color: 'primary'
}, },
26: { 26: {
key: 26, key: 26,
text: '百川大模型', text: '百川大模型',
value: 26, value: 26,
color: 'default' color: 'primary'
}, },
27: { 27: {
key: 27, key: 27,
text: 'MiniMax', text: 'MiniMax',
value: 27, value: 27,
color: 'default' color: 'primary'
}, },
29: { 29: {
key: 29, key: 29,
text: 'Groq', text: 'Groq',
value: 29, value: 29,
color: 'default' color: 'primary'
}, },
30: { 30: {
key: 30, key: 30,
text: 'Ollama', text: 'Ollama',
value: 30, value: 30,
color: 'default' color: 'primary'
}, },
31: { 31: {
key: 31, key: 31,
text: '零一万物', text: '零一万物',
value: 31, value: 31,
color: 'default' color: 'primary'
}, },
8: { 8: {
key: 8, key: 8,
text: '自定义渠道', text: '自定义渠道',
value: 8, value: 8,
color: 'primary' color: 'error'
}, },
22: { 22: {
key: 22, key: 22,
text: '知识库FastGPT', text: '知识库FastGPT',
value: 22, value: 22,
color: 'default' color: 'success'
}, },
21: { 21: {
key: 21, key: 21,
text: '知识库AI Proxy', text: '知识库AI Proxy',
value: 21, value: 21,
color: 'purple' color: 'success'
}, },
20: { 20: {
key: 20, key: 20,
text: '代理OpenRouter', text: '代理OpenRouter',
value: 20, value: 20,
color: 'primary' color: 'success'
}, },
2: { 2: {
key: 2, key: 2,
text: '代理API2D', text: '代理API2D',
value: 2, value: 2,
color: 'primary' color: 'success'
}, },
5: { 5: {
key: 5, key: 5,
text: '代理OpenAI-SB', text: '代理OpenAI-SB',
value: 5, value: 5,
color: 'primary' color: 'success'
}, },
7: { 7: {
key: 7, key: 7,
text: '代理OhMyGPT', text: '代理OhMyGPT',
value: 7, value: 7,
color: 'primary' color: 'success'
}, },
10: { 10: {
key: 10, key: 10,
text: '代理AI Proxy', text: '代理AI Proxy',
value: 10, value: 10,
color: 'primary' color: 'success'
}, },
4: { 4: {
key: 4, key: 4,
text: '代理CloseAI', text: '代理CloseAI',
value: 4, value: 4,
color: 'primary' color: 'success'
}, },
6: { 6: {
key: 6, key: 6,
text: '代理OpenAI Max', text: '代理OpenAI Max',
value: 6, value: 6,
color: 'primary' color: 'success'
}, },
9: { 9: {
key: 9, key: 9,
text: '代理AI.LS', text: '代理AI.LS',
value: 9, value: 9,
color: 'primary' color: 'success'
}, },
12: { 12: {
key: 12, key: 12,
text: '代理API2GPT', text: '代理API2GPT',
value: 12, value: 12,
color: 'primary' color: 'success'
}, },
13: { 13: {
key: 13, key: 13,
text: '代理AIGC2D', text: '代理AIGC2D',
value: 13, value: 13,
color: 'primary' color: 'success'
} }
}; };

View File

@ -51,7 +51,7 @@ const Register = () => {
<Grid item xs={12}> <Grid item xs={12}>
<Grid item container direction="column" alignItems="center" xs={12}> <Grid item container direction="column" alignItems="center" xs={12}>
<Typography component={Link} to="/login" variant="subtitle1" sx={{ textDecoration: 'none' }}> <Typography component={Link} to="/login" variant="subtitle1" sx={{ textDecoration: 'none' }}>
已经有帐号了?点击登录 已经有帐号了点击登录
</Typography> </Typography>
</Grid> </Grid>
</Grid> </Grid>

View File

@ -180,7 +180,7 @@ const LoginForm = ({ ...others }) => {
{({ errors, handleBlur, handleChange, handleSubmit, isSubmitting, touched, values }) => ( {({ errors, handleBlur, handleChange, handleSubmit, isSubmitting, touched, values }) => (
<form noValidate onSubmit={handleSubmit} {...others}> <form noValidate onSubmit={handleSubmit} {...others}>
<FormControl fullWidth error={Boolean(touched.username && errors.username)} sx={{ ...theme.typography.customInput }}> <FormControl fullWidth error={Boolean(touched.username && errors.username)} sx={{ ...theme.typography.customInput }}>
<InputLabel htmlFor="outlined-adornment-username-login">用户名</InputLabel> <InputLabel htmlFor="outlined-adornment-username-login">用户名 / 邮箱</InputLabel>
<OutlinedInput <OutlinedInput
id="outlined-adornment-username-login" id="outlined-adornment-username-login"
type="text" type="text"

View File

@ -296,7 +296,7 @@ const RegisterForm = ({ ...others }) => {
<Box sx={{ mt: 2 }}> <Box sx={{ mt: 2 }}>
<AnimateButton> <AnimateButton>
<Button disableElevation disabled={isSubmitting} fullWidth size="large" type="submit" variant="contained" color="primary"> <Button disableElevation disabled={isSubmitting} fullWidth size="large" type="submit" variant="contained" color="primary">
Sign up 注册
</Button> </Button>
</AnimateButton> </AnimateButton>
</Box> </Box>

View File

@ -3,6 +3,19 @@ import Label from "ui-component/Label";
import Stack from "@mui/material/Stack"; import Stack from "@mui/material/Stack";
import Divider from "@mui/material/Divider"; import Divider from "@mui/material/Divider";
function name2color(name) {
switch (name) {
case "default":
return "info";
case "vip":
return "warning"
case "svip":
return "error"
default:
return "info"
}
}
const GroupLabel = ({ group }) => { const GroupLabel = ({ group }) => {
let groups = []; let groups = [];
if (group === "") { if (group === "") {
@ -14,7 +27,7 @@ const GroupLabel = ({ group }) => {
return ( return (
<Stack divider={<Divider orientation="vertical" flexItem />} spacing={0.5}> <Stack divider={<Divider orientation="vertical" flexItem />} spacing={0.5}>
{groups.map((group, index) => { {groups.map((group, index) => {
return <Label key={index}>{group}</Label>; return <Label key={index} color={name2color(group)}>{group}</Label>;
})} })}
</Stack> </Stack>
); );

View File

@ -10,6 +10,7 @@ const ChannelTableHead = () => {
<TableCell>类型</TableCell> <TableCell>类型</TableCell>
<TableCell>状态</TableCell> <TableCell>状态</TableCell>
<TableCell>响应时间</TableCell> <TableCell>响应时间</TableCell>
<TableCell>已消耗</TableCell>
<TableCell>余额</TableCell> <TableCell>余额</TableCell>
<TableCell>优先级</TableCell> <TableCell>优先级</TableCell>
<TableCell>操作</TableCell> <TableCell>操作</TableCell>

View File

@ -93,7 +93,7 @@ export default function ChannelTableRow({
test_time: Date.now() / 1000, test_time: Date.now() / 1000,
response_time: time * 1000, response_time: time * 1000,
}); });
showInfo(`${item.name} 测试成功,耗时 ${time.toFixed(2)} 秒。`); showInfo(`${item.name} 测试成功,耗时 ${time.toFixed(2)} 秒。`);
} }
}; };
@ -170,6 +170,9 @@ export default function ChannelTableRow({
handle_action={handleResponseTime} handle_action={handleResponseTime}
/> />
</TableCell> </TableCell>
<TableCell>
{renderNumber(item.used_quota)}
</TableCell>
<TableCell> <TableCell>
<Tooltip <Tooltip
title={"点击更新余额"} title={"点击更新余额"}
@ -240,9 +243,9 @@ export default function ChannelTableRow({
</Popover> </Popover>
<Dialog open={openDelete} onClose={handleDeleteClose}> <Dialog open={openDelete} onClose={handleDeleteClose}>
<DialogTitle>删除</DialogTitle> <DialogTitle>删除</DialogTitle>
<DialogContent> <DialogContent>
<DialogContentText>是否删除{item.name}</DialogContentText> <DialogContentText>是否删除{item.name}</DialogContentText>
</DialogContent> </DialogContent>
<DialogActions> <DialogActions>
<Button onClick={handleDeleteClose}>关闭</Button> <Button onClick={handleDeleteClose}>关闭</Button>

View File

@ -135,7 +135,7 @@ export default function ChannelPage() {
const res = await API.get(`/api/channel/test`); const res = await API.get(`/api/channel/test`);
const { success, message } = res.data; const { success, message } = res.data;
if (success) { if (success) {
showInfo('已成功开始测试所有道,请刷新页面查看结果。'); showInfo('已成功开始测试所有道,请刷新页面查看结果。');
} else { } else {
showError(message); showError(message);
} }
@ -159,7 +159,7 @@ export default function ChannelPage() {
const res = await API.get(`/api/channel/update_balance`); const res = await API.get(`/api/channel/update_balance`);
const { success, message } = res.data; const { success, message } = res.data;
if (success) { if (success) {
showInfo('已更新完毕所有已启用道余额!'); showInfo('已更新完毕所有已启用道余额!');
} else { } else {
showError(message); showError(message);
} }
@ -193,20 +193,14 @@ export default function ChannelPage() {
return ( return (
<> <>
<Stack direction="row" alignItems="center" justifyContent="space-between" mb={5}> <Stack direction="row" alignItems="center" justifyContent="space-between" mb={2.5}>
<Typography variant="h4">渠道</Typography> <Typography variant="h4">渠道</Typography>
<Button variant="contained" color="primary" startIcon={<IconPlus />} onClick={() => handleOpenModal(0)}> <Button variant="contained" color="primary" startIcon={<IconPlus />} onClick={() => handleOpenModal(0)}>
新建渠道 新建渠道
</Button> </Button>
</Stack> </Stack>
<Stack mb={5}>
<Alert severity="info">
OpenAI 渠道已经不再支持通过 key 获取余额因此余额显示为 0对于支持的渠道类型请点击余额进行刷新
</Alert>
</Stack>
<Card> <Card>
<Box component="form" onSubmit={searchChannels} noValidate> <Box component="form" onSubmit={searchChannels} noValidate sx={{marginTop: 2}}>
<TableToolBar filterName={searchKeyword} handleFilterName={handleSearchKeyword} placeholder={'搜索渠道的 ID名称和密钥 ...'} /> <TableToolBar filterName={searchKeyword} handleFilterName={handleSearchKeyword} placeholder={'搜索渠道的 ID名称和密钥 ...'} />
</Box> </Box>
<Toolbar <Toolbar
@ -220,7 +214,7 @@ export default function ChannelPage() {
> >
<Container> <Container>
{matchUpMd ? ( {matchUpMd ? (
<ButtonGroup variant="outlined" aria-label="outlined small primary button group"> <ButtonGroup variant="outlined" aria-label="outlined small primary button group" sx={{marginBottom: 2}}>
<Button onClick={handleRefresh} startIcon={<IconRefresh width={'18px'} />}> <Button onClick={handleRefresh} startIcon={<IconRefresh width={'18px'} />}>
刷新 刷新
</Button> </Button>

View File

@ -41,7 +41,7 @@ const typeConfig = {
}, },
prompt: { prompt: {
base_url: "请填写AZURE_OPENAI_ENDPOINT", base_url: "请填写AZURE_OPENAI_ENDPOINT",
other: "请输入默认API版本例如2023-06-01-preview", other: "请输入默认API版本例如2024-03-01-preview",
}, },
}, },
11: { 11: {

View File

@ -65,7 +65,7 @@ const StatisticalLineChartCard = ({ isLoading, title, chartData, todayValue }) =
) : ( ) : (
<CardWrapper border={false} content={false}> <CardWrapper border={false} content={false}>
<Box sx={{ p: 2.25 }}> <Box sx={{ p: 2.25 }}>
<Grid container direction="column"> <Grid>
<Grid item sx={{ mb: 0.75 }}> <Grid item sx={{ mb: 0.75 }}>
<Grid container alignItems="center"> <Grid container alignItems="center">
<Grid item xs={6}> <Grid item xs={6}>

View File

@ -102,11 +102,11 @@ export default function Log() {
return ( return (
<> <>
<Stack direction="row" alignItems="center" justifyContent="space-between" mb={5}> <Stack direction="row" alignItems="center" justifyContent="space-between" mb={2.5}>
<Typography variant="h4">日志</Typography> <Typography variant="h4">日志</Typography>
</Stack> </Stack>
<Card> <Card>
<Box component="form" onSubmit={searchLogs} noValidate> <Box component="form" onSubmit={searchLogs} noValidate sx={{marginTop: 2}}>
<TableToolBar filterName={searchKeyword} handleFilterName={handleSearchKeyword} userIsAdmin={userIsAdmin} /> <TableToolBar filterName={searchKeyword} handleFilterName={handleSearchKeyword} userIsAdmin={userIsAdmin} />
</Box> </Box>
<Toolbar <Toolbar
@ -119,7 +119,7 @@ export default function Log() {
}} }}
> >
<Container> <Container>
<ButtonGroup variant="outlined" aria-label="outlined small primary button group"> <ButtonGroup variant="outlined" aria-label="outlined small primary button group" sx={{marginBottom: 2}}>
<Button onClick={handleRefresh} startIcon={<IconRefresh width={'18px'} />}> <Button onClick={handleRefresh} startIcon={<IconRefresh width={'18px'} />}>
刷新/清除搜索条件 刷新/清除搜索条件
</Button> </Button>

View File

@ -141,7 +141,7 @@ export default function Redemption() {
return ( return (
<> <>
<Stack direction="row" alignItems="center" justifyContent="space-between" mb={5}> <Stack direction="row" alignItems="center" justifyContent="space-between" mb={2.5}>
<Typography variant="h4">兑换</Typography> <Typography variant="h4">兑换</Typography>
<Button variant="contained" color="primary" startIcon={<IconPlus />} onClick={() => handleOpenModal(0)}> <Button variant="contained" color="primary" startIcon={<IconPlus />} onClick={() => handleOpenModal(0)}>
@ -149,7 +149,7 @@ export default function Redemption() {
</Button> </Button>
</Stack> </Stack>
<Card> <Card>
<Box component="form" onSubmit={searchRedemptions} noValidate> <Box component="form" onSubmit={searchRedemptions} noValidate sx={{marginTop: 2}}>
<TableToolBar filterName={searchKeyword} handleFilterName={handleSearchKeyword} placeholder={'搜索兑换码的ID和名称...'} /> <TableToolBar filterName={searchKeyword} handleFilterName={handleSearchKeyword} placeholder={'搜索兑换码的ID和名称...'} />
</Box> </Box>
<Toolbar <Toolbar
@ -162,7 +162,7 @@ export default function Redemption() {
}} }}
> >
<Container> <Container>
<ButtonGroup variant="outlined" aria-label="outlined small primary button group"> <ButtonGroup variant="outlined" aria-label="outlined small primary button group" sx={{marginBottom: 2}}>
<Button onClick={handleRefresh} startIcon={<IconRefresh width={'18px'} />}> <Button onClick={handleRefresh} startIcon={<IconRefresh width={'18px'} />}>
刷新 刷新
</Button> </Button>

View File

@ -371,7 +371,7 @@ const OperationSetting = () => {
value={inputs.ChannelDisableThreshold} value={inputs.ChannelDisableThreshold}
onChange={handleInputChange} onChange={handleInputChange}
label="最长响应时间" label="最长响应时间"
placeholder="单位秒,当运行通道全部测试时,超过此时间将自动禁用通道" placeholder="单位秒,当运行渠道全部测试时,超过此时间将自动禁用渠道"
disabled={loading} disabled={loading}
/> />
</FormControl> </FormControl>
@ -392,7 +392,7 @@ const OperationSetting = () => {
</FormControl> </FormControl>
</Stack> </Stack>
<FormControlLabel <FormControlLabel
label="失败时自动禁用道" label="失败时自动禁用道"
control={ control={
<Checkbox <Checkbox
checked={inputs.AutomaticDisableChannelEnabled === "true"} checked={inputs.AutomaticDisableChannelEnabled === "true"}
@ -402,7 +402,7 @@ const OperationSetting = () => {
} }
/> />
<FormControlLabel <FormControlLabel
label="成功时自动启用道" label="成功时自动启用道"
control={ control={
<Checkbox <Checkbox
checked={inputs.AutomaticEnableChannelEnabled === "true"} checked={inputs.AutomaticEnableChannelEnabled === "true"}

View File

@ -141,9 +141,8 @@ export default function Token() {
return ( return (
<> <>
<Stack direction="row" alignItems="center" justifyContent="space-between" mb={5}> <Stack direction="row" alignItems="center" justifyContent="space-between" mb={2.5}>
<Typography variant="h4">令牌</Typography> <Typography variant="h4">令牌</Typography>
<Button <Button
variant="contained" variant="contained"
color="primary" color="primary"
@ -155,13 +154,13 @@ export default function Token() {
新建令牌 新建令牌
</Button> </Button>
</Stack> </Stack>
<Stack mb={5}> <Stack mb={2}>
<Alert severity="info"> <Alert severity="info">
OpenAI API 基础地址 https://api.openai.com 替换为 <b>{siteInfo.server_address}</b>,复制下面的密钥即可使用 OpenAI API 基础地址 https://api.openai.com 替换为 <b>{siteInfo.server_address}</b>,复制下面的密钥即可使用
</Alert> </Alert>
</Stack> </Stack>
<Card> <Card>
<Box component="form" onSubmit={searchTokens} noValidate> <Box component="form" onSubmit={searchTokens} noValidate sx={{marginTop: 2}}>
<TableToolBar filterName={searchKeyword} handleFilterName={handleSearchKeyword} placeholder={'搜索令牌的名称...'} /> <TableToolBar filterName={searchKeyword} handleFilterName={handleSearchKeyword} placeholder={'搜索令牌的名称...'} />
</Box> </Box>
<Toolbar <Toolbar
@ -174,7 +173,7 @@ export default function Token() {
}} }}
> >
<Container> <Container>
<ButtonGroup variant="outlined" aria-label="outlined small primary button group"> <ButtonGroup variant="outlined" aria-label="outlined small primary button group" sx={{marginBottom: 2}}>
<Button onClick={handleRefresh} startIcon={<IconRefresh width={'18px'} />}> <Button onClick={handleRefresh} startIcon={<IconRefresh width={'18px'} />}>
刷新 刷新
</Button> </Button>

View File

@ -139,7 +139,7 @@ export default function Users() {
return ( return (
<> <>
<Stack direction="row" alignItems="center" justifyContent="space-between" mb={5}> <Stack direction="row" alignItems="center" justifyContent="space-between" mb={2.5}>
<Typography variant="h4">用户</Typography> <Typography variant="h4">用户</Typography>
<Button variant="contained" color="primary" startIcon={<IconPlus />} onClick={() => handleOpenModal(0)}> <Button variant="contained" color="primary" startIcon={<IconPlus />} onClick={() => handleOpenModal(0)}>
@ -147,7 +147,7 @@ export default function Users() {
</Button> </Button>
</Stack> </Stack>
<Card> <Card>
<Box component="form" onSubmit={searchUsers} noValidate> <Box component="form" onSubmit={searchUsers} noValidate sx={{marginTop: 2}}>
<TableToolBar <TableToolBar
filterName={searchKeyword} filterName={searchKeyword}
handleFilterName={handleSearchKeyword} handleFilterName={handleSearchKeyword}
@ -164,7 +164,7 @@ export default function Users() {
}} }}
> >
<Container> <Container>
<ButtonGroup variant="outlined" aria-label="outlined small primary button group"> <ButtonGroup variant="outlined" aria-label="outlined small primary button group" sx={{marginBottom: 2}}>
<Button onClick={handleRefresh} startIcon={<IconRefresh width={'18px'} />}> <Button onClick={handleRefresh} startIcon={<IconRefresh width={'18px'} />}>
刷新 刷新
</Button> </Button>

View File

@ -234,7 +234,7 @@ const ChannelsTable = () => {
newChannels[realIdx].response_time = time * 1000; newChannels[realIdx].response_time = time * 1000;
newChannels[realIdx].test_time = Date.now() / 1000; newChannels[realIdx].test_time = Date.now() / 1000;
setChannels(newChannels); setChannels(newChannels);
showInfo(`${name} 测试成功,耗时 ${time.toFixed(2)} 秒。`); showInfo(`${name} 测试成功,耗时 ${time.toFixed(2)} 秒。`);
} else { } else {
showError(message); showError(message);
} }
@ -244,7 +244,7 @@ const ChannelsTable = () => {
const res = await API.get(`/api/channel/test?scope=${scope}`); const res = await API.get(`/api/channel/test?scope=${scope}`);
const { success, message } = res.data; const { success, message } = res.data;
if (success) { if (success) {
showInfo('已成功开始测试道,请刷新页面查看结果。'); showInfo('已成功开始测试道,请刷新页面查看结果。');
} else { } else {
showError(message); showError(message);
} }
@ -270,7 +270,7 @@ const ChannelsTable = () => {
newChannels[realIdx].balance = balance; newChannels[realIdx].balance = balance;
newChannels[realIdx].balance_updated_time = Date.now() / 1000; newChannels[realIdx].balance_updated_time = Date.now() / 1000;
setChannels(newChannels); setChannels(newChannels);
showInfo(`${name} 余额更新成功!`); showInfo(`${name} 余额更新成功!`);
} else { } else {
showError(message); showError(message);
} }
@ -281,7 +281,7 @@ const ChannelsTable = () => {
const res = await API.get(`/api/channel/update_balance`); const res = await API.get(`/api/channel/update_balance`);
const { success, message } = res.data; const { success, message } = res.data;
if (success) { if (success) {
showInfo('已更新完毕所有已启用道余额!'); showInfo('已更新完毕所有已启用道余额!');
} else { } else {
showError(message); showError(message);
} }
@ -333,6 +333,8 @@ const ChannelsTable = () => {
setPromptShown("channel-test"); setPromptShown("channel-test");
}}> }}>
OpenAI 渠道已经不再支持通过 key 获取余额因此余额显示为 0对于支持的渠道类型请点击余额进行刷新 OpenAI 渠道已经不再支持通过 key 获取余额因此余额显示为 0对于支持的渠道类型请点击余额进行刷新
<br/>
渠道测试仅支持 chat 模型优先使用 gpt-3.5-turbo如果该模型不可用则使用你所配置的模型列表中的第一个模型
</Message> </Message>
) )
} }

View File

@ -94,7 +94,7 @@ const LoginForm = () => {
fluid fluid
icon='user' icon='user'
iconPosition='left' iconPosition='left'
placeholder='用户名' placeholder='用户名 / 邮箱地址'
name='username' name='username'
value={username} value={username}
onChange={handleChange} onChange={handleChange}

View File

@ -261,7 +261,7 @@ const OperationSetting = () => {
value={inputs.ChannelDisableThreshold} value={inputs.ChannelDisableThreshold}
type='number' type='number'
min='0' min='0'
placeholder='单位秒,当运行通道全部测试时,超过此时间将自动禁用通道' placeholder='单位秒,当运行渠道全部测试时,超过此时间将自动禁用渠道'
/> />
<Form.Input <Form.Input
label='额度提醒阈值' label='额度提醒阈值'
@ -277,13 +277,13 @@ const OperationSetting = () => {
<Form.Group inline> <Form.Group inline>
<Form.Checkbox <Form.Checkbox
checked={inputs.AutomaticDisableChannelEnabled === 'true'} checked={inputs.AutomaticDisableChannelEnabled === 'true'}
label='失败时自动禁用道' label='失败时自动禁用道'
name='AutomaticDisableChannelEnabled' name='AutomaticDisableChannelEnabled'
onChange={handleInputChange} onChange={handleInputChange}
/> />
<Form.Checkbox <Form.Checkbox
checked={inputs.AutomaticEnableChannelEnabled === 'true'} checked={inputs.AutomaticEnableChannelEnabled === 'true'}
label='成功时自动启用道' label='成功时自动启用道'
name='AutomaticEnableChannelEnabled' name='AutomaticEnableChannelEnabled'
onChange={handleInputChange} onChange={handleInputChange}
/> />

View File

@ -48,9 +48,10 @@ const TokensTable = () => {
const [searching, setSearching] = useState(false); const [searching, setSearching] = useState(false);
const [showTopUpModal, setShowTopUpModal] = useState(false); const [showTopUpModal, setShowTopUpModal] = useState(false);
const [targetTokenIdx, setTargetTokenIdx] = useState(0); const [targetTokenIdx, setTargetTokenIdx] = useState(0);
const [orderBy, setOrderBy] = useState('');
const loadTokens = async (startIdx) => { const loadTokens = async (startIdx) => {
const res = await API.get(`/api/token/?p=${startIdx}`); const res = await API.get(`/api/token/?p=${startIdx}&order=${orderBy}`);
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
if (startIdx === 0) { if (startIdx === 0) {
@ -70,7 +71,7 @@ const TokensTable = () => {
(async () => { (async () => {
if (activePage === Math.ceil(tokens.length / ITEMS_PER_PAGE) + 1) { if (activePage === Math.ceil(tokens.length / ITEMS_PER_PAGE) + 1) {
// In this case we have to load more data and then append them. // In this case we have to load more data and then append them.
await loadTokens(activePage - 1); await loadTokens(activePage - 1, orderBy);
} }
setActivePage(activePage); setActivePage(activePage);
})(); })();
@ -160,12 +161,12 @@ const TokensTable = () => {
} }
useEffect(() => { useEffect(() => {
loadTokens(0) loadTokens(0, orderBy)
.then() .then()
.catch((reason) => { .catch((reason) => {
showError(reason); showError(reason);
}); });
}, []); }, [orderBy]);
const manageToken = async (id, action, idx) => { const manageToken = async (id, action, idx) => {
let data = { id }; let data = { id };
@ -205,6 +206,7 @@ const TokensTable = () => {
// if keyword is blank, load files instead. // if keyword is blank, load files instead.
await loadTokens(0); await loadTokens(0);
setActivePage(1); setActivePage(1);
setOrderBy('');
return; return;
} }
setSearching(true); setSearching(true);
@ -243,6 +245,11 @@ const TokensTable = () => {
setLoading(false); setLoading(false);
}; };
const handleOrderByChange = (e, { value }) => {
setOrderBy(value);
setActivePage(1);
};
return ( return (
<> <>
<Form onSubmit={searchTokens}> <Form onSubmit={searchTokens}>
@ -427,6 +434,18 @@ const TokensTable = () => {
添加新的令牌 添加新的令牌
</Button> </Button>
<Button size='small' onClick={refresh} loading={loading}>刷新</Button> <Button size='small' onClick={refresh} loading={loading}>刷新</Button>
<Dropdown
placeholder='排序方式'
selection
options={[
{ key: '', text: '默认排序', value: '' },
{ key: 'remain_quota', text: '按剩余额度排序', value: 'remain_quota' },
{ key: 'used_quota', text: '按已用额度排序', value: 'used_quota' },
]}
value={orderBy}
onChange={handleOrderByChange}
style={{ marginLeft: '10px' }}
/>
<Pagination <Pagination
floated='right' floated='right'
activePage={activePage} activePage={activePage}

View File

@ -1,5 +1,5 @@
import React, { useEffect, useState } from 'react'; import React, { useEffect, useState } from 'react';
import { Button, Form, Label, Pagination, Popup, Table } from 'semantic-ui-react'; import { Button, Form, Label, Pagination, Popup, Table, Dropdown } from 'semantic-ui-react';
import { Link } from 'react-router-dom'; import { Link } from 'react-router-dom';
import { API, showError, showSuccess } from '../helpers'; import { API, showError, showSuccess } from '../helpers';
@ -25,9 +25,10 @@ const UsersTable = () => {
const [activePage, setActivePage] = useState(1); const [activePage, setActivePage] = useState(1);
const [searchKeyword, setSearchKeyword] = useState(''); const [searchKeyword, setSearchKeyword] = useState('');
const [searching, setSearching] = useState(false); const [searching, setSearching] = useState(false);
const [orderBy, setOrderBy] = useState('');
const loadUsers = async (startIdx) => { const loadUsers = async (startIdx) => {
const res = await API.get(`/api/user/?p=${startIdx}`); const res = await API.get(`/api/user/?p=${startIdx}&order=${orderBy}`);
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
if (startIdx === 0) { if (startIdx === 0) {
@ -47,19 +48,19 @@ const UsersTable = () => {
(async () => { (async () => {
if (activePage === Math.ceil(users.length / ITEMS_PER_PAGE) + 1) { if (activePage === Math.ceil(users.length / ITEMS_PER_PAGE) + 1) {
// In this case we have to load more data and then append them. // In this case we have to load more data and then append them.
await loadUsers(activePage - 1); await loadUsers(activePage - 1, orderBy);
} }
setActivePage(activePage); setActivePage(activePage);
})(); })();
}; };
useEffect(() => { useEffect(() => {
loadUsers(0) loadUsers(0, orderBy)
.then() .then()
.catch((reason) => { .catch((reason) => {
showError(reason); showError(reason);
}); });
}, []); }, [orderBy]);
const manageUser = (username, action, idx) => { const manageUser = (username, action, idx) => {
(async () => { (async () => {
@ -110,6 +111,7 @@ const UsersTable = () => {
// if keyword is blank, load files instead. // if keyword is blank, load files instead.
await loadUsers(0); await loadUsers(0);
setActivePage(1); setActivePage(1);
setOrderBy('');
return; return;
} }
setSearching(true); setSearching(true);
@ -148,6 +150,11 @@ const UsersTable = () => {
setLoading(false); setLoading(false);
}; };
const handleOrderByChange = (e, { value }) => {
setOrderBy(value);
setActivePage(1);
};
return ( return (
<> <>
<Form onSubmit={searchUsers}> <Form onSubmit={searchUsers}>
@ -322,6 +329,19 @@ const UsersTable = () => {
<Button size='small' as={Link} to='/user/add' loading={loading}> <Button size='small' as={Link} to='/user/add' loading={loading}>
添加新的用户 添加新的用户
</Button> </Button>
<Dropdown
placeholder='排序方式'
selection
options={[
{ key: '', text: '默认排序', value: '' },
{ key: 'quota', text: '按剩余额度排序', value: 'quota' },
{ key: 'used_quota', text: '按已用额度排序', value: 'used_quota' },
{ key: 'request_count', text: '按请求次数排序', value: 'request_count' },
]}
value={orderBy}
onChange={handleOrderByChange}
style={{ marginLeft: '10px' }}
/>
<Pagination <Pagination
floated='right' floated='right'
activePage={activePage} activePage={activePage}

View File

@ -83,6 +83,7 @@ const EditChannel = () => {
data.model_mapping = JSON.stringify(JSON.parse(data.model_mapping), null, 2); data.model_mapping = JSON.stringify(JSON.parse(data.model_mapping), null, 2);
} }
setInputs(data); setInputs(data);
setBasicModels(getChannelModels(data.type));
} else { } else {
showError(message); showError(message);
} }
@ -99,9 +100,6 @@ const EditChannel = () => {
})); }));
setOriginModelOptions(localModelOptions); setOriginModelOptions(localModelOptions);
setFullModels(res.data.data.map((model) => model.id)); setFullModels(res.data.data.map((model) => model.id));
setBasicModels(res.data.data.filter((model) => {
return model.id.startsWith('gpt-3') || model.id.startsWith('text-');
}).map((model) => model.id));
} catch (error) { } catch (error) {
showError(error.message); showError(error.message);
} }
@ -137,6 +135,9 @@ const EditChannel = () => {
useEffect(() => { useEffect(() => {
if (isEdit) { if (isEdit) {
loadChannel().then(); loadChannel().then();
} else {
let localModels = getChannelModels(inputs.type);
setBasicModels(localModels);
} }
fetchModels().then(); fetchModels().then();
fetchGroups().then(); fetchGroups().then();
@ -160,7 +161,7 @@ const EditChannel = () => {
localInputs.base_url = localInputs.base_url.slice(0, localInputs.base_url.length - 1); localInputs.base_url = localInputs.base_url.slice(0, localInputs.base_url.length - 1);
} }
if (localInputs.type === 3 && localInputs.other === '') { if (localInputs.type === 3 && localInputs.other === '') {
localInputs.other = '2023-06-01-preview'; localInputs.other = '2024-03-01-preview';
} }
if (localInputs.type === 18 && localInputs.other === '') { if (localInputs.type === 18 && localInputs.other === '') {
localInputs.other = 'v2.1'; localInputs.other = 'v2.1';
@ -242,7 +243,7 @@ const EditChannel = () => {
<Form.Input <Form.Input
label='默认 API 版本' label='默认 API 版本'
name='other' name='other'
placeholder={'请输入默认 API 版本例如2023-06-01-preview该配置可以被实际的请求查询参数所覆盖'} placeholder={'请输入默认 API 版本例如2024-03-01-preview该配置可以被实际的请求查询参数所覆盖'}
onChange={handleInputChange} onChange={handleInputChange}
value={inputs.other} value={inputs.other}
autoComplete='new-password' autoComplete='new-password'
@ -355,7 +356,7 @@ const EditChannel = () => {
<div style={{ lineHeight: '40px', marginBottom: '12px' }}> <div style={{ lineHeight: '40px', marginBottom: '12px' }}>
<Button type={'button'} onClick={() => { <Button type={'button'} onClick={() => {
handleInputChange(null, { name: 'models', value: basicModels }); handleInputChange(null, { name: 'models', value: basicModels });
}}>填入基础模型</Button> }}>填入相关模型</Button>
<Button type={'button'} onClick={() => { <Button type={'button'} onClick={() => {
handleInputChange(null, { name: 'models', value: fullModels }); handleInputChange(null, { name: 'models', value: fullModels });
}}>填入所有模型</Button> }}>填入所有模型</Button>