-
Notifications
You must be signed in to change notification settings - Fork 8.2k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
fd72894
commit 32393a2
Showing
3 changed files
with
26 additions
and
68 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,79 +1,35 @@ | ||
# 如何使用其他大语言模型 | ||
P.S. 如果您按照以下步骤成功接入了新的大模型,欢迎发Pull Requests(如果您在自己接入新模型的过程中遇到困难,欢迎加README底部QQ群联系群主) | ||
|
||
## ChatGLM | ||
|
||
- 安装依赖 `pip install -r request_llms/requirements_chatglm.txt` | ||
- 修改配置,在config.py中将LLM_MODEL的值改为"chatglm" | ||
# 如何接入其他本地大语言模型 | ||
|
||
``` sh | ||
LLM_MODEL = "chatglm" | ||
``` | ||
- 运行! | ||
``` sh | ||
`python main.py` | ||
``` | ||
1. 复制`request_llms/bridge_llama2.py`,重命名为你喜欢的名字 | ||
|
||
## Claude-Stack | ||
2. 修改`load_model_and_tokenizer`方法,加载你的模型和分词器(去该模型官网找demo,复制粘贴即可) | ||
|
||
- 请参考此教程获取 https://zhuanlan.zhihu.com/p/627485689 | ||
- 1、SLACK_CLAUDE_BOT_ID | ||
- 2、SLACK_CLAUDE_USER_TOKEN | ||
3. 修改`llm_stream_generator`方法,定义推理模型(去该模型官网找demo,复制粘贴即可) | ||
|
||
- 把token加入config.py | ||
4. 命令行测试 | ||
- 修改`tests/test_llms.py`(聪慧如您,只需要看一眼该文件就明白怎么修改了) | ||
- 运行`python tests/test_llms.py` | ||
|
||
## Newbing | ||
5. 测试通过后,在`request_llms/bridge_all.py`中做最后的修改,把你的模型完全接入到框架中(聪慧如您,只需要看一眼该文件就明白怎么修改了) | ||
|
||
- 使用cookie editor获取cookie(json) | ||
- 把cookie(json)加入config.py (NEWBING_COOKIES) | ||
6. 修改`LLM_MODEL`配置,然后运行`python main.py`,测试最后的效果 | ||
|
||
## Moss | ||
- 使用docker-compose | ||
|
||
## RWKV | ||
- 使用docker-compose | ||
# 如何接入其他在线大语言模型 | ||
|
||
## LLAMA | ||
- 使用docker-compose | ||
1. 复制`request_llms/bridge_zhipu.py`,重命名为你喜欢的名字 | ||
|
||
## 盘古 | ||
- 使用docker-compose | ||
2. 修改`predict_no_ui_long_connection` | ||
|
||
3. 修改`predict` | ||
|
||
--- | ||
## Text-Generation-UI (TGUI,调试中,暂不可用) | ||
4. 命令行测试 | ||
- 修改`tests/test_llms.py`(聪慧如您,只需要看一眼该文件就明白怎么修改了) | ||
- 运行`python tests/test_llms.py` | ||
|
||
### 1. 部署TGUI | ||
``` sh | ||
# 1 下载模型 | ||
git clone https://github.com/oobabooga/text-generation-webui.git | ||
# 2 这个仓库的最新代码有问题,回滚到几周之前 | ||
git reset --hard fcda3f87767e642d1c0411776e549e1d3894843d | ||
# 3 切换路径 | ||
cd text-generation-webui | ||
# 4 安装text-generation的额外依赖 | ||
pip install accelerate bitsandbytes flexgen gradio llamacpp markdown numpy peft requests rwkv safetensors sentencepiece tqdm datasets git+https://github.com/huggingface/transformers | ||
# 5 下载模型 | ||
python download-model.py facebook/galactica-1.3b | ||
# 其他可选如 facebook/opt-1.3b | ||
# facebook/galactica-1.3b | ||
# facebook/galactica-6.7b | ||
# facebook/galactica-120b | ||
# facebook/pygmalion-1.3b 等 | ||
# 详情见 https://github.com/oobabooga/text-generation-webui | ||
5. 测试通过后,在`request_llms/bridge_all.py`中做最后的修改,把你的模型完全接入到框架中(聪慧如您,只需要看一眼该文件就明白怎么修改了) | ||
|
||
# 6 启动text-generation | ||
python server.py --cpu --listen --listen-port 7865 --model facebook_galactica-1.3b | ||
``` | ||
|
||
### 2. 修改config.py | ||
|
||
``` sh | ||
# LLM_MODEL格式: tgui:[模型]@[ws地址]:[ws端口] , 端口要和上面给定的端口一致 | ||
LLM_MODEL = "tgui:galactica-1.3b@localhost:7860" | ||
``` | ||
|
||
### 3. 运行! | ||
``` sh | ||
cd chatgpt-academic | ||
python main.py | ||
``` | ||
6. 修改`LLM_MODEL`配置,然后运行`python main.py`,测试最后的效果 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,5 @@ | ||
{ | ||
"version": 3.60, | ||
"version": 3.61, | ||
"show_feature": true, | ||
"new_feature": "修复多个BUG <-> AutoGen多智能体插件测试版 <-> 修复本地模型在Windows下的加载BUG <-> 支持文心一言v4和星火v3 <-> 支持GLM3和智谱的API <-> 解决本地模型并发BUG <-> 支持动态追加基础功能按钮 <-> 新汇报PDF汇总页面 <-> 重新编译Gradio优化使用体验" | ||
"new_feature": "修复潜在的多用户冲突问题 <-> 接入Deepseek Coder <-> AutoGen多智能体插件测试版 <-> 修复本地模型在Windows下的加载BUG <-> 支持文心一言v4和星火v3 <-> 支持GLM3和智谱的API <-> 解决本地模型并发BUG <-> 支持动态追加基础功能按钮" | ||
} |