Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llamafactory-cli 启动方式让人迷惑体验感差,请考虑恢复python src/... 这样的启动方式 #3582

Closed
1 task done
ArcherShirou opened this issue May 6, 2024 · 27 comments · Fixed by #3596
Closed
1 task done
Labels
solved This problem has been already solved

Comments

@ArcherShirou
Copy link

ArcherShirou commented May 6, 2024

Reminder

  • I have read the README and searched the existing issues.

Reproduction

项目更新后,发现程序启动方式变为 llamafactory-cli xxx 这样的方式,老实说这样的方式体验感很糟,不能定位到底运行了哪个文件,不如之前的python src/... 这样的方式清晰,容易定位文件,运行报错时,也容易追踪和定位。请考虑是否改回来。

Expected behavior

No response

System Info

No response

Others

No response

@DinhLuan14
Copy link

The same problem. llamafactory-cli is not a good idea. :(

@DinhLuan14
Copy link

I want to find the file src/export_model.py but I can't find it in this new version :(

@Essence9999
Copy link

早上一启动,给我搞懵逼了~都习惯了,没必要这么搞啊

@xiaochaich
Copy link

臣附议

@yjj828
Copy link

yjj828 commented May 6, 2024

而且我都搞不清楚要在哪里把127.0.0.1换成0.0.0.0,搞得我在服务器上部署的都访问不到了。

@yjj828
Copy link

yjj828 commented May 6, 2024

而且我都搞不清楚要在哪里把127.0.0.1换成0.0.0.0,搞得我在服务器上部署的都访问不到了。

关键gradio提示的添加share=True也没用。

@fst813
Copy link

fst813 commented May 6, 2024

+1

@Anorid
Copy link

Anorid commented May 6, 2024

image
大佬们这个错误是什么情况

@fst813
Copy link

fst813 commented May 6, 2024

@Anorid 你这哪里有报错?

@johnmai-dev
Copy link

+1

@codemayq
Copy link
Collaborator

codemayq commented May 6, 2024

    llamafactory-cli api -h: launch an API server
    llamafactory-cli chat -h: launch a chat interface in CLI
    llamafactory-cli eval -h: do evaluation
    llamafactory-cli export -h: merge LoRA adapters and export model
    llamafactory-cli train -h: do training
    llamafactory-cli webchat -h: launch a chat interface in Web UI
    llamafactory-cli webui: launch LlamaBoard
    llamafactory-cli version: show version info

@GiulioZ94
Copy link

using "export GRADIO_SHARE=True" does not work to create a public link of gradio when running "llamafactory-cli webui"

@CodingQinghao
Copy link

臣附议!

@zzerrrro
Copy link

zzerrrro commented May 6, 2024

+n!

@songxijun
Copy link

using "export GRADIO_SHARE=True" does not work to create a public link of gradio when running "llamafactory-cli webui"

+1

@hiyouga hiyouga mentioned this issue May 6, 2024
1 task
@hiyouga
Copy link
Owner

hiyouga commented May 6, 2024

本次更新的初衷是提供更完善的文档更便捷的启动方式,由于文档没有及时更新,因此带来了一些使用上的不便,在此表示抱歉。在最新的 PR #3596 中,我们提交了以下几点更新:

  1. 添加了 train.pywebui.py,分别对应上个版本的 train_bash.py 和 train_web.py,兼容了旧的启动方式
  2. 更新了主页 Readme,添加了三步上手 Llama3 模型微调,详情见此处
  3. 将 examples 目录下的使用样例全面适配 Llama3 模型,并且提供了更细致的功能描述:https://github.com/hiyouga/LLaMA-Factory/blob/main/examples/README_zh.md

如有其它建议,欢迎在此版块下评论。

@hiyouga hiyouga added the solved This problem has been already solved label May 6, 2024
@hiyouga
Copy link
Owner

hiyouga commented May 6, 2024

I want to find the file src/export_model.py but I can't find it in this new version :(

@DinhLuan14 We have updated the document, see https://github.com/hiyouga/LLaMA-Factory/blob/main/examples/README.md#merging-lora-adapters-and-quantization

@HZF316
Copy link

HZF316 commented May 7, 2024

臣附议

@CodingQinghao
Copy link

本次更新的初衷是提供更完善的文档更便捷的启动方式,由于文档没有及时更新,因此带来了一些使用上的不便,在此表示抱歉。在最新的 PR #3596 中,我们提交了以下几点更新:

  1. 添加了 train.pywebui.py,分别对应上个版本的 train_bash.py 和 train_web.py,兼容了旧的启动方式
  2. 更新了主页 Readme,添加了三步上手 Llama3 模型微调,详情见此处
  3. 将 examples 目录下的使用样例全面适配 Llama3 模型,并且提供了更细致的功能描述:https://github.com/hiyouga/LLaMA-Factory/blob/main/examples/README_zh.md

如有其它建议,欢迎在此版块下评论。

api接口也需要一个配置网址和端口的地方,希望api_demo.py能回来,真的很需要。

@hiyouga
Copy link
Owner

hiyouga commented May 7, 2024

@CodingQinghao https://github.com/hiyouga/LLaMA-Factory/blob/main/src/api.py

@CodingQinghao
Copy link

@chunxiaoguo
Copy link

CUDA_VISIBLE_DEVICES=0 llamafactory-cli chat examples/merge_lora/llama3_lora_sft.yaml 这个执行不了
Uploading sft.png…
image

@nuass
Copy link

nuass commented May 17, 2024

纵使一键启动快了,debug是男上加男

@YueWen1024
Copy link

请问llamafactory-cli启动方式是怎么实现的?

@GlennCGL
Copy link

附议,可以把python的方式也保留一下,方便debug

@hiyouga
Copy link
Owner

hiyouga commented Sep 17, 2024

see #3582 (comment)

@xiaolling
Copy link

希望能保留脚本模式,并提供相应的 example

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved
Projects
None yet
Development

Successfully merging a pull request may close this issue.