Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OSError: exception: access violation reading 0x0000000000000000 when trying to run manually #89

Open
MacharaStormwing opened this issue Dec 25, 2024 · 7 comments

Comments

@MacharaStormwing
Copy link

MacharaStormwing commented Dec 25, 2024

Describe the Issue
When running manually to not have to wait so long for the exe to extract all the time I get the following error:

Traceback (most recent call last):
  File "C:\LargeLanguageModels\koboldcpp_rocm_files\koboldcpp.py", line 5168, in <module>
    main(parser.parse_args(),start_server=True)
  File "C:\LargeLanguageModels\koboldcpp_rocm_files\koboldcpp.py", line 4789, in main
    loadok = load_model(modelname)
  File "C:\LargeLanguageModels\koboldcpp_rocm_files\koboldcpp.py", line 925, in load_model
    ret = handle.load_model(inputs)
OSError: exception: access violation reading 0x0000000000000000

Full log at: https://pastebin.com/hMFwdmD4

What I did was:

  • install python 3.10.11 on windows 11 (latest version 24h2) and add C:\Users\drago\AppData\Local\Programs\Python\Python310 to Path
  • install pip using "curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py" and "python get-pip.py" and adding C:\Users\drago\AppData\Local\Programs\Python\Python310\Scripts to Path
  • download and extract koboldcpp_rocm_files.zip of release KoboldCPP-v1.78.yr0-ROCm
  • in CMD go into the extracted directory and run "python koboldcpp.py"
  • select some model (prepared in Default.kcpps) and start it

PLEASE NOTE:
There is no such error when just starting koboldcpp_rocm.exe of the same release (v1.78.yr0) with the same kcpps loaded

Could you please let me know what the problem is / how to fix it?

Additional Information:
Setup:

  • CPU: Ryzen 7 9800x3d with 64gb ram GPU: Radeon RX 7900 XT
  • OS: Windows 11 24h2 latest updates installed
  • Installed latest Radeon Drivers 24.12.1, insatalled Amd Rocm HIP SDK 6.2.4

Please note: I originally got this error (even when selecting the right GPU in the GUI):

rocBLAS error: Could not initialize Tensile host: No devices found

I had to disable the iGPU of the CPU to fix this.
Is there a way to keep having the iGPU?
I tried setting an environment variable "HIP_VISIBLE_DEVICES=3" (id 3 was the ID of my rx7900xt, id 1 was the iGPU) but that did not help.

@carcoonzyk
Copy link

same problem! wait for someone to solve it

@carcoonzyk
Copy link

This is because the project built for release did not add support for iGPU, such as the parameter gfx1036.

@YellowRoseCx
Copy link
Owner

This is because the project built for release did not add support for iGPU, such as the parameter gfx1036.

rocblas wouldn't let me compile support for gfx1036

@carcoonzyk
Copy link

This is because the project built for release did not add support for iGPU, such as the parameter gfx1036.这是因为为release构建的项目没有添加对iGPU的支持,例如参数gfx1036。

rocblas wouldn't let me compile support for gfx1036rocblas 不允许我编译对 gfx1036 的支持

It seems that, starting from the rocm6, hip no longer supports iGPU. But if using the rocm5, there is no such issue

@carcoonzyk
Copy link

This is because the project built for release did not add support for iGPU, such as the parameter gfx1036.

rocblas wouldn't let me compile support for gfx1036

So I hope the new version of koldcpp.exe can add a setting for the environment variable HIP_VISIBLEDEVICES to temporarily avoid this problem

@YellowRoseCx
Copy link
Owner

YellowRoseCx commented Jan 9, 2025

This is because the project built for release did not add support for iGPU, such as the parameter gfx1036.

rocblas wouldn't let me compile support for gfx1036

So I hope the new version of koldcpp.exe can add a setting for the environment variable HIP_VISIBLEDEVICES to temporarily avoid this problem

KoboldCpp.exe does do that for Linux systems, but that command is not available on Windows. ROCM 5 did not support iGPU either with rocBLAS:

ROCm currently doesn’t support integrated graphics. Should your system have an AMD IGP installed, disable it in the BIOS prior to using ROCm. If the driver can enumerate the IGP, the ROCm runtime may crash the system, even if told to omit it via HIP_VISIBLE_DEVICES.

https://rocm.docs.amd.com/en/docs-5.7.1/deploy/linux/installer/install.html

https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.2.1/install/amdgpu-install.html

@carcoonzyk
Copy link

carcoonzyk commented Jan 9, 2025

This is because the project built for release did not add support for iGPU, such as the parameter gfx1036.这是因为为发布而构建的项目没有添加对 iGPU 的支持,例如参数 gfx1036。

rocblas wouldn't let me compile support for gfx1036rocblas 不允许我编译对 gfx1036 的支持

So I hope the new version of koldcpp.exe can add a setting for the environment variable HIP_VISIBLEDEVICES to temporarily avoid this problem所以我希望新版本的 koldcpp.exe 能增加一个环境变量HIP_VISIBLEDEVICES的设置,暂时避免这个问题

KoboldCpp.exe does do that for Linux systems, but that command is not available on Windows. ROCM 5 did not support iGPU either with rocBLAS:KoboldCpp.exe 确实对 Linux 系统执行此操作,但该命令在 Windows 上不可用。ROCM 5 不支持使用 rocBLAS 的 iGPU:

ROCm currently doesn’t support integrated graphics. Should your system have an AMD IGP installed, disable it in the BIOS prior to using ROCm. If the driver can enumerate the IGP, the ROCm runtime may crash the system, even if told to omit it via HIP_VISIBLE_DEVICES.ROCm 目前不支持集成显卡。如果您的系统安装了 AMD IGP,请在使用 ROCm 之前在 BIOS 中禁用它。如果驱动程序可以枚举 IGP,则 ROCm 运行时可能会使系统崩溃,即使通过 HIP_VISIBLE_DEVICES 告知它省略也是如此。

https://rocm.docs.amd.com/en/docs-5.7.1/deploy/linux/installer/install.html

https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.2.1/install/amdgpu-install.html

This is what I discovered a few months ago in rocBLAS in a personal compiled (forgot specific source) generated ollama.exe, which supports igpu (gfx1036). And it really works. link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants