Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add nvidia-driver-latest-dkms package #47

Merged
merged 1 commit into from
Apr 19, 2022

Conversation

prateekchaudhry
Copy link
Contributor

Summary

This PR adds nvidia-driver-latest-dkms package to the GPU installation script. This fixes errors faced due to Nvidia packages when updating kernel on old AMIs.

Testing

Created an AMI with old kernel (4.14.268) and ran yum update kernel command

New tests cover the changes: no

Description for the changelog

Add nvidia-driver-latest-dkms package

Licensing

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

nvidia-fabric-manager \
pciutils \
xorg-x11-server-Xorg \
docker-runtime-nvidia \
oci-add-hooks \
libnvidia-container \
libnvidia-container-tools \
nvidia-container-runtime-hook \
cuda-drivers \
nvidia-container-runtime-hook
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it okay if we add cuda packages here?

Copy link
Contributor Author

@prateekchaudhry prateekchaudhry Apr 19, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not super clear with it, but AL team had some issues installing cuda drivers in same yum instruction. So keeping it in sync with them.

@prateekchaudhry prateekchaudhry merged commit 6a363ff into aws:main Apr 19, 2022
@prateekchaudhry prateekchaudhry mentioned this pull request Apr 20, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants