-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upcoming Orion OS upgrade (June 12-13, 2024) #981
Comments
@RatkoVasic-NOAA fyi |
Upgrade moved to May 22nd. |
For now gcc@12 is not yet functional and sys admins are working on it. Here is TODO list:
|
Hi Ratko, the following github issue says the spack-stack team is waiting for some action from admins. I assume that you are referring to MSU Orion admins that you are waiting for? If yes, what do you need from them? Is there a test case that they can investigate? |
Hi @RaghuReddy-NOAA , for now we had problem with gcc@12.2.0 |
CURRENT ORIONN STATUS (from email):I first started with 1.6.0 version, since SRW and ufs-weather-model use it, and environment unified-env-rocky9. Dom suggested that there were some problems with system GNU installation, so I installed gcc@12.2.0 and openmpi@4.1.6 Using our GNU/OpenMPI installations, I first tested spack-stack 1.7.0, environment ue-gcc.
Run directories of failing packages:
|
Good news:
We still have to fix py-matplotlib! |
This is Huston @ MSU, doing the gcc-12 rebuild for Orion. Please confirm the expected packages for gcc-12.2.0, here's what was built today, and will be sync'd after confirmation: gcc-12.2.0, no mpi: Note Issues: ncview, ncl, and cdo had errors.
gcc-12.2.0, openmpi-4.1.4
gcc-12.2.0, mpich-4.1.1
gcc-12.2.0, mpich-4.0.2
Question 1, is there a use case for gcc-12 + intel-impi? I would sooner encourage using intel-compilers with intel-impi over any others. Question 2, should openmpi or mpich be updated to other versions? Extra Information: Environment Location, going to work towards leveraging environments more, i.e. one environment file per compiler+mpi combination: /apps/spack-managed/spack-devel/var/spack/environments/ Getting Spack on HPC systems. Tentative plan is to make this a permanent path on all MSU systems: /apps/spack-managed/spack-devel Our config is in there, under etc/spack, and the setup script is in the expected share/spack/ |
@snowbird294 Thanks for the info. I wasn't involved in any of the conversations regarding sysadmin spack builds on Orion after the Rocky9 transition, but usually what we do with spack-stack is that we use the compilers and MPI libraries from the system/sysadmins and build the rest ourselves. @AlexanderRichert-NOAA @RatkoVasic-NOAA please correct me if a different path was chosen for Orion. As far as compilers and MPI are concerned, gcc@12.x.y with openmpi@4.1.x or 5.0.x should do; we don't usually mix and match GNU compilers with Intel MPI, we use the intel compilers (so far mostly the classic compilers up to the very last release of oneAPI that had them 2023.??.??) with Intel oneAPI MPI. |
This time on Orion we went ahead and installed gnu and openmpi: |
Update from JCSDA: spack-stack-1.7.0 is working for them, with the exception of having to load the |
@AlexanderRichert-NOAA Where are we at with the 1.5.1 and 1.6.0 installs? Are they all done? If so, then we can close this issue as completed - I did 1.7.0 and we just merged the updates for develop. |
All done. The only ones not checked off in the list are two of the 1.7.0 ones, so if those are done, then let's close this puppy. |
They were done, I forgot the list. Checked the boxes, closing! |
Is your feature request related to a problem? Please describe.
UPDATE - the OS upgrade was postponed to June 12-13, 2024.
Email from the Orion sysadmins:
Describe the solution you'd like
We need to rebuild whatever versions of spack-stack we want to support (current for sure, how many back?) after the OS upgrade.
Additional context
n/a
The text was updated successfully, but these errors were encountered: