-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loss of coordinate information from groupby.apply() on a stacked object #1483
Comments
** Maybe not an issue for others or I am missing something... |
Instead of computing the mean over your non-stacked dimension by dsg = dst.groupby('allpoints').mean() why not just instead call dsg = dst.mean('time', keep_attrs=True) so that you just collapse the time dimension and preserve the attributes on your data? Then you can |
@darothen |
This wasn't intentional. If we can fix it in a straightforward fashion, we definitely should. |
In order to maintain a list of currently relevant issues, we mark issues as stale after a period of inactivity If this issue remains relevant, please comment here or remove the |
Sorry this didn't get traction. Can we add an MCVE? Currently |
This is fixed if I use |
I use this stack, groupby, unstack quite frequently. e.g. here
An issue I have is that after
groupby('allpoints').apply()
, the coordinate names do not get carried through. i.e. the coordinate names are now:allpoints_level_0
andallpoints_level_1
. Then afterunstacking
I rename them back to lat/lon etc. Do you ever encounter this?Is there a way to carry them through and is this an issue for others?
Now we stack the data by allpoints. Note that the info about original coordinates (lat / lon) is still there...
dst = ds.stack(allpoints=['lat','lon'])
Now apply
groupby().apply()
dsg=dst.groupby('allpoints').apply(my_custom_function)
So now we have lost the
'lat','lon'
. However if we skip the groupby part and go straight tounstack
, this would be carried through.dst.unstack('allpoints')
The text was updated successfully, but these errors were encountered: