-
-
Notifications
You must be signed in to change notification settings - Fork 18.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
REGR: fix numpy accumulate ufuncs for DataFrame #39260
REGR: fix numpy accumulate ufuncs for DataFrame #39260
Conversation
# in theory could preserve int dtype for default axis=0 | ||
expected = pd.DataFrame({"a": [1.0, 3.0, 3.0, 4.0], "b": [0.1, 4.0, 4.0, 4.0]}) | ||
tm.assert_frame_equal(result, expected) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mmm, I'd like this to be more than "in theory". I'd consider this a buggy test, since things should be done blockwise for axis=0.
Can you you change the test case to have just floats or just ints (even if you have to manually split it for test coverage?).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I specifically used two dtypes to have two blocks to ensure we handle this case correctly for axis=1 (which can never be done clockwise)
Just above there is already a case with only ints that preserves the int dtype.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that before pandas 1.2.0, this also didn't preserve the dtypes per column, and 1.2.0 itself didn't calculate a proper result (so I would call this PR a strict improvement ;))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
agreed re preserving for axis=0. couldn't we still use mgr.apply in that case?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be clear, I could certainly detect the special case of accumulate with axis=0, and then pass axis=1 to the blocks, but:
1) that requires special case code like this in array_ufunc:
else:
# the ufunc(dataframe) case
...
elif method == "accumulate" and ("axis" not in kwargs or ("axis" in kwargs and kwargs["axis"] == 0)):
# swap axis for the transposed Block values
kwargs["axis"] = 1
result = mgr.apply(getattr(ufunc, method), **kwargs)
2) that requires Block.apply
to be "aware" of axis
. Currently it simply passes through keywords, but in this case it would need to interpret axis
differently depending on whether its values are stored as 2D or 1D (and I know we already need to take this axis swapping into account in many places, eg with NDFrame._get_block_manager_axis(axis)
, and in the internals as well, but that's typically when axis is a keyword of our own, and not a user-specified kwarg of a generic applied function). So I certainly could add an ExtensionBlock.apply override to take this into account. But we could also decide to leave this as is for now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but in this case it would need to interpret axis differently depending on whether its values are stored as 2D or 1D
Once more with feeling: this wouldn't be an issue with 2D EAs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Once more with feeling: this wouldn't be an issue with 2D EAs.
And I can also say: this wouldn't be an issue with only 1D arrays ..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@TomAugspurger @jbrockmendel I opened #39275 to keep track of the fact that this can be improved to preserve dtypes
doc/source/whatsnew/v1.2.1.rst
Outdated
@@ -31,6 +31,7 @@ Fixed regressions | |||
- Fixed regression in :meth:`DataFrame.replace` raising ``ValueError`` when :class:`DataFrame` has dtype ``bytes`` (:issue:`38900`) | |||
- Fixed regression in :meth:`Series.fillna` that raised ``RecursionError`` with ``datetime64[ns, UTC]`` dtype (:issue:`38851`) | |||
- Fixed regression in comparisons between ``NaT`` and ``datetime.date`` objects incorrectly returning ``True`` (:issue:`39151`) | |||
- Fixed regression in calling NumPy ``accumulate`` ufuncs on DataFrames (:issue:`39259`) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you properly reference accumulate. i would be more clear about this note, and show and example np.maximum.accumulate(df)
or similar (just the name no need to show computation)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you mean adding a reference to the numpy docs?
In case, I already added the example here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes a refernce to the numpy docs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, done
thanks @jorisvandenbossche |
@meeseeksdev backport 1.2.x |
…39273) Co-authored-by: Joris Van den Bossche <jorisvandenbossche@gmail.com>
Closes #39259
So we shouldn't call non-
__call__
ufuncs block-by-block, since they can work along a certain axis, likeaccumulate
. In this case there were 2 problems: 1) we were applying the ufunc on the Block values, which is 2D but transposed compared to the DataFrame (so the accumulation was done in the wrong direction) and 2) theaxis
keyword was also not passed through in case the user specified it.