-
Notifications
You must be signed in to change notification settings - Fork 278
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update thinc.ai to v8.1.1 #762
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
* Ops: replace FloatsType by constrained typevar Ops used the `FloatsType`, which had `FloatsXd` as its bound. MyPy could not infer that code such as the following is correct, ``` def dish(self, X: FloatsType, inplace: bool = False) -> FloatsType: tmp = X * X # ... ``` because the inferred type is the union (or a subtype). If we instead constrain the type variable as follows: ``` FloatsType = TypeVar("FloatsType", Floats1d, Floats2d, Floats3d, Floats4d) ``` the type paramater will be instantiated with a single concrete type, solving such issues. * Remove a bunch of casts and ignores that are not necessary anymore
* Unroll `argmax` in `maxout` for small sizes of `P` `maxout` uses the `argmax` function to determine the index of the maximum value of each `P` inputs. `argmax` uses a generic array loop, which impedes speculative execution and `could` also prevent unrolling of the outer `maxout` loop. This change unrolls `argmax` with small values of `P` using a variadic template. This leads to a small performance improvement. * Unmodernize struct initialization
This is purely a cosmetic change, but less confusing than thinc-io :).
* Add with_signpost_interval layer This layer wraps a layer, adding macOS interval signposts for the forward and backward pass. These intervals can then be visualized in the macOS Instruments.app timeline. * Fix reference in api-layers.md Co-authored-by: Madeesh Kannan <shadeMe@users.noreply.github.com> * End message is optional since signpost 0.0.3 * with_signpost_interval: also wrap init callback * docs: we wrap init as well * Add documentation fixes Suggested by @svlandeg. Co-authored-by: Madeesh Kannan <shadeMe@users.noreply.github.com>
* Add Ops.(backprop_)dish and CUDA kernel Dish is a Swish/GELU-like activation function. Since it does not rely on elementary operations like `exp` or `erf`, it can generally be computed faster than Swish and GELU: https://twitter.com/danieldekok/status/1484898130441166853 * Make mypy happy Apparently, X * X does not typecheck (?!?). * test_compare_activations_to_torch: test with different dY Also fix the backprop_dish CUDA kernel, which would fail now (thanks @shadeMe). * test_compare_activations_to_torch: be slightly more (absolute) tolerant Or the Dish test would fail (possibly different accuracies for sqrt?). * doc fix * Update dish types to use `FloatsXdT` * docs: add version tag to `(backprop_)dish` * Add Dish Thinc layer * Add Dish layer docs Also update description as suggested by @kadarakos. * Fix dish description Co-authored-by: Madeesh Kannan <shadeMe@users.noreply.github.com> Co-authored-by: Madeesh Kannan <shadeMe@users.noreply.github.com>
Co-authored-by: explosion-bot <explosion-bot@users.noreply.github.com>
* Remove redundant tests. Add confection to requirement.txt and setup.cfg. Adjust cnfig.py. * Add reference to confection in website/docs/usage-config.md. * Update confection reference in docs. * Extend imports fro confection for backwards compatibility.
* `PyTorchGradScaler`: Cache `_found_inf` on the CPU This prevents unnecessary overhead from launching kernels on the GPU in hot backward passes. * Only pin `_found_inf` to the CPU * Always store `_found_inf` as a `bool`
* work with cupy arrays and 2d arrays * force mypy pass * addressing comments * return correct shape empty array * test remap_ids with Ints2d * Update thinc/layers/remap_ids.py Co-authored-by: Daniël de Kok <me@github.danieldk.eu> * use numpy array * remove cupy import * mini fix * more strict typing * adjust test * Update thinc/layers/remap_ids.py Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * remove check * Update thinc/layers/remap_ids.py Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * address reviews * Update thinc/layers/remap_ids.py Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * simplify casting * Update thinc/layers/remap_ids.py Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * Update thinc/layers/remap_ids.py Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * remap_ids legacy * legacy * test version 1 and 2 * rename legacy to v1 * adding old test back * remap_ids docs update * Update website/docs/api-layers.md Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * Update website/docs/api-layers.md Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * make init/forward attribute setting more clear * Update website/docs/api-layers.md Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * Update website/docs/api-layers.md Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * Update website/docs/api-layers.md Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * prettier * update model type * prettier * Use new _v2 instead of renamed _v1 Co-authored-by: Daniël de Kok <me@github.danieldk.eu> Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com>
Co-authored-by: explosion-bot <explosion-bot@users.noreply.github.com>
…w` (explosion#757) `tf.experimental.dlpack.from_dlpack` expects a `PyCapsule` object.
* Remove references to FastAPI being an Explosion product. * Remove period at end of subheader.
Merge in |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
No description provided.