-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore(deps): bump transformers from 4.33.3 to 4.36.0 in /site/utils/embeddings #180
Merged
dependabot
merged 1 commit into
master
from
dependabot/pip/site/utils/embeddings/transformers-4.36.0
Dec 20, 2023
Merged
chore(deps): bump transformers from 4.33.3 to 4.36.0 in /site/utils/embeddings #180
dependabot
merged 1 commit into
master
from
dependabot/pip/site/utils/embeddings/transformers-4.36.0
Dec 20, 2023
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Bumps [transformers](https://github.com/huggingface/transformers) from 4.33.3 to 4.36.0. - [Release notes](https://github.com/huggingface/transformers/releases) - [Commits](huggingface/transformers@v4.33.3...v4.36.0) --- updated-dependencies: - dependency-name: transformers dependency-type: direct:production ... Signed-off-by: dependabot[bot] <support@github.com>
dependabot
bot
added
dependencies
Pull requests that update a dependency file
python
Pull requests that update Python code
labels
Dec 20, 2023
@dependabot merge
…On Thu, 21 Dec 2023, 08:12 dependabot[bot], ***@***.***> wrote:
This automated pull request fixes a security vulnerability
<https://github.com/ajfisher/ajfisher.me/security/dependabot/96>
(critical severity).
Learn more about Dependabot security updates
<https://docs.github.com/github/managing-security-vulnerabilities/configuring-dependabot-security-updates>.
------------------------------
Bumps transformers <https://github.com/huggingface/transformers> from
4.33.3 to 4.36.0.
Release notes
*Sourced from transformers's releases
<https://github.com/huggingface/transformers/releases>.*
v4.36: Mixtral, Llava/BakLlava, SeamlessM4T v2, AMD ROCm, F.sdpa
wide-spread support New model additions Mixtral
Mixtral is the new open-source model from Mistral AI announced by the
blogpost Mixtral of Experts <https://mistral.ai/news/mixtral-of-experts/>.
The model has been proven to have comparable capabilities to Chat-GPT
according to the benchmark results shared on the release blogpost.
The architecture is a sparse Mixture of Experts with Top-2 routing
strategy, similar as NllbMoe architecture in transformers. You can use it
through AutoModelForCausalLM interface:
>>> import torch>>> from transformers import AutoModelForCausalLM, AutoTokenizer>>> model = AutoModelForCausalLM.from_pretrained("mistralai/Mixtral-8x7B", torch_dtype=torch.float16, device_map="auto")>>> tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-8x7B")>>> prompt = "My favourite condiment is">>> model_inputs = tokenizer([prompt], return_tensors="pt").to(device)>>> model.to(device)>>> generated_ids = model.generate(**model_inputs, max_new_tokens=100, do_sample=True)>>> tokenizer.batch_decode(generated_ids)[0]
The model is compatible with existing optimisation tools such Flash
Attention 2, bitsandbytes and PEFT library. The checkpoints are release
under mistralai <https://huggingface.co/mistralai> organisation on the
Hugging Face Hub.
Llava / BakLlava
Llava is an open-source chatbot trained by fine-tuning LlamA/Vicuna on
GPT-generated multimodal instruction-following data. It is an
auto-regressive language model, based on the transformer architecture. In
other words, it is an multi-modal version of LLMs fine-tuned for chat /
instructions.
The Llava model was proposed in Improved Baselines with Visual
Instruction Tuning <https://arxiv.org/pdf/2310.03744> by Haotian Liu,
Chunyuan Li, Yuheng Li and Yong Jae Lee.
- [Llava] Add Llava to transformers by @younesbelkada
<https://github.com/younesbelkada> in #27662
<https://redirect.github.com/huggingface/transformers/issues/27662>
- [LLaVa] Some improvements by @NielsRogge
<https://github.com/NielsRogge> in #27895
<https://redirect.github.com/huggingface/transformers/issues/27895>
The integration also includes BakLlava
<https://github.com/SkunkworksAI/BakLLaVA> which is a Llava model trained
with Mistral backbone.
The mode is compatible with "image-to-text" pipeline:
from transformers import pipelinefrom PIL import Image import requestsmodel_id = "llava-hf/llava-1.5-7b-hf"</tr></table>
... (truncated)
Commits
- 1466677
<huggingface/transformers@1466677>
Release: v4.36.0
- accccdd
<huggingface/transformers@accccdd>
[Add Mixtral] Adds support for the Mixtral MoE (#27942
<https://redirect.github.com/huggingface/transformers/issues/27942>)
- 0676d99
<huggingface/transformers@0676d99>
[from_pretrained] Make from_pretrained fast again (#27709
<https://redirect.github.com/huggingface/transformers/issues/27709>)
- 9f18cc6
<huggingface/transformers@9f18cc6>
Fix SDPA dispatch & make SDPA CI compatible with torch<2.1.1 (#27940
<https://redirect.github.com/huggingface/transformers/issues/27940>)
- 7ea21f1
<huggingface/transformers@7ea21f1>
[LLaVa] Some improvements (#27895
<https://redirect.github.com/huggingface/transformers/issues/27895>)
- 5e620a9
<huggingface/transformers@5e620a9>
Fix SeamlessM4Tv2ModelIntegrationTest (#27911
<https://redirect.github.com/huggingface/transformers/issues/27911>)
- e96c1de
<huggingface/transformers@e96c1de>
Skip UnivNetModelTest::test_multi_gpu_data_parallel_forward (#27912
<https://redirect.github.com/huggingface/transformers/issues/27912>)
- 8d8970e
<huggingface/transformers@8d8970e>
[BEiT] Fix test (#27934
<https://redirect.github.com/huggingface/transformers/issues/27934>)
- 235be08
<huggingface/transformers@235be08>
[DETA] fix backbone freeze/unfreeze function (#27843
<https://redirect.github.com/huggingface/transformers/issues/27843>)
- df5c5c6
<huggingface/transformers@df5c5c6>
Fix typo (#27918
<https://redirect.github.com/huggingface/transformers/issues/27918>)
- Additional commits viewable in compare view
<huggingface/transformers@v4.33.3...v4.36.0>
[image: Dependabot compatibility score]
<https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores>
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting @dependabot
rebase.
------------------------------
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- @dependabot rebase will rebase this PR
- @dependabot recreate will recreate this PR, overwriting any edits
that have been made to it
- @dependabot merge will merge this PR after your CI passes on it
- @dependabot squash and merge will squash and merge this PR after
your CI passes on it
- @dependabot cancel merge will cancel a previously requested merge
and block automerging
- @dependabot reopen will reopen this PR if it is closed
- @dependabot close will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- @dependabot show <dependency name> ignore conditions will show all
of the ignore conditions of the specified dependency
- @dependabot ignore this major version will close this PR and stop
Dependabot creating any more for this major version (unless you reopen the
PR or upgrade to it yourself)
- @dependabot ignore this minor version will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen the
PR or upgrade to it yourself)
- @dependabot ignore this dependency will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the PR
or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the Security
Alerts page <https://github.com/ajfisher/ajfisher.me/network/alerts>.
------------------------------
You can view, comment on, or merge this pull request online at:
#180
Commit Summary
- 64bca6f
<64bca6f>
chore(deps): bump transformers in /site/utils/embeddings
File Changes
(1 file <https://github.com/ajfisher/ajfisher.me/pull/180/files>)
- *M* site/utils/embeddings/requirements.txt
<https://github.com/ajfisher/ajfisher.me/pull/180/files#diff-56dbdfaef2bf8f7c4cbb8ecc8f0486446b937133ddbbf89f9b708244d7ecd521>
(2)
Patch Links:
- https://github.com/ajfisher/ajfisher.me/pull/180.patch
- https://github.com/ajfisher/ajfisher.me/pull/180.diff
—
Reply to this email directly, view it on GitHub
<#180>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAA5DI7NNQF6HIAJ6UN3KUDYKNIFPAVCNFSM6AAAAABA5NZG7CVHI2DSMVQWIX3LMV43ASLTON2WKOZSGA2TCMRXGUZTCMY>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
dependabot
bot
deleted the
dependabot/pip/site/utils/embeddings/transformers-4.36.0
branch
December 20, 2023 22:01
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
dependencies
Pull requests that update a dependency file
python
Pull requests that update Python code
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Bumps transformers from 4.33.3 to 4.36.0.
Release notes
Sourced from transformers's releases.
... (truncated)
Commits
1466677
Release: v4.36.0accccdd
[Add Mixtral
] Adds support for the Mixtral MoE (#27942)0676d99
[from_pretrained
] Make from_pretrained fast again (#27709)9f18cc6
Fix SDPA dispatch & make SDPA CI compatible with torch<2.1.1 (#27940)7ea21f1
[LLaVa] Some improvements (#27895)5e620a9
FixSeamlessM4Tv2ModelIntegrationTest
(#27911)e96c1de
SkipUnivNetModelTest::test_multi_gpu_data_parallel_forward
(#27912)8d8970e
[BEiT] Fix test (#27934)235be08
[DETA] fix backbone freeze/unfreeze function (#27843)df5c5c6
Fix typo (#27918)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebase
will rebase this PR@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it@dependabot merge
will merge this PR after your CI passes on it@dependabot squash and merge
will squash and merge this PR after your CI passes on it@dependabot cancel merge
will cancel a previously requested merge and block automerging@dependabot reopen
will reopen this PR if it is closed@dependabot close
will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot show <dependency name> ignore conditions
will show all of the ignore conditions of the specified dependency@dependabot ignore this major version
will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor version
will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependency
will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)You can disable automated security fix PRs for this repo from the Security Alerts page.