Skip to content

Commit

Permalink
update the license
Browse files Browse the repository at this point in the history
  • Loading branch information
billishyahao authored and Luca-Calabria committed Nov 21, 2024
1 parent 18aa28b commit 1648a4c
Showing 1 changed file with 2 additions and 6 deletions.
8 changes: 2 additions & 6 deletions optimum/habana/transformers/models/gemma2/modeling_gemma2.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,6 @@
# coding=utf-8
# Copyright 2023 Mistral AI and the HuggingFace Inc. team. All rights reserved.
# Copyright 2024 Google Inc. HuggingFace Inc. team. All rights reserved.
#
# This code is based on EleutherAI's GPT-NeoX library and the GPT-NeoX
# and OPT implementations in this library. It has been modified from its
# original forms to accommodate minor architectural differences compared
# to GPT-NeoX and OPT used by the Meta AI team that trained the model.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
Expand Down Expand Up @@ -1061,4 +1057,4 @@ def apply_customized_rope(q, k, cos, sin, position_ids):
)
else:
# keep the same implementation as Transformers v4.37.2
return apply_rotary_pos_emb(q, k, cos[position_ids], sin[position_ids])
return apply_rotary_pos_emb(q, k, cos[position_ids], sin[position_ids])

0 comments on commit 1648a4c

Please sign in to comment.