You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Running examples/generate_json.py (an other mistral based examples) fails due to
Traceback (most recent call last):
File "/private/home/matthewchang/work/transformers-CFG/examples/generate_json_array.py", line 24, in <module>
grammar = IncrementalGrammarConstraint(grammar_str, "root", tokenizer)
File "/private/home/matthewchang/work/transformers-CFG/transformers_cfg/grammar_utils.py", line 13, in __init__
super().__init__(*args, **kwargs)
File "/private/home/matthewchang/work/transformers-CFG/transformers_cfg/token_grammar_recognizer.py", line 167, in __init__
super().__init__(grammar_str, tokenizer, start_rule_name, unicode)
File "/private/home/matthewchang/work/transformers-CFG/transformers_cfg/token_grammar_recognizer.py", line 36, in __init__
self.unicode_trie = ByteTrie.from_tokenizer(tokenizer, unicode=unicode)
File "/private/home/matthewchang/work/transformers-CFG/transformers_cfg/tokenization/trie.py", line 60, in from_tokenizer
mapping = get_mapping(tokenizer, unicode=unicode)
File "/private/home/matthewchang/work/transformers-CFG/transformers_cfg/tokenization/mapping.py", line 12, in get_mapping
log.debug(f"tokenizer model type: {get_tokenizer_model_type(tokenizer)}")
File "/private/home/matthewchang/work/transformers-CFG/transformers_cfg/utils.py", line 90, in get_tokenizer_model_type
or tokenizer_json["pre_tokenizer"]["pretokenizers"][1]["type"]
KeyError: 'pretokenizers'
I am able to reproduce this on a main with a clean conda environment. This is fixed by this PR: #61
The text was updated successfully, but these errors were encountered:
Running
examples/generate_json.py
(an other mistral based examples) fails due toI am able to reproduce this on a main with a clean conda environment. This is fixed by this PR: #61
The text was updated successfully, but these errors were encountered: