Skip to content

Commit

Permalink
[Codegen][ORT][Static Seq Length] TextGenerationPipeline (#946)
Browse files Browse the repository at this point in the history
* initial commit

* coreys simplifications

* finishing the second model static

* ready, time for beautification

* ready for review

* moved the code to examples

* fix eos logic

* add argument num_tokens_to_generate
  • Loading branch information
dbogunowicz authored and markurtz committed Jun 8, 2023
1 parent b1cf01b commit 0a3f48d
Show file tree
Hide file tree
Showing 4 changed files with 533 additions and 5 deletions.
30 changes: 30 additions & 0 deletions examples/codegen/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
<!--
Copyright (c) 2021 - present / Neuralmagic, Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->

Example of how to run the pipeline:

```python
from examples.codegen.text_generation import TextGenerationPipeline

codegen = TextGenerationPipeline(
model_path="/network/damian/static-codegen-350M-multi",
engine_type="onnxruntime",
sequence_length=128, )

out = codegen(sequences=["def hello_world():", "def fibonacci(x):"])
for seq in out.sequences:
print(seq)
```
Loading

0 comments on commit 0a3f48d

Please sign in to comment.