[BETA FEATURE] Chunked Prompts - Prompt OpenAI with text way beyond the input limits #535
Replies: 4 comments 5 replies
-
Hi does it require openai api key? because i don't have credit card to create a paid account in openai. |
Beta Was this translation helpful? Give feedback.
-
This is great! I got it working! Thanks for your Youtube video on QuickAdd. I was having an odd behavior with macro not being recognized, but you dealt with the same issue when you were running a user script, so I was able to solve it. Not seeing a video of house this is used, is the expected behavior when you run the script that you will paste it into "chunk?" I thought it was going to work so that it would simply chunk the "selected" text, but a modal appears titled chunk and I paste the selected text in there. Either way, this is exciting. Thanks for making this. |
Beta Was this translation helpful? Give feedback.
-
Hi Christian, I'm excited to try to get this working! One question: following the directions above, I get a "chunk" modal that looks like it's waiting for input while the "Assistant is chunking. Creating prompt chunks with text and prompt template" is also showing. What am I supposed to do here? And is there a way to not have this pop up? |
Beta Was this translation helpful? Give feedback.
-
Released new update that allows variables to be given & used, e.g. like |
Beta Was this translation helpful? Give feedback.
-
What do you do if you have some text you want to have summarized? Easy, ask ChatGPT to do it.
Now, what if you have a large body of text you want summarized?
Do you ask ChatGPT to do so by repeatedly copying over parts of the text?
That's tedious, and we don't do that here.
Let me introduce Chunked Prompts1 for QuickAdd.
This will chunk your text into pieces that fit into the input window for the various models. The feature inserts these chunks into your predefined chunk and makes requests to the OpenAI API for each of them.
It's even able to merge multiple chunks together if there's space.
This is incredibly useful for summarizing or transforming large bodies of text. Here are some of the things I've done with it:
I'm kind of on the fence about whether it should be a feature for QuickAdd, so I've made a script that does the same to get feedback.
It's a very specific feature, whereas I like the plugin having more primitive features that can be used as building blocks.
Setup
First, you'll need the associated script. Here's how you can use scripts in macros (video instructions):
You include the script by creating it as a file in your vault, e.g.
chunked-prompt.js
and then pasting the script into the file.Then it'll appear in the scripts dropdown as an option you can add.
Find the script below:
You can now create a macro with the script. As an example, you can copy my setup. The steps should be as follows:
The script should have the following settings:
Here are explanations for each of the script settings:
{{selected}}
, it'll use the selected text. You could also hook it into another macro and use the value from some prior step, e.g. the book summarizer or YouTube summarizer from here.Using my settings will allow you to select a bunch of text, which it'll send to OpenAI in chunks. The resulting text gets put back together & inserted into the active file.
The Capture settings are rather simple - you just send the output to the active file:
Be aware that your Obsidian might freeze after you've selected the text. It's just chunking - it'll unfreeze momentarily.
Footnotes
Name pending. ↩
I extracted the text. It can't parse PDFs. ↩
Beta Was this translation helpful? Give feedback.
All reactions