Skip to content

Commit

Permalink
feat: Add support for component names (#120)
Browse files Browse the repository at this point in the history
## Proposed changes

~Change the API, to remove the necessity of a wrapper for
`gsx.Component`. It is still useful to reference either the
`gsx.Component` type or the `gsx.ComponentProps` type, but they are
technically optional if your component does not accept children.~

Cleaned up the execution model (removes the extra nested function).

~I opted not to remove `gsx.StreamComponent`. It is possible, but
requires that `jsx` infer whether it is a stream component or not by the
fact that the result is a `Streamable`. This gets a bit strange because
Arrays are technically `IterableIterators`, so it can be ambiguous to
distinguish between an array result (which should not be collapsed) and
an iterator (with should be collapsed) when `props.stream` is undefined.
So by keeping the `StreamComponent` wrapper, the developer makes it very
clear what their intention is.~

~I'm on the fence as to whether this is an overall improvement to the
framework or not. On one hand, it enables slightly simpler component
definition, but on the other, if you forget to use `gsx.ComponentProps`
or `gsx.Component` type, the compiler will not allow you to pass
children to the component. As far as I can tell, there is no typing
magic we can use to make the compiler function otherwise. So this
introduces a foot gun, with limited benefit (still need to use
`gsx.Component` anywhere you define a component, but its just a type
now).~

After a bunch of discussion we opted to keep the `gsx.Component`
wrapper, and add a `name` parameter. Still fixed up the execution model
to remove the unnecessary nested function.
  • Loading branch information
jmoseley authored Jan 17, 2025
1 parent 6a36077 commit cc5d69c
Show file tree
Hide file tree
Showing 26 changed files with 620 additions and 442 deletions.
6 changes: 6 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -14,5 +14,11 @@
"cSpell.words": ["gensx", "jsxs", "Streamable"],
"[jsonc]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
},
"files.associations": {
"*.tsx.template": "typescriptreact",
"*.ts.template": "typescript",
"*.json.template": "json",
"*.md.template": "markdown"
}
}
65 changes: 36 additions & 29 deletions docs/src/content/docs/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@ interface LLMResearchProps {
}
type LLMResearchOutput = string;
const LLMResearch = gsx.Component<LLMResearchProps, LLMResearchOutput>(
"LLMResearch",
async ({ topic }) => {
console.log("📚 Researching topic:", topic);
const systemPrompt = `You are a helpful assistant that researches topics...`;
Expand Down Expand Up @@ -71,19 +72,22 @@ All GenSX components support nesting, a pattern to access component outputs via

```tsx
export const BlogWritingWorkflow =
gsx.StreamComponent<BlogWritingWorkflowProps>(async ({ prompt }) => {
return (
<OpenAIProvider apiKey={process.env.OPENAI_API_KEY}>
<ParallelResearch prompt={prompt}>
{(research) => (
<LLMWriter prompt={prompt} research={research.flat()}>
{(draft) => <LLMEditor draft={draft} stream={true} />}
</LLMWriter>
)}
</ParallelResearch>
</OpenAIProvider>
);
});
gsx.StreamComponent<BlogWritingWorkflowProps>(
"BlogWritingWorkflow",
async ({ prompt }) => {
return (
<OpenAIProvider apiKey={process.env.OPENAI_API_KEY}>
<ParallelResearch prompt={prompt}>
{(research) => (
<LLMWriter prompt={prompt} research={research.flat()}>
{(draft) => <LLMEditor draft={draft} stream={true} />}
</LLMWriter>
)}
</ParallelResearch>
</OpenAIProvider>
);
},
);
```

There is no need for a DSL or graph API to define the structure of your workflow. More complex patterns like cycles and agents can be encapsulated in components that use standard loops and conditionals. Typescript and JSX unifies workflow definition and execution in plain old typescript, with the ability to express all of the same patterns.
Expand Down Expand Up @@ -145,22 +149,25 @@ GenSX makes this easy to handle with `StreamComponent`.
A single component implementation can be used in both streaming and non-streaming contexts by setting the `stream` prop:

```tsx
const LLMEditor = gsx.StreamComponent<LLMEditorProps>(async ({ draft }) => {
console.log("🔍 Editing draft");
const systemPrompt = `You are a helpful assistant that edits blog posts...`;

return (
<ChatCompletion
stream={true}
model="gpt-4o-mini"
temperature={0}
messages={[
{ role: "system", content: systemPrompt },
{ role: "user", content: draft },
]}
/>
);
});
const LLMEditor = gsx.StreamComponent<LLMEditorProps>(
"LLMEditor",
async ({ draft }) => {
console.log("🔍 Editing draft");
const systemPrompt = `You are a helpful assistant that edits blog posts...`;

return (
<ChatCompletion
stream={true}
model="gpt-4o-mini"
temperature={0}
messages={[
{ role: "system", content: systemPrompt },
{ role: "user", content: draft },
]}
/>
);
},
);
```

From there you can use the component in a streaming context:
Expand Down
1 change: 1 addition & 0 deletions docs/src/content/docs/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@ interface RespondProps {
type RespondOutput = string;

const Respond = gsx.Component<RespondProps, RespondOutput>(
"Respond",
async ({ userInput }) => {
return (
<ChatCompletion
Expand Down
111 changes: 61 additions & 50 deletions examples/blogWriter/blogWriter.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ interface LLMResearchBrainstormOutput {
const LLMResearchBrainstorm = gsx.Component<
LLMResearchBrainstormProps,
LLMResearchBrainstormOutput
>(({ prompt }) => {
>("LLMResearchBrainstorm", ({ prompt }) => {
console.log("🔍 Starting research for:", prompt);
const systemPrompt = `You are a helpful assistant that brainstorms topics for a researching a blog post. The user will provide a prompt and you will brainstorm topics based on the prompt. You should return 3 - 5 topics, as a JSON array.
Expand Down Expand Up @@ -39,6 +39,7 @@ interface LLMResearchProps {
}
type LLMResearchOutput = string;
const LLMResearch = gsx.Component<LLMResearchProps, LLMResearchOutput>(
"LLMResearch",
({ topic }) => {
console.log("📚 Researching topic:", topic);
const systemPrompt = `You are a helpful assistant that researches topics. The user will provide a topic and you will research the topic. You should return a summary of the research, summarizing the most important points in a few sentences at most.`;
Expand All @@ -62,6 +63,7 @@ interface LLMWriterProps {
}
type LLMWriterOutput = string;
const LLMWriter = gsx.Component<LLMWriterProps, LLMWriterOutput>(
"LLMWriter",
({ prompt, research }) => {
const systemPrompt = `You are a helpful assistant that writes blog posts. The user will provide a prompt and you will write a blog post based on the prompt. Unless specified by the user, the blog post should be 200 words.
Expand All @@ -84,38 +86,42 @@ Here is the research for the blog post: ${research.join("\n")}`;
interface LLMEditorProps {
draft: string;
}
const LLMEditor = gsx.StreamComponent<LLMEditorProps>(({ draft }) => {
console.log("🔍 Editing draft");
const systemPrompt = `You are a helpful assistant that edits blog posts. The user will provide a draft and you will edit it to make it more engaging and interesting.`;
const LLMEditor = gsx.StreamComponent<LLMEditorProps>(
"LLMEditor",
({ draft }) => {
console.log("🔍 Editing draft");
const systemPrompt = `You are a helpful assistant that edits blog posts. The user will provide a draft and you will edit it to make it more engaging and interesting.`;

return (
<ChatCompletion
stream={true}
model="gpt-4o-mini"
temperature={0}
messages={[
{ role: "system", content: systemPrompt },
{ role: "user", content: draft },
]}
/>
);
});
return (
<ChatCompletion
stream={true}
model="gpt-4o-mini"
temperature={0}
messages={[
{ role: "system", content: systemPrompt },
{ role: "user", content: draft },
]}
/>
);
},
);

interface WebResearcherProps {
prompt: string;
}
type WebResearcherOutput = string[];
const WebResearcher = gsx.Component<WebResearcherProps, WebResearcherOutput>(
async ({ prompt }) => {
console.log("🌐 Researching web for:", prompt);
const results = await Promise.resolve([
"web result 1",
"web result 2",
"web result 3",
]);
return results;
},
);
export const WebResearcher = gsx.Component<
WebResearcherProps,
WebResearcherOutput
>("WebResearcher", async ({ prompt }) => {
console.log("🌐 Researching web for:", prompt);
const results = await Promise.resolve([
"web result 1",
"web result 2",
"web result 3",
]);
return results;
});

type ParallelResearchOutput = [string[], string[]];
interface ParallelResearchComponentProps {
Expand All @@ -124,31 +130,36 @@ interface ParallelResearchComponentProps {
const ParallelResearch = gsx.Component<
ParallelResearchComponentProps,
ParallelResearchOutput
>(({ prompt }) => (
<>
<LLMResearchBrainstorm prompt={prompt}>
{({ topics }) => {
return topics.map((topic) => <LLMResearch topic={topic} />);
}}
</LLMResearchBrainstorm>
<WebResearcher prompt={prompt} />
</>
));
>("ParallelResearch", ({ prompt }) => {
return (
<>
<LLMResearchBrainstorm prompt={prompt}>
{({ topics }) => {
return topics.map((topic) => <LLMResearch topic={topic} />);
}}
</LLMResearchBrainstorm>
<WebResearcher prompt={prompt} />
</>
);
});

interface BlogWritingWorkflowProps {
prompt: string;
}
export const BlogWritingWorkflow =
gsx.StreamComponent<BlogWritingWorkflowProps>(({ prompt }) => {
return (
<OpenAIProvider apiKey={process.env.OPENAI_API_KEY}>
<ParallelResearch prompt={prompt}>
{(research) => (
<LLMWriter prompt={prompt} research={research.flat()}>
{(draft) => <LLMEditor draft={draft} stream={true} />}
</LLMWriter>
)}
</ParallelResearch>
</OpenAIProvider>
);
});
gsx.StreamComponent<BlogWritingWorkflowProps>(
"BlogWritingWorkflow",
({ prompt }) => {
return (
<OpenAIProvider apiKey={process.env.OPENAI_API_KEY}>
<ParallelResearch prompt={prompt}>
{(research) => (
<LLMWriter prompt={prompt} research={research.flat()}>
{(draft) => <LLMEditor draft={draft} stream={true} />}
</LLMWriter>
)}
</ParallelResearch>
</OpenAIProvider>
);
},
);
Loading

0 comments on commit cc5d69c

Please sign in to comment.