Prompt
Use prompt engineering to do tasks, get responses and generate data.
<Prompt>
Server Component
The most basic building block for AI-powered applications that handles a text prompt
or arrays of messages
and return the response as-is.
Passing a single prompt
Live Example
Passing an array of messages
Live Example
Wrapping element
The response can be wrapped in an element by using the as
prop.
In addition, this allows you to pass in any of the standard HTML attributes as expected by that element type.
Live Example
The moon has a diameter of about 3,474 kilometers (2,159 miles) and a circumference of approximately 10,921 kilometers (6,783 miles).
usePrompt
hook
usePrompt
is a utility hook that allows for full access to the same features as Prompt
, in addition to the ability to enable JSON mode with the format
option.
Live Example
usePrompt
in JSON mode with a strict schema
Live Example
Custom Server Component with getPrompt
Setup
Add prompt: handlePromptRequest()
to the API route handler.
API Reference
Types
PromptOptions
Prompt API route handler options.Properties
openAiApiKeystring
- default`process.env.OPENAI_API_KEY`.
PromptRequestBodyinterface
Prompt request body.Properties
promptstring
A text description of the desired output. Used to send a simple `user` message to chat completion.- example'A blog post about the best way to cook a steak'
messagesarray
A list of chat completion messages comprising a conversation. `messages` are inserted before `prompt` if both are provided.- example`[{ role: 'system', content: 'You are a professional chef and esteemed poet. You answer cooking questions with poetry and rhyme.' }, { role: 'user', content: 'What is the best way to cook a steak?'}]`
- linkhttps://www.npmjs.com/package/openai `openai` for full `OpenAI.ChatCompletionMessageParam` type information.
formattext | JSON
Enforces the response format to 'text' or 'JSON'. When the format is `text`, the model generates a string of text. When the format is `JSON`: - Enables JSON mode which constrains the model to only generate strings that parse into a valid JSON object. - Adds 'Return JSON' to an initial system message to avoid the API returning an error.ChatGptCompletionResponse
The response body for `getPrompt`.Properties
responseTextstring
The response text extracted from the completion message content.tokensUsednumber
Total number of tokens used in the request (prompt + completion).finishReasonstring
The reason the chat stopped. This will be `stop` if the model hit a natural stop point or a provided stop sequence, `length` if the maximum number of tokens specified in the request was reached, `content_filter` if content was omitted due to a flag from our content filters, `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function.errorMessagestring
The error message if there was an error.Components
PromptPropsinterface<object>
Props to pass to the `<Prompt>` Server Component.Properties
promptstring
A text description of the desired output.- example'The longest river in the world'
messagesarray
A list of chat completion messages comprising a conversation.childrenReactNode
Children to render before the response content.Promptfunction
Prompt Server Component. Renders a prompt response as its children. The response can be wrapped in an element by using the `as` prop.Parameters
propsAsComponent<C, PromptProps>required
- example`as` usage `<Prompt prompt="The longest river in the world" as="div" className="text-xl" />`
- linkPromptProps
Hooks
usePromptfunction
A client-side fetch handler hook that answers a `prompt` or an array of `messages` and returns the response as-is.Parameters
bodyrequired
configUseRequestConsumerConfig<PromptRequestBody>
Fetch utility hook request options without the `fetcher`. Allows for overriding the default `request` config.Returns
isLoadingboolean
Fetch loading state. `true` if the fetch is in progress.isErrorboolean
Fetch error state. `true` if an error occurred.errorunknown
Fetch error object if `isError` is `true`datausePrompt!T | undefined
Fetch response data if the fetch was successful.refetchfunction
Refetches the data.Utilities
getPromptfunction
Answers a `prompt` or an array of `messages` and returns the response as-is. Server Action that calls the third-party API directly on the server. This avoids calling the Next.js API route handler allowing for performant Server Components.Parameters
requestPromptRequestBodyrequired
optionsPromptOptions
- linkPromptOptions
fetchPromptfunction
Answers a `prompt` or an array of `messages` and returns the response as-is. Client-side fetch handler that calls the internal Next.js API route handler, then the third-party API. Best used for Client Components and functionality.Parameters
bodyPromptRequestBodyrequired
configRequestConfigOnly
Fetch utility request options without the `body`