Skip to the main content

@trikinco/fullstack-components is no longer maintained.

Prompt

Use prompt engineering to do tasks, get responses and generate data.

<Prompt> Server Component

The most basic building block for AI-powered applications that handles a text prompt or arrays of messages and return the response as-is.

Passing a single prompt

Live Example

The Nile River is considered the longest river in the world, with a length of approximately 6,650 kilometers (4,130 miles).
<Prompt prompt="What's the longest river in the world?" />

Passing an array of messages

Live Example

The fox scurries over the ice
<Prompt
  messages={[
    {
      role: 'system',
      content:
        'You translate Norwegian to English. You return the translated text directly',
    },
    {
      role: 'user',
      content: 'Reven rasker over isen',
    },
  ]}
/>

Wrapping element

The response can be wrapped in an element by using the as prop. In addition, this allows you to pass in any of the standard HTML attributes as expected by that element type.

Live Example

The moon has a diameter of about 3,474 kilometers (2,159 miles) and a circumference of approximately 10,921 kilometers (6,783 miles).

<Prompt
  prompt="How big is the moon?"
  as="h3"
  className="text-xl"
/>

usePrompt hook

usePrompt is a utility hook that allows for full access to the same features as Prompt, in addition to the ability to enable JSON mode with the format option.

Live Example

'use client'
import { usePrompt } from '@trikinco/fullstack-components/client'
 
export default function Page() {
  const { isLoading, data } = usePrompt({
    prompt: 'What is the longest river in the world?',
  })
 
  if (isLoading) {
    return 'Loading...'
  }
 
  return <p>{data}</p>
}

usePrompt in JSON mode with a strict schema

Live Example

Specifying a JSON schema
'use client'
import { usePrompt } from '@trikinco/fullstack-components/client'
 
export default function Page() {
  // Optional, type variable to infer the return type for `data`
  const { isLoading, data } = usePrompt<{ rivers: string[] }>({
    // Required, JSON mode
    format: 'JSON',
    prompt: `
    What are the 5 longest rivers in the world?
    Return JSON: {"rivers": string[]}
    `,
    // ๐Ÿ‘† Specifying the schema in the prompt is required.
    // You can play around with the wording.
  })
 
  if (isLoading) {
    return 'Loading...'
  }
 
  return (
    <ol>
      {data?.rivers.map((river) => <li key={river}>{river}</li>)}
    </ol>
  )
}

Custom Server Component with getPrompt

Server Component
import { getPrompt } from '@trikinco/fullstack-components'
 
export default async function Page() {
  const { responseText } = await getPrompt({
    prompt: 'Tell me about TailwindCSS',
  })
 
  return <p>{responseText}</p>
}

Setup

Add prompt: handlePromptRequest() to the API route handler.

app/api/fsutils/[...fscomponents]/route.ts
import {
  handleFSComponents,
  handlePromptRequest,
  type FSCOptions,
} from '@trikinco/fullstack-components'
 
const fscOptions: FSCOptions = {
  prompt: handlePromptRequest({
    openAiApiKey: process.env.OPENAI_API_KEY || '',
  }),
  // Additional options and handlers...
}
 
const fscHandler = handleFSComponents(fscOptions)
 
export { fscHandler as GET, fscHandler as POST }

API Reference

Types

PromptOptions

Prompt API route handler options.
Properties

openAiApiKeystring

  • default`process.env.OPENAI_API_KEY`.

PromptRequestBodyinterface

Prompt request body.
Properties

promptstring

A text description of the desired output. Used to send a simple `user` message to chat completion.
  • example'A blog post about the best way to cook a steak'

messagesarray

A list of chat completion messages comprising a conversation. `messages` are inserted before `prompt` if both are provided.

formattext | JSON

Enforces the response format to 'text' or 'JSON'. When the format is `text`, the model generates a string of text. When the format is `JSON`: - Enables JSON mode which constrains the model to only generate strings that parse into a valid JSON object. - Adds 'Return JSON' to an initial system message to avoid the API returning an error.

ChatGptCompletionResponse

The response body for `getPrompt`.
Properties

responseTextstring

The response text extracted from the completion message content.

tokensUsednumber

Total number of tokens used in the request (prompt + completion).

finishReasonstring

The reason the chat stopped. This will be `stop` if the model hit a natural stop point or a provided stop sequence, `length` if the maximum number of tokens specified in the request was reached, `content_filter` if content was omitted due to a flag from our content filters, `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function.

errorMessagestring

The error message if there was an error.

Components

PromptPropsinterface<object>

Props to pass to the `<Prompt>` Server Component.
Properties

promptstring

A text description of the desired output.
  • example'The longest river in the world'

messagesarray

A list of chat completion messages comprising a conversation.

childrenReactNode

Children to render before the response content.

Promptfunction

import { Prompt } from '@trikinco/fullstack-components' 
Prompt Server Component. Renders a prompt response as its children. The response can be wrapped in an element by using the `as` prop.
Parameters

propsAsComponent<C, PromptProps>required

  • example`as` usage `<Prompt prompt="The longest river in the world" as="div" className="text-xl" />`
  • linkPromptProps
returnsPromise<JSX.Element>

Hooks

usePromptfunction

import { usePrompt } from '@trikinco/fullstack-components/client'
A client-side fetch handler hook that answers a `prompt` or an array of `messages` and returns the response as-is.
Parameters

bodyrequired

configUseRequestConsumerConfig<PromptRequestBody>

Fetch utility hook request options without the `fetcher`. Allows for overriding the default `request` config.
Returns

isLoadingboolean

Fetch loading state. `true` if the fetch is in progress.

isErrorboolean

Fetch error state. `true` if an error occurred.

errorunknown

Fetch error object if `isError` is `true`

datausePrompt!T | undefined

Fetch response data if the fetch was successful.

refetchfunction

Refetches the data.
returnsundefined

Utilities

getPromptfunction

import { getPrompt } from '@trikinco/fullstack-components'
Answers a `prompt` or an array of `messages` and returns the response as-is. Server Action that calls the third-party API directly on the server. This avoids calling the Next.js API route handler allowing for performant Server Components.
Parameters

requestPromptRequestBodyrequired

optionsPromptOptions

returnsPromise<ChatGptCompletionResponse>

fetchPromptfunction

import { fetchPrompt } from '@trikinco/fullstack-components/client' 
Answers a `prompt` or an array of `messages` and returns the response as-is. Client-side fetch handler that calls the internal Next.js API route handler, then the third-party API. Best used for Client Components and functionality.
Parameters

bodyPromptRequestBodyrequired

configRequestConfigOnly

Fetch utility request options without the `body`
returnsPromise<string>