How to Add a Chatbot to Your Next.js App

AI is the future and many companies are building AI tools to help improve productivity in the human race. This is the era of AI copilot where humans will use AI tools to be better. One of the tools is an AI chatbot that will help with many tasks. In this blog we will be learning how to add a chatbot to your Next.js App.

Jordan Wu profile picture

Jordan Wu

6 min·Posted 

Playa De Huelin, Málaga in Spain Sunset Image.
Playa De Huelin, Málaga in Spain Sunset Image.
Table of Contents

What is an AI Chatbot?

A chatbot is a computer program that has been trained to answer questions that you have provided. With the advancement of AI being trained on natural language processing NLP. You are able to build powerful chatbots that can perform more tasks using a language like English. It all starts with pre-trained large language models LLMs that have been trained with information on the web and with the help of human feedback the models are more accurate and useful. It's a powerful tool that has many applications and will change how businesses everywhere operate in the future.

How to Add a Chatbot to Next.js App Router

Adding an AI Chatbot is easy with the help of Vercel AI SDK. Many AI startups have been using Next.js to lead the way in building the next generation AI application. Vercal has been developing many tools to make using AI providers like OpenAI seamless. Check out their Next.js App Router Quickstart to learn more. For this part I will walk you through how I added a chatbot to my website.

Create OpenAI API Key

Vercel AI SDK supports many AI SDK Providers, the one you will be using is OpenAI. You will need an OpenAI API key. Sign in to the OpenAI platform and open the API keys dashboard https://platform.openai.com/api-keys. From there you can create an API Key that will be used for the chatbot. Save the key as an environment variable called OPENAI_API_KEY in your local env file .env.local. For more information checkout Getting Your OpenAI API Key.

.env.local
OPENAI_API_KEY="OPENAI_API_KEY"

Setup

Install the following dependencies:

pnpmyarnnpm
pnpm add @ai-sdk/openai ai nanoid

These are the dependencies for creating a OpenAI chatbot, it will be using OpenAI Chat Completions API. This is a LLM that can have a conversation with the user. For that to happen the LLM would need the chat history and could answer questions in previous messages. There are a few ways to create a chatbot and you will be creating one using AI and UI state which is the latest support. Read Introducing AI SDK 3.0 with Generative UI support.

Create useStreamableText Hook

You would need to create a hook that converts a streamable text object to a string. Create a hook called useStreamableText.tsx.

File Imagesrc/hooks/useStreamableText.tsx
import { StreamableValue, readStreamableValue } from 'ai/rsc'
import { useEffect, useState } from 'react'

export const useStreamableText = (content: string | StreamableValue<string>) => {
  const [rawContent, setRawContent] = useState(typeof content === 'string' ? content : '')

  useEffect(() => {
    ;(async () => {
      if (typeof content === 'object') {
        let value = ''
        for await (const delta of readStreamableValue(content)) {
          if (typeof delta === 'string') {
            setRawContent((value = value + delta))
          }
        }
      }
    })()
  }, [content])

  return rawContent
}

Create UI Message Components

With the AI and UI state approach you would need to create two UI components. One component for the assistant and one for the user. The assistant is the UI component for the chatbot response and the user UI component will be for the prompt message.

File Imagesrc/components/chat/assistant-message.tsx
'use client'

import { OpenaiIcon } from '@/components/icons'
import { StreamableValue } from 'ai/rsc'
import { useStreamableText } from '@/hooks/useStreamableText'

type AssistantMessageProps = {
  content: string | StreamableValue<string>
  loading?: boolean
}

function AssistantMessage(props: AssistantMessageProps) {
  const { content, loading } = props

  const text = useStreamableText(content)

  return (
    <>
      <div className="relative flex flex-shrink-0 flex-col items-end">
        <div className="relative flex h-6 w-6 items-center justify-center rounded-full border border-[#424242] bg-[#212121] p-1 text-[#ececec]">
          <OpenaiIcon size={16} />
        </div>
      </div>
      <div className="relative flex w-full min-w-0 flex-col">
        <div className="select-none font-semibold">{'ChatGPT: '}</div>
        <div className="flex-col gap-1 md:gap-3">
          <div className="flex max-w-full flex-grow flex-col">
            <div
              dir="auto"
              className="flex min-h-[20px] flex-col items-start gap-3 overflow-x-auto whitespace-pre-wrap break-words"
            >
              <div className="prose-invert w-full break-words">
                <p>
                  {loading ? (
                    <svg
                      fill="none"
                      stroke="currentColor"
                      strokeWidth="1.5"
                      viewBox="0 0 24 24"
                      strokeLinecap="round"
                      strokeLinejoin="round"
                      xmlns="http://www.w3.org/2000/svg"
                      className="size-5 animate-spin stroke-zinc-400"
                    >
                      <path d="M12 3v3m6.366-.366-2.12 2.12M21 12h-3m.366 6.366-2.12-2.12M12 21v-3m-6.366.366 2.12-2.12M3 12h3m-.366-6.366 2.12 2.12"></path>
                    </svg>
                  ) : (
                    text
                  )}
                </p>
              </div>
            </div>
          </div>
        </div>
      </div>
    </>
  )
}

export default AssistantMessage
File Imagesrc/components/chat/user-message.tsx
import { memo } from 'react'
import Avatar from '@/components/avatar'

type UserMessageProps = {
  content: string
}

function UserMessage(props: UserMessageProps) {
  const { content } = props

  return (
    <>
      <div className="relative flex flex-shrink-0 flex-col items-end">
        <Avatar size={24} />
      </div>
      <div className="relative flex w-full min-w-0 flex-col">
        <div className="select-none font-semibold">{'You: '}</div>
        <div className="flex-col gap-1 md:gap-3">
          <div className="flex max-w-full flex-grow flex-col">
            <div
              dir="auto"
              className="flex min-h-[20px] flex-col items-start gap-3 overflow-x-auto whitespace-pre-wrap break-words"
            >
              <div className="prose-invert w-full break-words">
                <p>{content}</p>
              </div>
            </div>
          </div>
        </div>
      </div>
    </>
  )
}

export default memo(UserMessage)

Both of these UI components will be used in our chatbot and represent the messages in the chat.

Create Server Action For Chat

Server Actions are asynchronous functions that are executed on the server. This means that the actions are called on server components. For this you will create a actions.tsx file.

File Imagesrc/lib/chat/actions.tsx
import 'server-only'

import { createAI, getMutableAIState, streamUI, createStreamableValue } from 'ai/rsc'
import { openai } from '@ai-sdk/openai'
import { nanoid } from 'nanoid'
import { AssistantMessage } from '@/components/chat'

export type Message = {
  role: 'user' | 'assistant' | 'system' | 'function' | 'data' | 'tool'
  content: string
  id: string
  name?: string
}

export type AIState = {
  chatId: string
  messages: Message[]
}

export type UIState = {
  id: string
  display: React.ReactNode
}[]

async function submitUserMessage(content: string) {
  'use server'

  const aiState = getMutableAIState<typeof AI>()

  aiState.update({
    ...aiState.get(),
    messages: [
      ...aiState.get().messages,
      {
        id: nanoid(),
        role: 'user',
        content,
      },
    ],
  })

  let textStream: undefined | ReturnType<typeof createStreamableValue<string>>
  let textNode: undefined | React.ReactNode

  const result = await streamUI({
    model: openai('gpt-3.5-turbo'),
    system: 'You are a friendly bot',
    messages: [
      ...aiState.get().messages.map((message: any) => ({
        role: message.role,
        content: message.content,
        name: message.name,
      })),
    ],
    text: ({ content, delta, done }) => {
      if (!textStream) {
        textStream = createStreamableValue('')
        textNode = <AssistantMessage content={textStream.value as string} />
      }

      if (done) {
        textStream.done()
        aiState.done({
          ...aiState.get(),
          messages: [
            ...aiState.get().messages,
            {
              id: nanoid(),
              role: 'assistant',
              content,
            },
          ],
        })
      } else {
        textStream.update(delta)
      }

      return textNode
    },
    maxTokens: 1000,
    temperature: 0.5,
    topP: 0.5,
  })

  return {
    id: nanoid(),
    display: result.value,
  }
}

export const AI = createAI<AIState, UIState>({
  actions: {
    submitUserMessage,
  },
  initialUIState: [],
  initialAIState: { chatId: nanoid(), messages: [] },
})

In this file we are defining both our AIState and UIState. AIState for storing the chat history and UIState is the UI component we will display in our chat from the chatbot. There will be an action called submitUserMessage that will make a request to the OpenAI API, sending it all the information it needs like the OpenAI LLM model "gpt-3.5-turbo", system, and chat history in messages. The return value is a stream of text data that will be converted to a string and displayed in the chat. The return value will be a UIState which contains a AssistantMessage.

Create Chat Component

Now you would need to create a chat component that will display all the messages along with the prompt input. Create a component called Chat.

File Imagesrc/components/chat/chat.tsx
'use client'
import { useState, useRef } from 'react'
import { useActions, useUIState } from 'ai/rsc'
import clsx from 'clsx'
import { nanoid } from 'nanoid'
import UserMessage from './user-message'
import AssistantMessage from './assistant-message'

import type { AI } from '@/lib/chat/actions'

export default function Chat() {
  const [loading, setLoading] = useState(false)
  const { submitUserMessage } = useActions()
  const [messages, setMessages] = useUIState<typeof AI>()
  const [input, setInput] = useState('')
  const textareaRef = useRef<HTMLTextAreaElement>(null)

  const lastMessage = messages[messages.length - 1]

  const handleSubmit = async (event: React.FormEvent<HTMLFormElement>) => {
    event.preventDefault()

    const value = input.trim()
    setInput('')
    if (!value) return

    setMessages((currentMessages) => [
      ...currentMessages,
      {
        id: nanoid(),
        display: <UserMessage content={value} />,
      },
    ])

    setLoading(true)

    const responseMessage = await submitUserMessage(value)
    setMessages((currentMessages) => [...currentMessages, responseMessage])
    setLoading(false)
  }

  function handleKeyDown(event: React.KeyboardEvent<HTMLTextAreaElement>) {
    if (event.key === 'Enter' && !event.shiftKey && !event.nativeEvent.isComposing) {
      event.preventDefault()
      const submitButtonElement = document.getElementById('chat-submit-button') as HTMLButtonElement
      if (!submitButtonElement.disabled) {
        submitButtonElement.click()
      }
    }
  }

  const handleInput = (event: React.FormEvent<HTMLTextAreaElement>) => {
    if (textareaRef.current) {
      const eventTarget = event.target as HTMLTextAreaElement
      textareaRef.current.style.height = 'auto'
      textareaRef.current.style.height = `${eventTarget.scrollHeight + 2}px`
    }
  }

  const handleInputChange = (event: React.ChangeEvent<HTMLTextAreaElement>) => {
    setInput(event.target.value)
  }

  const show = input.length > 0 ? true : false

  return (
    <div className="mb-4 flex h-full w-full flex-col items-center">
      <div className="mb-4 h-full w-full flex-1 overflow-y-auto  md:max-w-3xl lg:max-w-[40rem] xl:max-w-[48rem]">
        <div>
          {messages.map((message, index) => {
            const isLast = messages.length - 1 == index

            return (
              <div key={index}>
                <div className="mx-auto mb-[28px] flex flex-1 gap-3 text-base md:max-w-3xl lg:max-w-[40rem] xl:max-w-[48rem]">
                  {message.display}
                </div>
                {isLast && loading && (
                  <div className="mx-auto mb-[28px] flex flex-1 gap-3 text-base md:max-w-3xl lg:max-w-[40rem] xl:max-w-[48rem]">
                    <AssistantMessage content="" loading />
                  </div>
                )}
              </div>
            )
          })}
        </div>
      </div>
      <form onSubmit={handleSubmit} className="relative w-full md:max-w-3xl lg:max-w-[40rem] xl:max-w-[48rem]">
        <textarea
          ref={textareaRef}
          className="chat-editor-input  h-auto max-h-[25dvh] rounded-2xl border border-zinc-700 bg-zinc-700/[0.15] py-[10px] pl-4 pr-10 text-zinc-200 shadow-zinc-800/5 outline-none placeholder:text-zinc-500 md:py-3.5 md:pl-6 md:pr-12"
          autoComplete="off"
          rows={1}
          autoCapitalize="off"
          value={input}
          maxLength={2048}
          placeholder="Message ChatGPT"
          onChange={handleInputChange}
          onInput={handleInput}
          onKeyDown={handleKeyDown}
          autoFocus
        />
        <button
          id="chat-submit-button"
          aria-label="Submit Button"
          type="submit"
          className={clsx(
            'absolute bottom-[14px] right-2 rounded-lg border border-white bg-white p-0.5 text-white transition-colors hover:bg-white md:bottom-[18px] md:right-3',
            show ? 'opacity-1 text-white' : 'cursor-default text-gray-400 opacity-10'
          )}
        >
          <svg width="24" height="24" viewBox="0 0 24 24" fill="none" className="text-black">
            <path
              d="M7 11L12 6L17 11M12 18V7"
              stroke="currentColor"
              strokeWidth="2"
              strokeLinecap="round"
              strokeLinejoin="round"
            ></path>
          </svg>
        </button>
      </form>
    </div>
  )
}

Create Chat Playground Page

Now you can create a Next.js page for your chatbot. OpenAI has a playground when you login to play around with their chatbot. To create a playground in your Next.js app you would want to create a page.tsx file.

File Imagesrc/app/playground/chat/page.tsx
import Chat from '@/components/chat'
import siteMetadata from '@/content/siteMetadata'
import { pathTo } from '@/utils/routes'
import { nanoid } from 'nanoid'
import { AI } from '@/lib/chat/actions'

import type { Metadata } from 'next'

export default async function ChatPlaygroundPage() {
  const id = nanoid()

  return (
    <AI initialAIState={{ chatId: id, messages: [] }}>
      <Chat />
    </AI>
  )
}

The most important part of this is adding the async to the page component. This will allow our server component to run the server action asynchronously. That's everything you would need to add an OpenAI chatbot to your Next.js app! There's many more features you can add to your chatbot. You can check the chatbot at Chat Playground.

Summary

Many businesses are adding AI to their products and looking for ways to use AI tools to improve productivity, reduce cost, and provide value to their customers. AI software engineer is an emerging new role that requires in-depth understanding of AI principles and can integrate AI into web applications. The Vercel AI SDK is a powerful tool for interacting with LLMs providers like OpenAI and for streaming data back from LLMs. In this blog you added a chatbot to your Next.js App. Check out Next.js AI Chatbot Github for the code.

About the Author

Jordan Wu profile picture
Jordan is a full stack engineer with years of experience working at startups. He enjoys learning about software development and building something people want. What makes him happy is music. He is passionate about finding music and is an aspiring DJ. He wants to create his own music and in the process of finding is own sound.
Email icon image
Stay up to date

Get notified when I publish something new, and unsubscribe at any time.