← Back to Blog
React & Next.js AI Chatbot Integration: Complete Developer Guide for 2026

React & Next.js AI Chatbot Integration: Complete Developer Guide for 2026

ReactNext.jsIntegrationTypeScriptDeveloper Guide

React & Next.js AI Chatbot Integration: Complete Developer Guide for 2026

Adding an AI chatbot to your React or Next.js application opens up powerful possibilities for user engagement, support automation, and lead generation. This comprehensive guide walks you through every integration approach—from simple script embeds to fully custom implementations.

Why React & Next.js for AI Chatbots?

The React Ecosystem Advantage

React's component-based architecture makes it ideal for chatbot integration:

  • Component Isolation: Chat widgets live as self-contained components
  • State Management: Easy integration with Redux, Zustand, or Context
  • Hooks: Custom hooks for chat logic and API communication
  • SSR/SSG Support: Next.js enables SEO-friendly chat implementations
  • TypeScript: Full type safety for chat interfaces and API responses

Common Use Cases

Use Case Implementation Complexity Best Approach
Support widget Low Script embed
In-app assistant Medium Component library
Custom AI chat High API integration
Multi-tenant chat High Custom + API

Method 1: Script Embed (Quickest)

The fastest way to add a chatbot to any React/Next.js app.

Basic Script Injection

// components/ChatWidget.tsx
'use client';

import { useEffect } from 'react';

export function ChatWidget() {
  useEffect(() => {
    // Prevent duplicate script injection
    if (document.getElementById('chat-widget-script')) return;

    const script = document.createElement('script');
    script.id = 'chat-widget-script';
    script.src = 'https://widget.example.com/chat.js';
    script.async = true;
    script.dataset.widgetId = 'YOUR_WIDGET_ID';

    document.body.appendChild(script);

    return () => {
      // Cleanup on unmount
      const existingScript = document.getElementById('chat-widget-script');
      if (existingScript) {
        existingScript.remove();
      }
    };
  }, []);

  return null;
}

Using in Next.js App Router

// app/layout.tsx
import { ChatWidget } from '@/components/ChatWidget';

export default function RootLayout({
  children,
}: {
  children: React.ReactNode;
}) {
  return (
    <html lang="en">
      <body>
        {children}
        <ChatWidget />
      </body>
    </html>
  );
}

Using Next.js Script Component

Next.js provides an optimized Script component:

// app/layout.tsx
import Script from 'next/script';

export default function RootLayout({
  children,
}: {
  children: React.ReactNode;
}) {
  return (
    <html lang="en">
      <body>
        {children}
        <Script
          src="https://widget.example.com/chat.js"
          strategy="lazyOnload"
          data-widget-id="YOUR_WIDGET_ID"
        />
      </body>
    </html>
  );
}

Script Loading Strategies

// Different loading strategies for different needs

// Load after page is interactive (recommended for chat)
<Script src="..." strategy="lazyOnload" />

// Load immediately but don't block
<Script src="..." strategy="afterInteractive" />

// Load before page is interactive (not recommended for chat)
<Script src="..." strategy="beforeInteractive" />

Method 2: Component Library Integration

For more control, use a React component library or build your own.

Using a Chat UI Library

npm install @chatscope/chat-ui-kit-react
// components/CustomChat.tsx
'use client';

import { useState, useCallback } from 'react';
import {
  MainContainer,
  ChatContainer,
  MessageList,
  Message,
  MessageInput,
  TypingIndicator,
} from '@chatscope/chat-ui-kit-react';
import '@chatscope/chat-ui-kit-react/dist/styles/min/styles.min.css';

interface ChatMessage {
  message: string;
  sender: 'user' | 'bot';
  direction: 'incoming' | 'outgoing';
}

export function CustomChat() {
  const [messages, setMessages] = useState<ChatMessage[]>([
    {
      message: "Hi! How can I help you today?",
      sender: 'bot',
      direction: 'incoming',
    },
  ]);
  const [isTyping, setIsTyping] = useState(false);

  const handleSend = useCallback(async (text: string) => {
    // Add user message
    const userMessage: ChatMessage = {
      message: text,
      sender: 'user',
      direction: 'outgoing',
    };

    setMessages((prev) => [...prev, userMessage]);
    setIsTyping(true);

    try {
      // Call your AI API
      const response = await fetch('/api/chat', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ message: text }),
      });

      const data = await response.json();

      const botMessage: ChatMessage = {
        message: data.reply,
        sender: 'bot',
        direction: 'incoming',
      };

      setMessages((prev) => [...prev, botMessage]);
    } catch (error) {
      console.error('Chat error:', error);
    } finally {
      setIsTyping(false);
    }
  }, []);

  return (
    <div style={{ height: '500px', width: '400px' }}>
      <MainContainer>
        <ChatContainer>
          <MessageList
            typingIndicator={
              isTyping ? <TypingIndicator content="AI is thinking..." /> : null
            }
          >
            {messages.map((msg, i) => (
              <Message
                key={i}
                model={{
                  message: msg.message,
                  direction: msg.direction,
                  position: 'single',
                }}
              />
            ))}
          </MessageList>
          <MessageInput
            placeholder="Type your message..."
            onSend={handleSend}
            attachButton={false}
          />
        </ChatContainer>
      </MainContainer>
    </div>
  );
}

Building a Minimal Chat Component

// components/MinimalChat.tsx
'use client';

import { useState, useRef, useEffect, FormEvent } from 'react';

interface Message {
  id: string;
  role: 'user' | 'assistant';
  content: string;
  timestamp: Date;
}

export function MinimalChat() {
  const [messages, setMessages] = useState<Message[]>([]);
  const [input, setInput] = useState('');
  const [isLoading, setIsLoading] = useState(false);
  const messagesEndRef = useRef<HTMLDivElement>(null);

  const scrollToBottom = () => {
    messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' });
  };

  useEffect(() => {
    scrollToBottom();
  }, [messages]);

  const handleSubmit = async (e: FormEvent) => {
    e.preventDefault();
    if (!input.trim() || isLoading) return;

    const userMessage: Message = {
      id: crypto.randomUUID(),
      role: 'user',
      content: input.trim(),
      timestamp: new Date(),
    };

    setMessages((prev) => [...prev, userMessage]);
    setInput('');
    setIsLoading(true);

    try {
      const response = await fetch('/api/chat', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({
          messages: [...messages, userMessage].map((m) => ({
            role: m.role,
            content: m.content,
          })),
        }),
      });

      const data = await response.json();

      const assistantMessage: Message = {
        id: crypto.randomUUID(),
        role: 'assistant',
        content: data.reply,
        timestamp: new Date(),
      };

      setMessages((prev) => [...prev, assistantMessage]);
    } catch (error) {
      console.error('Failed to send message:', error);
    } finally {
      setIsLoading(false);
    }
  };

  return (
    <div className="flex flex-col h-[500px] w-full max-w-md border rounded-lg">
      {/* Messages */}
      <div className="flex-1 overflow-y-auto p-4 space-y-4">
        {messages.map((message) => (
          <div
            key={message.id}
            className={`flex ${
              message.role === 'user' ? 'justify-end' : 'justify-start'
            }`}
          >
            <div
              className={`max-w-[80%] rounded-lg px-4 py-2 ${
                message.role === 'user'
                  ? 'bg-blue-500 text-white'
                  : 'bg-gray-200 text-gray-900'
              }`}
            >
              {message.content}
            </div>
          </div>
        ))}
        {isLoading && (
          <div className="flex justify-start">
            <div className="bg-gray-200 rounded-lg px-4 py-2">
              <span className="animate-pulse">Thinking...</span>
            </div>
          </div>
        )}
        <div ref={messagesEndRef} />
      </div>

      {/* Input */}
      <form onSubmit={handleSubmit} className="border-t p-4">
        <div className="flex gap-2">
          <input
            type="text"
            value={input}
            onChange={(e) => setInput(e.target.value)}
            placeholder="Type a message..."
            className="flex-1 border rounded-lg px-4 py-2 focus:outline-none focus:ring-2 focus:ring-blue-500"
            disabled={isLoading}
          />
          <button
            type="submit"
            disabled={isLoading || !input.trim()}
            className="bg-blue-500 text-white px-4 py-2 rounded-lg hover:bg-blue-600 disabled:opacity-50 disabled:cursor-not-allowed"
          >
            Send
          </button>
        </div>
      </form>
    </div>
  );
}

Method 3: API Integration with Streaming

For the best user experience, implement streaming responses.

Next.js API Route with Streaming

// app/api/chat/route.ts
import { NextRequest } from 'next/server';

export async function POST(req: NextRequest) {
  const { messages } = await req.json();

  // Create a TransformStream for streaming
  const encoder = new TextEncoder();
  const stream = new TransformStream();
  const writer = stream.writable.getWriter();

  // Start the AI response in the background
  (async () => {
    try {
      const response = await fetch('https://api.openai.com/v1/chat/completions', {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json',
          Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
        },
        body: JSON.stringify({
          model: 'gpt-4',
          messages,
          stream: true,
        }),
      });

      const reader = response.body?.getReader();
      if (!reader) throw new Error('No reader');

      while (true) {
        const { done, value } = await reader.read();
        if (done) break;

        // Parse SSE data
        const text = new TextDecoder().decode(value);
        const lines = text.split('\n').filter((line) => line.startsWith('data:'));

        for (const line of lines) {
          const data = line.replace('data: ', '');
          if (data === '[DONE]') continue;

          try {
            const parsed = JSON.parse(data);
            const content = parsed.choices[0]?.delta?.content;
            if (content) {
              await writer.write(encoder.encode(content));
            }
          } catch {
            // Skip invalid JSON
          }
        }
      }
    } catch (error) {
      console.error('Streaming error:', error);
    } finally {
      await writer.close();
    }
  })();

  return new Response(stream.readable, {
    headers: {
      'Content-Type': 'text/plain; charset=utf-8',
      'Transfer-Encoding': 'chunked',
    },
  });
}

React Hook for Streaming

// hooks/useStreamingChat.ts
'use client';

import { useState, useCallback } from 'react';

interface Message {
  role: 'user' | 'assistant';
  content: string;
}

export function useStreamingChat() {
  const [messages, setMessages] = useState<Message[]>([]);
  const [isStreaming, setIsStreaming] = useState(false);

  const sendMessage = useCallback(async (content: string) => {
    const userMessage: Message = { role: 'user', content };
    const newMessages = [...messages, userMessage];

    setMessages(newMessages);
    setIsStreaming(true);

    try {
      const response = await fetch('/api/chat', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ messages: newMessages }),
      });

      if (!response.body) throw new Error('No response body');

      const reader = response.body.getReader();
      const decoder = new TextDecoder();
      let assistantContent = '';

      // Add empty assistant message
      setMessages((prev) => [...prev, { role: 'assistant', content: '' }]);

      while (true) {
        const { done, value } = await reader.read();
        if (done) break;

        const chunk = decoder.decode(value);
        assistantContent += chunk;

        // Update the last message with streamed content
        setMessages((prev) => {
          const updated = [...prev];
          updated[updated.length - 1] = {
            role: 'assistant',
            content: assistantContent,
          };
          return updated;
        });
      }
    } catch (error) {
      console.error('Streaming error:', error);
    } finally {
      setIsStreaming(false);
    }
  }, [messages]);

  const clearMessages = useCallback(() => {
    setMessages([]);
  }, []);

  return {
    messages,
    isStreaming,
    sendMessage,
    clearMessages,
  };
}

Streaming Chat Component

// components/StreamingChat.tsx
'use client';

import { useState, useRef, useEffect, FormEvent } from 'react';
import { useStreamingChat } from '@/hooks/useStreamingChat';

export function StreamingChat() {
  const { messages, isStreaming, sendMessage } = useStreamingChat();
  const [input, setInput] = useState('');
  const messagesEndRef = useRef<HTMLDivElement>(null);

  useEffect(() => {
    messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' });
  }, [messages]);

  const handleSubmit = (e: FormEvent) => {
    e.preventDefault();
    if (!input.trim() || isStreaming) return;
    sendMessage(input.trim());
    setInput('');
  };

  return (
    <div className="flex flex-col h-screen max-w-2xl mx-auto">
      <div className="flex-1 overflow-y-auto p-4">
        {messages.map((message, index) => (
          <div
            key={index}
            className={`mb-4 ${
              message.role === 'user' ? 'text-right' : 'text-left'
            }`}
          >
            <div
              className={`inline-block max-w-[80%] p-3 rounded-lg ${
                message.role === 'user'
                  ? 'bg-blue-500 text-white'
                  : 'bg-gray-100 text-gray-900'
              }`}
            >
              <div className="whitespace-pre-wrap">{message.content}</div>
              {isStreaming &&
                index === messages.length - 1 &&
                message.role === 'assistant' && (
                  <span className="inline-block w-2 h-4 ml-1 bg-gray-400 animate-pulse" />
                )}
            </div>
          </div>
        ))}
        <div ref={messagesEndRef} />
      </div>

      <form onSubmit={handleSubmit} className="p-4 border-t">
        <div className="flex gap-2">
          <input
            type="text"
            value={input}
            onChange={(e) => setInput(e.target.value)}
            placeholder="Ask anything..."
            className="flex-1 p-3 border rounded-lg focus:outline-none focus:ring-2 focus:ring-blue-500"
            disabled={isStreaming}
          />
          <button
            type="submit"
            disabled={isStreaming || !input.trim()}
            className="px-6 py-3 bg-blue-500 text-white rounded-lg hover:bg-blue-600 disabled:opacity-50"
          >
            {isStreaming ? 'Thinking...' : 'Send'}
          </button>
        </div>
      </form>
    </div>
  );
}

Method 4: Using Vercel AI SDK

The Vercel AI SDK provides an excellent abstraction for AI chat.

Installation

npm install ai openai

API Route with Vercel AI SDK

// app/api/chat/route.ts
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: openai('gpt-4-turbo'),
    system: 'You are a helpful assistant for our website.',
    messages,
  });

  return result.toDataStreamResponse();
}

Chat Component with useChat Hook

// components/VercelAIChat.tsx
'use client';

import { useChat } from 'ai/react';

export function VercelAIChat() {
  const { messages, input, handleInputChange, handleSubmit, isLoading } =
    useChat({
      api: '/api/chat',
    });

  return (
    <div className="flex flex-col h-[600px] max-w-xl mx-auto border rounded-lg">
      <div className="flex-1 overflow-y-auto p-4 space-y-4">
        {messages.map((message) => (
          <div
            key={message.id}
            className={`flex ${
              message.role === 'user' ? 'justify-end' : 'justify-start'
            }`}
          >
            <div
              className={`rounded-lg px-4 py-2 max-w-[80%] ${
                message.role === 'user'
                  ? 'bg-blue-500 text-white'
                  : 'bg-gray-100'
              }`}
            >
              {message.content}
            </div>
          </div>
        ))}
      </div>

      <form onSubmit={handleSubmit} className="p-4 border-t">
        <div className="flex gap-2">
          <input
            value={input}
            onChange={handleInputChange}
            placeholder="Type your message..."
            className="flex-1 p-2 border rounded-lg"
            disabled={isLoading}
          />
          <button
            type="submit"
            disabled={isLoading}
            className="px-4 py-2 bg-blue-500 text-white rounded-lg disabled:opacity-50"
          >
            Send
          </button>
        </div>
      </form>
    </div>
  );
}

State Management Integration

Using Zustand for Chat State

// store/chatStore.ts
import { create } from 'zustand';
import { persist } from 'zustand/middleware';

interface Message {
  id: string;
  role: 'user' | 'assistant';
  content: string;
  timestamp: number;
}

interface ChatState {
  messages: Message[];
  isOpen: boolean;
  isLoading: boolean;
  addMessage: (message: Omit<Message, 'id' | 'timestamp'>) => void;
  setLoading: (loading: boolean) => void;
  toggleChat: () => void;
  clearMessages: () => void;
}

export const useChatStore = create<ChatState>()(
  persist(
    (set) => ({
      messages: [],
      isOpen: false,
      isLoading: false,
      addMessage: (message) =>
        set((state) => ({
          messages: [
            ...state.messages,
            {
              ...message,
              id: crypto.randomUUID(),
              timestamp: Date.now(),
            },
          ],
        })),
      setLoading: (loading) => set({ isLoading: loading }),
      toggleChat: () => set((state) => ({ isOpen: !state.isOpen })),
      clearMessages: () => set({ messages: [] }),
    }),
    {
      name: 'chat-storage',
    }
  )
);

Chat Widget with Zustand

// components/ChatWidgetWithStore.tsx
'use client';

import { useChatStore } from '@/store/chatStore';
import { useState, FormEvent } from 'react';

export function ChatWidgetWithStore() {
  const {
    messages,
    isOpen,
    isLoading,
    addMessage,
    setLoading,
    toggleChat,
  } = useChatStore();
  const [input, setInput] = useState('');

  const handleSubmit = async (e: FormEvent) => {
    e.preventDefault();
    if (!input.trim() || isLoading) return;

    addMessage({ role: 'user', content: input.trim() });
    setInput('');
    setLoading(true);

    try {
      const response = await fetch('/api/chat', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ message: input.trim() }),
      });
      const data = await response.json();
      addMessage({ role: 'assistant', content: data.reply });
    } finally {
      setLoading(false);
    }
  };

  if (!isOpen) {
    return (
      <button
        onClick={toggleChat}
        className="fixed bottom-4 right-4 w-14 h-14 bg-blue-500 text-white rounded-full shadow-lg hover:bg-blue-600"
      >
        💬
      </button>
    );
  }

  return (
    <div className="fixed bottom-4 right-4 w-96 h-[500px] bg-white border rounded-lg shadow-xl flex flex-col">
      <div className="p-4 border-b flex justify-between items-center">
        <h3 className="font-semibold">Chat Support</h3>
        <button onClick={toggleChat} className="text-gray-500 hover:text-gray-700"></button>
      </div>

      <div className="flex-1 overflow-y-auto p-4 space-y-3">
        {messages.map((msg) => (
          <div
            key={msg.id}
            className={`flex ${msg.role === 'user' ? 'justify-end' : 'justify-start'}`}
          >
            <div
              className={`max-w-[80%] rounded-lg px-3 py-2 ${
                msg.role === 'user'
                  ? 'bg-blue-500 text-white'
                  : 'bg-gray-100'
              }`}
            >
              {msg.content}
            </div>
          </div>
        ))}
      </div>

      <form onSubmit={handleSubmit} className="p-4 border-t">
        <div className="flex gap-2">
          <input
            value={input}
            onChange={(e) => setInput(e.target.value)}
            placeholder="Type a message..."
            className="flex-1 p-2 border rounded-lg"
            disabled={isLoading}
          />
          <button
            type="submit"
            disabled={isLoading}
            className="px-4 py-2 bg-blue-500 text-white rounded-lg disabled:opacity-50"
          >
            Send
          </button>
        </div>
      </form>
    </div>
  );
}

TypeScript Best Practices

Type Definitions

// types/chat.ts
export interface Message {
  id: string;
  role: 'user' | 'assistant' | 'system';
  content: string;
  timestamp: Date;
  metadata?: MessageMetadata;
}

export interface MessageMetadata {
  tokens?: number;
  model?: string;
  finishReason?: 'stop' | 'length' | 'error';
}

export interface ChatConfig {
  apiEndpoint: string;
  model: string;
  temperature?: number;
  maxTokens?: number;
  systemPrompt?: string;
}

export interface ChatResponse {
  message: Message;
  usage?: {
    promptTokens: number;
    completionTokens: number;
    totalTokens: number;
  };
}

export interface ChatError {
  code: string;
  message: string;
  details?: unknown;
}

Type-Safe API Client

// lib/chatClient.ts
import { Message, ChatResponse, ChatConfig, ChatError } from '@/types/chat';

export class ChatClient {
  private config: ChatConfig;

  constructor(config: ChatConfig) {
    this.config = config;
  }

  async sendMessage(
    messages: Message[]
  ): Promise<ChatResponse | ChatError> {
    try {
      const response = await fetch(this.config.apiEndpoint, {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({
          messages: messages.map((m) => ({
            role: m.role,
            content: m.content,
          })),
          model: this.config.model,
          temperature: this.config.temperature ?? 0.7,
          max_tokens: this.config.maxTokens ?? 1000,
        }),
      });

      if (!response.ok) {
        return {
          code: 'API_ERROR',
          message: `HTTP ${response.status}: ${response.statusText}`,
        };
      }

      const data = await response.json();
      return data as ChatResponse;
    } catch (error) {
      return {
        code: 'NETWORK_ERROR',
        message: error instanceof Error ? error.message : 'Unknown error',
      };
    }
  }
}

Performance Optimization

Lazy Loading the Chat Widget

// components/LazyChat.tsx
'use client';

import dynamic from 'next/dynamic';
import { useState } from 'react';

const ChatWidget = dynamic(() => import('./ChatWidget'), {
  loading: () => <div className="animate-pulse">Loading chat...</div>,
  ssr: false,
});

export function LazyChat() {
  const [showChat, setShowChat] = useState(false);

  return (
    <>
      {!showChat ? (
        <button
          onClick={() => setShowChat(true)}
          className="fixed bottom-4 right-4 w-14 h-14 bg-blue-500 text-white rounded-full shadow-lg"
        >
          💬
        </button>
      ) : (
        <ChatWidget />
      )}
    </>
  );
}

Debounced Input

// hooks/useDebouncedValue.ts
import { useState, useEffect } from 'react';

export function useDebouncedValue<T>(value: T, delay: number): T {
  const [debouncedValue, setDebouncedValue] = useState(value);

  useEffect(() => {
    const timer = setTimeout(() => {
      setDebouncedValue(value);
    }, delay);

    return () => clearTimeout(timer);
  }, [value, delay]);

  return debouncedValue;
}

Testing Your Chat Integration

Unit Testing with Jest

// __tests__/ChatWidget.test.tsx
import { render, screen, fireEvent, waitFor } from '@testing-library/react';
import { ChatWidget } from '@/components/ChatWidget';

// Mock fetch
global.fetch = jest.fn();

describe('ChatWidget', () => {
  beforeEach(() => {
    (fetch as jest.Mock).mockClear();
  });

  it('sends message on form submit', async () => {
    (fetch as jest.Mock).mockResolvedValueOnce({
      ok: true,
      json: async () => ({ reply: 'Hello!' }),
    });

    render(<ChatWidget />);

    const input = screen.getByPlaceholderText(/type/i);
    const button = screen.getByRole('button', { name: /send/i });

    fireEvent.change(input, { target: { value: 'Hi there' } });
    fireEvent.click(button);

    await waitFor(() => {
      expect(fetch).toHaveBeenCalledWith('/api/chat', expect.any(Object));
    });
  });

  it('displays bot response', async () => {
    (fetch as jest.Mock).mockResolvedValueOnce({
      ok: true,
      json: async () => ({ reply: 'How can I help?' }),
    });

    render(<ChatWidget />);

    const input = screen.getByPlaceholderText(/type/i);
    fireEvent.change(input, { target: { value: 'Hello' } });
    fireEvent.submit(input.closest('form')!);

    await waitFor(() => {
      expect(screen.getByText('How can I help?')).toBeInTheDocument();
    });
  });
});

Deployment Considerations

Environment Variables

# .env.local
OPENAI_API_KEY=sk-...
CHAT_WIDGET_ID=your-widget-id
NEXT_PUBLIC_CHAT_API_URL=/api/chat

Next.js Config for External Scripts

// next.config.js
/** @type {import('next').NextConfig} */
const nextConfig = {
  async headers() {
    return [
      {
        source: '/api/chat',
        headers: [
          { key: 'Access-Control-Allow-Origin', value: '*' },
          { key: 'Access-Control-Allow-Methods', value: 'POST' },
        ],
      },
    ];
  },
};

module.exports = nextConfig;

Common Issues and Solutions

Issue: Chat Widget Not Rendering on SSR

// Solution: Use dynamic import with ssr: false
import dynamic from 'next/dynamic';

const ChatWidget = dynamic(() => import('./ChatWidget'), {
  ssr: false,
});

Issue: Multiple Script Injections

// Solution: Check for existing script
useEffect(() => {
  if (document.getElementById('chat-script')) return;
  // ... inject script
}, []);

Issue: State Lost on Navigation

// Solution: Use persistent storage
import { persist } from 'zustand/middleware';

const useChatStore = create(
  persist(
    (set) => ({ /* ... */ }),
    { name: 'chat-storage' }
  )
);

Next Steps

After implementing your chat integration:

  1. Add analytics - Track conversation metrics
  2. Implement error boundaries - Handle failures gracefully
  3. Add accessibility - Ensure WCAG compliance
  4. Optimize performance - Monitor and improve load times
  5. A/B test - Experiment with different chat triggers

React and Next.js provide the perfect foundation for building powerful AI chat experiences. Whether you choose a simple script embed or build a fully custom solution, the component-based architecture makes it easy to iterate and improve over time.

Author

About the author

Widget Chat is a team of developers and designers passionate about creating the best AI chatbot experience for Flutter, web, and mobile apps.

Comments

Comments are coming soon. We'd love to hear your thoughts!