Vue.js & Nuxt.js AI Chatbot Integration: Complete Developer Guide for 2026
Adding an AI chatbot to your Vue.js or Nuxt.js application enhances user engagement and automates support. This guide covers every integration method—from simple script embeds to custom Vue composables with real-time streaming.
Why Vue.js & Nuxt.js for AI Chatbots?
The Vue Ecosystem Advantage
Vue's reactivity system and composition API make chatbot integration elegant:
- Reactivity: Automatic UI updates when chat state changes
- Composition API: Reusable chat logic via composables
- Nuxt 3: SSR/SSG support with auto-imports and server routes
- TypeScript: First-class support for type-safe chat interfaces
- Pinia: Lightweight state management for chat persistence
Integration Approaches
| Approach | Complexity | Best For |
|---|---|---|
| Script embed | Low | Quick deployment |
| Vue component | Medium | Custom UI needs |
| Composable + API | Medium | Reusable logic |
| Full custom | High | Complete control |
Method 1: Script Embed (Quickest)
The fastest way to add a chatbot to any Vue/Nuxt app.
Vue 3 Component Wrapper
<!-- components/ChatWidget.vue -->
<script setup lang="ts">
import { onMounted, onUnmounted } from 'vue'
const props = defineProps<{
widgetId: string
}>()
onMounted(() => {
// Prevent duplicate injection
if (document.getElementById('chat-widget-script')) return
const script = document.createElement('script')
script.id = 'chat-widget-script'
script.src = 'https://widget.example.com/chat.js'
script.async = true
script.dataset.widgetId = props.widgetId
document.body.appendChild(script)
})
onUnmounted(() => {
const script = document.getElementById('chat-widget-script')
script?.remove()
})
</script>
<template>
<!-- Widget renders itself -->
</template>
Using in Nuxt 3 App
<!-- app.vue -->
<script setup lang="ts">
const config = useRuntimeConfig()
</script>
<template>
<NuxtLayout>
<NuxtPage />
<ChatWidget :widget-id="config.public.chatWidgetId" />
</NuxtLayout>
</template>
Nuxt 3 useHead for Script Loading
<!-- components/ChatWidget.vue -->
<script setup lang="ts">
const props = defineProps<{
widgetId: string
}>()
useHead({
script: [
{
src: 'https://widget.example.com/chat.js',
async: true,
'data-widget-id': props.widgetId,
},
],
})
</script>
<template>
<div id="chat-widget-container"></div>
</template>
Nuxt Plugin Approach
// plugins/chat-widget.client.ts
export default defineNuxtPlugin(() => {
const config = useRuntimeConfig()
if (document.getElementById('chat-widget-script')) return
const script = document.createElement('script')
script.id = 'chat-widget-script'
script.src = 'https://widget.example.com/chat.js'
script.async = true
script.dataset.widgetId = config.public.chatWidgetId
document.body.appendChild(script)
})
Method 2: Custom Vue Chat Component
Build your own chat interface with full control.
Basic Chat Component
<!-- components/CustomChat.vue -->
<script setup lang="ts">
import { ref, nextTick, watch } from 'vue'
interface Message {
id: string
role: 'user' | 'assistant'
content: string
timestamp: Date
}
const messages = ref<Message[]>([])
const input = ref('')
const isLoading = ref(false)
const messagesContainer = ref<HTMLElement | null>(null)
const scrollToBottom = async () => {
await nextTick()
if (messagesContainer.value) {
messagesContainer.value.scrollTop = messagesContainer.value.scrollHeight
}
}
watch(messages, scrollToBottom, { deep: true })
const sendMessage = async () => {
if (!input.value.trim() || isLoading.value) return
const userMessage: Message = {
id: crypto.randomUUID(),
role: 'user',
content: input.value.trim(),
timestamp: new Date(),
}
messages.value.push(userMessage)
const userInput = input.value
input.value = ''
isLoading.value = true
try {
const response = await fetch('/api/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ message: userInput }),
})
const data = await response.json()
const assistantMessage: Message = {
id: crypto.randomUUID(),
role: 'assistant',
content: data.reply,
timestamp: new Date(),
}
messages.value.push(assistantMessage)
} catch (error) {
console.error('Chat error:', error)
} finally {
isLoading.value = false
}
}
const handleKeydown = (event: KeyboardEvent) => {
if (event.key === 'Enter' && !event.shiftKey) {
event.preventDefault()
sendMessage()
}
}
</script>
<template>
<div class="chat-container">
<div ref="messagesContainer" class="messages">
<div
v-for="message in messages"
:key="message.id"
:class="['message', message.role]"
>
<div class="message-content">
{{ message.content }}
</div>
</div>
<div v-if="isLoading" class="message assistant">
<div class="message-content typing">
<span class="dot"></span>
<span class="dot"></span>
<span class="dot"></span>
</div>
</div>
</div>
<form @submit.prevent="sendMessage" class="input-form">
<input
v-model="input"
type="text"
placeholder="Type your message..."
:disabled="isLoading"
@keydown="handleKeydown"
/>
<button type="submit" :disabled="isLoading || !input.trim()">
Send
</button>
</form>
</div>
</template>
<style scoped>
.chat-container {
display: flex;
flex-direction: column;
height: 500px;
width: 100%;
max-width: 400px;
border: 1px solid #e0e0e0;
border-radius: 12px;
overflow: hidden;
}
.messages {
flex: 1;
overflow-y: auto;
padding: 1rem;
display: flex;
flex-direction: column;
gap: 0.75rem;
}
.message {
display: flex;
}
.message.user {
justify-content: flex-end;
}
.message.assistant {
justify-content: flex-start;
}
.message-content {
max-width: 80%;
padding: 0.75rem 1rem;
border-radius: 1rem;
}
.message.user .message-content {
background: #3b82f6;
color: white;
border-bottom-right-radius: 0.25rem;
}
.message.assistant .message-content {
background: #f3f4f6;
color: #1f2937;
border-bottom-left-radius: 0.25rem;
}
.typing {
display: flex;
gap: 4px;
padding: 0.75rem 1rem;
}
.dot {
width: 8px;
height: 8px;
background: #9ca3af;
border-radius: 50%;
animation: bounce 1.4s infinite ease-in-out;
}
.dot:nth-child(1) { animation-delay: -0.32s; }
.dot:nth-child(2) { animation-delay: -0.16s; }
@keyframes bounce {
0%, 80%, 100% { transform: scale(0); }
40% { transform: scale(1); }
}
.input-form {
display: flex;
gap: 0.5rem;
padding: 1rem;
border-top: 1px solid #e0e0e0;
}
.input-form input {
flex: 1;
padding: 0.75rem 1rem;
border: 1px solid #e0e0e0;
border-radius: 0.5rem;
outline: none;
}
.input-form input:focus {
border-color: #3b82f6;
}
.input-form button {
padding: 0.75rem 1.5rem;
background: #3b82f6;
color: white;
border: none;
border-radius: 0.5rem;
cursor: pointer;
}
.input-form button:disabled {
opacity: 0.5;
cursor: not-allowed;
}
</style>
Floating Chat Widget
<!-- components/FloatingChat.vue -->
<script setup lang="ts">
import { ref } from 'vue'
const isOpen = ref(false)
const hasUnread = ref(false)
const toggleChat = () => {
isOpen.value = !isOpen.value
if (isOpen.value) {
hasUnread.value = false
}
}
</script>
<template>
<div class="floating-chat">
<Transition name="slide">
<div v-if="isOpen" class="chat-panel">
<div class="chat-header">
<h3>Chat Support</h3>
<button @click="toggleChat" class="close-btn">×</button>
</div>
<CustomChat />
</div>
</Transition>
<button
v-if="!isOpen"
@click="toggleChat"
class="chat-trigger"
:class="{ 'has-unread': hasUnread }"
>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" fill="currentColor">
<path d="M12 2C6.48 2 2 6.48 2 12c0 1.85.5 3.58 1.36 5.07L2 22l4.93-1.36C8.42 21.5 10.15 22 12 22c5.52 0 10-4.48 10-10S17.52 2 12 2z"/>
</svg>
</button>
</div>
</template>
<style scoped>
.floating-chat {
position: fixed;
bottom: 1.5rem;
right: 1.5rem;
z-index: 1000;
}
.chat-panel {
position: absolute;
bottom: 0;
right: 0;
width: 380px;
background: white;
border-radius: 12px;
box-shadow: 0 10px 40px rgba(0, 0, 0, 0.15);
overflow: hidden;
}
.chat-header {
display: flex;
justify-content: space-between;
align-items: center;
padding: 1rem;
background: #3b82f6;
color: white;
}
.chat-header h3 {
margin: 0;
font-size: 1rem;
}
.close-btn {
background: none;
border: none;
color: white;
font-size: 1.5rem;
cursor: pointer;
line-height: 1;
}
.chat-trigger {
width: 60px;
height: 60px;
border-radius: 50%;
background: #3b82f6;
color: white;
border: none;
cursor: pointer;
display: flex;
align-items: center;
justify-content: center;
box-shadow: 0 4px 12px rgba(59, 130, 246, 0.4);
transition: transform 0.2s, box-shadow 0.2s;
}
.chat-trigger:hover {
transform: scale(1.05);
box-shadow: 0 6px 16px rgba(59, 130, 246, 0.5);
}
.chat-trigger svg {
width: 28px;
height: 28px;
}
.chat-trigger.has-unread::after {
content: '';
position: absolute;
top: 0;
right: 0;
width: 16px;
height: 16px;
background: #ef4444;
border-radius: 50%;
border: 2px solid white;
}
.slide-enter-active,
.slide-leave-active {
transition: all 0.3s ease;
}
.slide-enter-from,
.slide-leave-to {
opacity: 0;
transform: translateY(20px) scale(0.95);
}
</style>
Method 3: Vue Composables for Chat Logic
Create reusable chat logic with the Composition API.
Basic Chat Composable
// composables/useChat.ts
import { ref, computed } from 'vue'
export interface Message {
id: string
role: 'user' | 'assistant'
content: string
timestamp: Date
}
export function useChat(apiEndpoint = '/api/chat') {
const messages = ref<Message[]>([])
const isLoading = ref(false)
const error = ref<string | null>(null)
const lastMessage = computed(() =>
messages.value[messages.value.length - 1]
)
const sendMessage = async (content: string) => {
if (!content.trim() || isLoading.value) return
error.value = null
const userMessage: Message = {
id: crypto.randomUUID(),
role: 'user',
content: content.trim(),
timestamp: new Date(),
}
messages.value.push(userMessage)
isLoading.value = true
try {
const response = await fetch(apiEndpoint, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
messages: messages.value.map(m => ({
role: m.role,
content: m.content,
})),
}),
})
if (!response.ok) {
throw new Error(`HTTP ${response.status}`)
}
const data = await response.json()
const assistantMessage: Message = {
id: crypto.randomUUID(),
role: 'assistant',
content: data.reply,
timestamp: new Date(),
}
messages.value.push(assistantMessage)
} catch (err) {
error.value = err instanceof Error ? err.message : 'Unknown error'
} finally {
isLoading.value = false
}
}
const clearMessages = () => {
messages.value = []
error.value = null
}
return {
messages,
isLoading,
error,
lastMessage,
sendMessage,
clearMessages,
}
}
Streaming Chat Composable
// composables/useStreamingChat.ts
import { ref } from 'vue'
export interface Message {
id: string
role: 'user' | 'assistant'
content: string
}
export function useStreamingChat(apiEndpoint = '/api/chat/stream') {
const messages = ref<Message[]>([])
const isStreaming = ref(false)
const currentStreamedContent = ref('')
const sendMessage = async (content: string) => {
if (!content.trim() || isStreaming.value) return
const userMessage: Message = {
id: crypto.randomUUID(),
role: 'user',
content: content.trim(),
}
messages.value.push(userMessage)
isStreaming.value = true
currentStreamedContent.value = ''
// Add placeholder for assistant message
const assistantMessage: Message = {
id: crypto.randomUUID(),
role: 'assistant',
content: '',
}
messages.value.push(assistantMessage)
const assistantIndex = messages.value.length - 1
try {
const response = await fetch(apiEndpoint, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
messages: messages.value.slice(0, -1).map(m => ({
role: m.role,
content: m.content,
})),
}),
})
if (!response.body) throw new Error('No response body')
const reader = response.body.getReader()
const decoder = new TextDecoder()
while (true) {
const { done, value } = await reader.read()
if (done) break
const chunk = decoder.decode(value)
currentStreamedContent.value += chunk
messages.value[assistantIndex].content = currentStreamedContent.value
}
} catch (error) {
console.error('Streaming error:', error)
messages.value[assistantIndex].content = 'Sorry, an error occurred.'
} finally {
isStreaming.value = false
currentStreamedContent.value = ''
}
}
const clearMessages = () => {
messages.value = []
}
return {
messages,
isStreaming,
currentStreamedContent,
sendMessage,
clearMessages,
}
}
Using Composable in Component
<!-- components/ChatWithComposable.vue -->
<script setup lang="ts">
import { ref } from 'vue'
import { useStreamingChat } from '~/composables/useStreamingChat'
const { messages, isStreaming, sendMessage, clearMessages } = useStreamingChat()
const input = ref('')
const handleSubmit = () => {
sendMessage(input.value)
input.value = ''
}
</script>
<template>
<div class="chat">
<div class="messages">
<div
v-for="msg in messages"
:key="msg.id"
:class="['message', msg.role]"
>
{{ msg.content }}
<span
v-if="isStreaming && msg.role === 'assistant' && msg === messages[messages.length - 1]"
class="cursor"
>|</span>
</div>
</div>
<form @submit.prevent="handleSubmit">
<input
v-model="input"
placeholder="Ask anything..."
:disabled="isStreaming"
/>
<button type="submit" :disabled="isStreaming || !input.trim()">
{{ isStreaming ? 'Thinking...' : 'Send' }}
</button>
</form>
<button @click="clearMessages" class="clear-btn">Clear Chat</button>
</div>
</template>
Method 4: Nuxt 3 Server Routes
Build your chat API directly in Nuxt.
Basic API Route
// server/api/chat.post.ts
export default defineEventHandler(async (event) => {
const body = await readBody(event)
const { messages } = body
const config = useRuntimeConfig()
const response = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${config.openaiApiKey}`,
},
body: JSON.stringify({
model: 'gpt-4',
messages,
temperature: 0.7,
max_tokens: 1000,
}),
})
const data = await response.json()
return {
reply: data.choices[0].message.content,
}
})
Streaming API Route
// server/api/chat/stream.post.ts
export default defineEventHandler(async (event) => {
const body = await readBody(event)
const { messages } = body
const config = useRuntimeConfig()
const response = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${config.openaiApiKey}`,
},
body: JSON.stringify({
model: 'gpt-4',
messages,
stream: true,
}),
})
// Set up SSE headers
setResponseHeaders(event, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
})
const reader = response.body?.getReader()
if (!reader) {
throw createError({ statusCode: 500, message: 'No response body' })
}
const stream = new ReadableStream({
async start(controller) {
const decoder = new TextDecoder()
while (true) {
const { done, value } = await reader.read()
if (done) break
const text = decoder.decode(value)
const lines = text.split('\n').filter(line => line.startsWith('data:'))
for (const line of lines) {
const data = line.replace('data: ', '')
if (data === '[DONE]') continue
try {
const parsed = JSON.parse(data)
const content = parsed.choices[0]?.delta?.content
if (content) {
controller.enqueue(new TextEncoder().encode(content))
}
} catch {
// Skip invalid JSON
}
}
}
controller.close()
},
})
return sendStream(event, stream)
})
State Management with Pinia
Chat Store
// stores/chat.ts
import { defineStore } from 'pinia'
interface Message {
id: string
role: 'user' | 'assistant'
content: string
timestamp: number
}
interface ChatState {
messages: Message[]
isOpen: boolean
isLoading: boolean
}
export const useChatStore = defineStore('chat', {
state: (): ChatState => ({
messages: [],
isOpen: false,
isLoading: false,
}),
getters: {
messageCount: (state) => state.messages.length,
lastMessage: (state) => state.messages[state.messages.length - 1],
},
actions: {
addMessage(role: 'user' | 'assistant', content: string) {
this.messages.push({
id: crypto.randomUUID(),
role,
content,
timestamp: Date.now(),
})
},
async sendMessage(content: string) {
if (!content.trim() || this.isLoading) return
this.addMessage('user', content.trim())
this.isLoading = true
try {
const response = await fetch('/api/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ message: content.trim() }),
})
const data = await response.json()
this.addMessage('assistant', data.reply)
} catch (error) {
console.error('Chat error:', error)
this.addMessage('assistant', 'Sorry, something went wrong.')
} finally {
this.isLoading = false
}
},
toggleChat() {
this.isOpen = !this.isOpen
},
clearMessages() {
this.messages = []
},
},
persist: true, // Requires pinia-plugin-persistedstate
})
Using Pinia Store in Component
<!-- components/PiniaChat.vue -->
<script setup lang="ts">
import { ref } from 'vue'
import { useChatStore } from '~/stores/chat'
import { storeToRefs } from 'pinia'
const chatStore = useChatStore()
const { messages, isOpen, isLoading } = storeToRefs(chatStore)
const input = ref('')
const handleSend = () => {
chatStore.sendMessage(input.value)
input.value = ''
}
</script>
<template>
<div v-if="isOpen" class="chat-window">
<div class="messages">
<div
v-for="msg in messages"
:key="msg.id"
:class="msg.role"
>
{{ msg.content }}
</div>
</div>
<form @submit.prevent="handleSend">
<input
v-model="input"
:disabled="isLoading"
placeholder="Type a message..."
/>
<button :disabled="isLoading">Send</button>
</form>
</div>
<button @click="chatStore.toggleChat" class="toggle-btn">
{{ isOpen ? 'Close' : 'Chat' }}
</button>
</template>
TypeScript Integration
Type Definitions
// types/chat.ts
export interface Message {
id: string
role: 'user' | 'assistant' | 'system'
content: string
timestamp: Date
metadata?: MessageMetadata
}
export interface MessageMetadata {
tokens?: number
model?: string
finishReason?: 'stop' | 'length' | 'error'
}
export interface ChatConfig {
apiEndpoint: string
model?: string
temperature?: number
maxTokens?: number
systemPrompt?: string
}
export interface ChatResponse {
reply: string
usage?: {
promptTokens: number
completionTokens: number
totalTokens: number
}
}
export interface ChatError {
code: string
message: string
}
Type-Safe Composable
// composables/useTypedChat.ts
import type { Message, ChatConfig, ChatResponse } from '~/types/chat'
const defaultConfig: ChatConfig = {
apiEndpoint: '/api/chat',
temperature: 0.7,
maxTokens: 1000,
}
export function useTypedChat(config: Partial<ChatConfig> = {}) {
const mergedConfig = { ...defaultConfig, ...config }
const messages = ref<Message[]>([])
const isLoading = ref(false)
const sendMessage = async (content: string): Promise<ChatResponse | null> => {
if (!content.trim()) return null
const userMessage: Message = {
id: crypto.randomUUID(),
role: 'user',
content: content.trim(),
timestamp: new Date(),
}
messages.value.push(userMessage)
isLoading.value = true
try {
const response = await $fetch<ChatResponse>(mergedConfig.apiEndpoint, {
method: 'POST',
body: {
messages: messages.value.map(m => ({
role: m.role,
content: m.content,
})),
...mergedConfig,
},
})
const assistantMessage: Message = {
id: crypto.randomUUID(),
role: 'assistant',
content: response.reply,
timestamp: new Date(),
metadata: {
tokens: response.usage?.completionTokens,
},
}
messages.value.push(assistantMessage)
return response
} catch (error) {
console.error('Chat error:', error)
return null
} finally {
isLoading.value = false
}
}
return {
messages: readonly(messages),
isLoading: readonly(isLoading),
sendMessage,
}
}
Testing Vue Chat Components
Component Testing with Vitest
// tests/ChatWidget.test.ts
import { mount } from '@vue/test-utils'
import { describe, it, expect, vi, beforeEach } from 'vitest'
import ChatWidget from '~/components/CustomChat.vue'
// Mock fetch
global.fetch = vi.fn()
describe('ChatWidget', () => {
beforeEach(() => {
vi.clearAllMocks()
})
it('renders input and send button', () => {
const wrapper = mount(ChatWidget)
expect(wrapper.find('input').exists()).toBe(true)
expect(wrapper.find('button[type="submit"]').exists()).toBe(true)
})
it('sends message on form submit', async () => {
(fetch as any).mockResolvedValueOnce({
ok: true,
json: async () => ({ reply: 'Hello!' }),
})
const wrapper = mount(ChatWidget)
await wrapper.find('input').setValue('Hi there')
await wrapper.find('form').trigger('submit')
expect(fetch).toHaveBeenCalledWith('/api/chat', expect.any(Object))
})
it('displays messages', async () => {
(fetch as any).mockResolvedValueOnce({
ok: true,
json: async () => ({ reply: 'How can I help?' }),
})
const wrapper = mount(ChatWidget)
await wrapper.find('input').setValue('Hello')
await wrapper.find('form').trigger('submit')
// Wait for async operations
await new Promise(resolve => setTimeout(resolve, 100))
expect(wrapper.text()).toContain('Hello')
expect(wrapper.text()).toContain('How can I help?')
})
it('disables input while loading', async () => {
(fetch as any).mockImplementationOnce(
() => new Promise(resolve => setTimeout(resolve, 1000))
)
const wrapper = mount(ChatWidget)
await wrapper.find('input').setValue('Test')
await wrapper.find('form').trigger('submit')
expect(wrapper.find('input').attributes('disabled')).toBeDefined()
})
})
Composable Testing
// tests/useChat.test.ts
import { describe, it, expect, vi, beforeEach } from 'vitest'
import { useChat } from '~/composables/useChat'
global.fetch = vi.fn()
describe('useChat', () => {
beforeEach(() => {
vi.clearAllMocks()
})
it('sends message and updates state', async () => {
(fetch as any).mockResolvedValueOnce({
ok: true,
json: async () => ({ reply: 'Bot response' }),
})
const { messages, sendMessage, isLoading } = useChat()
expect(messages.value).toHaveLength(0)
await sendMessage('Hello')
expect(messages.value).toHaveLength(2)
expect(messages.value[0].content).toBe('Hello')
expect(messages.value[1].content).toBe('Bot response')
expect(isLoading.value).toBe(false)
})
it('handles errors gracefully', async () => {
(fetch as any).mockRejectedValueOnce(new Error('Network error'))
const { messages, sendMessage, error } = useChat()
await sendMessage('Test')
expect(error.value).toBe('Network error')
expect(messages.value).toHaveLength(1) // Only user message
})
})
Performance Optimization
Lazy Load Chat Widget
<!-- components/LazyChatLoader.vue -->
<script setup lang="ts">
import { ref, defineAsyncComponent } from 'vue'
const ChatWidget = defineAsyncComponent(() =>
import('./ChatWidget.vue')
)
const showChat = ref(false)
</script>
<template>
<button v-if="!showChat" @click="showChat = true" class="chat-trigger">
Open Chat
</button>
<Suspense v-else>
<template #default>
<ChatWidget />
</template>
<template #fallback>
<div class="loading">Loading chat...</div>
</template>
</Suspense>
</template>
Virtual Scrolling for Long Conversations
<!-- components/VirtualChat.vue -->
<script setup lang="ts">
import { useVirtualList } from '@vueuse/core'
import { ref } from 'vue'
const messages = ref([/* many messages */])
const { list, containerProps, wrapperProps } = useVirtualList(messages, {
itemHeight: 60,
})
</script>
<template>
<div v-bind="containerProps" class="messages-container">
<div v-bind="wrapperProps">
<div
v-for="{ data: message, index } in list"
:key="index"
class="message"
>
{{ message.content }}
</div>
</div>
</div>
</template>
Nuxt Configuration
Runtime Config
// nuxt.config.ts
export default defineNuxtConfig({
runtimeConfig: {
openaiApiKey: process.env.OPENAI_API_KEY,
public: {
chatWidgetId: process.env.NUXT_PUBLIC_CHAT_WIDGET_ID,
},
},
modules: [
'@pinia/nuxt',
'@pinia-plugin-persistedstate/nuxt',
],
})
Environment Variables
# .env
OPENAI_API_KEY=sk-...
NUXT_PUBLIC_CHAT_WIDGET_ID=your-widget-id
Common Issues and Solutions
Issue: Hydration Mismatch
<!-- Solution: Use ClientOnly wrapper -->
<template>
<ClientOnly>
<ChatWidget />
<template #fallback>
<div>Loading chat...</div>
</template>
</ClientOnly>
</template>
Issue: State Lost on Navigation
// Solution: Use Pinia with persistence
import { defineStore } from 'pinia'
export const useChatStore = defineStore('chat', {
// ... store definition
persist: {
storage: persistedState.localStorage,
},
})
Issue: Multiple Widget Instances
// Solution: Use singleton pattern in plugin
// plugins/chat.client.ts
let initialized = false
export default defineNuxtPlugin(() => {
if (initialized) return
initialized = true
// ... initialize widget
})
Next Steps
After implementing your Vue.js chatbot:
- Add analytics - Track conversation metrics with composables
- Implement error boundaries - Use Vue's errorCaptured hook
- Add accessibility - Ensure ARIA labels and keyboard navigation
- Optimize bundle - Use dynamic imports and tree shaking
- A/B test - Experiment with different chat triggers and prompts
Vue.js and Nuxt.js provide an excellent foundation for building reactive, performant AI chat experiences. The Composition API makes it easy to create reusable chat logic, while Nuxt's server routes simplify API development.



Comments
Comments are coming soon. We'd love to hear your thoughts!