← Back to Blog
Flutter AI Chatbot Offline Mode: Background Sync, Local Caching & Retry Queues

Flutter AI Chatbot Offline Mode: Background Sync, Local Caching & Retry Queues

FlutterOffline ModeBackground SyncDartHive

Flutter AI Chatbot Offline Mode: Background Sync, Local Caching & Retry Queues

Your users are not always online. They open your app on the subway, in a building basement, in a rural area with one bar of signal. If your chatbot just shows a spinner and dies when connectivity drops, you have a problem. Users abandon. Messages are lost. Trust erodes.

This guide walks through building offline support around the widget_chat Flutter package -- message queuing, local caching, background sync with WorkManager, and retry logic with exponential backoff. By the end, your Flutter AI chatbot will handle connectivity drops gracefully and sync everything when the connection returns.

Detecting connectivity state in Flutter

Before you can respond to offline conditions, you need to know when they happen. The connectivity_plus package gives you both the current state and a stream of changes.

dependencies:
  flutter:
    sdk: flutter
  widget_chat: ^0.0.8
  connectivity_plus: ^6.1.1
  hive_flutter: ^1.1.0
  workmanager: ^0.5.2

Build a connectivity service that the rest of your app can depend on:

import 'dart:async';
import 'package:connectivity_plus/connectivity_plus.dart';

class ConnectivityService {
  final _connectivity = Connectivity();
  bool _isOnline = true;
  late StreamSubscription<List<ConnectivityResult>> _subscription;

  bool get isOnline => _isOnline;

  Stream<bool> get onConnectivityChanged =>
      _connectivity.onConnectivityChanged
          .map((results) => !results.contains(ConnectivityResult.none));

  Future<void> initialize() async {
    final results = await _connectivity.checkConnectivity();
    _isOnline = !results.contains(ConnectivityResult.none);

    _subscription = _connectivity.onConnectivityChanged.listen((results) {
      _isOnline = !results.contains(ConnectivityResult.none);
    });
  }

  void dispose() => _subscription.cancel();
}

One thing to watch: connectivity_plus tells you whether a network interface is available, not whether the internet is actually reachable. A phone connected to Wi-Fi with no internet will still report ConnectivityResult.wifi. For critical flows, combine this with an actual HTTP ping to your backend or to a reliable endpoint.

Future<bool> hasActualInternet() async {
  try {
    final response = await http.get(Uri.parse('https://widget-chat.com/api/health'))
        .timeout(const Duration(seconds: 3));
    return response.statusCode == 200;
  } catch (_) {
    return false;
  }
}

Building a message queue with Hive

When the user sends a message while offline, you cannot just drop it. You need a local queue that persists across app restarts. Hive is a good fit here -- it is fast, requires no native dependencies, and works on every platform Flutter supports.

Define a model for queued messages:

import 'package:hive/hive.dart';

part 'pending_message.g.dart';

@HiveType(typeId: 0)
class PendingMessage extends HiveObject {
  @HiveField(0)
  final String id;

  @HiveField(1)
  final String content;

  @HiveField(2)
  final DateTime createdAt;

  @HiveField(3)
  int retryCount;

  @HiveField(4)
  String status; // 'pending', 'sending', 'failed'

  @HiveField(5)
  final String? conversationId;

  PendingMessage({
    required this.id,
    required this.content,
    required this.createdAt,
    this.retryCount = 0,
    this.status = 'pending',
    this.conversationId,
  });
}

Run flutter pub run build_runner build to generate the adapter. Then build the queue service:

import 'package:hive_flutter/hive_flutter.dart';

class MessageQueueService {
  static const _boxName = 'pending_messages';
  late Box<PendingMessage> _box;

  Future<void> initialize() async {
    await Hive.initFlutter();
    Hive.registerAdapter(PendingMessageAdapter());
    _box = await Hive.openBox<PendingMessage>(_boxName);
  }

  Future<void> enqueue(String content, {String? conversationId}) async {
    final message = PendingMessage(
      id: DateTime.now().millisecondsSinceEpoch.toString(),
      content: content,
      createdAt: DateTime.now(),
      conversationId: conversationId,
    );
    await _box.put(message.id, message);
  }

  List<PendingMessage> getPending() {
    return _box.values
        .where((m) => m.status == 'pending' || m.status == 'failed')
        .toList()
      ..sort((a, b) => a.createdAt.compareTo(b.createdAt));
  }

  Future<void> markSent(String id) async => await _box.delete(id);

  Future<void> markFailed(String id) async {
    final message = _box.get(id);
    if (message != null) {
      message.status = 'failed';
      message.retryCount++;
      await message.save();
    }
  }

  int get pendingCount => getPending().length;
}

This queue survives app kills, restarts, and device reboots. Messages stay until they are successfully delivered and acknowledged.

Integrating the queue with ChatWidget

Now wire the offline queue into your chatbot screen. The ChatWidget from widget_chat handles the AI conversation when online. Your wrapper intercepts sends when offline and manages the queue:

import 'package:flutter/material.dart';
import 'package:widget_chat/widget_chat.dart';

class OfflineChatScreen extends StatefulWidget {
  @override
  State<OfflineChatScreen> createState() => _OfflineChatScreenState();
}

class _OfflineChatScreenState extends State<OfflineChatScreen> {
  final _connectivityService = ConnectivityService();
  final _messageQueue = MessageQueueService();
  bool _isOnline = true;

  @override
  void initState() {
    super.initState();
    _connectivityService.initialize();
    _messageQueue.initialize();
    _connectivityService.onConnectivityChanged.listen((online) {
      setState(() => _isOnline = online);
      if (online) _flushQueue();
    });
  }

  Future<void> _flushQueue() async {
    final pending = _messageQueue.getPending();
    for (final message in pending) {
      await _retrySend(message);
    }
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      body: Column(
        children: [
          if (!_isOnline) _buildOfflineBanner(),
          if (_messageQueue.pendingCount > 0) _buildQueueIndicator(),
          Expanded(
            child: ChatWidget(
              configuration: BotConfiguration(
                projectSecretKey: const String.fromEnvironment('WIDGET_CHAT_KEY'),
                userID: 'user-123',
                name: 'Support Assistant',
              ),
            ),
          ),
        ],
      ),
    );
  }

  Widget _buildOfflineBanner() {
    return Container(
      width: double.infinity,
      padding: const EdgeInsets.symmetric(vertical: 8, horizontal: 16),
      color: Colors.orange.shade800,
      child: const Text(
        'You are offline. Messages will be sent when connection returns.',
        style: TextStyle(color: Colors.white, fontSize: 13),
      ),
    );
  }

  Widget _buildQueueIndicator() {
    return Container(
      padding: const EdgeInsets.symmetric(vertical: 6, horizontal: 16),
      color: Colors.blue.shade50,
      child: Row(
        children: [
          const SizedBox(width: 16, height: 16, child: CircularProgressIndicator(strokeWidth: 2)),
          const SizedBox(width: 12),
          Text('${_messageQueue.pendingCount} message(s) waiting to sync'),
        ],
      ),
    );
  }
}

The offline banner and queue indicator are essential UX elements. Users need to know why they are not getting a response and that their messages are safe.

Background sync with WorkManager

The user might close the app while offline. When connectivity returns, you still want those queued messages to sync. That is where WorkManager comes in -- it schedules background tasks that Android and iOS will execute even if the app is not running.

import 'package:workmanager/workmanager.dart';

const syncTaskName = 'com.widgetchat.syncPendingMessages';

@pragma('vm:entry-point')
void callbackDispatcher() {
  Workmanager().executeTask((task, inputData) async {
    if (task == syncTaskName) {
      final queue = MessageQueueService();
      await queue.initialize();

      final pending = queue.getPending();
      if (pending.isEmpty) return true;

      for (final message in pending) {
        try {
          await _sendToApi(message.content, message.conversationId);
          await queue.markSent(message.id);
        } catch (e) {
          await queue.markFailed(message.id);
          if (message.retryCount >= 5) continue;
        }
      }
      return true;
    }
    return false;
  });
}

void registerBackgroundSync() {
  Workmanager().initialize(callbackDispatcher, isInDebugMode: false);
  Workmanager().registerPeriodicTask(
    'sync-chat-messages',
    syncTaskName,
    frequency: const Duration(minutes: 15),
    constraints: Constraints(
      networkType: NetworkType.connected,
    ),
    existingWorkPolicy: ExistingWorkPolicy.keep,
  );
}

Key details: the Constraints(networkType: NetworkType.connected) ensures the task only runs when the device has connectivity. The existingWorkPolicy: ExistingWorkPolicy.keep prevents duplicate registrations. Call registerBackgroundSync() once during app initialization.

On iOS, background execution is more restricted. WorkManager uses BGTaskScheduler under the hood, which requires adding the Background fetch capability in Xcode and registering your task identifier in Info.plist. Periodic tasks on iOS have a minimum interval of 15 minutes and are subject to the system's discretionary scheduling.

Retry logic with exponential backoff

Hammering a failing endpoint with immediate retries is wasteful and can get you rate-limited. Exponential backoff spaces out retries progressively:

class RetryService {
  static const _maxRetries = 5;
  static const _baseDelay = Duration(seconds: 2);

  static Future<T> withBackoff<T>(Future<T> Function() operation) async {
    int attempt = 0;

    while (true) {
      try {
        return await operation();
      } catch (e) {
        attempt++;
        if (attempt >= _maxRetries) rethrow;

        final delay = _baseDelay * (1 << attempt); // 4s, 8s, 16s, 32s
        final jitter = Duration(
          milliseconds: (delay.inMilliseconds * 0.2 * (DateTime.now().millisecond / 1000)).round(),
        );
        await Future.delayed(delay + jitter);
      }
    }
  }
}

Use it when flushing the queue:

Future<void> _retrySend(PendingMessage message) async {
  try {
    await RetryService.withBackoff(() async {
      await _sendToApi(message.content, message.conversationId);
    });
    await _messageQueue.markSent(message.id);
  } catch (e) {
    await _messageQueue.markFailed(message.id);
  }
}

The jitter prevents the thundering herd problem -- if a thousand devices go offline at the same time (server outage, regional network issue), you do not want them all retrying at the exact same intervals.

Conflict resolution when syncing

What happens when a queued message no longer makes sense by the time it syncs? The user might have sent "What time does the store close?" at 8 PM while offline, but by the time the message syncs it is 11 PM. You need a strategy.

Time-based expiration is the simplest approach. Discard messages older than a threshold:

List<PendingMessage> getValidPending({Duration maxAge = const Duration(hours: 2)}) {
  final cutoff = DateTime.now().subtract(maxAge);
  return _box.values
      .where((m) =>
          (m.status == 'pending' || m.status == 'failed') &&
          m.createdAt.isAfter(cutoff))
      .toList()
    ..sort((a, b) => a.createdAt.compareTo(b.createdAt));
}

Deduplication prevents the same message from being sent twice if the original request actually made it through but the response was lost:

Future<void> enqueue(String content, {String? conversationId}) async {
  // Check for duplicate content within the last 5 minutes
  final recentDuplicates = _box.values.where((m) =>
      m.content == content &&
      m.createdAt.isAfter(DateTime.now().subtract(const Duration(minutes: 5))));

  if (recentDuplicates.isNotEmpty) return; // Skip duplicate

  final message = PendingMessage(
    id: DateTime.now().millisecondsSinceEpoch.toString(),
    content: content,
    createdAt: DateTime.now(),
    conversationId: conversationId,
  );
  await _box.put(message.id, message);
}

Order preservation matters for conversational context. Always sort by createdAt and send messages sequentially, not in parallel. If message 3 fails, stop the batch -- sending message 4 without the context of message 3 could confuse the AI.

Future<void> flushQueueSequentially() async {
  final pending = getValidPending();
  for (final message in pending) {
    try {
      await RetryService.withBackoff(() => _sendToApi(message.content, message.conversationId));
      await markSent(message.id);
    } catch (e) {
      await markFailed(message.id);
      break; // Stop processing -- subsequent messages depend on this one
    }
  }
}

Caching previous responses for offline browsing

Users should be able to scroll through previous conversations while offline. Cache the last N responses locally:

class ResponseCacheService {
  static const _boxName = 'cached_responses';
  late Box<Map> _box;

  Future<void> initialize() async {
    _box = await Hive.openBox<Map>(_boxName);
  }

  Future<void> cacheResponse(String userMessage, String botResponse) async {
    await _box.add({
      'userMessage': userMessage,
      'botResponse': botResponse,
      'timestamp': DateTime.now().toIso8601String(),
    });

    // Keep only last 100 exchanges
    if (_box.length > 100) {
      await _box.deleteAt(0);
    }
  }

  List<Map> getCachedConversation() {
    return _box.values.toList();
  }

  String? findCachedAnswer(String question) {
    // Simple keyword matching for common repeat questions
    final normalized = question.toLowerCase().trim();
    final matches = _box.values.where(
      (entry) => _similarity(entry['userMessage'].toString().toLowerCase(), normalized) > 0.8,
    );
    return matches.isNotEmpty ? matches.last['botResponse'] as String : null;
  }
}

When the user asks a question while offline, check the cache first. If there is a previous answer to a similar question, show it with a clear label indicating it is a cached response:

if (!_isOnline) {
  final cached = _responseCache.findCachedAnswer(userInput);
  if (cached != null) {
    _showMessage(
      'Based on a previous answer (you are currently offline):\n\n$cached',
      isCached: true,
    );
    return;
  }
  _messageQueue.enqueue(userInput);
  _showMessage('Message queued. You will get a response when back online.', isSystem: true);
}

Putting it all together

Here is the initialization sequence for your app:

void main() async {
  WidgetsFlutterBinding.ensureInitialized();
  await Hive.initFlutter();
  Hive.registerAdapter(PendingMessageAdapter());

  final connectivityService = ConnectivityService();
  await connectivityService.initialize();

  final messageQueue = MessageQueueService();
  await messageQueue.initialize();

  final responseCache = ResponseCacheService();
  await responseCache.initialize();

  registerBackgroundSync();

  runApp(MyApp(
    connectivityService: connectivityService,
    messageQueue: messageQueue,
    responseCache: responseCache,
  ));
}

The architecture is straightforward: ConnectivityService monitors network state, MessageQueueService persists unsent messages in Hive, ResponseCacheService stores past answers for offline access, RetryService handles backoff logic, and WorkManager syncs everything in the background.

The ChatWidget from widget_chat remains your primary conversation interface. These services wrap around it, catching messages when the network is unavailable and replaying them when it returns. Your users get a chatbot that works everywhere -- on the subway, in a dead zone, in a concrete parking garage. They never lose a message. They never see a blank screen.

That is what production-ready offline support looks like.

Author

About the author

Widget Chat is a team of developers and designers passionate about creating the best AI chatbot experience for Flutter, web, and mobile apps.

Comments

Comments are coming soon. We'd love to hear your thoughts!