API Live Sync #2: Live Source Data Structures and Types


In this second post, I’ll delve into creating the structures and foundations that we can build upon later to fully implement live-sync. You can find the repository here.
The Mission: Eliminating Manual API Documentation
Just so we don’t lose sight of what and why we are doing this, or if you just stumbled on this part: picture this: you're building an API, adding new endpoints, changing parameters, and your testing tool (Postman/Hoppscotch) is sitting there with yesterday's manually created requests. You have to manually create new requests, update parameters, fix request bodies, and maintain two sources of truth - your code and your collection. It's like having to manually update your GPS every time a new road is built.
That's exactly the problem I want to solve with the Live Sync feature for Hoppscotch. The goal? Enable code-first collection generation, similar to how Swagger generates documentation from annotated code. Think of it as giving your API testing tool the ability to stay in sync with your actual code automatically.
Phase 1: Building the Foundation (The "Don't Break Everything" Phase)
The Architecture: Like Building a House, But for Data
Before we could make anything "live," we needed solid foundations. Think of it like building a house - you don't start with the smart home features; you start by ensuring the walls won't fall.
Our architecture follows a clean separation of concerns:
┌─────────────────────────────────────────────────────────────┐
│ Live Sync Feature │
├─────────────────────────────────────────────────────────────┤
│ ┌─────────────────┐ ┌─────────────────┐ ┌──────────────┐ │
│ │ Live Source │ │ Sync Engine │ │ View Layer │ │
│ │ Manager │ │ │ │ │ │
│ └─────────────────┘ └─────────────────┘ └──────────────┘ │
├─────────────────────────────────────────────────────────────┤
│ Existing Hoppscotch Core │
└─────────────────────────────────────────────────────────────┘
The beauty of this approach? We're not reinventing the wheel. We're building on top of Hoppscotch's existing OpenAPI importer, which already knows how to turn specifications into collections. We're just making it...well, less manual.
The Type System
First things first - we needed to define what a "live source" actually is.
interface LiveSpecSource {
id: string
type: 'url' | 'file' // Because APIs live in two places: the cloud and your hard drive
name: string
status: 'connected' | 'error' | 'disconnected' | 'syncing'
config: URLSourceConfig | FileSourceConfig
syncStrategy: 'replace-all' | 'incremental'
lastSync?: Date
lastError?: string
createdAt: Date
updatedAt: Date
}
Why this structure? Well, we learned that OpenAPI specs generated from code can live in different places. Sometimes they're served by your development server at /openapi.json
(the well-behaved ones), and sometimes they're generated as files during your build process (the ones that like to hide in your project folder).
The status
field is particularly important - it's like a traffic light for your API connection. Green means "all good," red means "something's broken," and yellow means "we're working on it."
Configuration
Different sources need different configurations. URL sources need polling intervals and authentication headers, while file sources need file paths and watch settings.
interface URLSourceConfig {
url: string
pollInterval?: number // Default: 30 seconds (so that it doesn't become spam)
headers?: Record<string, string> // For those APIs that need a secret handshake
timeout?: number // Because waiting forever is not a strategy
}
interface FileSourceConfig {
filePath: string
watchEnabled?: boolean // File watching: like a security camera for your specs
}
The polling interval deserves special mention. I defaulted to 30 seconds because:
It's frequent enough to catch code changes during active development
It's not so frequent that it overwhelms your development server
It aligns well with typical development workflows (code, test, repeat)
Storage
I implemented two storage strategies because flexibility is key:
InMemoryLiveSpecStorage: Perfect for testing and "I just want to try this out" scenarios
LocalStorageLiveSpecStorage: For when you want your settings to survive a browser refresh
interface LiveSpecStorage {
loadSources(): Promise<LiveSpecSource[]>
saveSources(sources: LiveSpecSource[]): Promise<void>
loadSyncHistory(sourceId: string): Promise<SyncHistoryEntry[]>
saveSyncHistoryEntry(entry: SyncHistoryEntry): Promise<void>
// ... and more methods
}
The LocalStorage implementation handles date serialization (because JavaScript dates are... special), manages history cleanup (because infinite storage is a myth), and gracefully handles corruption (because localStorage is not as reliable as we'd like to believe).
Utilities
I created a utility library that handles all the mundane-but-critical tasks:
// ID generation (because UUIDs are better than "source-1", "source-2")
export function generateSourceId(): string {
return `live-spec-${uuidv4()}`
}
// Content hashing (for detecting changes without downloading the entire spec every time)
export function generateContentHash(content: string): string {
// A simple but effective hash function
let hash = 0
for (let i = 0; i < content.length; i++) {
const char = content.charCodeAt(i)
hash = ((hash << 5) - hash) + char
hash = hash & hash // Convert to 32-bit integer
}
return Math.abs(hash).toString(36)
}
// Time formatting (because "2024-01-15T10:30:00.000Z" is not user-friendly)
export function formatSyncTime(date: Date): string {
const diffMinutes = Math.floor((Date.now() - date.getTime()) / (1000 * 60))
if (diffMinutes < 1) return 'Just now'
if (diffMinutes < 60) return `${diffMinutes} minute${diffMinutes === 1 ? '' : 's'} ago`
// ... and so on
}
The content hashing function is particularly important - it's our way of quickly checking "has the generated spec changed?" without having to do a full comparison every time. It's like having a fingerprint for your API spec that updates when you modify your code.
Validation
We implemented validation because, let's face it, users will try to input the most creative thing, and we have to trust that our code won’t break when faced with the wild.
export function validateURLSource(config: URLSourceConfig): SourceValidationResult {
const errors: string[] = []
const warnings: string[] = []
try {
const url = new URL(config.url)
// Check protocol (because ftp://api.example.com is not going to work)
if (!['http:', 'https:'].includes(url.protocol)) {
errors.push('URL must use HTTP or HTTPS protocol')
}
// Warn about HTTP (because security matters)
if (url.protocol === 'http:' && !isLocalhost(url.hostname)) {
warnings.push('HTTP URLs are not secure. Consider using HTTPS.')
}
} catch (error) {
errors.push('Invalid URL format')
}
// More validation logic...
return { isValid: errors.length === 0, errors, warnings }
}
The validation is designed to be helpful, not just restrictive. We provide warnings for things that might work but aren't ideal (like HTTP URLs), and clear error messages for things that definitely won't work.
Testing: Because "It Works on My Machine" Isn't Good Enough
We wrote comprehensive tests for everything, because untested code is like a parachute that might work. Here's a taste:
describe('InMemoryLiveSpecStorage', () => {
it('should limit sync history entries based on settings', async () => {
// Set max entries to 2
await storage.saveSettings({
...DEFAULT_LIVE_SPEC_SETTINGS,
maxSyncHistoryEntries: 2
})
// Add 3 entries
for (let i = 1; i <= 3; i++) {
const entry: SyncHistoryEntry = {
id: `sync-${i}`,
sourceId: 'source-1',
timestamp: new Date(Date.now() + i * 1000),
status: 'success'
}
await storage.saveSyncHistoryEntry(entry)
}
const history = await storage.loadSyncHistory('source-1')
expect(history).toHaveLength(2) // Should keep only the most recent
})
})
This test ensures that our storage doesn't become a digital hoarder, keeping every sync event since the dawn of time.
What We've Accomplished So Far
We've successfully implemented:
Complete Type System: All the interfaces and types needed for live sync
Storage Layer: Both in-memory and localStorage implementations
Utility Functions: ID generation, content hashing, time formatting, and more
Validation System: Comprehensive validation for URLs, file paths, and configurations
Test Coverage: Extensive unit tests ensure everything works as expected
What's Next: The Road Ahead
The Live Source Manager Service
The next major milestone is implementing the LiveSpecSourceService
- the orchestrator that will:
Manage connections to development servers and generated spec files
Handle fetching specs from development endpoints (FastAPI, Express, Spring, etc.)
Watch for file changes in generated OpenAPI specs
Coordinate with the existing OpenAPI importer
Provide events for UI updates
Lessons Learned
Date Serialization is Tricky: JSON.stringify() turns dates into strings, but doesn't turn them back. I learned to handle this explicitly in our storage layer.
Testing Edge Cases Matters: The "what if the user enters an empty string?" scenarios are more common than you'd think.
Type Safety is Your Friend: TypeScript caught numerous potential runtime errors during development.
Progressive Enhancement: Start simple (manual refresh), add complexity gradually
Fail Gracefully: When things go wrong, keep the last known good state
Be Explicit: Clear error messages, obvious status indicators, no magic
Respect User Data: Never lose customizations without warning
Next up: Implementing the Live Source Manager Service - where we'll connect to the development servers and make collections truly code-driven! Stay tuned!
Subscribe to my newsletter
Read articles from Chijioke Ugwuanyi directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by