Performance Tips
Optimize your Otesse integration for speed and efficiency with these strategies.
Reduce API Calls
Cache Frequently Accessed Data
Service configurations, industries, and zone data change infrequently. Cache them locally:
import NodeCache from 'node-cache';
const cache = new NodeCache({ stdTTL: 3600 }); // 1 hour TTL
async function getIndustries() {
const cached = cache.get('industries');
if (cached) return cached;
const industries = await otesse.services.listIndustries();
cache.set('industries', industries);
return industries;
}
Use Webhooks Instead of Polling
Polling the API wastes requests and introduces latency. Use webhooks for real-time updates:
| Approach | Requests per Day | Latency |
|---|---|---|
| Poll every minute | 1,440 | Up to 60 seconds |
| Poll every 5 minutes | 288 | Up to 5 minutes |
| Webhooks | ~0 (only triggered by events) | Under 1 second |
Batch Operations
Instead of creating multiple bookings one at a time, use list endpoints with broad filters and process locally:
// SLOW — individual requests
for (const id of customerIds) {
const customer = await otesse.customers.get(id);
// process
}
// FAST — one request
const { data: customers } = await otesse.customers.list({
ids: customerIds.join(','),
perPage: 100,
});
// process all at once
Optimize Request Patterns
Use Field Selection
Request only the fields you need (when supported):
GET /v1/customers?fields=id,email,name,stats.total_bookings
Parallel Requests
When you need data from multiple endpoints, fetch in parallel:
// SLOW — sequential
const customers = await otesse.customers.list();
const bookings = await otesse.bookings.list();
const invoices = await otesse.invoices.list();
// FAST — parallel
const [customers, bookings, invoices] = await Promise.all([
otesse.customers.list(),
otesse.bookings.list(),
otesse.invoices.list(),
]);
Connection Pooling
Reuse HTTP connections instead of creating new ones for each request:
import { Agent } from 'https';
const agent = new Agent({
keepAlive: true,
maxSockets: 10,
});
// Pass to your HTTP client
const otesse = new OtesseClient({
apiKey: process.env.OTESSE_API_KEY!,
httpAgent: agent,
});
Webhook Processing Performance
Process Asynchronously
Never do heavy processing in the webhook handler:
// SLOW — blocks the response
app.post('/webhook', async (req, res) => {
await syncToDatabase(req.body); // 200ms
await updateExternalService(req.body); // 500ms
await sendNotification(req.body); // 300ms
res.status(200).send('OK'); // 1 second later
});
// FAST — respond immediately
app.post('/webhook', async (req, res) => {
await queue.add(req.body); // ~5ms
res.status(200).send('OK');
});
Batch Database Writes
If processing multiple webhook events, batch your database writes:
// Instead of individual inserts per event
// Collect events and write in batches every N seconds
const eventBuffer: WebhookEvent[] = [];
setInterval(async () => {
if (eventBuffer.length === 0) return;
const batch = eventBuffer.splice(0, eventBuffer.length);
await db.webhookEvents.insertMany(batch);
}, 5000);
Monitoring
Track these metrics to identify performance issues:
- API response times — Alert if p95 exceeds 500ms
- Rate limit proximity — Alert when
X-RateLimit-Remainingdrops below 20% - Webhook processing time — Alert if events take more than 5 seconds to process
- Error rates — Alert if error rate exceeds 1%
- Cache hit rate — Should be above 90% for cached endpoints
On this page