Rate Limiting
API rate limits ensure fair usage and system stability for all clients.
Rate Limits
Maximum requests allowed within a 60-second window
Maximum requests allowed within a 60-minute window
Note: Rate limits are applied per API key. Each tenant can have multiple API keys.
Response Headers
Every API response includes rate limit headers to help you track your usage:
X-RateLimit-LimitMaximum number of requests allowed in the current window
X-RateLimit-RemainingNumber of requests remaining in the current window
X-RateLimit-ResetUnix timestamp (milliseconds) when the rate limit window resets
HTTP/1.1 200 OK
Content-Type: application/json
X-RateLimit-Limit: 100
X-RateLimit-Remaining: 95
X-RateLimit-Reset: 1702905660000
{
"id": "inv_abc123",
"status": "PENDING",
...
}Rate Limit Exceeded
When you exceed the rate limit, the API returns a 429 Too Many Requests status code:
HTTP/1.1 429 Too Many Requests
Content-Type: application/json
X-RateLimit-Limit: 100
X-RateLimit-Remaining: 0
X-RateLimit-Reset: 1702905660000
Retry-After: 30
{
"error": {
"code": "RATE_LIMIT_EXCEEDED",
"message": "Rate limit exceeded. Please retry after 30 seconds.",
"retryAfter": 30
}
}Retry-After header to determine when to retry your request.Best Practices
1. Monitor Rate Limit Headers
Check X-RateLimit-Remaining in responses and slow down when approaching the limit.
2. Implement Exponential Backoff
When receiving 429 responses, implement exponential backoff with the Retry-After value.
3. Cache Responses
Cache frequently accessed data to reduce unnecessary API calls.
4. Use Webhooks
Instead of polling for invoice status updates, use webhooks to receive real-time notifications.
Implementation Example
class FromChainClient {
private baseURL = 'https://api.fromchain.io';
private apiKey: string;
constructor(apiKey: string) {
this.apiKey = apiKey;
}
async request(endpoint: string, options: RequestInit = {}) {
const url = `${this.baseURL}${endpoint}`;
const response = await fetch(url, {
...options,
headers: {
'Authorization': `Bearer ${this.apiKey}`,
'Content-Type': 'application/json',
...options.headers,
},
});
// Check rate limit headers
const remaining = parseInt(response.headers.get('X-RateLimit-Remaining') || '0');
const reset = parseInt(response.headers.get('X-RateLimit-Reset') || '0');
// Warn when approaching limit
if (remaining < 10) {
console.warn(`Rate limit warning: ${remaining} requests remaining`);
}
// Handle rate limit exceeded
if (response.status === 429) {
const retryAfter = parseInt(response.headers.get('Retry-After') || '60');
console.error(`Rate limited. Retrying after ${retryAfter} seconds`);
await this.sleep(retryAfter * 1000);
return this.request(endpoint, options); // Retry
}
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || 'API request failed');
}
return response.json();
}
private sleep(ms: number): Promise<void> {
return new Promise(resolve => setTimeout(resolve, ms));
}
}
// Usage
const client = new FromChainClient('gw_live_...');
try {
const invoice = await client.request('/v1/invoices', {
method: 'POST',
body: JSON.stringify({
amountExpected: '100.00',
externalId: 'order_123',
}),
});
console.log('Invoice created:', invoice.id);
} catch (error) {
console.error('Error:', error.message);
}Advanced: Request Batching
For high-volume integrations, consider batching operations to stay within rate limits:
class BatchProcessor {
private queue: Array<() => Promise<any>> = [];
private processing = false;
private requestsPerMinute = 90; // Leave buffer
private delay = (60 * 1000) / this.requestsPerMinute;
async add<T>(request: () => Promise<T>): Promise<T> {
return new Promise((resolve, reject) => {
this.queue.push(async () => {
try {
const result = await request();
resolve(result);
} catch (error) {
reject(error);
}
});
if (!this.processing) {
this.process();
}
});
}
private async process() {
this.processing = true;
while (this.queue.length > 0) {
const request = this.queue.shift();
if (request) {
await request();
await this.sleep(this.delay);
}
}
this.processing = false;
}
private sleep(ms: number): Promise<void> {
return new Promise(resolve => setTimeout(resolve, ms));
}
}
// Usage
const batch = new BatchProcessor();
const invoices = await Promise.all([
batch.add(() => client.request('/v1/invoices', {
method: 'POST',
body: JSON.stringify({ amountExpected: '100.00' }),
})),
batch.add(() => client.request('/v1/invoices', {
method: 'POST',
body: JSON.stringify({ amountExpected: '200.00' }),
})),
// ... more requests
]);
console.log(`Created ${invoices.length} invoices`);