Installation
Basic Usage
Configuration
Custom Endpoint
Point to a self-hosted or local backend:Supported Providers
OpenAI
- GPT-5 family:
gpt-5.2,gpt-5,gpt-5-mini,gpt-5-nano - GPT-4 family:
gpt-4o,gpt-4o-mini,gpt-4.1,gpt-4-turbo,gpt-4 - GPT-3.5:
gpt-3.5-turbo - Other:
o3,o4-mini
Anthropic
- Opus:
claude-opus-4-6,claude-opus-4-1-20250805,claude-opus-4-20250514 - Sonnet:
claude-sonnet-4-6,claude-sonnet-4-5-20250929,claude-sonnet-4-20250514,claude-3-7-sonnet-20250219 - Haiku:
claude-haiku-4-5-20251001,claude-3-haiku-20240307
Google Gemini
- Gemini 2.0:
gemini-2.0-flash,gemini-2.0-flash-lite - Gemini 1.5:
gemini-1.5-pro,gemini-1.5-flash,gemini-1.5-flash-8b
What Gets Tracked
Every LLM API call sends a trace containing:Manual Cost Calculation
If you need to calculate costs without sending traces:Requirements
- Node.js: 18 or higher (for native
fetchsupport) - Dependencies: No runtime dependencies. Provider SDKs are peer dependencies.
Troubleshooting
No traces appearing in dashboard
- Check that
costrace.init()is called before creating LLM clients - Verify your API key is correct
- Check browser console (if client-side) or terminal for error warnings
TypeScript errors
The SDK is fully typed. If you see type errors, make sure you have the provider SDK installed:Traces not being sent
Check your browser console or terminal for[Costrace] warnings. The SDK uses console.warn() for errors.