Use Perplexity’s Sonar API with OpenAI’s client libraries for seamless integration.
model
- Model name (use Perplexity model names)messages
- Chat messages arraytemperature
- Sampling temperature (0-2)max_tokens
- Maximum tokens in responsetop_p
- Nucleus sampling parameterfrequency_penalty
- Frequency penalty (-2.0 to 2.0)presence_penalty
- Presence penalty (-2.0 to 2.0)stream
- Enable streaming responsessearch_domain_filter
- Limit or exclude specific domainssearch_recency_filter
- Filter by content recencyreturn_citations
- Include citation URLs in responsereturn_images
- Include image URLs in responsereturn_related_questions
- Include related questionssearch_mode
- “web” (default) or “academic” mode selector.pip install openai
choices[0].message.content
: The main model responsemodel
: The model usedusage
: Token usage detailscitations
: (Perplexity) List of source URLssearch_results
: (Perplexity) Array of search result objectssonar-pro
, sonar-reasoning
, etc.)Bearer
token format in Authorization header