Skip to main content
Version: 1.0.5

OpenAI Compatibility

Use ASI:One's API with OpenAI's client libraries for seamless integration.

Overview

ASI:One's API is fully compatible with OpenAI's Chat Completions API format. This means you can use existing OpenAI client libraries and simply change the base URL to start using ASI:One's agentic models with Agentverse marketplace integration.

API Compatibility

Standard OpenAI Parameters

These parameters work exactly the same as OpenAI's API:

  • model - Model name (use ASI:One model names)
  • messages - Chat messages array
  • temperature - Sampling temperature (0-2)
  • max_tokens - Maximum tokens in response
  • top_p - Nucleus sampling parameter
  • frequency_penalty - Frequency penalty (-2.0 to 2.0)
  • presence_penalty - Presence penalty (-2.0 to 2.0)
  • stream - Enable streaming responses

ASI:One-Specific Parameters

These ASI:One-specific parameters are also supported:

  • web_search - Enable web search capabilities
  • x-session-id - Session ID for agentic model persistence (header)
  • Tool calling parameters for Agentverse marketplace agent integration

Examples with OpenAI SDK

Install the OpenAI library: pip install openai

# Complete Request & Response
from openai import OpenAI

client = OpenAI(
api_key="YOUR_ASI_ONE_API_KEY",
base_url="https://api.asi1.ai/v1"
)

response = client.chat.completions.create(
model="asi1-mini",
messages=[
{"role": "system", "content": "Be precise and concise."},
{"role": "user", "content": "What is agentic AI and how does it work?"}
],
temperature=0.2,
top_p=0.9,
max_tokens=1000,
presence_penalty=0,
frequency_penalty=0,
stream=False,
extra_body={
"web_search": False
}
)

print(response.choices[0].message.content)
print(f"Usage: {response.usage}")
# Agentic Model with Session - Working Example
import uuid
from openai import OpenAI

client = OpenAI(
api_key="YOUR_ASI_ONE_API_KEY",
base_url="https://api.asi1.ai/v1"
)

# Generate session ID for agentic models
session_id = str(uuid.uuid4())

print(f"🆔 Session ID: {session_id}")
print("🔄 Making request to asi1-agentic...")

response = client.chat.completions.create(
model="asi1-agentic",
messages=[
{"role": "user", "content": "Check latest flights arrival status on Delhi airport."}
],
extra_headers={
"x-session-id": session_id
},
temperature=0.7,
stream=True
)

print("📡 Response received, streaming content:\n")

# Handle streaming response safely
for chunk in response:
# Safe check for choices and content
if (hasattr(chunk, 'choices') and
chunk.choices and
len(chunk.choices) > 0 and
hasattr(chunk.choices[0], 'delta') and
hasattr(chunk.choices[0].delta, 'content') and
chunk.choices[0].delta.content):

print(chunk.choices[0].delta.content, end="")

print("\n\n🏁 Stream completed!")
# Web Search Integration
from openai import OpenAI

client = OpenAI(
api_key="YOUR_ASI_ONE_API_KEY",
base_url="https://api.asi1.ai/v1"
)

response = client.chat.completions.create(
model="asi1-extended",
messages=[
{"role": "user", "content": "Latest developments in AI research"}
],
extra_body={
"web_search": True
}
)

print(response.choices[0].message.content)

Understanding the Response Structure

After making a request, your response object includes both standard OpenAI fields and ASI:One-specific fields:

  • choices[0].message.content: The main model response
  • model: The model used
  • usage: Token usage details
  • executable_data: (ASI:One) Agent manifests and tool calls from Agentverse marketplace
  • intermediate_steps: (ASI:One) Multi-step reasoning traces
  • thought: (ASI:One) Model reasoning process
# Accessing response fields
print(response.choices[0].message.content) # Main answer
print(response.model) # Model name
print(response.usage) # Token usage

# ASI:One specific fields
if hasattr(response, 'executable_data'):
print(response.executable_data) # Agent calls
if hasattr(response, 'intermediate_steps'):
print(response.intermediate_steps) # Reasoning steps
if hasattr(response, 'thought'):
print(response.thought) # Model thinking

Model Selection for OpenAI SDK

Choose the right ASI:One model based on your use case:

ModelBest ForOpenAI SDK Usage
asi1-miniFast responses, general chatStandard OpenAI parameters
asi1-fastUltra-low latencyStandard OpenAI parameters
asi1-extendedComplex reasoningStandard OpenAI parameters
asi1-agenticAgent orchestrationRequires x-session-id header
asi1-fast-agenticReal-time agentsRequires x-session-id header
asi1-extended-agenticComplex workflowsRequires x-session-id header
asi1-graphData visualizationStandard OpenAI parameters

Next Steps

Ready to get started with ASI:One's OpenAI-compatible API? Here's what to do next:

  1. Get your API key - Sign up and create your ASI:One API key
  2. Try the quickstart - Make your first API call in minutes
  3. Explore agentic models - Discover the power of Agentverse marketplace integration
  4. Learn about tool calling - Extend your applications with custom tools

Need help? Check out our Model Selection guide to choose the right model for your use case.