开发者文档
gemini-2.5-flash-lite API 接入文档
完整的API接入指南,包含Python、Node.js、cURL代码示例。5分钟快速集成Google模型。
代码示例
Python
from openai import OpenAI
# 配置 GetGoAPI
client = OpenAI(
api_key="YOUR_GETGOAPI_KEY", # 从 https://getgoapi.com 获取
base_url="https://api.getgoapi.com/v1"
)
# 调用 gemini-2.5-flash-lite
response = client.chat.completions.create(
model="gemini-2.5-flash-lite",
messages=[
{"role": "user", "content": "Hello, how are you?"}
]
)
print(response.choices[0].message.content)Node.js
import OpenAI from 'openai';
// 配置 GetGoAPI
const client = new OpenAI({
apiKey: 'YOUR_GETGOAPI_KEY', // 从 https://getgoapi.com 获取
baseURL: 'https://api.getgoapi.com/v1'
});
// 调用 gemini-2.5-flash-lite
async function main() {
const response = await client.chat.completions.create({
model: 'gemini-2.5-flash-lite',
messages: [
{ role: 'user', content: 'Hello, how are you?' }
]
});
console.log(response.choices[0].message.content);
}
main();cURL
curl https://api.getgoapi.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_GETGOAPI_KEY" \
-d '{
"model": "gemini-2.5-flash-lite",
"messages": [
{
"role": "user",
"content": "Hello, how are you?"
}
]
}'API 参数说明
| 参数 | 类型 | 说明 |
|---|---|---|
| model | string | 模型名称: gemini-2.5-flash-lite |
| messages | array | 对话消息数组 |
| temperature | number | 采样温度 (0-2),默认1 |
| max_tokens | number | 最大输出token数 |