Skip to content

DeepSeek

1. Overview

DeepSeek offers a cost-effective LLM family with relatively low prompt and completion costs, making it especially suitable for translation and other high-volume workloads.

2. Request

  • Method:POST
  • Endpoint:

    https://gateway.serevixai.ai/v1/chat/completions
    

3. Parameters

3.1 Header Parameters

Parameter Type Required Description Example
Content-Type string Yes Sets the request content type. It must be application/json application/json
Accept string Yes Sets the response content type. The recommended value is application/json application/json
Authorization string Yes API key required for authentication, in the format Bearer $YOUR_API_KEY. Bearer $YOUR_API_KEY

3.2 Body Parameters (application/json)

Parameter Type Required Description Example
model string Yes The model ID to use. See Model List for available versions, such as deepseek-v3. deepseek-v3
messages array Yes A chat message list in an OpenAI-compatible format. Each object contains role and content. [{"role": "user","content": "Hello"}]
role string No Message role. Supported values: system, user, and assistant. user
content string No The message content. Hello, tell me a joke.
temperature number No Sampling temperature in the range 0-2. Higher values make the output more random, while lower values make it more focused and deterministic. 0.7
top_p number No Another way to control the sampling distribution, in the range 0-1. It is usually used instead of temperature. 0.9
n number No How many completions to generate for each input message. 1
stream boolean No Whether to enable streaming output. When set to true, the API returns ChatGPT-style streamed data. false
stop string No You can specify up to 4 stop strings. Generation stops when one of them appears in the output. "\n"
max_tokens number No The maximum number of tokens that can be generated in a single reply, subject to the model context window. 1024
presence_penalty number No -2.0 to 2.0. Positive values encourage the model to introduce new topics, while negative values reduce that tendency. 0
frequency_penalty number No -2.0 to 2.0. Positive values reduce repetition, while negative values increase it. 0

4. Request Examples

POST /v1/chat/completions
Content-Type: application/json
Accept: application/json
Authorization: Bearer $YOUR_API_KEY

{
    "model": "deepseek-v3",
    "messages": [
        {
            "role": "user",
            "content": "Hello, can you explain quantum mechanics to me?"
        }
    ]
}
curl https://gateway.serevixai.ai/v1/chat/completions \
    -H "Content-Type: application/json" \
    -H "Accept: application/json" \
    -H "Authorization: Bearer $YOUR_API_KEY" \
    -d "{
    \"model\": \"deepseek-v3\",
    \"messages\": [{
        \"role\": \"user\",
        \"content\": \"Hello, can you explain quantum mechanics to me?\"
    }]
}"
package main

import (
    "context"
    "fmt"

    "github.com/openai/openai-go"
    "github.com/openai/openai-go/option"
)

func main() {
    apiKey := "sk-123456789012345678901234567890123456789012345678"

    client := openai.NewClient(
        option.WithAPIKey(apiKey),
        option.WithBaseURL("https://gateway.serevixai.ai/v1"),
    )

    resp, err := client.Chat.Completions.New(
        context.Background(),
        openai.ChatCompletionNewParams{
            Model: "deepseek-v3",
            Messages: []openai.ChatCompletionMessageParamUnion{
                openai.UserMessage("Hello, can you explain quantum mechanics to me?"),
            },
        },
    )

    if err != nil {
        fmt.Println("error:", err)
        return
    }

    fmt.Println(resp.Choices[0].Message.Content)
}
#!/usr/bin/env python3

from openai import OpenAI

def main():
    api_key = "sk-123456789012345678901234567890123456789012345678"

    client = OpenAI(
        api_key=api_key,
        base_url="https://gateway.serevixai.ai/v1"
    )

    response = client.chat.completions.create(
        model="deepseek-v3",
        messages=[
            {"role": "user", "content": "Hello, can you explain quantum mechanics to me?"}
        ]
    )

    print(response.choices[0].message.content)

if __name__ == "__main__":
    main()

5. Response Example

{
    "id": "chatcmpl-1234567890",
    "object": "chat.completion",
    "created": 1699999999,
    "model": "deepseek-v3",
    "choices": [
        {
            "message": {
                "role": "assistant",
                "content": "Quantum mechanics is the branch of physics that studies the microscopic world..."
            },
            "finish_reason": "stop"
        }
    ],
    "usage": {
        "prompt_tokens": 10,
        "completion_tokens": 30,
        "total_tokens": 40
    }
}