Skip to main content

Use Cases

Function Calling enables models to interact with external tools to retrieve real-time information or perform specific operations. This feature improves data accuracy while extending model capabilities, making the model not just a simple text generator but one that supports more dynamic and practical application scenarios. Example use cases for Function Calling:
  • Dynamic Information Queries: Retrieve real-time data such as weather, news, and stock prices from external systems via API calls. For example, by calling a weather API to get real-time weather information, when a user asks about the current weather, the model can provide the current conditions rather than outdated forecasts.
  • Task Automation: Execute specific operations through function calls, allowing users to trigger automated backend operations through conversation. For example, by calling a ticketing website API to book tickets, when a user asks how to purchase tickets for a certain attraction, the model can help the user complete the booking directly instead of just explaining how to book.

Supported Models

The following models support Function Calling:

Usage

  1. Define the tool functions for the model to call.
  2. Add the tools parameter in the request to define the functions for the model to use.

Examples

The following provides a complete Python code example demonstrating how to use Function Calling, using querying the current weather at a location as an example. For the specific API format for Function Calling, see Create Chat Completion API.

1. Initialize the Client

You need to initialize the client with your Myrouter API key.
from openai import OpenAI
import json

client = OpenAI(
    base_url="https://api.myrouter.ai/openai",
    api_key="<Your API Key>",
)

model = "deepseek/deepseek-v3"

2. Define the Functions to Call

Define the functions for the model to call. The following Python example demonstrates a function for getting weather information.
# Example function to simulate fetching weather data.
def get_weather(location):
    """Get the current weather for a specified location"""
    print("Calling get_weather function, location: ", location)
    # In a real application, you would call an external weather API here.
    # This is a simplified example that returns hardcoded data.
    return json.dumps({"location": location, "temperature": "20 Celsius"})

3. Construct the API Request with Tools and User Message

Create the API call request. This request includes the tools parameter, which defines the functions for the model to use, along with the user’s message.
tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get the weather for a location. The user must first provide the location.",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "City information, e.g.: Shanghai",
                    }
                },
                "required": ["location"]
            },
        }
    },
]
messages = [
    {
        "role": "user",
        "content": "What's the weather like in Shanghai?"
    }
]
# Send the request and print the response
response = client.chat.completions.create(
    model=model,
    messages=messages,
    tools=tools,
)
# In production, check if the response contains tool calls
tool_call = response.choices[0].message.tool_calls[0]
print(tool_call.model_dump())
Output:
{'id': '0', 'function': {'arguments': '{"location": "Shanghai"}', 'name': 'get_weather'}, 'type': 'function'}

4. Process the Function Call Result and Get the Final Answer

Next, process the function call, execute the get_weather function, and send the result back to the model to generate the final response for the user.
# Ensure tool_call is defined from the previous step
if tool_call:
    # Extend conversation history with the assistant tool call message
    messages.append(response.choices[0].message)

    function_name = tool_call.function.name
    if function_name == "get_weather":
        function_args = json.loads(tool_call.function.arguments)
        # Execute the function and get the response
        function_response = get_weather(
            location=function_args.get("location"))
        # Add the function response to the messages
        messages.append(
            {
                "tool_call_id": tool_call.id,
                "role": "tool",
                "content": function_response,
            }
        )

    # Get the final response from the model with function results
    answer_response = client.chat.completions.create(
        model=model,
        messages=messages,
        # Note: do not include the tools parameter here
    )
    print(answer_response.choices[0].message)
Output:
ChatCompletionMessage(content="The current temperature in Shanghai is 20 Celsius. Please note that weather conditions may change at any time. It is recommended to check the latest weather forecast for more accurate information.", refusal=None, role='assistant', function_call=None, tool_calls=None)

Complete Code

from openai import OpenAI
import json

client = OpenAI(
    base_url="https://api.myrouter.ai/openai",
    api_key="<Your API Key>",
)

model = "deepseek/deepseek-v3"

# Example function to simulate fetching weather data.
def get_weather(location):
    """Get the current weather for a specified location"""
    print("Calling get_weather function, location: ", location)
    # In a real application, you would call an external weather API here.
    # This is a simplified example that returns hardcoded data.
    return json.dumps({"location": location, "temperature": "20 Celsius"})

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get the weather for a location. The user must first provide the location.",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "City information, e.g.: Shanghai",
                    }
                },
                "required": ["location"]
            },
        }
    },
]

messages = [
    {
        "role": "user",
        "content": "What's the weather like in Shanghai?"
    }
]

# Send the request and print the response
response = client.chat.completions.create(
    model=model,
    messages=messages,
    tools=tools,
)

# In production, check if the response contains tool calls
tool_call = response.choices[0].message.tool_calls[0]
print(tool_call.model_dump())

# Ensure tool_call is defined from the previous step
if tool_call:
    # Extend conversation history with the assistant tool call message
    messages.append(response.choices[0].message)

    function_name = tool_call.function.name
    if function_name == "get_weather":
        function_args = json.loads(tool_call.function.arguments)
        # Execute the function and get the response
        function_response = get_weather(
            location=function_args.get("location"))
        # Add the function response to the messages
        messages.append(
            {
                "tool_call_id": tool_call.id,
                "role": "tool",
                "content": function_response,
            }
        )

    # Get the final response from the model with function results
    answer_response = client.chat.completions.create(
        model=model,
        messages=messages,
        # Note: do not include the tools parameter here
    )
    print(answer_response.choices[0].message)