Documentation Index Fetch the complete documentation index at: https://mintlify.com/QwenLM/Qwen/llms.txt
Use this file to discover all available pages before exploring further.
Overview
Qwen models support function calling capabilities, allowing the model to intelligently decide when to call external functions and generate appropriate parameters. This enables integration with APIs, databases, and other external services.
Qwen supports both Qwen-style and GPT-style function calling formats, providing flexibility for different use cases.
Qwen-Style Functions
Qwen uses a custom function definition format that includes both human-readable and model-readable descriptions:
functions = [
{
'name_for_human' : '谷歌搜索' ,
'name_for_model' : 'google_search' ,
'description_for_model' : '谷歌搜索是一个通用搜索引擎,可用于访问互联网、查询百科知识、了解时事新闻等。 Format the arguments as a JSON object.' ,
'parameters' : [
{
'name' : 'search_query' ,
'description' : '搜索关键词或短语' ,
'required' : True ,
'schema' : { 'type' : 'string' },
}
],
},
{
'name_for_human' : '文生图' ,
'name_for_model' : 'image_gen' ,
'description_for_model' : '文生图是一个AI绘画(图像生成)服务,输入文本描述,返回根据文本作画得到的图片的URL。 Format the arguments as a JSON object.' ,
'parameters' : [
{
'name' : 'prompt' ,
'description' : '英文关键词,描述了希望图像具有什么内容' ,
'required' : True ,
'schema' : { 'type' : 'string' },
}
],
},
]
GPT-Style Functions
Qwen also supports OpenAI-compatible function definitions:
functions = [
{
'name' : 'get_current_weather' ,
'description' : 'Get the current weather in a given location.' ,
'parameters' : {
'type' : 'object' ,
'properties' : {
'location' : {
'type' : 'string' ,
'description' : 'The city and state, e.g. San Francisco, CA' ,
},
'unit' : {
'type' : 'string' ,
'enum' : [ 'celsius' , 'fahrenheit' ]
},
},
'required' : [ 'location' ],
},
}
]
Using Function Calling with OpenAI API
Qwen provides an OpenAI-compatible API server for function calling:
Basic Function Call
Multi-Turn Conversation
import json
import openai
# Configure API endpoint
openai.api_base = 'http://localhost:8000/v1'
openai.api_key = 'none'
def call_qwen ( messages , functions = None ):
if functions:
response = openai.ChatCompletion.create(
model = 'Qwen' ,
messages = messages,
functions = functions
)
else :
response = openai.ChatCompletion.create(
model = 'Qwen' ,
messages = messages
)
return response.choices[ 0 ][ 'message' ]
# Example: Weather query
messages = [
{ 'role' : 'user' , 'content' : '波士顿天气如何?' }
]
functions = [
{
'name' : 'get_current_weather' ,
'description' : 'Get the current weather in a given location.' ,
'parameters' : {
'type' : 'object' ,
'properties' : {
'location' : {
'type' : 'string' ,
'description' : 'The city and state, e.g. San Francisco, CA' ,
},
'unit' : { 'type' : 'string' , 'enum' : [ 'celsius' , 'fahrenheit' ]},
},
'required' : [ 'location' ],
},
}
]
response = call_qwen(messages, functions)
print (response)
# Output includes function_call with arguments: {"location": "Boston, MA"}
Setting Up the OpenAI API Server
Install Dependencies
pip install fastapi uvicorn openai pydantic sse_starlette
Clone Repository
git clone https://github.com/QwenLM/Qwen-7B
cd Qwen-7B
Start API Server
The server will start on http://localhost:8000/v1
Function Calling Workflow
Define Functions
Create function definitions with clear descriptions and parameter schemas.
Send User Query
Send the user’s message along with available functions to the model.
Model Decides
The model decides whether to call a function and generates appropriate arguments.
Execute Function
Your application executes the function with the provided arguments.
Return Results
Send the function results back to the model for final response generation.
Integration with LangChain
Qwen works seamlessly with LangChain for agent-based function calling:
from langchain.agents import AgentType, initialize_agent, load_tools
from langchain.chat_models import ChatOpenAI
llm = ChatOpenAI(
model_name = 'Qwen' ,
openai_api_base = 'http://localhost:8000/v1' ,
openai_api_key = 'EMPTY' ,
streaming = False ,
)
tools = load_tools([ 'arxiv' ])
agent_chain = initialize_agent(
tools,
llm,
agent = AgentType. ZERO_SHOT_REACT_DESCRIPTION ,
verbose = True ,
)
agent_chain.run( '查一下论文 1605.08386 的信息' )
Best Practices
Important Considerations:
Clear Descriptions : Provide detailed function descriptions to help the model understand when to use each function
Parameter Validation : Always validate function arguments before execution
Error Handling : Implement proper error handling for function calls
Chinese vs English : The current version (as of 2023.08) of Qwen-7B-Chat performs better with Chinese tool-use prompts than English ones
Supported Models
Function calling is supported on:
Qwen-7B-Chat
Qwen-14B-Chat
Qwen-72B-Chat
Qwen-1.8B-Chat
All models support both Qwen-style and GPT-style function definitions.
Example Use Cases
Web Search : Integrate search engines to answer factual questions
Image Generation : Create images based on text descriptions
Weather APIs : Get real-time weather information
Database Queries : Retrieve information from databases
Calculator : Perform mathematical calculations
Code Execution : Execute code and return results
Next Steps
Tool Use Learn about ReAct prompting and tool integration
Agent Building Build intelligent agents with Qwen