mistral_common.protocol.instruct.request
ChatCompletionRequest(**data)
Bases: BaseCompletionRequest
, Generic[ChatMessageType]
Request for a chat completion.
Attributes:
Name | Type | Description |
---|---|---|
model |
Optional[str]
|
The model to use for the chat completion. |
messages |
List[ChatMessageType]
|
The messages to use for the chat completion. |
response_format |
ResponseFormat
|
The format of the response. |
tools |
Optional[List[Tool]]
|
The tools to use for the chat completion. |
tool_choice |
ToolChoice
|
The tool choice to use for the chat completion. |
truncate_for_context_length |
bool
|
Whether to truncate the messages for the context length. |
continue_final_message |
bool
|
Whether to continue the final message. |
Examples:
>>> from mistral_common.protocol.instruct.messages import UserMessage, AssistantMessage
>>> from mistral_common.protocol.instruct.tool_calls import ToolTypes, Function
>>> request = ChatCompletionRequest(
... messages=[
... UserMessage(content="Hello!"),
... AssistantMessage(content="Hi! How can I help you?"),
... ],
... response_format=ResponseFormat(type=ResponseFormats.text),
... tools=[Tool(type=ToolTypes.function, function=Function(name="get_weather", parameters={}))],
... tool_choice=ToolChoice.auto,
... truncate_for_context_length=True,
... )
Source code in .venv/lib/python3.13/site-packages/pydantic/main.py
from_openai(messages, tools=None, continue_final_message=False, **kwargs)
classmethod
Create a chat completion request from the OpenAI format.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
messages
|
List[Dict[str, Union[str, List[Dict[str, Union[str, Dict[str, Any]]]]]]]
|
The messages in the OpenAI format. |
required |
tools
|
Optional[List[Dict[str, Any]]]
|
The tools in the OpenAI format. |
None
|
continue_final_message
|
bool
|
Whether to continue the final message. |
False
|
**kwargs
|
Any
|
Additional keyword arguments to pass to the constructor. These should be the same as the fields of the request class or the OpenAI API equivalent. |
{}
|
Returns:
Type | Description |
---|---|
ChatCompletionRequest
|
The chat completion request. |
Source code in src/mistral_common/protocol/instruct/request.py
to_openai(**kwargs)
Convert the request messages and tools into the OpenAI format.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
kwargs
|
Any
|
Additional parameters to be added to the request. |
{}
|
Returns:
Type | Description |
---|---|
Dict[str, List[Dict[str, Any]]]
|
The request in the OpenAI format. |
Examples:
>>> from mistral_common.protocol.instruct.messages import UserMessage
>>> from mistral_common.protocol.instruct.tool_calls import Tool, Function
>>> request = ChatCompletionRequest(messages=[UserMessage(content="Hello, how are you?")], temperature=0.15)
>>> request.to_openai(stream=True)
{'temperature': 0.15, 'top_p': 1.0, 'response_format': {'type': 'text'}, 'tool_choice': 'auto', 'continue_final_message': False, 'messages': [{'role': 'user', 'content': 'Hello, how are you?'}], 'stream': True}
>>> request = ChatCompletionRequest(messages=[UserMessage(content="Hello, how are you?")], tools=[
... Tool(function=Function(
... name="get_current_weather",
... description="Get the current weather in a given location",
... parameters={
... "type": "object",
... "properties": {
... "location": {
... "type": "string",
... "description": "The city and state, e.g. San Francisco, CA",
... },
... "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
... },
... "required": ["location"],
... },
... ),
... )])
>>> request.to_openai()
{'temperature': 0.7, 'top_p': 1.0, 'response_format': {'type': 'text'}, 'tool_choice': 'auto', 'continue_final_message': False, 'messages': [{'role': 'user', 'content': 'Hello, how are you?'}], 'tools': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}]}
Source code in src/mistral_common/protocol/instruct/request.py
InstructRequest(**data)
Bases: MistralBase
, Generic[ChatMessageType, ToolType]
A valid Instruct request to be tokenized.
Attributes:
Name | Type | Description |
---|---|---|
messages |
List[ChatMessageType]
|
The history of the conversation. |
system_prompt |
Optional[str]
|
The system prompt to be used for the conversation. |
available_tools |
Optional[List[ToolType]]
|
The tools available to the assistant. |
truncate_at_max_tokens |
Optional[int]
|
The maximum number of tokens to truncate the conversation at. |
continue_final_message |
bool
|
Whether to continue the final message. |
Examples:
>>> from mistral_common.protocol.instruct.messages import UserMessage, SystemMessage
>>> request = InstructRequest(
... messages=[UserMessage(content="Hello, how are you?")], system_prompt="You are a helpful assistant."
... )
Source code in .venv/lib/python3.13/site-packages/pydantic/main.py
from_openai(messages, tools=None, continue_final_message=False, **kwargs)
classmethod
Create an instruct request from the OpenAI format.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
messages
|
List[Dict[str, Union[str, List[Dict[str, Union[str, Dict[str, Any]]]]]]]
|
The messages in the OpenAI format. |
required |
tools
|
Optional[List[Dict[str, Any]]]
|
The tools in the OpenAI format. |
None
|
continue_final_message
|
bool
|
Whether to continue the final message. |
False
|
**kwargs
|
Any
|
Additional keyword arguments to pass to the constructor. These should be the same as the fields of the request class or the OpenAI API equivalent. |
{}
|
Returns:
Type | Description |
---|---|
InstructRequest
|
The instruct request. |
Source code in src/mistral_common/protocol/instruct/request.py
to_openai(**kwargs)
Convert the request messages and tools into the OpenAI format.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
kwargs
|
Any
|
Additional parameters to be added to the request. |
{}
|
Returns:
Type | Description |
---|---|
Dict[str, List[Dict[str, Any]]]
|
The request in the OpenAI format. |
Examples:
>>> from mistral_common.protocol.instruct.messages import UserMessage
>>> from mistral_common.protocol.instruct.tool_calls import Tool, Function
>>> request = InstructRequest(messages=[UserMessage(content="Hello, how are you?")])
>>> request.to_openai(temperature=0.15, stream=True)
{'continue_final_message': False, 'messages': [{'role': 'user', 'content': 'Hello, how are you?'}], 'temperature': 0.15, 'stream': True}
>>> request = InstructRequest(
... messages=[UserMessage(content="Hello, how are you?")],
... available_tools=[
... Tool(function=Function(
... name="get_current_weather",
... description="Get the current weather in a given location",
... parameters={
... "type": "object",
... "properties": {
... "location": {
... "type": "string",
... "description": "The city and state, e.g. San Francisco, CA",
... },
... "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
... },
... "required": ["location"],
... },
... ),
... )])
>>> request.to_openai()
{'continue_final_message': False, 'messages': [{'role': 'user', 'content': 'Hello, how are you?'}], 'tools': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}]}
Source code in src/mistral_common/protocol/instruct/request.py
ResponseFormat(**data)
Bases: MistralBase
The format of the response.
Attributes:
Name | Type | Description |
---|---|---|
type |
ResponseFormats
|
The type of the response. |
Examples: