Mistral: Ministral 3B

Ministral 3B is a 3B parameter model optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference.

Description

Ministral 3B is a 3B parameter model optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference.

ArchitectureАрхитектура

Modality:
text->text
InputModalities:
text
OutputModalities:
text
Tokenizer:
Mistral

ContextAndLimits

ContextLength:
32768 Tokens
MaxResponseTokens:
0 Tokens
Moderation:
Disabled

PricingRUB

Request:
Image:
WebSearch:
InternalReasoning:
Prompt1KTokens:
Completion1KTokens:

DefaultParameters

Temperature:
0.3
StartChatWith Mistral: Ministral 3B

UserComments