Mistral: Mistral 7B Instruct v0.2

A high-performing, industry-standard 7.3B parameter model, with optimizations for speed and context length. An improved version of [Mistral 7B Instruct](/modelsmistralai/mistral-7b-instruct-v0.1), with the following changes: - 32k context window (vs 8k context in v0.1) - Rope-theta = 1e6 - No Sliding-Window Attention

Description

A high-performing, industry-standard 7.3B parameter model, with optimizations for speed and context length. An improved version of [Mistral 7B Instruct](/modelsmistralai/mistral-7b-instruct-v0.1), with the following changes: - 32k context window (vs 8k context in v0.1) - Rope-theta = 1e6 - No Sliding-Window Attention

ArchitectureАрхитектура

Modality:
text->text
InputModalities:
text
OutputModalities:
text
Tokenizer:
Mistral
InstructionType:
mistral

ContextAndLimits

ContextLength:
32768 Tokens
MaxResponseTokens:
0 Tokens
Moderation:
Disabled

PricingRUB

Request:
Image:
WebSearch:
InternalReasoning:
Prompt1KTokens:
Completion1KTokens:

DefaultParameters

Temperature:
0.3
StartChatWith Mistral: Mistral 7B Instruct v0.2

UserComments