Goliath 120B

A large LLM created by combining two fine-tuned Llama 70B models into one 120B model. Combines Xwin and Euryale. Credits to - [@chargoddard](https://huggingface.co/chargoddard) for developing the framework used to merge the model - [mergekit](https://github.com/cg123/mergekit). - [@Undi95](https://huggingface.co/Undi95) for helping with the merge ratios. #merge

Description

A large LLM created by combining two fine-tuned Llama 70B models into one 120B model. Combines Xwin and Euryale. Credits to - [@chargoddard](https://huggingface.co/chargoddard) for developing the framework used to merge the model - [mergekit](https://github.com/cg123/mergekit). - [@Undi95](https://huggingface.co/Undi95) for helping with the merge ratios. #merge

ArchitectureАрхитектура

Modality:
text->text
InputModalities:
text
OutputModalities:
text
Tokenizer:
Llama2
InstructionType:
airoboros

ContextAndLimits

ContextLength:
6144 Tokens
MaxResponseTokens:
512 Tokens
Moderation:
Disabled

PricingRUB

Request:
Image:
WebSearch:
InternalReasoning:
Prompt1KTokens:
Completion1KTokens:

DefaultParameters

Temperature:
0
StartChatWith Goliath 120B

UserComments