Uncategorized

Scaling Large Language Models with Simple yet Effective Depth Up-Scaling



Download a PDF of the paper titled SOLAR 10.7B: Scaling Large Language Models with Simple yet Effective Depth Up-Scaling, by Dahyun Kim and 17 other authors

Download PDF
HTML (experimental)

Abstract:We introduce SOLAR 10.7B, a large language model (LLM) with 10.7 billion parameters, demonstrating superior performance in various natural language processing (NLP) tasks. Inspired by recent efforts to efficiently up-scale LLMs, we present a method for scaling LLMs called depth up-scaling (DUS), which encompasses depthwise scaling and continued pretraining. In contrast to other LLM up-scaling methods that use mixture-of-experts, DUS does not require complex changes to train and inference efficiently. We show experimentally that DUS is simple yet effective in scaling up high-performance LLMs from small ones. Building on the DUS model, we additionally present SOLAR 10.7B-Instruct, a variant fine-tuned for instruction-following capabilities, surpassing Mixtral-8x7B-Instruct. SOLAR 10.7B is publicly available under the Apache 2.0 license, promoting broad access and application in the LLM field.

Submission history

From: Chanjun Park [view email]
[v1]
Sat, 23 Dec 2023 05:11:37 UTC (557 KB)
[v2]
Fri, 29 Dec 2023 01:51:29 UTC (783 KB)



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *