4K
0.00252
0.00252
405B
Chat
Offline

Snowflake Arctic Instruct

Snowflake Arctic Instruct: Open-source enterprise LLM with 480B parameters, excelling in SQL, coding, and instruction following tasks. Apache-2.0 licensed.
Try it now

AI Playground

Test all API models in the sandbox environment before you integrate. We provide more than 200 models to integrate into your app.
AI Playground image
Ai models list in playground
Testimonials

Our Clients' Voices

Snowflake Arctic InstructTechflow Logo - Techflow X Webflow Template

Snowflake Arctic Instruct

Efficient enterprise-grade LLM with dense-MoE architecture for diverse AI applications.

Basic Information

Model Name: Snowflake Arctic Instruct

Developer/Creator: Snowflake AI Research Team

Release Date: April 24, 2024

Version: Not specified

Model Type: Large Language Model (LLM)

Description

Overview

Snowflake Arctic Instruct is an efficient, intelligent, and open-source language model developed by the Snowflake AI Research Team. It combines a dense transformer model with a Mixture of Experts (MoE) architecture, resulting in a powerful and flexible foundation for building AI-powered applications.

Key Features
  • Dense-MoE Hybrid transformer architecture
  • 480 billion total parameters, 17 billion active parameters
  • Optimized for inference efficiency
  • Instruction-tuned for improved performance on enterprise tasks
  • Apache-2.0 license for free use in research, prototypes, and products
Intended Use

Snowflake Arctic Instruct is designed for enterprise-level AI applications, excelling at tasks such as:

  • SQL generation
  • Code generation and understanding
  • Complex instruction following
  • Dialogue and conversational AI
  • Summarization
  • General language understanding and generation
Language Support

The model supports text input and output, including code generation.

Technical Details

Architecture

Snowflake Arctic Instruct features a unique Dense-MoE Hybrid transformer architecture:

  • 10 billion parameter dense transformer model
  • Residual 128x3.66 billion parameter MoE Multilayer Perceptron (MLP)
  • Top-2 gating technique for selecting active parameters
  • 35 transformer layers
Training Data

The training process for Arctic was split into three distinct stages, totaling approximately 3.5 trillion tokens:

  1. Phase 1: 1 trillion tokens
  2. Phase 2: 1.5 trillion tokens
  3. Phase 3: 1 trillion tokens

This multi-stage approach allowed different competencies to be wired logically, optimizing the model's performance on enterprise-focused tasks.

Knowledge Cutoff

The knowledge cutoff date is up to early 2024.

Performance Metrics

Snowflake Arctic Instruct demonstrates strong performance across various benchmarks:

  • Excels at enterprise-specific tasks
  • Outperforms DBRX, Mixtral 8x7B, and Llama 2 70B on average across enterprise benchmarks
  • Competitive performance on general commonsense reasoning benchmarks
  • Achieves a score of 7.95 on MTBench, with a turn-1 score of 8.31
  • Performs competitively on the Helpful, Honest, & Harmless (HHH) alignment dataset

Usage

Code Samples

Ethical Guidelines

While specific ethical guidelines are not mentioned in the search results, the model is released under an Apache-2.0 license, allowing free use in research, prototypes, and products.

Licensing

License Type: Apache-2.0The Apache-2.0 license allows users to freely use, modify, and distribute the model in both commercial and non-commercial applications.

Try it now

The Best Growth Choice
for Enterprise

Get API Key