Snowflake Arctic vs DBRX: 10 Differences That You Should Know (2024)

AI has made incredible progress in recent years, with LLMs playing a significant part in this growth. It seems like almost every other week, a new LLM hits the market! These powerful language models can not only understand and create human-like text, but they can also generate code, making them useful in a ton of different ways. Two really impressive LLMs that have popped up recently are Snowflake Arctic and DBRX. Both of em' have received a lot of attention for their impressive capabilities and open nature.

In this article, we will compare Snowflake Arctic vs DBRX across 10 key differences—architecture, parameters, hardware infrastructure, development timeline, cost, training tokens, code generation benchmark, programming and mathematical reasoning, IFEval common sense reasoning benchmark, MMLU benchmark, and much more! Then, we'll highlight the unique capabilities and use cases of each model by giving them challenges to solve and see how they perform on each task.

What Is Snowflake Arctic?

Snowflake Arctic was announced last week as one of their new cutting-edge enterprise-grade open source LLMs (large language models), causing some buzz in the LLM and AI communities.

Snowflake Arctic (Source: Snowflake.com) - Snowflake Arctic vs DBRX

Snowflake Arctic differentiates itself from other LLMs by its unique Dense-MoE Hybrid Transformer architecture, which delivers top-tier intelligence with excellent efficiency on a massive scale. It has 480 billion parameters distributed across 128 fine-grained experts and uses a top-2 gating technique to choose 17 Billion active parameters.

Snowflake Arctic's key distinguishing feature is its open nature—Snowflake has released its weights under an Apache 2.0 license, creating a new standard for openness in enterprise LLM technology.

Snowflake Arctic has several key features, such as:

1) Efficiently Intelligent

Snowflake Arctic excels at enterprise tasks such as SQL generation, coding, and instruction following benchmarks, even when compared to open source models trained with significantly higher compute budgets. It sets a new baseline for cost-effective training, enabling Snowflake users to create high-quality custom models for their enterprise needs at a very low cost.

2) Breakthrough Efficiency

Snowflake Arctic offers top-tier enterprise intelligence among open source LLMs, excelling at tasks such as SQL, code generation, complex instruction following, and the ability to produce grounded answers.

3) Truly Open Source

Snowflake Arctic is available under an Apache 2.0 license, providing ungated access to its weights and code. Also, Snowflake has open sourced all of its data recipes and research insights.

4) Enterprise AI Focused

Snowflake Arctic model is specifically tailored for enterprise AI needs, focusing on high-quality tasks for enterprise. It's great for core applications like data analysis and automation and shines in outperforming larger models without needing lots of computing power (compared to other open models).

What Is Databricks DBRX?

Databricks DBRX was announced last month; it is also one of the cutting-edge, state-of-the-art language models. DBRX is a transformer-based, decoder-only large language model with a total of 132 billion parameters, 36 billion of which are active, thanks to its unique fine-grained Mixture of Experts (MoE) architecture. What makes DBRX special is how it uses 16 experts and selects 4 of them for any given input, resulting in about ~65 times more possible combinations of experts, enhancing the model's overall quality and performance.

Databricks DBRX (Source: Databricks.com) - Snowflake Arctic vs DBRX

Here are the key features of Databricks DBRX:

1) Advanced Architecture

DBRX distinguishes itself from other large language models (LLMs) with its advanced fine-grained Mixture Of Experts (MoE) architecture and innovative training process.

2) Built with Powerful Tools and Infrastructure

Databricks DBRX model was built using powerful hardware infrastructure, advanced software tools, and deep expertise in large language model development. This includes the use of Unity Catalog for data management and governance, Lilac AI for data exploration and analysis, Apache Spark and Databricks notebooks for data processing and cleaning, optimized versions of open source training libraries like MegaBlocks, LLM Foundry, Composer, and Streaming for model training, and MLflow for results logging.

3) Transformer-Based and Decoder-Only

DBRX is a transformer-based, decoder-only large language model with 132 billion parameters, of which 36 billion are actively used during inference thanks to its unique fine-grained Mixture Of Experts (MoE) architecture.

4) Performance and Benchmarks

DBRX outperforms nearly all established open models and even rivals closed-source LLM giants like GPT-4, Gemini, and Mistral across various benchmarks. It has introduced two models—DBRX Base and DBRX Instruct, with DBRX Instruct excelling in various benchmarks, particularly in programming and mathematics.

5) Open Nature

The weights of the base model (DBRX Base) and the fine-tuned model (DBRX Instruct) are available on Hugging Face under an open license, promoting transparency and collaboration within the AI community.

6) Easy Setup and Accessibility

Databricks has made DBRX accessible through various user-friendly platforms and third-party integrations, catering to a wide range of users and use cases. It's also possible to set up DBRX on a local machine, although it requires significant processing/compute power and memory.

Top 10 Key Differences—Snowflake Arctic vs DBRX

If you're in a hurry, here's a quick rundown of 10 key differences between Snowflake Arctic and DBRX.

Snowflake Arctic vs DBRX top 10 key differences - Snowflake Arctic vs DBRX

1). Snowflake Arctic vs DBRX—Architecture Breakdown

Databricks DBRX: Fine-grained MoE architecture

Databricks DBRX uses a transformer-decoder architecture, a popular choice for large language models. Unlike some models, which can encode and decode text, DBRX concentrates solely on text generation. To optimize efficiency, it uses a fine-grained Mixture Of Experts (MoE) technique, meaning DBRX has a pool of 132 billion parameters but only activates a relevant subset of 36 billion for each task.

Basically, it works like a team of experts, calling upon the most relevant ones to address a given request. This allows DBRX to strike a balance between delivering powerful results and maintaining resource efficiency.

Snowflake Arctic: Dense MoE hybrid transformer architecture

Snowflake Arctic uses a unique method with its Dense-MoE hybrid transformer architecture, which blends a foundational dense transformer model with a significantly larger residual MoE MLP (Multilayer Perceptron).

While the MLP has a large number of parameters (about 470 billion), only a tiny fraction (approximately 17 billion) is used during inference due to the MoE nature. This combination offers efficient training and inference.

The dense transformer provides a strong base, while the MoE MLP acts like a pool of experts, activating only the relevant sub-model for a specific task.

To increase efficiency, Snowflake Arctic utilizes top-2 gating within the MoE MLP, which ensures that the most relevant experts are chosen for the job.

As a whole, this architecture enables Snowflake Arctic to achieve excellent performance while preserving resources.

2). Snowflake Arctic vs DBRX—Total Parameters and Active Parameters

How Many Parameters Does DBRX Have?

DBRX is a transformer-based LLM with a total of 132 Billion parameters. However, due to its MoE architecture, only a subset of these parameters, 36 Billion, are active on any given input. This fine-grained MoE design allows DBRX to selectively utilize different combinations of experts, resulting in improved model quality and efficiency.

How Many Parameters Does Snowflake Arctic Have?

Snowflake Arctic contains a remarkable 480 Billion total parameters, making it one of the largest LLMs currently available. However, like DBRX, Snowflake Arctic's MoE architecture means that not all of these parameters are active at once. Instead, Arctic employs a top-2 gating mechanism to choose 17 Billion active parameters from its 128 experts.

This dense-MoE hybrid design, with a large number of total parameters and many experts, allows Snowflake Arctic to leverage a vast model capacity for top-tier intelligence. At the same time, it maintains resource-efficient training and inference by engaging a moderate number of active parameters.

3). Snowflake Arctic vs DBRX—Hardware Infrastructure Used

DBRX: 3072 NVIDIA H100s

The development of DBRX was a computationally intensive task, requiring substantial hardware resources. According to Databricks, DBRX was trained on 3072 * NVIDIA H100 GPUs, which are among the most powerful and efficient GPUs currently available for AI workloads.

Snowflake Arctic: 1000 GPUs

The development of Snowflake Arctic was also a computationally intensive task, but it required significantly fewer hardware resources compared to DBRX. According to Snowflake, Arctic was trained on 1000 * NVIDIA GPUs, which is approximately one-third of the GPU count used for DBRX.

4). Snowflake Arctic vs DBRX—Development Timeline and Cost

Databricks DBRX: 3 months + $10 million dollar

The development of DBRX was a significant undertaking, spanning three months from start to finish. This intense computation costed Databricks around ~$10 million, including pre-training, post-training, evaluation, and refining.

Snowflake Arctic: 3 months + $2 million dollar

Remarkably, Snowflake Arctic was developed over a similar three-month timeline as DBRX. However, the training compute budget for Snowflake Arctic was significantly lower, coming in at approximately $2 million (less than 3,000 GPU weeks), which is just one-fifth of the cost incurred for Databricks DBRX.

5). Snowflake Arctic vs DBRX—Training Tokens

Databricks DBRX: 12 Trillion tokens

DBRX was pre-trained on an expansive dataset comprising 12 trillion tokens of text and code data. This massive training corpus, carefully curated by the Databricks team, played a crucial role in shaping DBRX's broad language understanding and generation capabilities across various domains.

Snowflake Arctic: 3.5 Trillion tokens (3 Stage)

While Snowflake Arctic's training dataset was smaller in comparison to DBRX, at 3.5 trillion tokens, it was meticulously designed to enhance the model's performance on enterprise-focused tasks. Snowflake Arctic was trained with a three-stage curriculum, with each stage focusing on different skills. The first stage (1 Trillion tokens) concentrated on generic skills, while the latter two stages (1.5Trillion and 1 Trillion tokens, respectively) emphasized enterprise-specific capabilities such as coding, math, and SQL.

Performance Comparision

6). Snowflake Arctic vs DBRX—Code (SQL) Generation Benchmark Comparison

Databricks DBRX: 76.3% (Spider)

On the Spider benchmark, which evaluates SQL generation capabilities, DBRX achieved an impressive score of 76.3%. This result highlights DBRX's proficiency in understanding natural language queries and translating them into precise SQL statements, a crucial skill for data analysis and querying tasks.

Snowflake Arctic: 79% (Spider)

On the same Spider benchmark, Snowflake Arctic took a significant leap forward, achieving an impressive 79% score. This result surpasses DBRX's performance on the Spider benchmark alone, showcasing Arctic's strength in SQL generation tasks.

7). Snowflake Arctic vs DBRX—Programming and Mathematical Reasoning Benchmark Comparison

HumanEval Benchmark Comparison

Databricks DBRX: 61.0% vs Snowflake Arctic: 64.3%

On the HumanEval benchmark, which tests programming abilities, Snowflake Arctic outperforms DBRX with a score of 64.3% compared to DBRX's 61.0%. This result highlights Arctic's strength as a capable code model.

GSM8k Benchmark Comparison

Databricks DBRX: 73.5% vs Snowflake Arctic: 74.2%

When it comes to mathematical reasoning, as evaluated by the GSM8k benchmark, Arctic takes the lead with a score of 74.2%, while DBRX achieves a slightly lower score of 73.5%.

8). Snowflake Arctic vs DBRX—IFEval Benchmark Comparison

Databricks DBRX: 27.6%

IFEval (Instruction Following Evaluation) benchmark assesses a model's ability to understand and follow complex, multi-step instructions.

On this IFEval benchmark, DBRX scored 27.6%.

Snowflake Arctic: 52.4%

In the same benchmark, Snowflake Arctic demonstrated a significant advantage, scoring 52.4%. This result highlights Arctic's strength in comprehending and executing complex instructions, a crucial capability for enterprise applications.

9). Snowflake Arctic vs DBRX—Common Sense Benchmark Comparison

Assessing common sense reasoning is an important aspect of evaluating LLM capabilities. Both Snowflake Arctic and DBRX perform similarly on commonsense reasoning benchmarks, with a slightly better score for DBRX.

Databricks DBRX: 74.8%

On this commonsense reasoning benchmark, DBRX scored 74.8%.

Snowflake Arctic: 73.1%

Snowflake Arctic, on the other hand, scored slightly lower at 73.1% for common sense reasoning.

10). Snowflake Arctic vs DBRX—MMLU Benchmark Comparison

MMLU (Multitask Prompted Language Understanding) benchmark evaluates a model's language understanding and reasoning capabilities across various tasks and domains.

Databricks DBRX: 73.7%

On this benchmark, DBRX outperforms the Snowflake Arctic with a score of 73.7%.

Snowflake Arctic: 67.3%

On the MMLU benchmark, Snowflake Arctic scores 67.3%, which is lower than DBRX's performance.

Step-By-Step Guide to Getting Started With DBRX—on Databricks Platform

Let us begin with a step-by-step tutorial to getting started with Databricks DBRX.

Prerequisites:

  • Databricks Account Access
  • Databricks workspace enabled for Unity Catalog.
  • Access to compute resources (SQL warehouse or cluster) that support Unity Catalog.
  • Appropriate privileges on Databricks
  • Users and groups added to the workspace.

Step 1—Login to Databricks

Access the Databricks platform using your account credentials.

Logging into Databricks - Snowflake Arctic vs DBRX 

Step 2—Navigate to Marketplace

Once you have logged in, go to the Marketplace section within Databricks.

Navigate to Databricks Marketplace - Snowflake Arctic vs DBRX

Step 3—Search for DBRX Models

In the Marketplace search bar, type "DBRX" or "DBRX models" to locate the available options.

Searching for DBRX models - Snowflake Arctic vs DBRX

Step 4—Get Instant Access to DBRX

Click on “DBRX Models” and then select "Get instant access" to proceed with the installation and setup.

Installing Databricks DBRX model - Snowflake Arctic vs DBRX

Step 5—Select Usage Location (Databricks)

Now, select the "More Option" option and click on "on Databricks" when asked where you are planning to use the model—and choose a name for the catalog, then proceed to click on "Get instant access"

Installing Databricks DBRX model - Snowflake Arctic vs DBRX
OR

Step 6—Open the Playground From the Left Navigation Pane

In the Databricks interface, find the left navigation pane and click on "Playground" under the "Machine Learning" section.

Databricks AI playground - Snowflake Arctic vs DBRX

Step 7—Serve a Model From the Marketplace

Within the Playground, select the option "Serve model from marketplace" to access the available models.

Serving DBRX Model From the Databricks Marketplace - Snowflake Arctic vs DBRX

Step 8—Open the DBRX Model

Choose the DBRX model and open it.

Opening Databricks DBRX Model - Snowflake Arctic vs DBRX

Step 9—Serve the DBRX Model and Create It

Once you've opened the DBRX model, click on "Serve this model" and follow the steps provided to create and set up the model.

Serving DBRX Model From the Databricks Marketplace - Snowflake Arctic vs DBRX

Step 10—Select the Model You Want to Interact With

Head back to your "Playground" option, and then use the dropdown list on the top left corner to choose the specific model for interaction.

Selecting DBRX Model in Databricks AI playground - Snowflake Arctic vs DBRX

Step 11—Interact With the Model

Boom! Now, type in your question or prompt in the designated area OR Select a sample AI instruction from the list provided in the window.

Optional—Compare Multiple Model

To compare multiple model responses side-by-side, select the "+" icon to add an endpoint.

Selecting DBRX Model in Databricks AI playground - Snowflake Arctic vs DBRX

Where Can I Access a Ready-To-Use DBRX Demo for Free?

If you want to try out DBRX before setting it up yourself, Databricks provides several platforms where you can access ready-to-use demos and interact with the model for free:

Step-By-Step Guide to Getting Started With Snowflake Arctic—on Snowflake Platform

Let's start with a step-by-step tutorial for getting started with Snowflake Arctic, but keep in mind that we'll be using the Snowflake Cortex LLM function; if you're unfamiliar with Snowflake Cortex, we've written an article about it, so read it before getting started.

Prerequisites:

  • Snowflake account and access credentials: To connect to your Snowflake instance and use the Snowflake Cortex LLM function, you'll need a Snowflake account and the appropriate access credentials (username, password, account identifier, etc.).
  • Required Privileges: To use Snowflake Cortex LLM functions, you must be granted the relevant privileges by an account administrator.
  • Basic Knowledge of SQL: Basic knowledge of SQL is required to follow the examples and integrate the LLM functions into your workflows.
  • Snowflake Cortex Supported Region: Snowflake Cortex LLM functions are currently available in the following regions:
    • AWS US East (N. Virginia)
    • AWS US West (Oregon)
    • AWS Europe (Frankfurt)
    • Azure East US 2 (Virginia)
    • Azure West Europe (Netherlands)

Step 1—Login to Snowflake

First, sign up or log in to your Snowflake account.

Step 2—Create a New SQL Worksheet

Open a new SQL worksheet to write and execute your code.

Creating a New SQL Worksheet - Snowflake Arctic vs DBRX

Step 3—Create Database and Schema

If you haven't already, create a new database and schema to work with Snowflake Cortex LLM functions.

CREATE DATABASE IF NOT EXISTS snowflake_arctic_db
CREATE SCHEMA IF NOT EXISTS snowflake_arctic_schema

Step 4—Switch to the Desired Role

Make sure that you have the necessary privileges to access Snowflake Cortex LLM functions. This typically involves switching to a role that has been granted the CORTEX_USER database role. For example:

USE ROLE arctic_user_role;

OR

USE ROLE ACCOUNTADMIN;
See Required Privileges and Roles for Accessing Snowflake Cortex LLM functions

Once you have the appropriate role activated, you can call Snowflake Cortex LLM functions directly within your SQL queries. Each function has its own syntax and parameters.

Step 5—Use Snowflake Cortex COMPLETE Function

COMPLETE function generates text based on a given prompt or conversation. It supports multiple language models, including Snowflake Arctic. All you have to do is specify the model and the prompt as parameters, as seen below:

SELECT SNOWFLAKE.CORTEX.COMPLETE('snowflake-arctic', 'Write a brief introduction about Snowflake Arctic');
See Step-by-Step Guide to Use Snowflake Cortex LLM Functions

Where Can I Access a Ready-To-Use Snowflake Arctic Demo for Free?

If you wish to test Snowflake Arctic before you install it yourself, Snowflake offers numerous platforms where you can access and interact with the model for free:

Also, Snowflake stated that Arctic will be accessible shortly on variety of other platforms such as AWS, Azure, Lamini, Perplexity, and Together.ai.

Snowflake Arctic vs DBRX in Action—Evaluating Writing, Translating and Coding Capabilities

Now that you have a basic understanding of Snowflake Arctic vs DBRX, let’s test ‘em both to see how they perform in terms of writing, translating, and coding capabilities, and then determine which one is the best choice.

Challenge 1—Writing Long Form Creative Content

Let's kick things off with the first challenge! We're going to give both models the same prompt to write long-form content and evaluate their responses based on creativity, hallucination, word count, creativity, and the overall quality and engagement of their writing.

Here's the prompt we'll use:

Write a 500-word fiction short tale set in a post-apocalyptic future where humanity is forced to live underground due to a catastrophic disaster that renders the Earth's surface uninhabitable. The story should focus on themes such as resilience, hope, and the battle to survive in a tough environment.

Snowflake Arctic Response:

Snowflake Arctic Response - Snowflake Arctic vs DBRX

Databricks DBRX Response:

Databricks DBRX Response - Snowflake Arctic vs DBRX

Here's a full comparison between DBRX and Snowflake Arctic's outputs:

Observation 1: Creativity—Snowflake Arctic vs DBRX:

Databricks DBRX's response shows a higher level of creativity and storytelling ability. It tells a coherent story through characters (Lily and Max), a well-developed plot (the underground garden, discovering the surface), and a core message of resilience and hope. Snowflake Arctic's response, while imaginative, falls short in terms of story coherence and character development. It establishes the concept effectively, but does not fully develop the story before abruptly ending it.

Observation 2: Hallucination—Snowflake Arctic vs DBRX:

Both Snowflake Arctics and DBRX responses demonstrate some degree of hallucination since they generate fictional content that is not directly based on genuine data. But DBRX's response looks to be more grounded and consistent with the fictitious world it portrays.

Observation 3: Word Count—Snowflake Arctic vs DBRX:

DBRX's response is roughly 547 words long, slightly more than the 500-word limit. Snowflake Arctic's response is approximately 345 words, which falls short of the 500-word requirement.

Observation 4: Grammar—Snowflake Arctic vs DBRX:

Both Snowflake Arctics and DBRX responses demonstrate a strong command of grammar, including precise sentence structure and punctuation. There are no serious grammatical problems in either response.

Observation 5: Overall Quality and Engagement—Snowflake Arctic vs DBRX:

DBRX's response is more engaging and immersive, with a well-developed plot, characters, and a sense of narrative progression. Snowflake Arctic's response starts off strongly but loses momentum, leaving the reader wanting more development and resolution.

Challenge 2—Translating Paragraph into Multiple Language

Let's take a look at another challenge—comparing the translation abilities of Snowflake Arctic and DBRX. We're going to give both models the same task to translate and then evaluate their responses based on how accurately they capture the meaning and context, and how natural and smooth the translated text sounds.

Here's the prompt we'll use:

Translate the following paragraph into Japanese, French, and Hindi:
‘AI has made incredible progress in recent years, with Large Language Models (LLMs), playing a significant part in this growth. It feels like every other week, a new LLM hits the market! These powerful tools can not only understand and create human-like text, but they can even generate code, which makes them useful in a ton of different ways.’

Databricks DBRX:

Databricks DBRX Response - Snowflake Arctic vs DBRX
Observation 1: Japanese Translation:

The Japanese translation provided by DBRX is accurate and conveys the meaning and context of the original English paragraph well. The language used is fluent and natural, making it easy to understand for native Japanese speakers.

Observation 2: French Translation:

The French translation by DBRX is also accurate and maintains the original meaning and context effectively. The language used is fluent and natural, making it sound like it was written by a native French speaker.

Observation 3: Hindi Translation:

The Hindi translation provided by DBRX is also accurate and preserves the overall meaning and context of the original text well. The language used is fluent and natural, making it easy to understand for native Hindi speakers.

Snowflake Arctic:

Snowflake Arctic Response - Snowflake Arctic vs DBRX
Observation 1: Japanese Translation:

The Japanese translation provided by Snowflake Arctic is also accurate and preserves the overall meaning and context of the original text. However, there are a few minor grammatical errors and awkward phrasing, which slightly affects the fluency and naturalness of the translation.

Observation 2: French Translation:

The French translation by Snowflake Arctic is generally accurate, but there are a few instances where the meaning or context is slightly altered or lost. The language used is mostly fluent, but there are some awkward phrasings that affect the overall naturalness of the translation.

Observation 3: Hindi Translation:

The Hindi translation by Snowflake Arctic is generally accurate, but there are a few instances where the meaning or context is slightly altered or lost. The language used is mostly fluent, but there are some awkward phrasings and instances of code-switching (mixing Hindi and English{Hindi in English}) that affect the overall naturalness of the translation.

TL;DR: Both Snowflake Arctic vs DBRX offered reasonably accurate translations of all three languages. But DBRX's translations seem to have a slight edge in terms of precisely preserving content and context while also providing more fluent and natural-sounding translations, particularly in French and Hindi.

Challenge 3—Testing Coding Capabilities by Building Todo App in JS

Let's move on to our final challenge where we'll put Snowflake Arctic and DBRX to the test with a coding task. We're going to give both models to create a simple to-do app and then evaluate their code based on its correctness and functionality, as well as their adherence to best practices and coding standards, and the organization and structure of their code, including indentation, comments, readability, and overall usability.

Here's the prompt we'll use:

Create a very simple Todo application using HTML and JavaScript that allows users to:
1) Add new tasks to a list
2) Delete tasks from the list
The application should be simple and clean, with appropriate input fields and buttons for each functionality.

Databricks DBRX:

Here is the code it generated for us.

Databricks DBRX Response - Snowflake Arctic vs DBRX

Here's what the final version of our Todo app looks like.

Databricks DBRX Response - Snowflake Arctic vs DBRX

Snowflake Arctic:

Here is the code it generated for us.

Snowflake Arctic Response - Snowflake Arctic vs DBRX

Here's what the final version of our Todo app looks like.

Snowflake Arctic Response - Snowflake Arctic vs DBRX
Observation 1: Functionality and Correctness—Snowflake Arctic vs DBRX:
  • Both codes offer functionality to add and delete tasks.
  • Databricks DBRX code uses an onclick event on the "Add Task" button to call the addTask function, while Snowflake Arctic adds an event listener on the form's submit event. Both approaches are valid.
  • Databricks DBRX adds an alert message if the user tries to add a task without entering any text. Snowflake Arctic's code does not have this feature but trims any whitespace before adding the task.
Observation 2: Coding Best Practices and Standards—Snowflake Arctic vs DBRX:
  • Both codes use JavaScript best practices like DOM manipulation, event handling, and efficient element creation.
  • Databricks DBRX's deleteTask function uses the removeChild method to remove the entire list item directly, while Snowflake Arctic's code uses the li.remove method to remove the list item, which is a more modern approach.
Observation 3: Organization and Structure—Snowflake Arctic vs DBRX:
  • Snowflake Arctic's code is better organized, with proper indentation. It also uses a more modern <!DOCTYPE html> declaration.
  • Databricks DBRX's code mixes JavaScript and HTML in places, like using onclick events inside the HTML itself. While this is not necessarily incorrect, it can be considered less organized and harder to maintain compared to Snowflake Arctic's approach.
Observation 4: Readability and Usability—Snowflake Arctic vs DBRX:
  • Snowflake Arctic's code includes more descriptive variable names, making it easier to understand the purpose of each variable.
  • Databricks DBRX's code uses var for variable declarations, while Snowflake Arctic uses const. The use of const is generally considered a better practice, as it ensures that variables cannot be reassigned accidentally.
TLDR; Both Databricks DBRX and Snowflake Arctic provided functional code that achieves the desired result. But, Snowflake Arctic's code is better organized, more readable, and demonstrates better adherence to modern coding practices and standards compared to Databricks DBRX.

Conclusion

And that's it! We explored two groundbreaking open LLMs, Snowflake Arctic and Databricks DBRX. Arctic is intended for enterprise intelligence, handling tasks such as SQL generation, coding, and complex instruction following. It has a unique architecture and efficient training process, delivering top-notch performance in enterprise benchmarks. DBRX, on the other hand, is a versatile LLM that excels at a variety of benchmarks, including programming, mathematics, and understanding languages. Its quick training and inference capabilities make it ideal for a variety of applications. Both are completely open and promote total transparency.

In this article, we have covered:

  • What Is Snowflake Arctic?
  • What Is Databricks DBRX?
  • Top 10 Key Differences—Snowflake Arctic vs DBRX
    • Snowflake Arctic vs DBRX—Architecture Breakdown
    • Snowflake Arctic vs DBRX—Parameters and Active Parameters
    • Snowflake Arctic vs DBRX—Hardware Infrastructure Used
    • Snowflake Arctic vs DBRX—Development Timeline and Cost
    • Snowflake Arctic vs DBRX—Training Tokens
    • Snowflake Arctic vs DBRX—Code (SQL) Generation Benchmark Comparison
    • Snowflake Arctic vs DBRX—Programming and Mathematical Reasoning Benchmark Comparison
    • Snowflake Arctic vs DBRX—IFEval Benchmark Comparison
    • Snowflake Arctic vs DBRX—Common Sense Benchmark Comparison
    • Snowflake Arctic vs DBRX—MMLU Benchmark Comparison
  • Step-By-Step Guide to Getting Started With DBRX
  • Step-By-Step Guide to Getting Started With Snowflake Arctic
  • Snowflake Arctic vs DBRX in Action—Evaluating Writing, Translating and Coding Capabilities

What People Are Saying—Snowflake Arctic vs DBRX:

"Playing with the model I wanted to do a few things: Test its knowledge of things I know a lot about. Get to know the limits of the system serving it. In this case, I confirmed that DBRX-Instruct is a solid, sub-GPT-4, not too verbose model. Within that, there are tons of relevant details".
- Nathan Lambert, Author(Interconnects.io)

In my testing, DBRX showcased truly impressive coding abilities. When tasked with creating a Python tic-tac-toe game, the model generated functional code remarkably quickly. While there was a minor formatting issue with indentation, the core logic was flawless. DBRX handled this coding challenge with ease, demonstrating its strength in areas like programming that require logical reasoning and code generation. What struck me was DBRX's blitz-like speed in producing the solution. The model's computational efficiency enables blazing-fast inference times that left me impressed. This performance edge, combined with DBRX's coding prowess, makes it an extremely compelling option for developers and coding-focused applications.
- David Chew, Author(Medium)

“Arctic has a very different architecture when compared to most open models these days. I played with Arctic a little bit in their demo, but it didn’t pass my basic challenge tasks around who I am, RLHF details, and the stretch question of writing DPO code. It’s clear from the announcements that this is a model with a niche in coding and reasoning, and that’s a good thing when it comes to the open LLM strategy”.
- Nathan Lambert, Author(Interconnects.io)

“Snowflake Arctic is a true beast. This model is going to rattle so many companies that it will be unbelievable. It sets a new baseline for cost-effective training, enabling Snowflake customers to create high-quality custom models for their enterprise needs at a low cost. Snowflake Arctic is amazing, no doubt about that. It excels at Enterprise tasks such as SQL generation, coding, and instruction following benchmarks, even surpassing other models like Lama 3. Snowflake Arctic is truly open, providing ungated access to weights and code, along with open data recipes and research insights. Its performance in benchmarks is better than Lama 3, making it the best LLM for Enterprise AI”.
- Fahd Mirza Chughtai, Content Creator(YouTube)

FAQs

What is Snowflake Arctic?

Snowflake Arctic is an open source, enterprise-grade large language model (LLM) developed by Snowflake with 480 billion parameters and a unique Dense-MoE Hybrid Transformer architecture.

What is DBRX?

DBRX is a transformer-based, decoder-only large language model developed by Databricks with 132 billion parameters, utilizing a fine-grained Mixture of Experts (MoE) architecture.

What is the key difference between the architectures of Snowflake Arctic vs DBRX?

Snowflake Arctic uses a Dense-MoE Hybrid Transformer architecture, while DBRX uses a fine-grained Mixture of Experts (MoE) architecture.

How many active parameters do Snowflake Arctic vs DBRX have?

Snowflake Arctic has 17 billion active parameters out of its 480 billion total parameters, while DBRX has 36 billion active parameters out of its 132 billion total parameters.

How do I obtain the model weights and code for Snowflake Arctic and DBRX?

Snowflake Arctic's model checkpoints (base and instruct-tuned versions) are available for download on Hugging Face under an Apache 2.0 license. DBRX's weights (DBRX Base and DBRX Instruct) can also be downloaded from Hugging Face under an open license.

How much hardware infrastructure was used to train Snowflake Arctic vs DBRX?

Snowflake Arctic was trained on 1,000 GPUs, while DBRX was trained on 3,072 NVIDIA H100 GPUs.

What were the development timelines and costs for Snowflake Arctic vs DBRX?

Both models were developed over a 3-month timeline, but Snowflake Arctic had a significantly lower training compute budget of $2 million, compared to $10 million for DBRX.

How many training tokens were used for Snowflake Arctic vs DBRX?

Snowflake Arctic was trained on 3.5 trillion tokens, while DBRX was trained on 12 trillion tokens.

How did Snowflake Arctic vs DBRX perform on the Spider benchmark for SQL generation?

Snowflake Arctic achieved a score of 79% on the Spider benchmark, while DBRX scored 76.3%.

How did Snowflake Arctic vs DBRX perform on the HumanEval benchmark for programming abilities?

On the HumanEval benchmark, Snowflake Arctic outperformed DBRX with a score of 64.3% compared to DBRX's 61.0%.

How did Snowflake Arctic vs DBRX perform on the GSM8k benchmark for mathematical reasoning?

On the GSM8k benchmark, Snowflake Arctic scored 74.2%, while DBRX scored 73.5%

How did Snowflake Arctic vs DBRX perform on the IFEval benchmark for instruction following?

Snowflake Arctic demonstrated a significant advantage on the IFEval benchmark, scoring 52.4% compared to DBRX's 27.6%.

How did Snowflake Arctic vs DBRX perform on common sense reasoning benchmarks?

DBRX had a slightly better score of 74.8% on common sense reasoning benchmarks, compared to Snowflake Arctic's 73.1%.

How did Snowflake Arctic vs DBRX perform on the MMLU benchmark for language understanding and reasoning?

DBRX outperformed Arctic on the MMLU benchmark with a score of 73.7%, while Snowflake Arctic scored 67.3%.

What platforms can be used to access a ready-to-use demo of DBRX for free?

DBRX can be accessed for free on platforms like Mosaic AI Model Serving, Databricks AI Playground, You.com, and Perplexity Labs.

What platforms can be used to access a ready-to-use demo of Snowflake Arctic for free?

Snowflake Arctic can be accessed for free on platforms like Streamlit Community Cloud, Hugging Face, NVIDIA API Catalog, and Replicate.