When a Cloud Wars Top 10 company releases an LLM, it’s a big deal. Following the lead of other companies in this competitive group, Snowflake has announced the launch of Snowflake Arctic, an LLM that the company describes as “the most open, enterprise-grade LLM on the market.”
“This is a watershed moment for Snowflake, with our AI research team innovating at the forefront of AI,” said Sridhar Ramaswamy, CEO of Snowflake. “By delivering industry-leading intelligence and efficiency in a truly open way to the AI community, we are furthering the frontiers of what open-source AI can do. Our research with Arctic will significantly enhance our capability to deliver reliable, efficient AI to our customers.”
Snowflake Arctic Differentiators
Snowflake Arctic uses a Mixture-of-Experts (MoE) architecture, meaning it pulls insights from a variety of sub-models to increase accuracy and handle more complex and diverse queries. However, what stands out about Arctic, and the focus Snowflake is keen to encourage, is the LLM’s openness.
The AI Ecosystem Q1 2024 Report compiles the innovations, funding, and products highlighted in AI Ecosystem Reports from the first quarter of 2024. Download now for perspectives on the companies, investments, innovations, and solutions shaping the future of AI.
Snowflake is, of course, a data company, and thousands of organizations are already using the Snowflake platform as a springboard from which to leverage existing open LLMs, ensuring the levels of flexibility required to provide its customers with LLM choices. Now, with Arctic, Snowflake has taken the logical step of launching an enterprise-grade open LLM of its own.
The Snowflake Arctic LLM has an Apache 2.0 license meaning personal, research, and commercial use of the technology is fully ungated. What’s more, Snowflake is even providing weights and code templates, as well as inference and training options, for users to deploy and customize the models to their intended use cases. Current deployment frameworks include NVIDIA NIM with NVIDIA TensorRT-LLM, vLLM, and Hugging Face.
Beyond this, Arctic is available right away via Cortex, Snowflake’s fully-managed service where users can currently build using leading LLMs from Google, Meta, and Mistral AI. In addition, they will soon be able to build through AWS, Microsoft Azure, Perplexity, and Together AI, amongothers.
Snowflake Arctic was developed unusually quickly: The company’s research team managed to train the model using Amazon Elastic Compute Cloud (Amazon EC2) P5 instances in under three months and at a fraction of the cost, approximately one-eighth, of building comparable open models. This speed is demonstrating the breadth of what can be achieved without the major investments, in time and money, that many other models have required.
So, how does Arctic compare to other similar enterprise-grade models? The company claims its performance dramatically exceeds that of the new Databricks LLM DBRX as well as Meta’s Llama 3 during inference of training. It outperforms DBRX and others in coding and SQL generation speeds, the company says.
Closing thoughts
“There has been a massive wave of open-source AI in the past few months,” said Clement Delangue, CEO and Co-Founder of Hugging Face, in response to the announcement. “We’re excited to see Snowflake contributing significantly with this release not only of the model with an Apache 2.0 license but also with details on how it was trained. It gives the necessary transparency and control for enterprises to build AI and for the field as a whole to break new ground.”
We can agree with this. Recently, we reported on the launch of DBRX, the open, general-purpose LLM launched by Databricks. And now, just a few weeks later, one of the world’s most influential companies is also taking an open approach. Furthermore, Snowflake Arctic LLM is building on Snowflake’s other open models in the Snowflake Arctic model family, which includes practical text-embedding models for retrieval use cases.
Snowflake is responding to customer demand for choice and flexibility, and from day one has enabled companies to operationalize LLMs and associated technologies. Now, with its own open LLM, Snowflake has taken a major step forward. Couple this with full transparency around model training, and Snowflake could well be at the forefront of the next stage of the rapidly evolving GenAI movement.