
Snowflake Summit 2023 is underway in Las Vegas and Snowflake is pulling out all the stops to validate its positioning as the “world’s largest data, apps, and AI conference” with a wide range of new products and strategic partnerships in the AI realm.
The new developments — targeting customers and developers — fall under three categories:
- Snowflake as a single platform supporting an expanding range of data types
- Deploying, distributing, and monetizing apps
- Programmability of the Snowflake platform with container services and AI-powered search
Microsoft Partnership Extends to Generative AI
One of the most significant announcements is the expansion of Snowflake’s existing partnership with Microsoft. The companies will focus on bringing generative AI models and advanced ML functionality to Snowflake’s Data Cloud through new product integrations with Azure ML, Azure OpenAI, and Microsoft Cognitive Services.
The aim is to enable Snowflake customers to develop next-generation data products by leveraging Azure OpenAI and Microsoft Cognitive Services with Snowflake data, while the Azure ML integration will support accelerated ML workflows for joint customers. And the benefits of this elevated partnership don’t just lie in the field of AI and ML.
Joint customers can expect new and updated product integrations between Snowflake Data Cloud and Microsoft products including Power Apps and Automate for low-code/no-code app development, Azure Data Factory for Extract, Transform, Load tasks, and Power BI for data visualization, among others.
LLM for Extracting Data From Documents
The company today is launching Document AI, which leverages Snowflake’s new, first-party large language model (LLM) and enables users to extract data from documents with greater speed, ease, and accuracy. The innovation picks up where Snowflake’s existing support for unstructured document processing left off. Document AI is currently in private preview.
With Document AI built-in to Snowflake, users can mitigate the tedious manual processes required to gain insights from unstructured data and expedite the process using natural language processing via a visual interface.
The Document AI release comes from Snowflake’s acquisition of Applica in September 2022 and is driven by the latter’s multimodal LLM. While phase one of the strategy with Applica’s technology focuses on documents, Snowflake plans to expand functionality to cover other unstructured data types.
Native App Framework Updates
Another major focus at this week’s Snowflake Summit is the ability to use Snowflake to develop and monetize apps within the Data Cloud via the platform’s native marketplace. There are currently 25-plus Snowflake Native Apps on Snowflake Marketplace created by brands including Capital One, Goldman Sachs, and Matillion.
Using Matillion as an example, the company has developed a Snowflake Native App — Matillion Connector for Google Sheets — that enables users to build and schedule pipelines for direct data ingestion from Google Sheets to Snowflake.
Capital One has developed an app for data cost governance in the Snowflake environment, while Goldman Sachs is combining its Financial Cloud for Data with its open-sourced data platform to generate business insights for clients, business partners, and engineers.
Snowflake’s Native App Framework, initially launched in June 2022, enables any developer to create applications alongside the existing offerings. Using the framework, developers can build apps quickly and securely within Snowflake and bring them to the marketplace, where Snowflake users can purchase and run them directly within their accounts.
The company’s approach makes app discoverability and usability easy for customers because it ensures they aren’t required to move their data out of Snowflake, avoiding the security and privacy issues related to data exports or external access requests. The innovations also eliminate the need to create separate billing systems, with Custom Event Billing (currently in public preview) and on-platform monetization (general availability) via Snowflake Marketplace.
Further, Snowflake has launched the Marketplace Capacity Drawdown Program, which enables customers to purchase data and Snowflake Native Apps within their Snowflake Capacity commitment. The use of already-committed capacity means customers can shorten procurement and contracting steps including vendor onboarding, contract negotiations, and payment logistics.
Container Services and NVIDIA Partnership
Snowpark is Snowflake’s multi-language data processing suite that enables developers to write and run code directly in Snowflake. A new feature, Snowflake Container Services, is expanding the capacity of Snowpark by enabling users to access a wider infrastructure and run a larger variety of workloads. The new service also gives users in-account access to various third-party software applications, which include LLMs and MLOps tools, with partners including Dataiku, SAS, and more.
Beyond this, a new partnership with NVIDIA enables Snowflake users to leverage NVIDIA’s NeMoTM LLM development platform and accelerate compute power with NVIDIA GPUs. Snowflake users can use data directly from their accounts to develop custom LLMs for various generative AI use cases. This cost-efficient approach ensures the continuity of trusted security and governance protocols while reducing latency and making Snowflake a go-to platform for developing and releasing generative AI models.
AI-Powered Search and Generative AI
Snowflake recently announced the acquisition of the AI-powered search platform Neeva. During a briefing prior to the summit, Christian Kleinerman, SVP of Product, Snowflake, explained how Snowflake is leveraging Neeva to power generative AI capabilities, “Neeva is a key component to the broader platform for gen AI apps and experiences,” Kleinerman said. “The reality of language models is the demos are all great, but controlling things like biases, safety, incorrectness, or results of hallucinations, is difficult.
“The Neeva team is at the forefront of combining traditional information retrieval techniques with language models to provide conversational experiences and conversational results but being able to ensure the accuracy of results and precision of answers,” he added. “We will be incorporating all of this into our own first-party experiences and the platform so that our customers can build (trustworthy) generative AI experiences.”
Closing Thoughts
Snowflake is already widely known as an all-in-one data platform that supports a multitude of data-driven tasks. The company’s enablement of in-platform app development and monetization and its positioning as a unified space for the creation, hosting, and utilization of LLMs and generative AI models are, right now, adding new strength to that position.