Join usDocsSign in
← All posts

– by Elizabeth on July 28, 2022

Bridging the exploratory analytics gap with Deepnote for Snowflake

A story of SQL-first notebooks, Deepnote—now a Snowflake Select Technology partner—and how Deepnote and Snowflake work together to power your SQL + Python workflows.

Snowflake, amongst other products, offers a cloud data warehousing platform with out-of-the-box features like scalable compute, data sharing, or data cloning. Products within Snowflake’s ecosystem, like Snowsight, allow analysts to explore and visualize their data for ad-hoc use cases. But what if you’d like to turn your ad-hoc queries into compelling narratives and easily shareable assets? With Deepnote—now a Snowflake Select Technology Partner—your team can do just that.

In this article, we will walk you through the key pillars that make the Deepnote and Snowflake experience delightful, and show how you can bring both into your toolkit to make your querying and exploration easier, more secure, and collaborative.

For a quick start, we build out a template project to show you the ropes. Use our playground warehouse or connect to your own Snowflake instance using the Integrations menu.

1. Turning your notebook into a SQL editor

Option 8.png

Whether you're a seasoned analyst or new to the whole notebooks thing, you’ve likely encountered SQL Alchemy API , enabling connection to data sources from within your notebook environment. This is a very common way of extracting data and preparing it for manipulation within a dataframe. However, whilst generaly accepted, it also comes with a couple of pain-points we've listed below:

  1. Syntax inconsistency: With multi-line SQL queries, these would tend to require \ characters to be appended at the end of each line to help the notebook environment recognise that this is all one string to be queried.
  2. Non-SQL friendly experience: Sometimes its easier to write an overarching query that extracts all records from one table, and then break this down in a filtered manner as you proceed with your analysis. This is commonly done to reduce the amount of SQL that needs to be written due to the less-than-ideal experience of writing SQL in a non-SQL friendly IDE.
  3. Memory constraints: Extracting large volumes of data from a Snowflake warehouse combined with a standard notebook environment for data manipulation can quickly result in memory constraint errors due to the volume of information required or transformation requests that can strain your notebook environment. This will naturally degrade your notebook experience and likely led to soul-searching for more preferential options.

Here’s what we did to alleviate these concerns. In Deepnote, there's no need to directly use a connector to extract data from your Snowflake warehouse. Through the power of SQL blocks, in Deepnote you can run queries directly on the top of your Snowflake warehouse, from the comfort of your notebook. All you have to do is add your Snowflake credentials and voilà - your notebook is now an interface for exploring your Snowflake data (in a collaborative way too, should you choose to do so!).

In addition to a first-class SQL experience, Deepnote helps you eliminate the memory constraints. You can simply switch over to a more powerful VM instance or spin down dynamically based off your compute needs.

2. 10x-ing your exploration speed

Option 7.png

Having access to the Snowflake warehouse coupled with the exploratory powers of notebook means you can write queries, save the results into pandas DataFrames, and visualize them all in one go. You can even inject Python variables into your SQL queries with jinjasql to build out more complex queries and move seamlessly between SQL and Python.

With the built-in DataFrame viewer, we can examine the dataset for missing values, the most common categorical values, distributions of numeric columns, and more. The built-in filters and sorting make it easy to gain a deeper understanding of the data and relationships that might impact our future model's predictions. Lastly, built-in no-code charts allow us to examine our target variable as a function of other features without having to write any additional code. We can seamlessly switch from code to visualizations, and go right back into querying as needed.

3. Keeping your Snowflake data secure

Option 9.png

As data teams move to using hosted notebooks, it can present security risks with respect to meta-data and authentication credentials. Ongoing git-commits and pushes to shared repositories highlight the chance of leaked credentials and potential security breaches.

At Deepnote, we set out to build the best notebooks experience that lets you collaborate effectively and build better data products faster, and do so in a secure way. All files in Deepnote are encrypted at rest, whether they are files you create or anything you upload. For sensitive information (such as database integrations or environment variables), we apply a layer of AES-256-CBC encryption before storing them in our database where the decryption keys are stored separately.

Better yet, you can configure the connection at the Workspace level and decide whether to make it available to all members and shared projects or just a specific project. As soon as the Snowflake connection is set up, your data is secure, and we don’t have to worry about re-configuring things again. Say goodbye to your multiple JSON files for sharing credentials and simply share access to your project.

4. Unlocking cross-team collaboration

Option 11.png

At Deepnote, we believe collaboration is key to the notebooks experience. At the end of the day, data work is a team sport. We put a lot of thought into improving in-team collaboration for data teams, but equally, we want to make sure their work is accessible to others - subject matter experts, leaders and data consumers. The emphasis on collaboration links to different steps in your exploratory journey:

  • Environment setup: Configure your Snowflake integration once and never worry about it again. Ability to set up an Environment for your Workspace with granular role-based permissions means that everyone has access to the right things.
  • Analysis: The whole exploratory & analytics process can be made collaborative. You can share your project in 1 click, with all the logic, data and dependencies in one place. You can bring others to share feedback right in the notebook via comments, or troubleshoot things in the real time.
  • Presentation: While you can easily share your notebook with one click at any point, you can go a step further and make your production-ready work accessible and interactive for anyone. You can turn your notebook into a simple yet powerful data app by prameterizing your notebook to introduce interactive user inputs, and scheduling the notebook so that you’re always pulling in fresh data from Snowflake. Once that’s done, all you need to do just hit publish.
To get started with Snowflake and Deepnote, create a free Deepnote account, or head over to our Snowflake docs.

Share this post

Twitter icon

Elizabeth Dlha

Head of Community & Partnerships at Deepnote

Follow Elizabeth on Twitter and LinkedIn

Join the world's best data teams and get started with Deepnote

No credit card required. Run your first notebook in seconds.

© 2022 Deepnote. All rights reserved.