Sign inGet started
← Back to all posts

Introducing modules: reusable workflows for your entire team

By Filip Žitný

Updated on March 13, 2025

We're excited to announce modules in Deepnote — a powerful new way to create consistent, reusable workflows across your workspace.

Illustrative image for blog post

What are modules?

Modules are your solution to code fragmentation and inconsistent analysis. They let you package essential elements — code snippets, SQL queries, data transformations, and visualizations — into reusable modules that can be seamlessly imported into any notebook across your workspace.

This "build once, use everywhere" approach ensures your entire team leverages standardized tools and methodologies without duplicating efforts. Whether it's complex data cleaning routines or custom visualization functions, modules make your best and most trusted work instantly accessible.

Semantic layer_v2 (1).png

Use cases

With modules, the possibilities are vast. Here are a few ways you can leverage this feature:

  • Standard metrics & queries: Create a semantic layer using standard metrics and queries, building a trusted foundation with consistent definitions for key metrics like weekly active users (WAU) and churn rate. Ensure everyone calculates metrics the same way, fostering consistency and accuracy across analyses. We will show you how to build this semantic layer in detail below.
  • Data pipelines: Create modular ETL workflows by breaking complex transformations into logical steps. This modular approach not only makes your data pipelines easier to maintain but also simplifies debugging.
  • Code reusability: Share common functions, visualization code, and data processing logic across projects. Avoid the hassle of copying and pasting the same code between notebooks, and instead, centralize your efforts into a single module.
  • ML experimentation: Package machine learning models to easily test them across different datasets and parameters. Modules allow you to compare results consistently across experiments, enhancing the robustness of your findings.

Building a semantic layer with modules

A very powerful application that modules unblock is building a semantic layer for key performance indicators (KPIs). This approach ensures that everyone in your organization calculates metrics consistently, promoting accuracy and trust in your data-driven decisions.

Imagine your team needs to track multiple KPIs such as Weekly Active Users (WAU), monthly churn rates, customer lifetime value, and net promoter score (NPS). With modules, you can create a dedicated data notebook for each KPI.

Create a notebook and define your KPI

Start by creating a dedicated notebook for each KPI. In this notebook, clearly define the KPI's purpose, calculation methodology, and data sources. Ensure that the definition aligns with organizational standards and business objectives.

Screenshot 2025-03-13 at 16.09.26.png

Create a module

Transform your KPI notebook into a reusable module by encapsulating all the necessary data queries, transformations, and visualization code. Make sure the module has clear inputs and outputs, and document any dependencies or assumptions.

Click the module button in the notebook's upper right corner.

Screenshot 2025-03-13 at 16.13.38.png

Select which blocks to export. The output of exported blocks will be available when running the module from another notebook. Export any code or SQL blocks by clicking the export block action and giving the export a name.

Screenshot 2025-03-13 at 16.11.30.png

To add parameters, simply include input blocks in your notebook. These will become configurable when others import your module.

Screenshot 2025-03-13 at 15.00.56.png

Repeat these steps for all KPIs

Follow the same process for each KPI that you want to include in your dashboard. This ensures that all metrics are standardized and consistently defined across your organization.

Screenshot 2025-03-13 at 16.16.40.png

Importing Modules

Click the Module button in the notebook footer.

Screenshot 2025-03-13 at 14.56.33.png

Search for the module by project or notebook title.

CleanShot 2025-03-13 at 15.31.50@2x.png

Configure the module - set any required input parameters, choose which modules to import (optionally rename returned variables)

Screenshot 2025-03-13 at 14.50.28.png

When you run the module block, the notebook executes in a separate environment and returns your selected exports. All returned variables are added to your current notebook's memory, seamlessly integrating with your existing work.

Create a data app/dashboard by combining the modules

Build a unified dashboard or Deepnote app that imports all your KPI modules. Arrange the visualizations in a logical way that tells a coherent story about your business performance and helps stakeholders make informed decisions.

CleanShot 2025-03-13 at 15.35.07@2x.png

Update modules when metrics change

When a KPI definition or calculation needs to be modified, simply update the corresponding module. The changes will automatically propagate to all notebooks and dashboards where the module is used, ensuring consistency throughout your analytics ecosystem.

Finding Modules

Access all published modules through the Modules section in your workspace navigation sidebar. This centralized location makes it easy to discover and utilize the collective knowledge and tools created by your team.

CleanShot 2025-03-13 at 15.33.28@2x.png

Benefits of this approach

  • Standardization: Ensures consistent definitions and calculations across the organization
  • Maintainability: Makes updates easier as changes only need to be made in one place
  • Scalability: Allows for easy addition of new KPIs without disrupting existing dashboards
  • Collaboration: Enables different team members to work on different KPIs simultaneously
  • Transparency: Provides clear documentation of how each metric is defined and calculated

If you wish to see full power of semantic layer with modules check this example notebook.

We've shown you how to build a semantic layer using notebooks, and this isn't just theory—we use modules the same way at Deepnote to organize our data infrastructure, streamline collaboration, and ensure consistency across our analytics work. Give this approach a try and see how it can simplify your own data workflows.

Filip Žitný

Data Scientist

Follow Filip on Twitter, LinkedIn and GitHub

Blog

Illustrative image for blog post

Beyond AI chatbots: how we tripled engagement with Deepnote AI

By Gabor Szalai

Updated on April 3, 2024

That’s it, time to try Deepnote

Get started – it’s free
Book a demo

Footer

Solutions

  • Notebook
  • Data apps
  • Machine learning
  • Data teams

Product

Company

Comparisons

Resources

  • Privacy
  • Terms

© Deepnote