Sign inGet started
← Back to all guides

How to connect to BigQuery with Python

By Nick Barth

Updated on March 6, 2024

Looking for the best data tool for BigQuery that combines Python's power, SQL’s agility, and BigQuery's data capabilities? In this guide, we'll explore how to connect Python to Google BigQuery, a powerful data warehouse, with Python in a Deepnote Jupyter notebook. With Deepnote’s collaborative data notebook, you can use a cutting edge cloud Jupyter notebook, and perform exploratory data science with ease.  

We will discuss using BigQuery Python connectors, the Pandas module, and executing code in Jupyter notebooks for a seamless connection. By the end of this article you should be able to connect BigQuery to a Juptyer Notebook.

Whether you're a data engineer, analyst, or a curious tech enthusiast, understanding how to harness the capabilities of BigQuery within a Python ecosystem can significantly elevate your data management prowess. From installing the BigQuery connector in a Deepnote Jupyter notebook to exploring Python code examples for BigQuery connectivity, we've got you covered.

In 2024, it is necessary and easy to connect to BigQuery using Python. Once connected, you can focus on analyzing important data. Python, SQL, and Deepnote's collaboration make data management and analysis easier.

This guide helps you connect BigQuery and Python using Deepnote. You can use BigQuery's data warehouse and connect to it from Jupyter notebook.

Integrating BigQuery opens up a gateway to query databases and navigate your warehouse's structure seamlessly. Imagine an advanced "SQL editor" enriched with all the productivity-enhancing features inherent in Deepnote's notebook environment. Here's a glimpse of what this integration offers:

1. Deepnote AI: Deepnote autonomous AI can read your schema, database, and create and execute python, this best-in-class Python AI & SQL AI allows users to breeze through their exploratory data science.

2. Unified SQL and python environment: Combine native SQL queries and Python code within a single notebook for a cohesive analytical experience.

3. Effortless warehouse exploration: Utilize the integrated schema explorer to instantly search through your entire warehouse, streamlining the process of locating specific tables, columns, or databases.

4. Interactive data exploration: Dive into data exploration interactively, eliminating the need for extra code. Visualize and analyze data directly within the familiar jupyter notebook interface, facilitating a smoother analysis process.

Connecting to BigQuery from your notebook with Deepnote’s integration

The easiest way to integrate BigQuery into your Jupyter notebook is by using Deepnote’s built-in integrations. Just click "add integration," enter your BigQuery connection details, including the account identifier, and you're done. You now may browse your BigQuery schema, query your BigQuery with SQL, or use pandas to load your BigQuery data. A few quick steps will connect python to BigQuery. A quick google of Python connect to BigQuery will show hundreds of results, but trust us, this is the easiest.

Authenticating with a service account

A service account will provide a shared connection to Google BigQuery. That is, all collaborators with least Editor privileges will be able run queries against databases provisioned in the service account.

To use the service account authentication, you will need to supply a JSON service account key. Click here for a guide on creating a JSON service account key. Your service account key will be encrypted and stored in Deepnote's database.

Authenticating to BigQuery with Google OAuth

With BigQuery's Google OAuth authentication you can give every member of your Deepnote workspace their own set of credentials. This ensures greater security by using short-lived tokens and enabling the use of multi-factor authentication. Follow the principle of least privilege and use granular access control for various BigQuery resources to ensure users can only access the data they need. Click here to learn how to set up BigQuery's Google OAuth authentication in Deepnote.

Connecting to BigQuery from your notebook with Python

As Deepnote is a general purpose notebook, that fully supports python, including allowing you to change your python environment. You can simply write a few lines of Python, and import the BigQuery Python connector, and easily create a connection. You may simply install the BigQuery Connector in your Python Juptyer notebook.

First, open a new notebook in Deepnote and import the necessary libraries. You will need the BigQuery Python connector, which you can easily install using the pip package manager. Simply run the command "pip install BigQuery-connector-python" in a code cell.

Next, you can establish a connection to BigQuery by providing your credentials. Replace the placeholders with your actual username, password, account, warehouse, database, and schema. This information is essential for establishing a successful connection.

Using Python to connect to BigQuery

To go beyond querying (like listing tables, creating datasets, etc.), you may need to use the official Python client library (docs).

Use this code snippet to authenticate the python client using the integration's service account:Once the connection is established, you can start querying the BigQuery database using SQL. In the example provided, we execute a simple query to retrieve all the records from the "products" table. However, you can modify the query to suit your specific needs.

import json
import os
from google.oauth2 import service_account
from google.cloud import bigquery

bq_credentials = service_account.Credentials.from_service_account_info(
    json.loads(os.environ['INTEGRATION_NAME_SERVICE_ACCOUNT']))
client = bigquery.Client(credentials=bq_credentials, 
    project=bq_credentials.project_id)

Just replace the INTEGRATION_NAME with an uppercased, underscore-connected name of your integration. If you have trouble finding it, run this one-liner to list environment variables that contain service accounts:

[var for var in os.environ if '_SERVICE_ACCOUNT' in var]

Once the BigQuery client is initialized, you can use it to run queries and materialize the results as dataframes like this:

sql = """
SELECT *
FROM `bigquery-public-data.usa_names.usa_1910_current`
LIMIT 1000
"""
df = client.query(sql).to_dataframe()

The collaboration between Python, SQL, and Deepnote makes data management and analysis a breeze. You can leverage the power of Python's data manipulation libraries, such as Pandas, to perform complex data transformations. Additionally, Deepnote's collaborative features allow multiple team members to work on the same notebook simultaneously, making it easier to collaborate and share insights.

By connecting BigQuery and Python using Deepnote, you can harness the full potential of BigQuery's data warehouse and perform advanced analytics on your data. Whether you are analyzing sales trends, customer behavior, or any other business metrics, this integration provides a seamless and efficient workflow.

In conclusion, connecting BigQuery and Python using Deepnote is a straightforward process that empowers you to unlock the true value of your data. With the ability to write and run Python code, execute SQL queries, and collaborate with your team, you can streamline your data analysis workflow and make informed decisions based on actionable insights. So why wait? Start exploring the possibilities today!

That’s it, time to try Deepnote

Get started – it’s free
Book a demo

Footer

Product

  • Integrations
  • Pricing
  • Documentation
  • Changelog
  • Security

Company

Comparisons

Resources

  • Privacy
  • Terms

© Deepnote