Sign inGet started
← Back to all guides

What file formats does Snowflake support?

By Nick Barth

Updated on March 6, 2024

Snowflake, the cloud data platform, supports multiple file formats for the import and export of data. When you're connecting to Snowflake from Deepnote or any other data science notebook environment, understanding supported file formats is critical for smooth data operations.

Here are the most commonly supported file formats in Snowflake:

  1. CSV (Comma-separated values)
  • Widely used, simple text format where each line of the file represents a single data record with values separated by commas.

2. JSON (JavaScript object notation)

  • A lightweight data-interchange format that is easy for humans to read and write, and easy for machines to parse and generate.  

3. Parquet

  • An open-source columnar storage file format optimized for use with big data processing frameworks.  

4. Avro

  • A row-oriented remote procedure call and data serialization framework developed within the Apache Hadoop project.  

5. ORC (Optimized row columnar)

  • A type of columnar storage format that is highly optimized for heavy read workloads. 

6. XML (eXtensible markup language)

  • A markup language that defines a set of rules for encoding documents in a format which is both human-readable and machine-readable.

It's important to note that Snowflake also allows for semi-structured data formats (like JSON, Avro, ORC, and XML) to be imported into columnar data storage, which enables complex data analytics.

In the context of the provided use case, if you're working in Deepnote, which is a collaborative data science notebook environment, and wish to connect it to Snowflake, you would use these file formats to transfer data between the platforms. The choice of file format can depend on factors such as the nature of the data, the need for compression, and the specific requirements of the analysis tasks you will perform in Deepnote.

Remember to configure the file format options properly within Snowflake's COPY INTO command when importing data, or when using any client libraries for data extraction and loading to ensure compatibility and efficiency.

Here's a basic example of how you might specify JSON file format in Snowflake:

CREATE FILE FORMAT my_json_format TYPE = 'JSON';
⁠
⁠COPY INTO my_table
⁠
⁠FROM @my_stage/my_file.json
⁠
⁠FILE_FORMAT = my_json_format;

When operating with these formats, you'll typically use the COPY INTO command for bulk data loads into Snowflake. The EXPORT command can be used when performing data unloads.

For further details or tutorials, refer to Snowflake's official documentation or the integration guides specific to your data science environment.

Nick Barth

Product Engineer

Nick has been interested in data science ever since he recorded all his poops in spreadsheet, and found that on average, he pooped 1.41 times per day. When he isn't coding, or writing content, he spends his time enjoying various leisurely pursuits.

Follow Nick on LinkedIn and GitHub

That’s it, time to try Deepnote

Get started – it’s free
Book a demo

Footer

Solutions

  • Notebook
  • Data apps
  • Machine learning
  • Data teams

Product

Company

Comparisons

Resources

  • Privacy
  • Terms

© Deepnote