How to Install Snowpark Into IntelliJ IDEA

Welcome to our second blog in our series on Snowpark! This post will discuss the Snowflake Data Cloud’s new product, Snowpark, and its installation with IntelliJ IDEA. 

Let’s dive in!

About Snowpark

Snowpark is a new application programmatic interface (API) from Snowflake that allows developers, data engineers, and data scientists to programmatically perform data operations. Snowpark allows organizations to simplify their IT architecture by bringing more data pipelines into Snowflake’s single, governed core data platform.    

Snowpark is implemented as a Java API that exposes SQL-like data operations to Java and Scala developers. By writing code in a language like Scala or Java, engineering teams can build more sophisticated applications with more robust testing than they could with SQL alone. 

How to Install Snowpark via IntelliJ IDEA

Snowpark contains a robust client library with many helpful installation guides. You can find the guide for installing Snowpark with different IDEs here. In general, this guide is very good and straightforward to follow. However, if you’re like us, we like to build projects with automation tools so Maven or Gradle should be very familiar. We will show you how to set up Snowpark via Maven on IntelliJ IDEA. 

Maven is used for project build automation using Java. It helps you map out how a particular application is built, as well as its different dependencies. Although Maven is heavily used for Java development, plugins can be used to support Scala development as well.

The following sections provide a prerequisite for installing Snowpark.

Ideally, you will use the dependency tag into your project pom.xml file that will automatically import packages. This can be achieved by including Snowpark 0.6.0 Maven dependency available in OSGeo Release Repository. This will ease up our application setup and enable us to create standalone Java/Scala applications, which can be deployed in Snowflake’s backend or other frameworks.

To integrate the Snowpark library into your Maven project, add the Snowpark repository into your repositories tag in your pom.xml file.

    <id>OSGeo Release Repository</id>
Then add the dependency to your dependencies.




In case version 0.6.0 changes, you can update the tag <version>0.6.0</version> and then clean and build your project.

You may need to include additional plugins to your project such as, (net.alchim31.maven) for compiling Scala files and (maven-shade-plugin) for creating a standalone application that includes all dependencies into your jar file.

Now that you have a Scala project and included all your Snowpark dependencies, we can proceed to your project development. However, in case you encounter an issue with multiple Scala versions (see Figure below), you can fix this by either uninstalling the Scala version on your system and installing the required one (not recommended as other applications may depend on it) or use the Maven dependency injection.

A screenshot that shows potential issues with multiple Scala versions

One of many benefits of using Maven is that it uses an XML file “pom.xml” to describe the project that you are building and the dependencies of the software with regards to third-party modules. More details on Maven can be found here. In case you already have a Scala version installed on your system and it is not compatible with the supported version, you are in luck! As you are using Maven, just include the Scala dependency in your pom.xml file like this:

a screenshot showing a Scala dependency in a pom.xml file

The ${scala.version} is set in the properties block. At the time of this article, the Snowpark version is 0.5.0 and the Scala version 2.12.11.

Hopefully, this blog sheds some light on how to properly install IntelliJ IDEA in Snowpark. Stay tuned to our blog as we cover more topics and how-tos for Snowpark

More to explore

Accelerate and automate your data projects with the phData Toolkit

Data Coach is our premium analytics training program with one-on-one coaching from renowned experts.