June 15, 2021

How to Connect to Snowpark With VSCode

By Keith Smith

Congratulations on your interest in Snowpark! 

To help you get started, we put together this handy tutorial to walk you through getting connected to your Snowflake account. Our goal is that you find the following tutorial helpful on your journey to unlocking data engineering capabilities natively on the Snowflake data cloud.

If you have any feedback, requests, or questions, feel free to reach out and we will do our best to answer!

Visual Studio Code (VSCode)

The first step in this process is ensuring that you have VSCode installed on the device of your choice. 


Once VSCode is installed, we are going to want to install and enable the metals extension from the marketplace.

In VSCode you can install metals via the following process or you can click on the marketplace link above.

Extensions shortcut icon:

a graphic of the VSCode extension shortcut icon that contains 4 squares.

Select and install the metals extension:

a screenshot of the Metals extension

Once installed the sideways Scala logo should appear on the left hand side of the VSCode application.

Sideways picture of the Scala logo

Create a new Scala Project

We are now ready to start a new Scala project! Let’s prepare a place to store our work and get VSCode + Metals to do the rest!

Directory Structure

This can be setup to your liking but for simplicity we are going to set up a simple “Hello World” example to ensure our local configuration is correct as well as our Snowflake account connectivity.

Setup a base directory to work from.


mkdir ~/snowflake/snowpark/hello-vscode

Once you have a directory to work from let’s make sure VSCode can open it as well.

File > Open > Navigate to “hello-vscode” directory > Click Open

Screenshot of how to open the VSCode editor

At this point VSCode should open (or reopen) the code editor with the directory you just created and chosen as the base directory. This directory should be empty (since we just created it).

New Scala Project

Now we are going to use the power of Metals to automate our environment setup for Scala and Spark.

Select the Metals icon on the left-hand side.

Sideways picture of the Scala logo

The following window should appear once Metals is open.

Screenshot that says, "New Scala project."

Click “New Scala project” and this should populate a drop down list of options.

Screenshot that says, "scala/hello-world.g8"

Select “scala/hello-world.g8” and then name the project. In this example it is “hello-vscode”.

Screenshot that says "hello-vscode"

At this point VSCode will ask if you want to open the project in a new window, this is optional.

Screenshot that says, "Do you want to open the new project in a new window?"

Regardless of what you choose, the next task is to configure the correct version of Scala. It should be noted that at this point in time the only compatible Scala version is Scala 12 and the default version in the Metals setup is Scala 13. We need to update “build.sbt”

Screenshot of the Scala configuration.

Once you save the above configuration you need to import the correct Scala version. VSCode makes this easy by prompting you with a pop-up and select “Import build”.

a screenshot that says, "New sbt workspace detected, would you like to import the build?
Screenshot that says, "sbt bloopinstall"

Snowpark Setup

At this point your environment should be complete for Scala development and ready to integrate with Snowpark specific steps.

Snowpark JAR

This jar file needs to be downloaded from Snowflake and available locally.


tar -xvf snowpark-X.Y.Z-bundle.tar.gz

Snowpark JAR Setup

Now that we have the jar, we need to import it into our project. We need to copy the “lib” directory and we need to copy the Snowpark JAR file.

Screenshot that shows the Snowpark JAR file

Add Classpath

To accomplish this, we need to again open up the “build.sbt” file and add the following line at the end of the file:

unmanagedClasspath/includeFilter := “lib”

a screenshot of the build.sbt file
A screenshot with the alert message saying, "New sbt workspace detected, would you like to import the build?

Snowpark Example

Now we are ready to test if our project is configured correctly and connect to the Snowflake environment. It is important to note that Snowflake permissions are setup correctly before executing the following example but if it isn’t, Snowflake will throw warnings on permissions issues.

Replace the “Main.scala” file with the following (be sure to fill out the required information):

					import com.snowflake.snowpark._
import com.snowflake.snowpark.functions._

object Main {
  def main(args: Array[String]): Unit = {
    // Replace the <placeholders> below.
    val configs = Map (
      "URL" -> "https://<account>.snowflakecomputing.com:443",
      "USER" -> "<user name>",
      "PRIVATEKEY" -> "<private rsa key copied from your private key file>",
      "PASSWORD" -> "<placeholder password>",
      "ROLE" -> "<role name>",
      "WAREHOUSE" -> "<warehouse name>",
      "DB" -> "<database name>",
      "SCHEMA" -> "<schema name>"
    val session = Session.builder.configs(configs).create
    session.sql("show tables").show()


This will execute a “SHOW TABLES” command and you should be able to see the tables in your environment.

a screenshot of the SHOW TABLES command

Congratulations, you are now ready to enhance your Snowflake Data Engineering experience with Snowpark!

Data Coach is our premium analytics training program with one-on-one coaching from renowned experts.

Accelerate and automate your data projects with the phData Toolkit