May 21, 2025

How to Deploy Snowflake Streamlit Apps: The Easiest Method Explained Using dbt

By Rajib Prasad

Deploying Snowflake AI Data Cloud Streamlit Apps efficiently is key to streamlining analytics and enhancing data decision-making. However, manual deployment can be error-prone and difficult to scale.

By using dbt, teams can automate parts of the deployment process, ensuring consistency, reducing operational overhead, and seamlessly integrating app updates with data model changes.

However, this approach is not recommended for managing or deploying production-level applications. While dbt Core can help structure and automate parts of the deployment process, it lacks a continuous integration and deployment (CI/CD) framework to efficiently move changes across environments. dbt Cloud, however, provides this functionality, making it a better fit for production use. Without proper CI/CD, managing multiple versions, rolling back changes, or ensuring application stability in production becomes challenging.

However, it can be useful in cases like:

  1. Quick Prototyping – Testing ideas before full deployment.

  2. Data Exploration – Allowing analysts to interact with Snowflake data easily.

  3. Internal Dashboards – These are for real-time insights without a complex setup.

For production, we recommend using CI/CD tools like GitHub Actions or Bitbucket Pipelines or Airflow to automate testing and deployment.

In this blog, we’ll explore:

  • Overview of Streamlit in Snowflake.

  • How to Create a Single-Page Streamlite App Using Snowsight?

  • How do you integrate a Steamlit App with dbt and deploy it?

 Let’s jump in!

What is Streamlit in Snowflake?

Streamlit in Snowflake (SiS) is a fully managed service that allows users to build and deploy interactive data applications directly within Snowflake. Unlike traditional Streamlit, which requires separate hosting and infrastructure, SiS eliminates the need for external deployment by integrating natively within Snowflake.

With SiS, Data Scientists, Machine Learning Engineers, and Data Engineers can quickly create secure, Python-based data applications that interact seamlessly with Snowflake tables, stored procedures, and query results—without moving data outside Snowflake. This ensures governance, security, and scalability while enabling rapid prototyping and decision-making.

Why Does it Matter?

  • Simplicity: Streamlit allows you to create powerful UIs with just a few lines of Python code. Streamlit focuses on enabling developers to build functional web apps quickly and efficiently.

  • Interactivity: Streamlit makes it easy to add interactive widgets like sliders, text inputs, buttons, and file uploaders. These elements enable users to modify inputs dynamically and visualize the effects immediately.

  • Real-time updates: One of Streamlit’s standout features is its ability to rerun scripts in real-time as users interact with the app. When a user changes a widget or input, Streamlit automatically re-executes the script from top to bottom, updating the content in real-time.

  • No web development needed: Streamlit abstracts away the complexities of web development, allowing you to focus on building your app’s functionality rather than worrying about the front-end design.

  • Integration with data science tools: Streamlit integrates seamlessly with Python’s most popular libraries, such as Pandas, NumPy, Matplotlib, and Plotly, making it easy to visualize data or integrate machine learning models with minimal effort.

How do you Integrate Your Snowflake Streamlit Apps With the dbt Project?

Integrating Snowflake Streamlit Apps with your dbt project allows you to leverage dbt transformations within your application. This means that any changes to your data models are automatically reflected in your Streamlit app, ensuring consistency, accuracy, and up-to-date insights without manual intervention.

From a developer’s perspective, this approach simplifies data management by automating transformations, reducing redundant SQL logic, and maintaining a single source of truth for data-driven applications. For the organization, it enhances data governance, improves collaboration, and ensures data consistency across different environments.

We can utilize dbt macros and hooks to deploy Streamlit Apps directly from the dbt project, streamlining the deployment process. Below is an example of a dbt project setup for Streamlit application integration.

Step 1: Create an Infrastructure Folder

At the root of your project, create a new directory named infrastructure. This folder will be a dedicated space for all Streamlit-related code, maintaining a clean and organized project structure. 

  • Directory: infrastructure/streamlit/

This directory is designated specifically for Streamlit-related files, ensuring the application code is isolated from other project components.

  • File: app.py

Located within the infrastructure/streamlit/ folder, app.py serves as the main entry point for the Streamlit application. This file contains the code for initiating and displaying the Streamlit UI, querying data, and presenting the results.

Step 2: Streamlit Code Snippets for the UI

Copy the below Python code into your app.py file.

				
					import streamlit as st
from snowflake.snowpark.context import get_active_session
import pandas as pd

# App title
st.title("WELCOME TO STREAMLIT APPS")

# Header
st.header("Source Table Details")

session = get_active_session()
conn = session.connection
cursor = conn.cursor()

query = f"""
    SELECT id, name, age, city
    FROM <database_name>.<schema_name>.<table_name>"""

df = session.sql(query).to_pandas()
st.dataframe(df)

				
			

Step 3: Create an Internal Stage to Load Streamlit .py file

This is a prerequisite for Streamlit app deployment. We can create it from the Snowsight UI using the below command.

				
					CREATE STAGE <stage_name>
  ENCRYPTION = (TYPE = 'SNOWFLAKE_SSE');

				
			

Step 4: Create a Macro to Deploy .py File Into the Internal Stage

Using the PUT command, create a macro to deploy .py files from the infrastructure folder to a Snowflake internal stage. Before executing the PUT command, ensure any existing .py files are removed, as PUT appends files by default and will skip deployment if the file already exists.

macros/load_file_to_stage.sql
				
					{% macro load_file_to_stage(param) %}
    {% set query1 -%}
        REMOVE @DEV_STREAMLIT_BLOG_DB.RAW.internal_stage/app.py;
    {%- endset %}
    {% set result1 = run_query(query1) %}
    
    {% set query2 -%}
        PUT file://infrastructure/streamlit/app.py @DEV_STREAMLIT_BLOG_DB.RAW.internal_stge AUTO_COMPRESS=FALSE;
    {%- endset %}
    {% set result2 = run_query(query2) %}
{% endmacro %}
				
			

Note: The PUT command with the OVERWRITE parameter set to TRUE ensures that existing files are replaced. Without this parameter, the PUT command will not overwrite and skip existing files.

				
					PUT file://path_to_file/*.py @internal_stage_path OVERWRITE = TRUE;
				
			

Step 5: Create a Macro to Create or Deploy Your Streamlit Application

We will now create a macro to execute the CREATE OR REPLACE command to deploy the Streamlit application.

macros/create_streamlit_app.sql
				
					{% macro create_streamlit_app() %}
    {% set query1 -%}
        CREATE OR REPLACE STREAMLIT DEV_STREAMLIT_BLOG_DB.RAW.hello_streamlit_from_dbt
        ROOT_LOCATION = '@DEV_STREAMLIT_BLOG_DB.RAW.INTERNAL_STAGE'
        MAIN_FILE = 'https://i0.wp.com/www.phdata.io/app.py'
        QUERY_WAREHOUSE = COMPUTE_WH;
    {%- endset %}
    {% set result1 = run_query(query1) %}
{% endmacro %}
				
			

Step 6: Create dbt Hooks to Call the dbt Macros

This is the final step: creating dbt hooks to invoke the macros.

Step 7: Execute the dbt run Command and View the app on Snowflake Steamlit UI

After executing the dbt run command, you can view the app in Snowflake’s Streamlit section.

Best Practices

  1. The Streamlit app operates under the owner’s privileges by default, meaning it inherits the owner’s access rights rather than those of the user interacting with it. To enable your peers to view or use the app, you must explicitly grant access by assigning appropriate permissions to other roles.

				
					GRANT USAGE ON STREAMLIT identifier($streamlit_app_name) TO ROLE identifier($viewer_rl_name);
				
			
  1. With Streamlit, you make changes to your data script file, save it, and Streamlit displays the changes automatically. When the source code changes or the user interacts with the web application, Streamlit reruns the entire script. Use Streamlit caching to avoid running expensive code multiple times and improve performance

@st.cache_data @st.cache_resource
Serializable Objects
  • Functions

  • Dataframes

  • API Calls

  • etc.

  • Unserializable Objects

  • ML models

  • Database Connections

  1. With Streamlit, you make changes to your data script file, save it, and Streamlit displays the changes automatically. When the source code changes or the user interacts with the web application, Streamlit reruns the entire script. Use Streamlit caching to avoid running expensive code multiple times and improve performance.

				
					{% macro create_streamlit_app(database_name, schema_name) %}

    {% if environment == 'dev' %}
        {% set query1 -%}
            CREATE OR REPLACE STREAMLIT {{ database_name }}.{{ schema_name }}.hello_streamlit_from_dbt
            ROOT_LOCATION = '@ {{ database_name }}.{{ schema_name }}.INTERNAL_STAGE'
            MAIN_FILE = 'https://i0.wp.com/www.phdata.io/app.py'
            QUERY_WAREHOUSE = COMPUTE_WH;
        {%- endset %}
        {% set result1 = run_query(query1) %}
    {% else %}
        {# The macro will not run in non-dev environments (tst, prd) #}
        {% do log('Skipping Streamlit app creation: Not in dev environment.', level='INFO') %}
    {% endif %}

{% endmacro %}

				
			

Closing

You can create a basic, single-page Streamlit app in Snowflake’s Snowsight using the integrated Python editor, allowing you to write, modify, and run the code for your app directly. However, this method is not recommended for managing or deploying production-level applications. It lacks a continuous integration and deployment (CI/CD) process to move changes across environments efficiently.

For more advanced development, you can use Snowflake CLI commands to build and test your Streamlit app locally and leverage CI/CD workflows for deploying multi-page applications. This provides better control over the deployment and versioning process.

Stay tuned for our next blog, where we’ll dive into building a data entry form that interacts with Snowflake database objects.

phData Blue Shield

Looking for help?

If you’re looking to streamline your Streamlit app deployment with best practices, phData’s experts are here to help! Reach out to learn how we can support your development and CI/CD workflows.

FAQs

To deploy a Streamlit app in Snowflake, you’ll utilize Snowflake’s integration to set up and manage the app directly in your environment. This involves specifying the app’s root location and the main script file and ensuring proper data access by linking to the relevant Snowflake database. You must also configure permissions to ensure the app runs successfully within your Snowflake setup.

In Snowflake, access to a Streamlit app is controlled through user roles and permissions. By default, the app runs with the owner’s privileges. To enable other users to interact with the app, you must assign access to specific roles, ensuring that the right individuals or teams have the appropriate permissions to use or view the app.

Data Coach is our premium analytics training program with one-on-one coaching from renowned experts.

Accelerate and automate your data projects with the phData Toolkit