June 16, 2020

Snowflake Names phData as Emerging Partner Of The Year

By Keith Smith

Snowflake Inc. announced this week during their Virtual Partner Summit that phData has been named the FY2020 Emerging Partner Of The Year. With multiple joint customer wins over the past year, we’re honored to be recognized with such a high distinction and excited about our future in the Snowflake ecosystem.

Snowflake Emerging Partner of the Year 2

You can learn more about the award in our press release, here! In this blog, let’s talk about what we’ve learned about building data products on Snowflake’s cloud data platform. Hopefully, these 3 tips can help ensure your Snowflake-based data product is successful. 

Tip 1: Unlock All of Your Data

The landscape of technologies that our customers use to empower their business are vast. Being able to analyze all of the data being generated from these solutions allows our customers to discover insights driving their business forward. This is especially true in areas that were previously considered difficult to obtain and analyze, including:
IoT
  • Sensors in the field, factory, or plant
  • Process 4 Billion data points daily

SAP Offload

  • SAP ECC
  • SAP HANA
  • SAP BW

Legacy Database Migration

Snowflake Machine Learning

  • Snowflake as a Feature Store
  • Snowflake Data Marketplace monetization

Tip 2: Automate Your Data Cloud

Snowflake is the leading cloud data platform and allows you to tap into all of the data listed in the previous section. Enabling these solutions requires infrastructure, management, and standardization to be successful. We have identified best practices that help our customers succeed quicker.

Data Pipeline Automation

Ingesting over 27,000 source tables from SAP can be a tedious process that includes analyzing the source system, creating the schema in Snowflake, and creating continuous pipelines that move data between the systems. Each of these Snowflake pipelines is at least 5 SQL statements that include Snowflake Pipes, Snowflake Streams, and Snowflake Tasks. During our engagements we’ve learned that automatically generating these pipelines is not only possible but centralizes success & mistakes, leading to lower costs long term. Learn about how phData’s Streamliner is helping our customers accelerate their pipeline success.

User, Application, & Data Management Automation

Managing users, groups, applications, approvals, and data can be challenging. Gaining access to the right data at the right time helps drive business forward, but the execution of this can require 25 different SQL statements to be executed flawlessly. Having identified this challenge we have built Tram, your ride into Snowflake made easy.

Cloud Infrastructure Automation

Production data pipelines usually require cloud infrastructure outside of the scope Snowflake provides. For example, you might need Apache Airflow, Apache Kafka, AWS Sagemaker, or Databricks. Cloud Foundation is a library of pre-built infrastructure-as-code, but also CI/CD to ensure governed, automated deployments. 

Tip 3: Drive Down Processing Costs

Everyone these days is concerned about costs and like every other technology, there are right and wrong ways to utilize the tools offered to you. There are a couple of best practices that are vital to driving down the cost of Snowflake computing resources.

Ingestion Design Matters

Loading data into Snowflake is something every customer deals with, it is the lifeblood of the data products being built. This means that there is always a process and associated cost to the ingestion part of the data product pipeline. Ensuring that data is broken into appropriate sizes has a dramatic impact on ingestion costs that are continuously running. Learn how our Data Engineering team helps right size your ingestion pipeline.

Leveraging the Snowflake Cloud Services Layer

When building data pipelines in Snowflake it’s important to always leverage the tools available to you in the Snowflake Cloud Services Layer. There are many advantages this layer offers including zero-copy clones, security, and query optimization but it can also help drive down processing costs. Using a Snowflake Stream can dramatically reduce costs and time because tracking of table changes occurs in the Cloud Services Layer and doesn’t require warehouse execution to check for offset changes.

Looking Ahead

While we celebrate this award we still have an eye on the future for continuing our joint success with Snowflake. We are incredibly excited about working in this space and will continue to identify places where customers can get more out of their Snowflake Data Products. We are excited to see all the new features available as part of the Snowflake Data Cloud and can’t wait to be part of the journey. Want to take that journey with us? Reach out to sales@phdata.io to learn how we can help your business accelerate success.

Data Coach is our premium analytics training program with one-on-one coaching from renowned experts.

Accelerate and automate your data projects with the phData Toolkit