Case Study

Major Regional Bank Automates Reports of Commercial Lending

The Customer’s Challenge

As a major regional bank continues to grow, they are finding it more cumbersome to manually generate reports to measure the productivity and efficiency of their commercial lending team. 

phData’s Solution

After an initial Proof of Value engagement with the client, they brought in phData to create a centralized and automated data product for the Commercial Lending team. Our data engineering team quickly got the data cleaned, and automated the system to allow them to have optimized live boards for their Commercial Lending reporting.

The Full Story

The client is one of the leading regional banks in the Southeast and offers robust banking solutions, including consumer banking, mortgage, small business banking, commercial banking, and wealth management.

Reporting for the Commercial Lending team was often manual, with many moving parts and disparate Excel worksheets. This led to a lot of manual work to generate some basic charts and statistics to help measure how the Commercial Lending team was performing. 

On top of that, investigating anything found within those reports required someone to manually comb through that same disparate data.

As the client began to look towards building a cloud data system, they decided to bring phData on to help them automate Commercial Lending reporting as well as lay the foundation for the successful adoption of their new cloud data system.

Why phData?

The team at phData first joined to do a Proof of Value after the client had reached out to the Snowflake Data Cloud. The initial success of that team paved the way for them to bring phData on to help lay the foundations for their first data product. However, to accomplish this we had to:

  • Establish an Information Architecture for Snowflake
  • Demonstrate software engineering best practices to their data team
  • Automate and optimize CI/CD of data products with dbt

Establishing an Information Architecture for Snowflake

We wanted to start by making sure we could not just automate the creation of any necessary source and product databases, but we wanted to accomplish this in a highly repeatable and automated way. To do this, we used our internal software, Project Administration. 

Project Administration allowed us to architect a repeatable structure for the client’s data sources and data products as well as establishing a proper role hierarchy within Snowflake. This makes it easy for the client’s Data Team to:

  • Automatically deploy any new objects and roles through GitHub Actions 
  • Onboard and automate the assignment of roles to various team members

Demonstrating Software Engineering Best Practices to Their Data Team

Anyone who has ever worked within data knows that it is tough to manage or even revert changes. Thankfully with modern tools like dbt and Fivetran, we’re able to bring software engineering best practices to the ELT/ETL process. This allows us to not just better manage changes and merges but allows us to focus on making sure that our data process is truly idempotent. 

We used Fivetran to manage the Extract and Load portions of the ELT/ETL process. Fivetran made it easy to get data out of the disparate source systems and into Snowflake following our established Information Architecture while dbt made it easy for us to focus on versioning and automating the deployment of our transformation process. 

Once Fivetran had brought the source systems in, we used dbt to transform the data and get the same answers that the Commercial Lending team was getting with a manual process, but it was completely automated. 

Ultimately, this allows the team to view their charts daily instead of monthly and gives them the ability to drill down into their data to make it easier to do exploratory analysis and find problems.

Automate and Optimize CI/CD of Data Products with dbt

With all these pieces in place, we could focus on automating the entire process. This permits the data team to focus on using version control to manage their changes while having an automated process that tests the deployment before actually deploying into a production environment. 

Also, thanks to dbt, we were able to optimize those deployments using slim-ci to only deploy new/recent changes to help lower the computational cost of the deployments. 

Slim-ci allows us to only deploy pieces of the transformation layer that actually changed, while the daily feed focused on incrementally bringing in any changes instead of completely re-creating the warehouse every morning.

Results

The client has seen four key results front his project:

  • The Commercial Lending team can pull and look at reports on a daily basis instead of once per month.
  • A repeatable framework was established to expedite and automate how the enterprise pushes and accesses data in Snowflake.
  • Automated QA, Validation, and CI/CD allow the data team to focus on providing value with new data products instead of spending manual time deploying and fixing production when a deployment goes wrong.
  • General excitement around the organization about the future of the organization’s cloud migration.
 

Armed with new technology and new techniques for the data team, the client will be able to address process inefficiencies and push the organization to the forefront of banking.

Take the next step
with phData.

Need assistance developing your company’s Snowflake strategy?

Data Coach is our premium analytics training program with one-on-one coaching from renowned experts.

Accelerate and automate your data projects with the phData Toolkit