Posts in Technical Guide
Automating Your Snowflake Cloning Strategy with AWS

Developing with old and bad data is counterproductive and and old world way of working. What I mean by that is there are plenty of ways to get data parity across Dev, Test, and Production environments with little effort in a cloud based ecosystem. However, there are few companies that are taking this type of approach in their data strategy. This time I’ll show how to accomplish this on AWS for Snowflake.

Read More
Automating Your Snowflake Database Cloning with GCP

This blog will focus on cloning databases in Snowflake with the intent of Snowflake customers being able to break free of those constraints. By using the Zero Copy Clone feature that Snowflake has, we can create Production level parity for developers and testers on a regular basis with near zero cost implications. We can also automate and schedule these refreshes so developers are always working with data that is representative of the real world. This enhances quality throughout the entire SDLC. Seems like a no brainer, right?

Read More
Dynamically Duplicating A BigQuery DataSet’s Tables

Fresh data is always better to develop against than stale data. In the past, we’ve been constrained in a variety of ways that prevented us from using Production-like data or having close data parity to the applications we are developing from and for. This can create many different issues, so my thought was “Why continue in the old ways if we can fundamentally change how we work by working in the cloud?”

Read More