Benn Stancil, President & Co-founder
January 24, 2019•Updated on August 29, 2022
NaN minute read
Last week, I urged data science teams to focus on delivering value quickly and iteratively (thanks to Ian Blumenfeld for highlighting the importance of keeping an eye on the horizon throughout this process, and not only thinking about your next step).
In addition to Ian's comments, I received another question: How do you ship value early when you have to build infrastructure first?
After all, analysis requires data. You can't have data unless you have a way to collect it, and a place to store it. And you can't get anything out of it without a means for querying and analyzing it. Regardless of how agile your team might be, you're limited by how quickly you can provision a warehouse, build ETL pipelines, set up event tracking, and integrate an analytics tool. Doesn't this all take significant time and expertise to build?
The short answer is, not anymore.
All of these systems can be set up in a matter of hours, not weeks and months. As Tristan Handy of Fishtown Analytics explains in this excellent post, modern data tools make it possible for the first iterations of a data infrastructure to be built without the help of specialist data engineers.
Today, analysts themselves can get an entire data infrastructure up and running using off-the-shelf tools. Cloud-first warehouses like Redshift, BigQuery, and Snowflake can be spun up in minutes via web interfaces. Data can be instantly fed into these warehouses without writing a single line of code, thanks to tracking systems like Segment or Snowplow, and ETL tools like Stitch, Fivetran, or ETLeap. And an analytics tool like Mode can be integrated in minutes, immediately giving companies the ability to query, visualize, and share their data.
Get a simple step-by-step tutorial on how to assemble a modern data stack in 30 minutes (with a YouTube video for guidance).
This new stack represents the most significant development in data engineering of the last five years. They may not seem as exciting as technologies that solve scale and latency on the bleeding-edge of data science, but these “meat-and-potatoes” technologies solve a much broader problem: They make analytics accessible to everyone.
I used to joke with a former coworker that he was always thirty minutes from "broke." Today's businesses, even those short on time and people, are now in a similar, albeit far more positive, situation: Data engineer or not, regardless of what analytics infrastructure you have or haven't built, you're only ever thirty minutes from answers.
Data engineer or not, you're only ever thirty minutes from answers when you set up a modern data stack.
The first technical hurdles in analytics have never been lower, so don't let the idea of starting keep you from actually starting. The sooner you get over them, the sooner you can get to work building an analytical decision-making culture.