Head over to Twitch on any given day and you'll find 250,000 broadcasters live streaming content to more than two million concurrent viewers. Launched in 2007 under the name Justin.tv, Twitch was intended to be a “stream anything” platform, but pivoted to video game streaming once it proved to be the most popular content on the site.
Twitch contains multitudes. Esports professionals battle it out in live tournaments, speedrunners race through levels, and people play games poorly—on purpose—for entertainment's sake. When broadcasts of creative pursuits gained interest, Twitch added sections for Food and Creative.
These big decisions—to hone in on particular audiences, to create new categories—are informed by the behavior of Twitch’s users. And when Twitch’s data science team assists in making product decisions—big and small—user data is one of the first places they turn.
“Broadcasters are the heart and soul of Twitch,” says Drew Harry, Director of Twitch's Science team, “and so understanding that every channel is distinct and important in its own way is the foundation for all of our analysis work.”
He's not kidding. When we chatted with him recently about data science at Twitch, we could sense this user-centric approach in how the Science team is structured, in their new hire processes, and in plans for team expansion.
Twitch's Science team. Drew is sitting in the front row, far left.
Read on to find out how Drew and his team:
- Onboard new data scientists
- Make time for mentorship
- Structure the Science team
- Balance qualitative research
- Define best practices for collaborating with other teams
- Promote the value of data science throughout the company
Onboarding and mentoring new data scientists
The Twitch ecosystem is incredibly vast. How important is it that new data scientists understand it all to be effective?
One of the big questions we talk about is how much domain knowledge you expect a data scientist to have coming in? If you're running an insurance company, would you expect your analyst to come in knowing the ins and outs of the insurance industry? You probably wouldn't. You'd say, "They can learn that."
Saturation is probably the generic answer. You've got to dig in and get your hands dirty and find something in the product that you love. It doesn't have to be video games. You may love the technology, you may love the creative aspect, you may just love chat. You may like hanging out with people online. You've got to find something to latch onto.
We really encourage new members of our team to take time to learn what's going on in Twitch's communities, maybe participate in some of those communities themselves. Around the office are screens broadcasting various channels. There are lots of opportunities to meet broadcasters.
We have TwitchCon, which is a big convention where fans and broadcasters and staff get together and talk about all things Twitch. That's a really great experience for new people to get some hands-on experience with the community.
How do you think about mentorship on the Science team? How do you make time for it?
When we're thinking about the hierarchy of ways we could spend time, mentorship always flies straight to the top. If you can be a multiplier for your colleagues—if you can help someone learn to do something better—that's always better than doing it yourself. Because then you can help others do their work faster and better and teach other people.
The ability and desire to mentor is something that we look for in people we hire, and it's something that we try to actively reward. We have team structures around it, too.
For example, we have a weekly stand-up in Slack and a weekly research review meeting where people present to the whole science team. They say, "This is what I'm working on. This is where I could use some advice. This is where I'm stuck." These are both nice opportunities for people to talk about specific problems, create little interest groups, and share expertise and advice.
Structuring and building the data science team
How is the Science team structured? How does the team fit into the organization as a whole?
We have historically been just data scientists, but we're in a phase where we're diversifying. We want the Science team to be supported by three pillars: data science research, user experience research, and data governance.
Data scientists typically draw their work from a specific part of the organization—for example, a product team that they're working with closely. At Twitch, those teams are based on customer segment. There's a community team that thinks about viewers and communities. There's a broadcaster team that thinks about video ingest, video replication, content discovery, and monetization.
And then there's the developer team, which we just launched. That team thinks a lot about game publishers, how they interact with their game communities, and how they build tools on top of the Twitch platform—games, or chatbots, or browser plug-ins, or whatever.
While data scientists sit together and report through one Science organization, each of us can tie our work back to one of these groups. Since we get to regularly collaborate with the same group of people who understand their data and their product, it's easier for us to impact product direction and strategy.
We're also building a qualitative research team, and that's the second pillar. It's people who do hands-on work with users: ethnography, interviews, diary studies, and usability work. We'll bring users into the office and show them pre-release products and try to understand what works and what doesn't.
The third pillar is data governance. We're building a team that cares about how data moves through the Twitch ecosystem, how we define what we should be collecting, how we manage the data, and how we distribute it to people.
Could you speak more about why you decided to build a qualitative research team? It's a bit unique that data science and qualitative user research are part of the same team.
While some organizations put different research methods in different places in the organization, I think of them as strongly complementary and have seen lots of benefits to having them adjacent.
Data science works best when you have strong data sources. We can do some work before a product's release to predict if it'll be successful, but we can do a ton more after the release, once the data's being collected. Data science is good at answering how much and how often and who does certain things.
To dig into those “why” questions, we have to actually ship something and wait for enough data to roll in. That could be really expensive. So we've brought in qualitative researchers who work in the early stage of the process, who can help us explore the design space around products that haven't been released yet. We like to have people who can say, "We're thinking about how viewers want to interact with each other when their favorite channel is offline. Someone needs to be out there talking to viewers to understand their social networks and the other tools they use to interact with fans of the same channel."
What are some of the Science team's main focuses?
There are two main branches of data science right now. One is focused on making good decisions with data. I think of those as product strategy-oriented data scientists: people who are trying to support product managers primarily.
The other branch is what I would call data product data scientists: people who are trying to build specific algorithms and techniques that can turn into new products that are informed by data.
Strategy-oriented work cycles are different than data product work. Strategy-oriented data scientists tend to have a consultant role, and they're more embedded in the team.
Data products tend to take longer to bake, and we actually act a little bit like product managers than consultants. We're trying to identify data product opportunities and their potential impact, and then we need to build a case for spending engineering time on them.
We've established a strong hand in production decision-making, and now we're growing a new practice in the data product space where we're helping to identify opportunities and create great data products based on all that we know about how people view content on Twitch.
Collaborating across teams
You said that strategy-oriented data scientists do a lot of consulting. Do you have any tips for how other data scientists can be great consultants?
That's really hard. We think a lot about customer education—in this case, our customers are Twitch employees outside the Science team. How we can help them learn how to use us better? I think there are a few things data scientists can do to make the consultation process smoother.
The consultation process is the heart of what a good data scientist does. They teach their customers how to ask good questions and how to interpret the results. They don't just hand someone numbers. They say, "Given all that I know about the context, here's what I think you should do and here's the chain of evidence that supports that argument."
First, we need to help our customers approximate the right level of detail. It's like a Goldilocks scenario. There are incredibly high-level questions—"How do we measure the impact of events at Twitch?"—or really low-level questions—"Can you tell me how many times people who watched this event went on to watch this event?" There's no context or impact to those questions. We want to work in the middle with a question like: "Are events a cost effective way of attracting new viewers?"
Second, we need to gauge expectations, by asking things like:
- What would be a good outcome for you?
- What decision are you making?
- What results are you expecting to get back, and what would you do if you got the reverse answer?
That last one is a really important question to ask. Sometimes people have made their minds up already, and it'll take a lot of negative evidence to convince them otherwise.
Third, data scientists should never just accept descriptions of work at face value. So we go back to the person we're helping and ask:
- Can you explain to me the context around this decision?
- Can you help me understand what outcomes would be useful for you?
- How much confidence do you need?
Sometimes they need a level of confidence we can't plausibly get in a couple of weeks. Or sometimes the level of confidence needed is so low that if we can just make a good guess, and that's acceptable.
Fourth, we need to know when it's not a data science question at all. Sometimes we get an extremely complex problem where there really isn't good data to support a decision one way or the other. It's more of a professional intuition question.
The best bet there is to try something and then carefully measure what happens afterwards. Not everything needs a ton of pre-work to make a good decision.
Selling ideas and results
How do you convince colleagues that data science projects are worth the investment?
Selling ideas is always hard. I think product managers are frequently rewarded for having big, audacious visions and for those visions to come from them. If you're a product manager, you probably have everybody else at the company telling you all the time what you should be doing and complaining that you're not doing the thing that they want. Breaking through that cacophony requires delicacy, solid relationships, and really strong evidence.
The other thing that is really important is selling results. You need to shout your findings loudly—find anyone who will listen. We have a variety of tools for doing that—we send an announcement mailing list to a couple hundred people internally, and we inject slides of our learnings into the company presentation every Friday. Customer education is an important piece because we don't just want to convince a product manager, we want to convince the whole company that something is a great idea.
It's not just sell one person, make a great report, have a great insight so other data scientists think that you're really clever. It's spreading your insights to every corner of the company—because that's what drives the business forward. You should be able to ask anybody in the company what the Science team has discovered recently that had a real impact on the business, and they should be able to tell you something.
Do you have any recommendations for data scientists at companies where the executives are less familiar with the value of data science? How can those folks start the process of selling widely and shouting from the rooftops?
You need to be showing what you can do early and often.
Probably the fastest path to success is identifying people whose intractable problems you can solve. Find those problems, solve them in a way that makes people really happy, and then tell everybody that you solved their problems.
It's really tough if you don't have executives who have a clear understanding of what data science can do. That's one of the primary reasons why people leave data science jobs. They say, "I didn't feel supported," or they say, "I felt like I had to produce results that made someone look good, and I didn't think it was intellectually honest."
That's toxic for a data science team. It goes back to customer education. You have to basically be selling the value and the methods for making great data-driven decisions all the time. You need a certain amount of support from enough of the execs so that they hold to each other to a high standard. If you have that support, it'll send people in your direction who feel confident that the best way to get work done is to get the data science team involved early on in their projects.