What I Learned From My First Beta
We’re working on a brand-new basketball product here at Hudl. It’s an extremely exciting opportunity for our company — we’re creating a whole new way of capturing, consuming, and analyzing basketball video. The 2013 – 14 basketball season was my first beta at this scale, and I’d be lying if I told you we had a flawless strategy and executed perfectly from every angle. The team and I learned some valuable lessons during the course of this season that I’d like to share with you in this post.
What I Learned From My First Beta
We’re working on a brand-new basketball product here at Hudl. It’s an extremely exciting opportunity for our company — we’re creating a whole new way of capturing, consuming, and analyzing basketball video. The 2013 – 14 basketball season was my first beta at this scale, and I’d be lying if I told you we had a flawless strategy and executed perfectly from every angle. The team and I learned some valuable lessons during the course of this season that I’d like to share with you in this post.
We’re working on a brand-new basketball product here at Hudl. It’s an extremely exciting opportunity for our company — we’re creating a whole new way of capturing, consuming, and analyzing basketball video. During the 2012 – 2013 basketball season, we laid the technical foundations for the product as we did our initial research into pain points with the current system. We sought opportunities we could explore with the new product. After a lot of work in the offseason, for the 2013 – 14 season we were ready — or so we thought — to put some more pressure on this new product as we entered our beta testing phase.
As it turns out, beta testing a completely new, standalone product like this is a lot different than our typical rollout process. This was my first beta at this scale, and I’d be lying if I told you we had a flawless strategy and executed perfectly from every angle. The team and I learned some valuable lessons during the course of this season that I’d like to share with you in this post.
Oh Hey, That’s Broken
One of our biggest mistakes was assuming everything we’d implemented so far would go fairly smoothly, from a technical standpoint. After all, we’d tested things out ourselves and run things through their paces with a limited number of coaches — what could possibly go wrong?
Turns out, a surprising amount of stuff could (and did) go wrong. Coaches were using cameras with video formats we’d never seen. That caused encoding to fail way more often than was acceptable. They got into situations we hadn’t taken into account using our Tag a Game feature (odd game situations, connectivity problems in gyms). A few college teams that have the Pro tier of our existing service tried the new version, and some features didn’t play nicely with that at all.
To top it all off, the thing we were most excited about and optimistic for wasn’t working well either. Team stats from our Tag a Game app weren’t matching with game video an acceptable percentage of the time.

It was rough for a while. Some coaches bailed on the beta immediately because it wasn’t stable enough. We spent the first six weeks of the season getting the product to where we wanted it to be, from a stability standpoint. From there, we were able to resume our focus on usability and workflows. But the experience meant that we weren’t able to test all the assumptions we’d wanted to during the season.
However, by the end of the season, we felt extremely confident in the technical feasibility of the product, and that early-season pain helped us get there.
Goals Aren’t Guesses — Build, Measure, Learn
Just before the season I met with Kyle, Hudl’s head of Coaching Tools, as well as Allie, our basketball business development person to discuss the goals for the season. I’d come up with some metrics I wanted to track, along with some goals I wanted to hit that I thought would help us determine whether aspects of the product were performing well or not. We talked through those metrics, tweaked the goals, set up our dashboards, and hoped for the best.
In retrospect, while it was good to have metrics in mind, most of our goals were way, way off because they were simply guesses. A better approach would have been:
- Define our One Metric That Matters (OMTM) for each aspect of the product or assumption we wanted to test.
- Establish a baseline during the first couple weeks of the season.
- Use that information to inform our next set of changes and goals for that metric.
- Measure the results of those changes.
- Iterate or move on to the next challenge.
I want to emphasize that I’m not anti-goal. I’m anti-guess. Product goals should be based in reality, which means they should be based on solid baseline numbers and some aspect of your product you’re actively testing.
What’s a “Scoring Threat Inbound?”
Every design choice you make, including button labels, is just an assumption about what you think is best. You have to continually validate those ideas through rapid, regular feedback. One of our challenges this season was finding a way to keep qualitative feedback coming in during the beta.
While there were a few beta coaches within driving distance, we had a lot of beta teams scattered throughout the entire country. Prior to the season, we brought a number of coaches into the office to try out the app. That was ideal for testing usability and our app’s workflows. During the season, our team also continued our regular practice of calling a few coaches a week to hear how the beta was going. I also encouraged coaches to email me or to call support with any urgent issues.
Despite all that, we still felt like we weren’t getting enough information about how things were going. We did two things that really amped up the quantity and quality of feedback. First, around the midpoint of the season, our designer Craig switched our feedback system from UserVoice to a standalone forum for beta coaches. This helped coaches reach out more directly to us with their ideas and issues, without feeling like they had to search for or vote on preexisting ideas.
The most valuable thing we did all season was embedding ourselves for two days with a basketball team in De Soto, KS. That visit really made us realize how coaches were using both our current and beta products. We saw aspects of the products that they really love, and features that we didn’t validate as thoroughly as we should have. You can read more about our takeaways from the visit here.
How’d Things End Up?
Even though things didn’t go perfectly, we learned a ton from being in beta for an entire season. There were some concepts that clicked:
- A live score that updates!
- Team stats automatically associated with video!
And some that didn’t:
- What does this button do? What’s a “scoring threat inbound?”
- Why are you asking me to “bookmark” my film?
We ended up with a much clearer vision of where our new product needs to be in order to provide a better basketball video experience for our coaches and athletes. We’re stoked about the future of Hudl Basketball in 2014.