Understanding User Interactions on Mobile
Understanding how users interact with your product is a key to lean software development. Seeing people trip over your navigation or completely turn around your expected use case may be cringe inducing, but it’s critical to understanding if what you built delivers. We’ll take a look at software we’ve used at Hudl to see exactly how a person uses an app in a non-intrusive manner. It’s not only saved us loads of time but also greatly increased our sample size.
Understanding User Interactions on Mobile
Understanding how users interact with your product is a key to lean software development. Seeing people trip over your navigation or completely turn around your expected use case may be cringe inducing, but it’s critical to understanding if what you built delivers. We’ll take a look at software we’ve used at Hudl to see exactly how a person uses an app in a non-intrusive manner. It’s not only saved us loads of time but also greatly increased our sample size.
Understanding how users interact with your product is a key to lean software development. Seeing people trip over your navigation or completely turn around your expected use case may be cringe inducing, but it’s critical to understanding if what you built delivers. We’ll take a look at software we’ve used at Hudl to see exactly how a person uses an app in a non-intrusive manner. It’s not only saved us loads of time but also greatly increased our sample size.
There are several well known methodologies to see and understand actual use of desktop and web software. From low-tech methods like sitting behind someone and watching them use your software to high-tech tools like UserTesting.com and Verify. It isn’t as easy with mobile. For me, hovering over a person while they tap on a five inch screen has proven to be a tough experience. It doesn’t feel like a users true usage environment (for them and us) and small screens can be difficult to crowd around.
One tool that helps me better understand Hudl’s mobile users is AppSee [edit: This product is no longer available]. It’s a mobile analytics SaaS that records videos of people using our mobile apps. We use it to sample a small percentage of random sessions to see how users are truly using our apps. We’ll search for videos based on an action such as “Edited Breakdown Data” to see Hudl in action. We’ve used the content to help find flaws in our design assumptions that user interviews didn’t uncover.
A surprising, yet obvious, discovery
After releasing a redesigned 4.0 version of the Hudl iOS App, we noticed people love to tap their profile pictures. A lot. When a user viewed our main menu, many would tap on their profile picture, presumably to take them to a profile view. This was on our roadmap, but it wasn’t prioritized with the initial release.
This wasn’t an action tracked by any of our standard analytics because it wasn’t something we expected. After watching several sample sessions we recognized the problem, reprioritized the profile and linked the profile picture to an editable profile page.

It’s just too slow!
My favorite example of watching users to see how they interact with the app was with a highly-requested feature we built for iPads called “Edit Breakdown Data.”
In our initial prototype we modeled much of the user experience around our web implementation of the feature. More or less a spreadsheet that lets our coaches enter data about a clip they’re watching (e.g. offensive formation, play, yards gained).
We gathered feedback on wireframes and hi-res mocks in user interviews and everything seemed to line up. We developed a beta of the feature over the course of a couple weeks and released it to 20 beta users.
We got in our user’s way
We pride ourselves on developing software that helps users get their job done with minimal friction. After watching people use Edit Breakdown Data, we realized we’d done the opposite. Users were entering the same information for multiple plays but were slowed down by having to scroll through a large list and select the same data each time. It was taking users 20 minutes to edit the data columns on less than 20 video clips. This is something that would take two or three minutes on the web. Qualitative feedback from verbal user interviews indicated the same.
If at first you don’t succeed
With this in mind we went back to the drawing board for version two. Based on feedback from users and what we saw in AppSee, we wanted to make it easy to select data that is used repeatedly. We added a “Recent Data” column in addition to the full list of data when the user tapped on a cell to edit it. We display values chosen recently, as well as the most-used. We also sped up certain animations to create a feeling of speed. While the app didn’t process data any faster, small changes like this can improve the feel of the overall user experience.
We’ll be releasing this update to the same beta group in a couple of weeks and going through the same methodology for qualitative and quantitative feedback. This feature will be released to millions of users shortly thereafter. As coaches add more data to their video this will both save them time and give data based insight on how to improve their game.

Originally posted on Medium.