A/B Testing and Experimentation Lessons From the Boston Celtics and Moneyball

By SiteSpect Marketing

August 22, 2013

Share

Analytics and experimentation are famous in sports. A perfect example of this is Moneyball, the popular book by Michael Lewis about Billy Beane, General Manager of the Oakland Athletics, who uses statistical analysis to assemble and optimize the ideal winning team.

Recently we have begun seeing this statistics-based approach in other sports, such as basketball. Brad Stevens, the new head coach for the Boston Celtics, has been known to be numbers-driven. In fact, when he was coaching for Butler University, he was credited as the first-ever college basketball coach to hire a basketball analyst and take a statistics-based approach. His winning record at Butler made him the most-successful third-year college basketball coach ever by making it to the Final Four championship game and losing by only two points to Duke.

Marketing experimentation is very similar to this Moneyball approach, except that instead of player statistics, we rely on digital analytics in the form of visits, bounce rate, conversion rates, heatmaps, and more. In fact, as digital marketers we’re lucky that we can A/B test concurrent changes to our sites and quickly find results, as opposed to basketball, which needs to order A/B tests sequentially.

Let’s explore the analytics-based approach to basketball that Brad Stevens is now building with the Celtics before we get to how this can help you.

On July 9, Brad Stevens hired Drew Cannon (who was also at Butler) as a Basketball Operations Analyst for the Celtics. At Butler, Drew developed numerous reports on player substitutions that maximized the team’s performance, and studied which team practice drills led to better game play. In addition, he developed hundreds of statistics and ways to look at game performance, on and off the court.

To truly understand the culture and performance mindset that Brad is building using Analytics, we have to look back at his time as a coach at Butler University. Before the 2014 NBA season even begins, Brad is sure to pour over previous game tapes, meet with individual players, and develop models on how to measure team performance. He will also likely look at all of his players’ shooting percentages — not only in the open court, but also all around the parquet and at the three-point and free-throw lines. In addition, some of the key statistics he will likely look at include player efficiency, time on court, injuries, and how each of these metrics change depending on who is on the court. After pouring over all of these statistics, along with his coaching staff, he will put together an optimized lineup and continue A/B testing on how to improve.

Experimentation and A/B testing is largely the same – that is, as digital marketers and analysts, we come up with a hypothesis based on a number of inputs – including input from customers and web analytics — and then A/B test that theory.

So what can we learn from this approach to sports analytics as it applies to A/B testing and experimentation?

  1. Filter out the noise. Often as marketers, we have an overwhelming amount of data available to us. Big data is a popular phrase right now describing the vast amount of information available across all of the customer and digital touch points that are being measured. Whether your information is coming from big data, or just a single source, be sure to consider your main A/B testing goals, and filter out the irrelevant data that will not help get you any closer to your main A/B testing or business goal.
  2. Build on your wins. Basketball players and coaches get a chance at half-time to review all of the statistics from the first half. For example, how many shots did players make, how often did the team shoot from the three-point line, and more. Based on their successes in the first half, the team can iteratively build upon what is going well and focus on winning the game. With your A/B testing and experimentation, get some wins and keep building on those to continue the success of your program.
  3. Don’t be afraid to try out-of-the-box ideas. Brad Stevens adopted a data-driven approach when he was the head coach at Butler University and it led to a winning streak. This data and A/B testing-based approach was new in the game of college basketball, and the coaching staff at Butler had to jump out of their comfort zone to get started. With your A/B testing strategy, if you have a hypothesis (especially one backed by data) that would require A/B testing something out of the box, don’t be afraid to go after that opportunity. Often this is a chance to learn and continue to grow A/B testing skills even if your hypothesis does not turn out to be a winner.
  4. Recognize that everyone on your team has strengths and weaknesses. Just as some basketball players may be better 3-point shooters rather than driving to the basket for layups, the members on your A/B testing team also have their own strengths and weaknesses. Knowing these strengths and molding your team around them has huge benefits, in business and in sports. Maybe there are one or two analysts that are naturally gifted with digging through data, and another that can use the data to find out-of-the-box ideas. Each of these skillsets will fit together like a jigsaw puzzle and help maximize your optimization program if there is acknowledgement of the strengths of individuals. Determine in which areas individuals are strong and ensure to utilize those strengths when designing and creating tests.

To learn more about SiteSpect, visit our website.

Share

SiteSpect Marketing

SiteSpect Marketing

SiteSpect Marketing focuses on authoring content for the web and social media.

Suggested Posts

Subscribe to our blog: