Metrics: Twyman’s law

You are the product manager in your organization. You run weekly metrics sync to review the key metrics of your product. Your analytics team presents to you these three analyses:

  1. Time spent on home page has gone up in the last week — User engagement has gone up.
  2. You improved the sign up flow by including the birthday of users that sign up for your product. A week later — nearly 10% of all users were born on January 1st.
  3. There were 0 Active users between 2:00 AM — 03:00 AM PST on March 8th, 2020. No network outage reported.
  4. The team released a product on Wednesday. Since then, over the last 5 days the call volumes have gone down — The call volume peaked on Tuesday the day before the release but since then has been coming down on Wednesday, Thursday, Friday, Saturday, Sunday. You must take a call to revert the release asap.

If you looked at these data points at the surface level, you could either celebrate the wins such as #1 or be confused about insights such as #2, or worried about unknown issues such as #3 and possibly end up panicking at #4.

Situations like these keep presenting themselves. Tony Twyman, a dominant figure in the technical and methodological development of audience research, propounded the principle

The more unusual or interesting the data, the more likely they are to have been the result of an error of one kind or another

I recently stumbled upon the law also called Twyman’s law and was intrigued. By being cognizant of the fact that ‘Behind every data there are deeper level of insights’ — I could save myself from making errors that could have hurt the business otherwise.

So let us look a little deeper at the above three cases, what else could have been going on that might explain the situations:

  1. A change was released a week ago that broke the homepage — CTA click did not work until you click it multiple times. As a result visitors were spending more time on the page. When time spent on page goes up, look for bounce rate, time to load the page, and other key metrics. Key insight is to always look a metric with respect to other trade off metrics.
  2. Users dislike adding birth date while signing up. Since the field is mandatory they pick up the most convenient date - 01/01/01. While adding friction to a user experience — make sure the data collected will be meaningful because if there is a way to avoid sharing meaningful data, users will most in all likelihood end up avoiding. Food for thought - Have you ever filled a one question survey that Youtube asks before stating the video?
  3. Day light saving kicked-in — Does your analytics code take care of daylight saving and other such time zone changes? Bugs due to time zone changes are more common than you think.
  4. Compare data Day of the week to the day of the week. We observed a strict ‘week-likality’ in data where by default call volume peaked on Tuesday. We took a call to keep the test running for longer and no surprise we did not see any impact on call volume in the long run.

In a startup, the feedback cycles are shorter and there is pressure to launch something great and at the same time soon. In such a scenario, it is easy to fall prey to statistical mistakes. So if you are a product manager — responsible for the key metrics of your organization, follow the mantra

Be a skeptic: Trust no one — including data

Especially when you see interesting data that you fail to explain, doubt it.

References:

[1] Twyman’s law: https://en.wikipedia.org/wiki/Twyman%27s_law

[2] Ronny Kohavi: Twyman’s law and controlled experiments https://www.exp-platform.com/Documents/TwymansLaw.pdf

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store