How to Spot Bad Marketing Experiments

When marketers and agencies experiment with a new platform or run A/B tests, they love to share their insights.

And hey, we love a good experiment as much as the next marketer.

But most marketers aren’t statisticians… so they often craft case studies that are full of statistical errors.

Why this matters

Many studies are often not representative of marketing as a whole.

And taking bad experiments to heart could hurt your own marketing.

So the next time you come across an experiment or case study…

Here’s how to spot bad marketing experiments

The study was conducted with a small or unspecified budget

We recently read a study that ran on a budget less than $400.

On many platforms, this simply isn’t enough to provide representative results.

The margin of error is just too big.

Too many variables were being tested at once

There are lots of variables that go into marketing.

And if a study doesn’t explicitly control for all of them except for the one they’re testing, you should be wary of the data.

Data being presented as objective, broad insights

Any good experiment will recognize that, although the data may be useful, it’s not the end-all be-all for the industry.

If you ever see someone presenting their data as the objective, final truth, you should be cautious.

Don’t get us wrong… We encourage all marketers to share as much data as possible. That’s how we all get better at this thing!

But you know the classic adage: Don’t believe everything you read on the internet.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *