At Toggl, we’re big on reading Kissmetrics, Hubspot and other marketing blogs, but at the end of the day, conversion success can come from the most random of places.

Last month we decided to (once again) jump on the April Fools bandwagon. Because when everybody’s doing it, you can kind of get away with it.

Long story short, we came up with the idea of replacing the Toggl front page video with a parody of office productivity. Using the Toggl staff as cast, we shot scenes of people tracking their slacking under false time entries.

The Fake Video

For example, this is yours truly:

toggl april fools video with mart

 This isn’t staged, I slack hard.

And this is or support team lead Grethel:

toggl april fools video grethel

You can still see the video here, sans the timer and text overlay, which were added in front-end.

We got the video done fairly quick. From storyboard to canned file, it was done over four workdays. The budget was limited to 7€ (the cost of a SD card adapter), though you could argue that using the staff’s time would bump that up by quite a bit.

But we were OK with the cost. The video was something very different from what Toggl had done on previous years. We also figured there’s a fair chance it might cause ripples in the social media space, so we can write the whole thing off as a marketing success.

So we pushed the video live, set it to play for all users for the duration of April 1st (in their timezones), sat back and waited.

 

And waited some more.

And nothing happened.

The video absolutely failed in creating any buzz on social media. It’s difficult to tell why. Perhaps others simply didn’t find it as funny, perhaps people were just tired of the annual avalanche of tech april fools tricks. Whatever the reason, the joke (like so many others around the web) bombed.

 

But just as it seemed we were done with the video, our leadgen manager Annika noticed a strange spike in new sign up’s on April 1st:

How an april fools joke improved new user conversion

Week over week comparison between 25th and 1st.

The unusual Wednesday spike correlated with the fake video set to run in place of our standard conversion-generating cat piece.

Now, most mature marketers would’ve made some offhand remark about how sometimes weird stuff happens, disregarded the tiny bump and moved on with their work.

But humans (and marketers) can be an irrational bunch, and wars have been started for lesser reasons than this.

The Fake Test

Choosing not to ignore the small bump in new sign up’s seemingly caused by the April Fools video, we thought it might be interesting to test the video’s conversion for a week and took the idea to the CEO’s table.

He was initially reluctant to experiment further with the video.

In general, we favour a “kill your darlings” approach – meaning that if something isn’t working, it’s best to kill the idea quickly and move on to other things. Since time is never in ample supply, this tactic helps us try many different ideas in relatively short time.

As a joke, the video had not really produced the impact we had hoped, and had thus seemingly earned it’s death sentence. But because the tiny conversion bump was just sharp enough not to ignore, we ended up giving the video a chance – this time not as a joke, but as a conversion tool.

We agreed to run the video on the home page for a week to see if it had any effect on traffic converting to sign up’s.

And as our numbers would have us believe later – it totally did.

But as hinted by in the post title, the “test” was a fake one. Because our developers have plenty of stuff on their hands (including other onboarding tests), we had to run the fake video test on very basic assumptions, and without a control group.

 

If you listen very carefully, you’ll hear a thousand statisticians scream in terror.

 

Normally, with A/B testing you want to run two parallel tests to two random selections of users. This helps minimise the chance of week over week variations in user behaviour. Simply running one video to all front page visitors over one week, carries the false assumption that any change in user behaviour on that given week was driven by that video.

Likewise, for the same reason outlined above, we didn’t track what happened to the new Toggl users after they’d signed up on the home page. All we measured was new traffic converting into sign up’s over one week. 

The test was as basic as they come.

Nonetheless, we had our green light, we ran our little experiment and waited for the numbers to come in. Once they did, we were in for a surprise – the conversion rate was up 18.2 per cent.

Which, if you know anything about marketing, is no small thing.

The Real Lesson(s)

So did it work? Certainly, 18.2% is a big leap. But looking at the following weeks since the April Fools experiment, our home page conversion rate hasn’t fallen back. What does that mean?

It might just mean nothing.

I absolutely love this post by David Kadavy, on how he managed to get statistical differences running A/A tests – that is, running identical experiments side by side. Sometimes our minds see what they want to see.

It could’ve been, it couldn’t have – unless you run tests the proper way, with a sound understanding of how statistical analysis works, and on the right assumptions and with the right sample sizes, you’re really just wasting precious time and energy at the expense of activities that actually matter.

But does this mean we should not have tried the test at all?

No, because, some of the activities you might miss out on when worrying too much about wether the test is set up properly, is coming up with new ideas.

Yes, that’s a big one too.

There is real substance to the idea of “just going with it.” We could have just written up a blog post about Toggl syncing their data with NSA’s servers. Or we could’ve just done nothing at all. But by giving a more elaborate prank a shot, and taking it out for a longer spin, we made way for new ideas in the future.

Yes, the test was overly simplified and the results put the video’s converting power somewhere between Bigfoot and fair elections in truthfulness, but it did have the effect of inspiring other ideas. While I don’t think we’ll be changing our home page video any time soon (I mean the cat is just terrific!), but since toying around with the April Fools video we’ve been bouncing around a few other visual ideas we perhaps wouldn’t have thought of trying otherwise.

And let’s not forget about an added benefit – we now have more hands-on experience with video shooting and editing, which might save us precious learning time in the future.

But the real lesson for us was this – never stop doing new things. Just as you don’t want to get stuck doing endless A/B tests, you don’t want to get stuck in a status quo. Lack of a proper A/B testing platform should never be a reason for not trying out a new idea. Planning is important (if you’re flexible with it), but overthinking is a great way of shooting down good ideas.

“Lack of a proper A/B testing platform should never be a reason for not trying out a new idea.”

Looking beyond the point of valuing ideas, our little fake test was also great for learning about the dangers of A/B testing. We learn by practice, not by reading – our onboarding manager Liisa didn’t get good at her job by simply reading articles. She learned it by doing. We learned, through analysing the numbers coming in from our little test, to recognise potential false positives, discrepancies and fallacies.

 

When you read about online marketing, advertising, content marketing, lead generation, user onboarding etc., you’ll be swarmed by hundreds if not thousands of tactical guides, tutorials and how-to’s. The good ones never fail to mention that there is no single answer.

And a lot of it comes down to saying “screw this meeting, let’s do this.”