Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How Obama Raised $60 Million by Running a Simple Experiment (optimizely.com)
211 points by dsiroker on Nov 29, 2010 | hide | past | favorite | 53 comments


"That would have been a huge mistake since it turns out that all of the videos did worse than all of the images."

It annoys me no end when websites have videos explaining their product and no equivalent text. I hate watching videos about products. Always make the video optional and secondary.


Yes, always. Assuming your audience is maxklein. Otherwise, best to test.


You're right that they should test, because their audience might be different. But I admit to being the same way as him: when I hit an article that's nothing but video, I skip it.

I can read a lot faster than they can talk and I could have read a lot of other things in the time it takes to watch a video. I also have a lot of things I want to read and nowhere near enough time to read them all. In fact, there was an article here on HN that linked to a bare video. I was going to read the article, but when I saw it was a video, I closed the tab immediately.

So be sure to test and make sure, but remember that there might well be a lot of others like me who would rather read than watch a video.


I'm similar. I'll watch about 2 minutes of a video, and see if the presentation is engaging enough to warrant spending the time to watch the video.

If I can download the video, I do that and watch it on my trainride home.

Else the window gets closed


Obamas audience was massive. Millions and millions of people. I believe that's representative of the general taste of people.


True, but extrapolating this single test with a particular audience looking for a particular type of information and being presented with these particular images and videos out to all websites is probably not the wisest thing.


Of course, that's true, but we can't test EVERYTHING. If a very large test has been made with a very general population, then I think one should just make the assumption that will work similarly on your audience.

If we start testing everything, we'll just be stuck testing the whole time! There are an infinite number of things to test, if there is a large and credible study showing one particular thing to be true, and no "anti" study, then I will just accept it and not retest it. Rather, I test things unique to my product or presentation.

Running a test, particularly a multivariate test is not an easy or quick task. It requires a lot to do it properly, and if I had to do hundreds of them, I'd be wasting a lot of time.

Most times as a small company, one should not be focusing on if a button is red or green, but on if you are at all in the right business.

Such small-scale and detailed testing is really a way to stick your head in the sand, when larger optimisations or corrections could lead to a much much bigger jump.


I get what you're saying, and you're not wrong in general, but in this specific case, there are plenty of counter-examples. For example, there's an awful lot of affiliate marketers out there who have found that video squeeze pages convert at a much higher rate than text squeeze pages. And they only know that because they tested the shit out of the options, not because they prefer one over the other.

I hate watching a 5-minute video when I could skim-read a 30-second block of text instead. But I'm not necessarily representative of the population as a whole. And specifically as it relates to text over video, a huge number of people are finding that video converts better than text for their audience. YMMV.


>I hate watching a 5-minute video when I could skim-read a 30-second block of text instead.

But it may be that even if the majority of people feel like that the people that will watch the video will convert better than the people that will read or skim-read a block of text.


It would be interesting to know how Dropbox is doing. http://dropbox.com/ shows nothing but their logo, a big play button (with “Watch a Video” as a caption) and a big blue “Download Dropbox” button.

No explanation beyond the video (more than two minutes runtime), not even a hint at what Dropbox is doing on that page. You have to click if you want to know what Dropbox is. Did they test that? Does it work because it is hard to concisely describe all the strengths of Dropbox? Maybe it works because the main vector for the recruitment of users are invitation emails which do include a short description of Dropbox?


I like when an image can turn into a video when clicked on, that way it acts as another screen for explanation but also a video if I want to watch it.


The "Learn More" button and this whole page are spammy. I remember being annoyed by this splash page, I just wanted to get to the main site. When users saw the "Sign Up" or "Join Us" buttons they probably realized it was an email phishing form and then searched for the hidden "Continue to Website" link. The "Learn More" button otoh seems to be what you are looking for, an escape, but when the button complained about the missing email some users probably went ahead and entered it even though they didn't want to because it seemed like the only way into the site.


Precisely. It doesn't take very many A/B tests on lead-gen forms to realize you can get much better conversion rates if you're willing to trick your users into clicking on stuff. "Learn More" isn't a better title for that button, it's a deliberately misleading title for that button. You can have a button labelled "Get a free puppy" and a button labelled "Spam me every 24 hours and sell my e-mail address to the highest bidder". Guess which one will convert better; Guess which one is honest.


The best thing about this article is that it is a giant ad. Informative, educational and useful, but an ad nonetheless.

Wonderful marketing and great article as well.


Most company blogs are giant ads. What matters though is this article still provides value despite being an ad. I agree that this is wonderful marketing.


Indeed. I thought it was particularly clever how he apparently used the campaign data, but ran it through his product to get the "experiment results" screen shots.


I assumed that was Google Website Optimizer or whatever they actually used back in 2008. If not, they should make that clearer.


This post is an ad and at the same time an effective demonstration of the usefulness of advanced multivariate A/B tests. This dual purpose content/ad is very relevant to this community because we care about usability and good UIs. So while being a self-promoting piece it is also a well written tutorial of sorts.


It was educational and informative. It wasn't until you mentioned it as being an ad that I went back to see who actually wrote the article. You increased it's effectiveness by focusing on it...and now even further. ;)


I don't mind that to be honest, because it's a product that I may end up using. It provides a real value, just like the ad did. :-)


Kind of a side note, a complete noob when it comes to A/B, I tried out Optimizely after reading this and I've gotta say, this thing is put together well. I think I created 6 variations for our home page in about 2 minutes, saved it, and started watching results flow in. Amazing execution.


I completely agree. I just started using Optimizely today, and the ability to make a small change instantly across all pages on your site is extremely valuable.


He seems to assume that the contribution and volunteer rate of the added marginal people (those who signed up after seeing the best performing media / button, but would not have signed up with the default media / button) is as good as the rest of the people (those who would have signed up regardless).


He doesn't mention it, but that's easy to determine. Simply see what the average contribution of people who contributed during the A/B testing was per image/button combination. If there not all about the same then there may be some effect (negative or positive) that the added marginal people brought. If they're about the same, I think its a decent assumption to believe that it will be the same going forward.

And of course, you can simply validate it by seeing if it is the same going forward.


Simply see what the average contribution of people who contributed during the A/B testing was per image/button combination

We didn't measure the average contribution for each variation for this particular experiment but in future splash page tests we measured eventual $ per pageview for each variation and it was very closely correlated to % signup rate.


$400/month for 200K visits? The pricing seems to be based on how much they'll contribute to your bottom line rather than what the market will bear, what competitors are charging, whether they should focus on growth rather than early profit, etc. This might be a great opportunity for someone to clone the business and charge 25% of their prices, which will still be $250/month for 1 million visits.


You think the opportunity to be the cheap, crappy competitor with an audience of technically inclined poor people is attractive? I think that is terrible positioning. If you charged $50, that would still be a whole lot of ramen, and you'd lose your customers to the first still cheaper, still crappier alternative or to free options. (There exist compelling free options for A/B testing if one is technically inclined.)

$400 is not a lot of money when you probably demonstrate increases in revenue. (It is a fraction of BCCs gains from A/B testing. Nationwide political campaigns, Fortune 500 companies, and real estate firms all have slightly more money to spend than BCC does.)


The experiment showed that video was a poor choice for Obama. Images worked better for him. However, I do not think anyone should generalize from this example. Obama was supported by specific demographics. The poor and working classes favored Obama to a disproportionate degree. These are the people most likely to still be on dial-up Internet connections. There are the people with the least access to broad band. Therefore, these are the people for whom video would be a poor choice.


I wonder why this was downvoted? What part of this do people disagree with? I made 3 main points:

1.) Obama had disproportionate support from the poor and working classes. This is easy to verify, there were many news organizations that covered this. Here is one example:

http://geocommons.com/overlays/6095

2.) The poor and working classes have less access to broadband Internet. Is anyone seriously going to dispute this?

3.) Therefore video was a poor choice for Obama, since many of his supporters lacked broadband Internet access.

My friends, instead of downvoting, why don't you explain what you are thinking? Does someone disagree with my thesis? I feel like I'm stating something fairly obvious here.


I didn't downvote you, but I do think you're mistaken.

1) Your source cites the November 2008 exit polls. The campaigning mentioned in the article was being done through 2007, when Hillary Clinton still had her hat in the ring. Now, it's hard to use campaign donations as a gauge of support for presidential candidates (particularly among people who are too poor to even afford broadband). However it's easy to point out that Hillary Clinton's platform was more beneficial to the working class, whereas Barack Obama's vague Yes-We-Can appealed more to broadband-connected hipsters like me. As you can see from About's (pro-Clinton) US Liberals blog[1], half of the Obama campaign chest going into 2008 came from individual donors, whose email addresses were harvested on websites just like the one described in the OP.

2) I'll also dispute the claim that the poor have less access to broadband, having grown up so debilitatingly poor myself. In America, internet access is cheap, and even for the poor, the value of a fast, reliable internet connection far outweighs the cost. Those who are remain on dial-up connections are residents of low-density rural areas where broadband is unavailable. These people are a) unlikely to have been Obama supporters, and b) unlikely to have gone around donating campaign money on the Internet.

3) Barack Obama's was the most Internet-connected campaign in US history. If dial-up users were as large a chunk of his support base as you say they were, he would have spent a lot more time going door to door in Des Moines, and a lot less time on YouTube.

[1] I wanted to link to the New York Times article, but it's probably more informative all around if I just link to a blog which quote-mines it. At least About.com doesn't make you log in. http://usliberals.about.com/b/2007/07/19/hard-truths-about-d...


This is not a subject where anecdotal information reveals much, because the USA is diverse and few of us understand what all of the different population segments face, in terms of information access, or Internet access. According to Wikipedia the digital divide still exists, and income plays a role:

http://en.wikipedia.org/wiki/Digital_divide

While it admits to several definitions of the divide, it mentions broadband access as one aspect of the problem.

My original point remains: Obama had disproportionate support from those population segments that were likely to have limited broadband access, therefore video would be a poor choice for him.

The larger point is that you can not generalize from this one example, mentioned in the originally linked article. Video was a poor choice for Obama, but might be a good choice for a new web startup that is pitched to a demographic where broadband access can be assumed.


Of course the digital divide is affected by income, but you're missing key pieces of the story.

1) The digital divide between classes is negligible compared to the divide between rural and urban dwellers, particularly wrt broadband access[1]. Also, n.b. class based Internet analytics is prone to error, since the poorest segment of America lives in remote rural areas where broadband access is unavailable to begin with.

2) As I pointed out above, the OP was in reference to 2007, when the Democratic candidacy was still up in the air. Further, Barack Obama drew support (particularly individual campaign donations) from young Internet-savvy people, whereas the real blue-chip donations went to Hillary Clinton (and to a lesser extent, Chris Dodd). Your data is from the exit polls in 2008, after the Democrats had (more or less) formed into a united front.

This makes your larger point untrue[2]. The data from the Obama campaign indicates that these particular videos were less effective than the images, and to assume this is due to bandwidth complaints is specious at best.

[1]http://pewinternet.org/Media-Mentions/2006/Rural-areas-laggi...

[2] Well, it is true that you can't generalize from this one example, and that video may still be a good choice for web startups, but not for the reasons you suggest.


College students have better-than-average access to broadband via campus networks.


A majority of folks earning 200k+/year went for Obama, as well.


How many people in the USA make over $200k a year? And how many people in the USA can be classified as poor or working class?

If you make $200K a year, then you are in the upper 1% of income in the USA.

When I said "poor and working classes" I was thinking of the bottom 2 quintiles of income distribution. I think this is a fairly common reference. Certainly the usage is common in articles that write about the demographics of elections.

You realize there are more votes in the lower 40% than in the top 1%?


"Sending email to people who signed up on our splash page and asking them to volunteer typically converted 10% of them into volunteers."

Is it wrong for me to be a little startled by that conversion rate? I wouldn't have expected anything that high.


Is it wrong for me to be a little startled by that conversion rate? I wouldn't have expected anything that high.

It depends on how much of a commitment they're asking from volunteers. If you've already signed up on the website it's not much more to volunteer to pass out fliers at your office. If they want you to work a benefit or something than ya I'd agree it's a little high.

Keep in mind too that these are people who have already actively signed on to support the campaign.


Good points, though the best-testing design was the one that didn't phrase signing up as a commitment, but as an opportunity to "Learn More", hence my surprise.


That's a good point and when I was reading the article I actually predicted that the "Learn More" button would win because it was the least committal but I never drew the connection between that and volunteer signups.

A few theories:

1) Maybe the feeling of not committing by pressing the button made people more comfortable signing up to be a volunteer, figuring that it probably wouldn't be too much of a commitment

2) Maybe the content on the Learn More page was just so good that the only thing that really mattered was getting more people onto that page and they would be comfortable

3) Perhaps the type of person more likely to click after seeing the picture of the family (Stay at home moms perhaps) was more likely to have time/desire to volunteer.

As I think about it, assuming that one of these ideas is right which is probably not the case, I would guess that it's #3. I don't know the demographics of the people who volunteered but around me it seems like it would be the moms who go to the site, and they would probably be more likely to be skewed toward clicking with the family and non-committal button, while also being more likely to have the time to volunteer.

Disclaimer: This is pure, assumption based, uninformed speculation of the worst kind.


As I understand it there is no "Learn More" page just a button: you give your e-mail, zip code and press the button.

After that they send you the e-mail with the suggestion to volunteer. So 10% look impressive.


As I understand it there is no "Learn More" page just a button: you give your e-mail, zip code and press the button.

You are correct. All of the variations would redirect to the home page regardless of what the button said.

So 10% look impressive.

Keep in mind that this is the percentage of people who volunteered at some point during the campaign after signing up on our splash page-- not the percentage of people who immediately decided to volunteer after signing up.


Speculation, yes, but food for thought.


If you wouldn't expect anything that high, and they got something that high, and you weren't startled, that would be wrong.

The startlement is an indication that your predictions were bad and you should update them.


Ask a question, get a meaninglessly literal answer...


You asked a meaningless question.

So you never expected anything that high. So ... what?

Why are you startled? What are you asking? In what ways could being startled be "wrong" of you?


I asked a question you aren't capable of grasping, for whatever reason. There's clearly no value for either of us in continuing this thread.


I asked a question and you prefer to snub and mock me than answer it. There's clearly no value for me in continuing this thread.


His question was, paraphrased, "Is there some obvious evidence I should have been aware of—perhaps because I was previously presented with it but ignored it—such that I would not be as startled about this as I am now? Did I make a misstep in the past which has led me to become the me I am now, who has to make such a large shift in perspective?"


Glad they were able to actually write a short case-study on such a high worth (I'm assuming) case. That's real transparency.


http://www.designing-obama.com/ => I snagged the PDF when it was still free. EDIT: Still free. See responses.

VSA Partners on the Logo:

Part One: http://www.youtube.com/watch?v=etEP1Bhgui0

Part Two: http://www.youtube.com/watch?v=ukIMW833EPE

There's a ton of info out there on Obama's campaign, it's really one of the best branding and marketing success stories.


The PDF is still free: http://digital.designing-obama.com/

You can also read "Barack Obama Social Media Toolkit by Edelman" : http://www.scribd.com/full/10807015?access_key=key-nc75wovez...



For any of those interested in seeing the Mixergy interview with Dan Siroker (founder of Optimizely), check out this link: http://mixergy.com/dan-siroker-optimizely-interview/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: