A few weeks ago I challenged readers to design a better book cover for How to Fail at Almost Everything and Still Win Big. I promised that if one of the new designs tested better than my original art I would use the new design for the upcoming softcover book release. I embarked on this path because a lot of people told me the cover I designed for the book was hurting sales. I decided to test that hypothesis and perhaps improve things for the softcover release.
I predicted that a new cover design would not outperform the original. Today I have the results of the marketing tests. Before I tell you the results, do you think you can you predict a winning book cover design?
First, some background on how I selected the covers for testing.
I picked three designs to test. One is my original cover. The second is from a professional designer who competed against other professional designers on a site called 99designs.com. That cover design is labelled Lilam. The third cover I tested was the “best” of the ones submitted by blog readers – according to my publisher and me – created by Daniel Thornton. (Nice work, Daniel of qualifycomics.com)
My publisher created Twitter card ads and ran them randomly to see which ones generated the most click-throughs. I expected each cover to have similar results. But one of them blew away the other two designs. It wasn’t even close.
Are you ready to test your powers of prediction? Here are the three designs as they appeared on the Twitter ads. Tell me in the comments which one you thought was the run-away winner in click-throughs.
Daniel Thornton’s Design:
Scott’s original design:
Okay, if you have your guess in mind, here are the click results:
Scott’s Original Cover: 1,312
Daniel Thornton’s Cover: 481
Lilam’s Cover: 326
This is far from a scientific test, and one can see lots of holes in the method. But it seems an easy decision for me to keep the original art.
I have a few ideas about why my art outperformed the others. One suspicion is that people have seen the original cover art so seeing it again was just reinforcement. That would bias the test.
Another factor is color. My original art uses a color that has been well-tested on the Internet and we know it attracts the most clicks. On CalendarTree.com we changed one button from green to burnt orange and increased clicks 13% with no other change. So I know my cover color was smoking the two competitors.
My cover design also has some intentional complexity that makes you stop and wonder how it will all turn out. Will the giant shoe crush the stick figure or will he leap to safety on the bag of money? The science would say that forcing you to stop and think will make a bigger impression. Daniel’s art depicts the aftermath of the fall and is straightforward. Lilam’s art has some mystery about how the petal-picking will turn out, but that might seem a bit overused as a metaphor.
My original hypothesis was that different cover designs for this book would test about the same. I was very wrong about that. And it does indicate how important it might be for publishers to do rigorous cover design testing. Could a good cover design really triple the click-throughs as my unscientific test suggests?
I also inadvertently confirmed the talents of my editor and publisher. From the start of the book project I resisted their suggestion that I create the cover art myself. I argued that I don’t have that specific flavor of talent and that we needed a professional designer. The evidence suggests I was wrong. But in my defense, I only did the line art. The title and color treatment came from the publisher’s professional designer.
The good news for me is that my original cover art probably helped sales more than it hurt. The bad news for me is that I don’t have a new cover design that will test better than the original.
I hope this was interesting to you.
Listen to me yacking with James Altucher on The James Altucher Show podcast.