Copywriter Bonks His Head, Loses Faith In Testing 10

Please, don’t call a doctor just yet.

Sure, I may be the last person you expected to downplay the role of testing in copywriting. After all, I’m the copywriter who tests before, during and after writing copy for clients.

And EVEN I am a bit hesitant to say this publicly, as I do think more people need to be testing their sales copy.

But there’s something I’ve noticed going on in some of the Internet Marketing forums that I frequent. Specifically the Warrior Forum.

People ask for advice on a specific strategy, tactic, technique or what have you… and instead of getting valuable feedback, they hear a loud echo: “You need to test!”

While that response is better than a large number of respondents who chime in with their untested opinions. It’s also a response I find troublesome.

So I replied…

Sometimes I fear “You need to test” has become just a different way to say “I don’t know.”

Indeed, “You need to test” has become a cop out.

Yes, I believe everything should be up for a test at all times when reasonable.

But that doesn’t mean you start with two random variables and pit them against each other to see which comes out on top. There’s a better way.

In that warrior forum thread I likened this to waiting on a room full of typing monkeys to perfectly replicate Shakespeare. It may technically happen at some point, but it’s the long… even wrong way to go about it.

Instead of testing two random variables, you should start with a proven, successful model and start testing different variables from there.

And if that’s what you do, then asking others for “what works” deserves a better response than stopping the conversation with the call to ‘test it’ yourself.

In conclusion, I think responding with “You need to test” can actually be the WRONG answer when you can find plenty of evidence already pointing in one direction if you choose to look.

And while testing may always be the final answer, modeling success should often be the first step.

10 thoughts on “Copywriter Bonks His Head, Loses Faith In Testing

  1. Reply Paul Hancox Oct 4, 2010 12:44 pm

    I agree, and in fact your post gave me pause for thought. I’m a big split testing advocate, but at the same time, just saying “Test!” can be a bit of a cop out.

    For example, copywriters know there are certain principles behind headlines, so just by looking at two different headlines, we may be able to predict which is likely to work better.

    As you say, testing gives the final answer, but modeling success and also using time-tested principles is a great starting place.
    .-= Paul Hancox´s last blog ..Why People Aren’t Taking Your FREE Stuff =-.

  2. Reply Kevin Oct 5, 2010 1:39 am

    Stephen,

    I’ve always handled this by saying: I believe “X” will work, but you’ll have to test it to be sure.

    The truth is, one tactic will work like gangbusters for one list… while it’ll fall on its face for another.

    Beyond the obvious and well-known copy conversion devices, there are far too many variables to know what will work best — and when — without testing.

    So, I often give the best answer I can, with the testing qualifier. Not so much to give me wiggle room, but because it’s being honest, and not the hot-shot authority.

    It’s embarrassing to give a strategic answer, and be wrong.

    Just my two cents.

  3. Reply Stephen Dean Oct 5, 2010 12:07 pm

    Thanks Paul, happy to have your backing and proud to have made you think 🙂

    Kevin – That’s the perfect response in my opinion. It doesn’t stop the conversation, it adds to it.

  4. Reply Dr Martin Russell Oct 6, 2010 1:06 am

    Two points Stephen.

    I like James Brausch’s testing suggestion
    http://jamesbrausch.com/blog/testresultsinternetmarketingforums

    But also thought you might be interested to know what Jim Stone of SplitTestAccelerator.com is putting together for getting ideas for testing. [The site is having a revamp this week – from the feedback he got I suspect!]

    http://www.optimizersclub.com

  5. Reply Stephen Dean Oct 6, 2010 1:39 pm

    Thanks Martin. I forgot about James’ post, but I do remember reading it.

    For those who haven’t read it, James asked a series of questions on several different marketing forums.

    Each question only had 2 possible answers and only 1 correct answer.

    Statistical randomness would suggest getting the answer right 50% of the time. But people who responded on the forum apparently got the answer wrong more than 80% of the time.

    And he reports that the Warrior Forum had the worst guessers, getting the answer wrong more than 93% of the time!

    I’m not sure if James was baiting with his questions or not… meaning asking questions where the right answer is contrary to popular opinion on purpose…

    …but it really doesn’t matter.

    Apparently, so few people test for themselves that it’s hard to find a successful model to follow on IM forums.

  6. Reply Jim Sansi Oct 7, 2010 8:16 am

    Your hanging out in forums? 😉

    I think its easy to fall into that trap, especially when something is converting into sales and you don’t want to mess with it.

  7. Reply Ryan Healy Oct 8, 2010 2:45 pm

    I really liked this post. I agree that “Test it!” is often a cop-out. Because they are certain principles of advertising that, in most cases, shouldn’t be violated.

    Now, when it comes to nitty-gritty details like, “Should I say Dear Friend or Dear Copywriter at the beginning of the letter?” then my answer is definitely… test it!

    …but only if that is worth testing. Usually there are far better things to test: headlines, subheads, lead copy, etc.

    Ryan
    .-= Ryan Healy´s last blog ..Ken McCarthy More Dangerous Than Salty Droid =-.

  8. Reply Stephen Dean Oct 8, 2010 4:58 pm

    Yup, still hang out there periodically Jim. I think for the right reasons though, and that strategy has been good to me.

  9. Reply Stephen Dean Oct 8, 2010 5:04 pm

    Thanks Ryan, I’m feeling satisfied now that I have more back up 🙂

  10. Reply Gogo Oct 8, 2010 5:05 pm

    Stephen,

    Thanks for an excellent post. Your answer within the post is the “right” answer if there can be such a thing.

    Think about how a scientist may set about to approach an unbounded problem, one not fully defined.

    It must start with a series of observations upon which a hypothesis to be tested will rest.

    In marketing, the repeatedly proven and observed principles and phenomena would mark the bases of such hypotheses…which could then be set up in A-B or other tests.

    The point is that there must be some fundamentally grounded principles which both shape the direction of inquiry/observation, and which in turn get to be shaped by the findings of your subsequent tests.

    Of course Ryan also brings up a great point. Tests should be “strategically relevant” and prioritized in accordance to what the overall objective is.

    Great post.

Leave a Reply