AB Testing should not be "Ready, Fire, Aim"

David Byrd : Raven Call
David Byrd
David Byrd is the Founder and Chief Creative Officer for Raven Guru Marketing. Previously, he was the CMO and EVP of Sales for CloudRoute. Prior to CloudRoute, He was CMO at ANPI, CMO & EVP of Sales at Broadvox, VP of channels and Alliances for Telcordia and Director of eBusiness development with i2 Technologies.He has also held executive positions with Planet Hollywood Online, Hewlett-Packard, Tandem Computers, Sprint and Ericsson.
| Raven Guru Marketing http://www.ravenguru.com/

AB Testing should not be "Ready, Fire, Aim"

Although it seems that the world moves at a lightning pace, the best things are still done after some thoughtful research, planning and excellent execution. 

Ready Fire AIM.jpgA/B Testing (also known as split testing or bucket testing) is a method of comparing two versions of a webpage, email or app against each other to determine which one performs better. For too many marketing people it has become an acceptable path to developing digital marketing and inbound marketing campaigns.


The simple concept behind A/B testing is to evaluate landing pages, emails and other digital lead generation elements by altering design, content or subject lines to improve conversions. After altering the lead generation elements, the conversion rates are compared and the best performing version is used to communicate with the larger number of prospects. The assumption is that the design, content or subject line with the highest conversion rate will best serve the objectives of the marketing campaign. However, too many digital marketers use A/B Testing as a crutch, neglecting to perform the necessary research and analysis of their markets and product receptivity. Instead, using A/B Testing, they build lead generation campaigns and continually change them hoping to eventually find something that works. Eventually, they expect to hit upon a winning combination.

“Ready, Fire, Aim” was popularized by Ross Perot as the antitheses to “Aim, Aim, Aim”. The point being that too many companies suffer from “analysis paralysis” and never get to the execution stage. However, “Ready, Fire, Aim” was not supposed to replace all logic and research. A reasonable understanding of the direction that we wanted to fire the gun was required prior to firing. Too many marketers are using A/B Testing without performing even remedial research. A/B testing is most successful when you have clearly established the target audience, products and messaging that best communicates your value proposition and business. The better marketing professionals use A/B testing to achieve incremental performance through minor alterations, and are not expecting dramatic performance improvements. A/B testing does not replace good data collection, interpretation and planning. If those steps are skipped, then the underlying problems with your prospecting, marketing campaigns, pricing, promotions, etc. will remain.

While performing A/B testing is an important step to optimizing conversion rates, it is not a short cut to Marketing Excellence.  So, as you leverage A/B Testing for improving your lead generation, make sure you're not forever stuck cycling “Ready, Fire, Aim”.



Related Articles to 'AB Testing should not be "Ready, Fire, Aim"'
Inbound Marketing Process
PBX Global Revenue.jpg
IMC.jpg
Feedback for AB Testing should not be "Ready, Fire, Aim"

Leave a comment

Featured Events