A/B (or split) testing refers the practice of comparing the effectiveness of two versions of a web page. Typically when we talk about A/B testing it's in the context of a PPC, email, or other campaign designed to drive traffic to those pages. There's no doubt that this sort of testing can yield valuable results and improve the effectiveness of your campaigns. However, A/B testing outside of a specific campaign can be a valuable tool to improve both the user experience and the content on your site.
I'll use a example from a website that I manage. I was tasked with the creation of a new section of the website for a new medical service. I've done this sort of task many times in the past but this project was unique because this service was available to both men and women but in very different forms. With this in mind, this new section of the website had to appeal to both men and women and encourage them both to engage and click through to other pages to learn more.
I decided that the best way to appeal to the different audiences was to split them up from the beginning. I created a landing page for the new section that immediately made it clear that women should follow one path and men should follow another. My idea was that both groups would really only be interested in what specifically applied to them.
In the interest of ensuring that I was creating the most useful pages possible, I decided to run some A/B tests against a page that took a different approach. On the second page I started off with more generic information about the service. There was an overview and a few paragraphs discussing the key benefits of this new service with the opportunity to dive in to specifics based on gender following up toward the end of the page.
After setting up the testing I sat back and waited for Google to tell me that I was right. :) As it turns out I was completely wrong. The visitors that came to the second page were much more engaged and went on to spend more time on the site and visit more pages than those that landed on the first version. It appeared that the general interest in the technology drew more interest than specific information for the particular user. In the end, being wrong wasn't a bad thing since I was able to correct my mistakes and develop the most successful page possible.
I went on to test video content and headlines and further optimize this particular section of the website. The end result is content that is more engaging and relevant to the user. This in turn drives more leads and a better return. If I hadn't done that testing then I never would have known what I was missing.
I would encourage you to not just think about A/B testing when you launch a new campaign. Look for opportunities on your site to improve the user experience and see what you can learn about your customer and how they engage with your site.