MarketingSherpa Video Archive

Landing Page Optimization: How CRC Health transformed decision-making across 140 sites

Daniel Burstein, MECLABS, and Jon Ciampi, CRC Health Group



In this MarketingSherpa webinar replay, you will hear from Jon Ciampi, Vice President, Marketing, Business Development & Corporate Development, CRC Health Group. Ciampi and Daniel Burstein, Director of Editorial Content, MECLABS, discussed how Ciampi chose to engage in optimization and A/B testing. They discussed the results of this testing, which included a $500,000 profit increase in the first five months along with an astounding 14,000% increase in CTR for branded PPC ads.

In regard to achieving testing approval from upper management, "The biggest impact, if you can, is showing the result of an A/B test. If you can show a good result, that will get everybody's attention. When you come back with a 20%, 100% or sometimes a 300% lift, all of the sudden peoples' eyes are turning, their ears are open and they want to listen to what you have to say. But, stay away from details, because if you get too into the details, the conversation quickly ends," Ciampi said.

In this webinar, you will learn:
  • The results of a landing page test from one of CRC Health's rehabilitation facilities

  • How you can use testing to reliably base your decisions on evidence in the marketplace

  • Why you should not to stop at the basic learning of a test, and look deeper into what it is really revealing about customers

  • And much more


Download the slides to this presentation

Related Resources

Optimization Summit 2013 Wrap-up: Top 5 takeaways for testing websites, pay-per-click ads and email

Web Usability: Long landing page nets 220% more leads than above the fold call-to-action — from MarketingExperiments

Online Testing: 6 test ideas to optimize the value of testimonials on your site
— from MarketingExperiments

%REGISTER%

Video Transcription


Burstein: Hello, and welcome to another MarketingSherpa webinar. Thank you for joining us. Today we're going to be talking about landing page optimization. But we're really going to be talking about so much more than that. We're really going to be talking about how to business decisions. How to make decisions for your marketing. How CRC Health is making decisions for more than 100 websites from their landing page optimization and A/B testing.

So to do that we're going to be talking to Jon Ciampi, the VP of Marketing, Business Development, and Corporate Development for CRC Health. Thanks for joining us today, Jon.

Ciampi: Thanks, Daniel. Good to be here.

Burstein: So Jon is out in Cupertino, Calif. I am Daniel Burstein, Director of Editorial Content at MECLABS. I'm here at our headquarters and studio in Jacksonville Beach, Fla. I mentioned Cupertino because in the pre-call, Jon was telling me how he's right next to Apple Computer and has an IT background. I know we always have a lot of IT questions. Don't feel like this is just about healthcare. If you have any IT questions, B2B questions, and really what Jon did at the core of it is questions about learning more about your customer.

If you're not familiar about MarketingSherpa webinars let me give you a quick primer. This is a little different than other webinars. We're not giving a presentation today. We're certainly not trying to sell anything. We are asking a very high preforming marketer, that would be Jon not me, lots of questions, lots of your questions. We got questions when you registered for this webinar.

We have many questions but you can ask questions live on the line now too. You can use the Radio Talk function, right there. You can ask questions through the Q&A on ReadyTalk. You can also go to #SherpaWebinar on Twitter and ask your questions. I will pose them to Jon. On #SherpaWebinar, you can not only ask your question but you can share what has worked for you with landing page optimization and A/B testing. You can find peers from around the world to run tests by. Great way to network. #SherpaWebinar.

I will also be sharing some background information about Jon's story. We've done several reporting with Jon. We've done case studies with him. He presented an Optimization Summit in Boston. I'll be sharing some of those links to #SherpaWebinar while we're talking.

With that let me give you some quick background about Jon. As I said, he works at CRC Health. He is responsible for all web sales at CRC Health. It is a $450 million revenue company. Very interesting back story; Jon has two patents pending that involve machine learning. Like I said, he's got that IT background as well.

Let's jump in by giving you a better understanding of CRC Health and to kick it off with a question here from Werner. He's a general manager. He said, "I deal with the smallest medical device against pain. It is working against pain in the back and in all joints. Is it recommended to create separate, separate squeeze pages for every organ? I wish to work mostly via email marketing."

So I asked this to you Jon because I wanted you to tell us a little bit about CRC Health and about the decision making challenges you had because you have several, several brands to work with, right?

Ciampi: Yeah, we have 140-plus facilities across the United States. We manage over 300 websites. Only about 140 of those are for the facilities. Then we have these marketing sites. We deal with everything in behavioral health from drug addiction and alcohol addiction to eating disorders, all the way through to behavioral disorders. It's a quite broad spectrum of facilities and services that we offer in terms of marketing.

Burstein: As we look at your challenge, that was part of the challenge you faced. There was many, many, different marketing decisions you could take. I wanted to look at some questions from our audience here and see how they tied into your challenge. Jon, who's in Sports Marketing. Very general question, how to fully market your company's potential. Which at the core, I think Jon, was your challenge that we're going to talk about in a minute that business leaders were throwing at you.

I wanted to dive into this question from Paula. She's a UX consultant. She says, "How are you able to convince management on the value of A/B testing for the first test? What are the most important things to test on the page?"

Jon is in a unique position in that he is part of management. I wanted to Jon, to get your answer to the question in terms of how would you want someone to present testing to you. Now that you're aware of it, if you had a marketing manager, marketing director, how would you want them to come to you and say, "Hey, I need budget for A/B testing. This is really helpful stuff."

Ciampi: Yeah, I think the interesting thing is, the way to present it to pull the marketer out of their world is executives don't want to hear about A/B testing. You're talking in a language that they thoroughly understand. I can promise you most executives understand probability, understand statistics, and understand how A/B testing works.

They may not understand it on the web but they understand the concepts. They didn't get to where they are by not understanding them. But when you dive into that language they start to tune out. They just go, "What are you talking about?"

It's similar to what you have on the slide. That's what we always get the question of is should we spend more money? The way I feel that you get the best results by going after A/B testing is either pick something that you want to test and say, "Hey, look we would like to try this idea. Here's roughly what it costs." Don't even get into the A/B test. The biggest, or if you can, the biggest impact is show the results of an A/B test.

If you can show a good result that will get everybody's attention. As we talked about Daniel, when you come back with a 20% lift, 100% lift, sometimes 300% lift, all of a sudden people's eyes are turning, their ears are opening and they want to listen to what you have to say. Make sure you just stay out of the details. If you get into the details the conversation quickly ends.

Burstein: As we talk about getting out of the details, I think it's a great time to talk about measurement and metrics. What we have up here are some of the results that Jon realized from this really awesome case study. We have a question here from Jennifer. She's a senior Manager. She wants to know about measurement.

"What is the best metric: clickthrough rate, conversion, etc.?" So on two levels, Jon. What would you recommend? What kind of metrics should marketers be talking to business leaders about as they try to get by and get budget for A/B testing. Also what kind of metrics are you looking at? What kind of metrics are important to you?

Ciampi: Yeah, that's a great question and it varies. The most important metric I feel, is to talk about either profit or revenue. It depends on the organization and where your leaders are focused and kind of how the organization is run. They'll either be focused on profit or revenue. You'll hear fancy terms like EBITDA as another form of profit. I always start there because those are hard dollars that finance understands, the CEO understands, any marketing leaders understand. They all get that.

I don't like to focus too much on that because we do marketing and not everything is measurable. So when run our A/B tests, we know that we call it the spillover effect. Meaning we run a bunch of ads. How many people saw our ads, but maybe they didn't activate right there, but they activated in some other way that we didn't measure. And that's very difficult to measure.

We actually back up the funnel and we say, okay if we ran a bunch of ads how many leads did we get? Did we generate more cause? Did we generate more interest? So we use the profit as a way of putting the rails on our budget. But we use the leads as a way of showing; week to week, month to month performance changes because there's lots of things that can also affect the profit or the revenue depending on what's happening down funnel in other activities that are going on in the organization.

Burstein: So we just got a great question from Mariana, Jon, and it kind of challenges what you said about presenting the increasing results to business leaders. She said, "What if the results are drastically worse, but still not statistically relevant? Just wait?" So Mariana, we're going to get into statistical relevance very briefly.

We're going to touch on it very briefly a little later in the webinar. But Jon, I wanted to get your thoughts on negative tests. How or what someone should someone do with negative tests?

Ciampi: That's where you got to become a real marketer. You have to talk about the negative tests as what did you learn? I think MECLABS and Flint always talks about that as every test has a learning and whatever it was that came out of it was that came out of it, the negative is hard. I had to swallow a few of those pills and I never liked doing it.

When you actually start to see the results come in and you see them negative ... First off, if it hasn't validated yet; wait. Unless you absolutely feel like, "No, we're just pouring down the drain." But the biggest thing is, "What did you learn?"

I feel that that's where marketers really have to step out of the technology and really start to understand "what were we trying to do," and "why didn't the customer behave in the manner that we thought they would behave?"

Burstein: So, let's talk about that, Jon. Let's talk about what you learned in one of your tests here. We had a question from Sean. He's in business development. He said, "Can you show examples, templates of your highest converting landing pages, please?" Actually my colleague Jon Powell, in our sister Brand MarketingExperiments recently had a Web clinic that showed three high preforming page templates that we here at MECLABS have tested over many, many different sites. If you message me on #SherpaWebinar @DanielBurstein I can share that with you, Sean.

Let's take a look at a specific test from Jon here. This is not a template per say, but it's a landing page they tested. Jon, can you tell us a little bit about Sierra Tuscon and how you were trying to test for value.

Ciampi: Yeah, Sierra Tuscon is a behavioral health facility for treatment of addictions, eating disorders and other mental health issues. It's located in Arizona. It's high end, meaning it attracts typically a wealthier population.

We were focused on what needs to be communicated to customers. Now we're talking the top of the funnel. That was probably one of the biggest mind set changes. We had a short, I think someone used it in the previous question a, squeeze page. We had a short squeeze page, didn't provide a lot of information.

What we felt was you're doing with a higher income family, that they were looking for more information. We just had a hunch that said, "You know what? We don't believe that this page provides enough information when you're talking about the types of dollars, to get somebody to contact." And, that contact button is a huge hurdle.

If you think about anyone who's ever suffered from any behavioral disorders, it's a huge hurdle to all of a sudden expose yourself and to talk about that issue. So looking at our page the idea was "Well, wait a minute. Are we really giving enough information for them to make that decision? Is it enough value?" That's when we looked at redesigning it, and we challenged several different paradigms out there on what these types of landing pages should look like.

Burstein: Here you can see the control. If you're someone who really likes to get into the creative samples, on the MarketingSherpa SlideShare, we've posted these slides. You can take a deeper look.

We're going to be tweeting the link to the SlideShare through #SherpaWebinar on Twitter. We'll also have a video replay of this webinar that we're going to be sending out to everyone so you can watch and take a closer look at the controls on that replay and you can share it with your peers who weren't able to make it today.

Looking very briefly at the control, Jon, this is what you were talking about. It's an average short form page and you've gotten a call to action along with that form in the upper right correct?

Ciampi: Well yeah, if you actually add up all the call to actions, they're probably five or six on that page. Which is also a sign of desperately trying to get somebody to contact you. Which was also an issue we saw. Overselling


Burstein: Please, please, please, please … contact us. Now, let's take a look. This is the experiment page, Jon. That's just the top of the page. As you can see on the left, this is a long page. This was in some ways a daring test. Here let's click through this. There you go. The form is way at the bottom there.

This is, first of all, one, I have to ask, are you in your right mind to put a form that low on a page? But let me instead ask you some questions we had from our audience about forms. Nikos, a marketing manager, wanted to know your thoughts about maximum form fields, versus minimum form fields.

Shelly, a digital marketing executive, wanted to know if there's any advice on forms on the landing page. If you should put them on the landing page. Or link to them from the landing page. Do you have any thoughts about that?

Ciampi: For us? Yes. It was shortest fields was the most successful. We wanted to keep people in the flow of the page. We have been building up the force all the way down the page. We did not want to have them click off to another page. I felt that just added an extra step to the process.

I didn't do it at the time, Daniel, but it would have been great to take a poll. Because I guarantee 99% of the people would have said not only would this page perform worse, but they hated the layout and design. They loved the colorful pictures on the control. So, it was interesting seeing people's emotional feelings on the page versus the theory behind why we chose it.

Burstein: Yeah, when we see these pages side by side we can really see the drastic differences there of just how long that page is. I mean, aside from it being less colorful, less emotional, less pictures, just how long that page is and how low the form is at the bottom.

So Jon, I'm just going to ask you a few questions from our audience here. Just think about them for a moment because I want to share one other thing. Then we're going to get to the results of this test. Nick, an online marketing coordinator wants to ask, "What is the best placement for a lead generation form?" As you can see Nick, there are two drastically different placements in this test. An upper right for the control, and the very bottom of a long page for the treatment.

Danielle, a director of marketing, wants to know, "How much copy is the correct amount for a B2B landing page?" Again, that treatment has significantly more copy. And Ben, a marketing programs manager, wants to know, "How much copy do you recommend putting on my new page as we sell relatively complex enterprise software and struggle with copy length." Jon, I think you have a complex sale too here. It's a little different obviously than enterprise software.

Think about those for just one moment. We're going to get to the results in just one moment. First, I want to tell you, the audience about Lead Gen Summit which is coming up end of September beginning of October in San Francisco. You can learn more at MECLABS.com/SanFran. We are fortunate enough to have Jon Ciampi with us for Lead Gen Summit.

What you're looking at, these slides here, something that Jon presented at Optimization Summit. He let me interview him on stage and ask him tons of questions about this. Unfortunately we're not going to have time today to get through his entire Optimization Summit presentation. But, hopefully enough to give you some real value. Take that back to your office and make a difference.

If you can join us at Lead Gen Summit in San Francisco, you can see Jon's full new presentation there. We were talking just the other day. He's got some interesting new landing page tests. He's got some great email marketing tests, too, where he used automation and triggers, which I know is a major topic for lots of Lead Generation marketers. If you can join us in San Francisco we'd love to see you there. You can see some more information at MECLABS.com/SanFran.

So, let's dive into the results. And let's think about a few of those questions for a second, Jon. Do you have any thoughts on, one, placement of forms on lead generation pages and two, copy length for landing pages?

Ciampi: That's a good question, I don't know what will work for other businesses, it all depends and each is unique. But to give some guidance at least, what we learned.

People who are coming to your website or landing page that have high motivation meaning they want to engage right away, we wanted to put something right at the top. We do put a phone number there. They're just going to engage right away, they don't need to read the copy. But what we found is that, by understanding the psychology and where these people are, we know that they need certain pieces of information put in front of them before they're going to make the call. So, we put one after we get through those pieces of information.

If you notice, also, we don't provide any navigation. You can't leave the page. That way we make sure that we get the information we want across. What was the second question, Daniel?

Burstein: The second question had to do with copy length.

Ciampi: Copy length is interesting. It's, we always try, we go back to I forgot who said it, but cross out every third word. We always try to go shorter is better. But what we're really looking at is have we answered the questions that are causing people to resist taking the call to action, and we keep honing that in.

This is where we work across departments. We work with our call centers. We work with our sales teams to understand what questions they're getting. Then we actively change the page and see whether or not we get better conversion rates. So, we're constantly fiddling with copy, placement, information, to see what's happening.

We do that for two reasons. One is to have better conversion. Two, we also clean up the funnel. Meaning we weed out any bad leads that we're getting. If we know that people aren't going to be qualified, let's put in information that weeds them out.

Burstein: We also have a comment here from Brad. He says, "Dan Kennedy, whose success goes without saying, has sales pages that are 10-plus pages long." So I think to the results of these tests which you can see here and to Jon's comments you can see, "Oh, we'd love to give you the exact right answer here. You want 200 words. Then you want the landing page right there, you want the form right there in the middle." But, there's really no one right answer.

This worked for Jon and he found that out by testing and if you'd look at best practices. For example, let's say, keep landing pages short as some people say that and put the form above the fold Jon would have lost out on this 220% increase. So Jon, let's talk about these results for a second. We had a comment here from Sasha and he said, "He thinks the number of leads they receive will vary based on maybe just who is coming to these pages."

But you didn't see the number of leads vary right? You had a pretty clear winner here with pretty significant results?

Ciampi: Oh, yeah. This wasn't just leads. This was actually to revenue. So this was based on admissions. This is a very hard number. The second thing that was pretty interesting about it was it was clear every time we ran it. We took this page and actually ran it on another facility, and it was very similar and it worked. But then, we took it to other facilities that were very similar and it didn't work.

So what we were able to determine with this was that this page works, for this facility. And it worked every single time. I know that if I put it up there against anything else that we've already tested, it's going work or have the same predictable outcome.

Burstein: Talking about that predictable outcome, Jon knows it's predictable because these results are significant, they're statistically significant. We have a lot of questions about that. I'm going to read just a few then I want to address that point. Just at a high level. So everyone on the call who is maybe not familiar with A/B testing understands what it will take to learn something like Jon has. You can't just go with some very basic numbers.

Jeff had a question. "What's the minimum level of viable records needed for an A/B email test?" We had Jody, a senior marketing specialist, "What is the sweet spot for the number of variables in a successful A/B split? Subject line, content, layout, etc." Marko, a marketing manager, wants to know, "I manage a website with about a half a million visits per day. How long should my A/B testing go? I could collect data on a split test from 100 visitors in a few hours, would it be enough?"

And Linda, a partner, wants to know, "Do you also do A/B testing on email subject lines and or content? If so, how long after the test do you send the real email? Is it important to maintain the same time of day, day of week as the test?"

Jon, you can pitch in too if you have any thoughts. I wanted to read all those questions together because they get into a really core principal of when we're testing, the numbers can mislead you if you don't understand the statistical significance. So I just wanted to tell you about a few terms. You can go to MarketingExperiments.com and learn a lot more about them but we don't want to guide you wrong. Where you just put in two different landing page treatments in an A/B splitter and think that it works.

So the minimum level of viable records to Jeff's question. That's going to really vary based on the results you get right? You're going to need more and more records if the results are closer together. So for a very, very big difference like this a 220%, you need less people that you're testing it on. Because what we're really trying to see with the test is are two things different? When the results are pretty close it's harder to say if those two things are different so we need to show it to more and more and more people to see if that represents the real world.

When the results are drastically different, for example, I'm not saying this is valid, but if you had only eleven people, ten went one way and one went the other you could probably pretty sure say that ten is the winner. As opposed to if six went one way and five went the other. So the sweet spot for the number of variables also will be affected by how much difference you have and how many people you send it to.

So if you do a A/B test you're able to do that with a smaller number of recipients. Because if you do A/B/C, A/B/C/D, A/B/C/D/E for example the more and more different variables you test the more you're going to take that audience you have and split them down into smaller and smaller segments. So the bigger audience you have, the bigger difference you have, the more variables you'll be able to test.

To Marko's point, he wants to know, if he has a half million visits per day can he just collect data for a few hours and then be able to call a winner. One thing he's looking at is if it's statistically valid, which is important. So with that number, he might be able to say it's statistically valid, but I want you to keep in mind that there are other threats to testing as well.

One is the history effect. If he just runs that for a few hours on a Wednesday morning and then says for example, one headline is better than another. What he might not realize is that his audience and traffic changes over the week and his audience on a Wednesday morning might not be the same as his audience on a Saturday morning. You also do have to look at the time you run a test for to see if you're taking into account history effect and making sure that whatever you learn is true for the audience you are showing it to. So keep in mind audiences might change over time.

That's why Linda asked about A/B testing, how long she should wait after sending the winner. That's the other thing to keep in mind. Really the bigger, higher level, question to ask is why am I getting the results I'm getting. For example, if I'm sending this at eight to 10 in the morning and would they be the same if I sent it at two in the afternoon? Is that the same group of people? Do they resonate with the same amount of things?

So as you see, it really all comes back to learning. That's why I wanted to talk about what Jon learned from this test. When I first saw this test I thought, "Hey, that's a great example that you don't just want to put the landing page above the fold." Jon thought a few more things and I'm going to ask him in a moment with this question from Cheryl, a nonprofit consultant, she says, "Testing is also used in a nonprofit sector to evaluate different types of outreach. How do you see your advice as it pertains to nonprofit securing new donors, and donor retention?"

Jon, I think what you really learned was not necessarily something about the health care field. But what you really learned was about your customers and how to learn about your customers. So, how would that apply to nonprofit and really any field of any marketer listening today?

Ciampi: You know, I think that's the broader question, Daniel. Because now we've seen this in many industries as we assume that people are coming to us because they want to buy. And most often the initial engagement is can I trust you? I'm not here to buy. Maybe if it's something that's a widget and I know I can get it everywhere and anywhere then I don't care as much.

But that was the "aha" in this. And it's resonated all over which was; we start putting out all the information on our services and how great we are and we do all this chest pounding. We haven't actually told them; "You're handing us your loved one. You can trust us. We're here to help. More importantly, we're going to help."

Until you break through that, I mean, think if you walked into the doctor's office and you don't trust the doctor. You don't care what the doctor says. You just want to get out of the office. That was really the one that nobody spoke about it, nobody talked about it, I had never heard it.

In all the testing and all the efforts that we went through nobody talked about trust. We were always focused on luxury, thread count, Le Cordon Bleu certified chefs, everything around the luxury aspect. Here the test came back and really it was an eye opener for us. And it wasn't just in marketing. It was across all aspects of the business.

Burstein: So let's take a look, in the time we have remaining, you said it's across all aspects of the business. This, what you learn from this test also helped with your pay-per-click advertising. So, you want to tell us a little bit about what you were doing with your pay per click advertising?

Ciampi: Yeah, it was similar. We break our pay per click, at the highest level we have two types of categories branded/non branded. Are we using the facility name or not? They are local facilities or they have a local name presence, brand, all that.

What was really interesting was how we were putting the ads out there. If you look at the left ad and you look at the right ad. "Sierra Tucson Care Center considered a top recovery clinic." Very chest pounding company out. I don't exactly know what top means but that's what it was. "Get a free assessment. Call now." I love the call now with the exclamation point because if you don't it's just, I don't know. It's just very sales-y, company, focused.

The non-brand. AZ stands for Arizona. It's a Geo-targeted ad. "Alcohol detox facility ... exclusive." Again, you can see all the luxury aspects. "Exclusive, luxury, one clinical staff per three patients."

Again, as a buyer, you are into ... I would hope our buyers if these ads worked had checklists of criteria they were using to evaluate our services versus a competitor. If they don't, my guess is that non-branded ad won't work.

Burstein: So, let's take a look at what you call the customer logic what you learned from your A/B test. How you made changes to these pay-per-click ads.

Ciampi: Yeah, probably what's most surprising is just the subtle changes. If you look at the left ad, the branded ad again we did not change anything except for the last line. I still don't like the word top, we did change... Sorry the one before said, I think, top addiction, this one said top depression and that had no bearing, we've tested that out.

The last line says, "Traditional and alternative therapies.’"You'd think that's the most benign statement ever. But as a buyer, and it's the same on the right, but on a buyer’s mind what this says is, "You're not going to judge me, you're not going to put me in a box. You're not just going to give me the cookie cutter."

When people go to the doctor, I'll use the physical doctor, primary care physician as an example because everyone can relate to it. You don't want them to hand you, "Oh well, this is what I prescribe all my patients." You want them to actually take time to notice you and what makes you unique.

Well, everyone thinks that the free assessment, it's free. Well it's free, it's got to be, that will work. Call now, oh that's a great call to action. Put it up on the ad. We'll get them to call right now. Let's even put a phone number. None of that works. What we found is build the trust and tell them, 'I'm going to help you.’ That's where the buyer is at this point.

Burstein: And tying into the buyer really produced these results. Sandy wanted to know how to increase conversion rate. So when I show these results you had here Jon. This I would say is an example of how to increase conversion rate. Do you want to talk to these results? Because they just seem ridiculous.

Ciampi: They do seem ridiculous. I don't ever really mention them because people just can't. I mean if I say 100% or 200% people will believe me. But when I say 14,000%, people just don't want to believe that. So interesting, we got a comment from Google. We meet with Google on this and they informally said that this was the highest conversion rate that they had seen in health care. That doesn't mean it is the highest.

We thought that that was a vote of confidence in our efforts. 44% clickthrough rate on a pay-per-click ad. Pretty astonishing considering it was only converting at three-tenths of a percent before. You would say, “Oh, that's branded and branded acts differently than non-branded.” But if you look at the non-branded again you got a 3,000% increase and we just changed one line. And it was a one line that really changed the psychology of how the buyer was engaging with us.

Burstein: So Jon had many, many tests and we can only share two of those with you today. Let's take a look at a high level of the results of all his testing here. Dan from marketing and sales wanted to know about total cost of what you did or not. I don't really want to put you on the spot and ask for your total cost, but I want to get a sense of your total return.

What was this entire campaign you ran within your marketing department to learn more about your valued proposition, to learn more about your customers? And, how is that return, not just impacted the company, and let's say financial terms as we've seen with some of these results and the results on screen. How has it impacted the marketing department and your company as a whole in terms of how you approach your customers?

Ciampi: I think that's a great question, numbers are always ... I mean I work in a $450 million company and if I was to talk to somebody who works in a $20 billion company we would have totally different perspectives. I think what's important here is that we were in the neighborhood of over a million dollars of additional contribution.

So from that perspective, on one channel, one ad, perspective, that was a huge lift. Or one landing page should I say because those ads fed in. From the overall organization perspective, what was interesting is ... And it goes back to that comment about how do I get money.

Now, I'm in the unique position because I control my own budget so obviously I was able to fund this. What I was able to do was gain a lot of support because we were able to show such dramatic increases. And your increases, I hope to God they're all the same as that, but if they're not, even if a 20% ... I mean, most businesses are not growing at 20% in this economy. So if you can show a 20% growth in a change from an A/B test, all of a sudden everyone starts paying attention.

This graph that Daniel has up, this graph has been shown multiple times in various states because obviously time moves forward and we keep updating it. But it's been shown across the entire company as to some of the things that are happening.

Burstein: OK, Jon, we only have one minute left. But to help other people get these results, we have a question from David who is a founder. He says, "Any detail that would allow duplication of results, ideally the most impact-full results?" So, in the last minute you have remaining what advice can you give to marketers on the call to get the same results you did?

Ciampi: Yeah, obviously first start with testing. And, put it down on paper. Don't just do testing to do testing. Because that's unfortunately what I find from people who want to get into it. They immediately go to the technology side which is the testing. That's the easy side.

Start thinking about your customer and lay out what your thoughts are on the customer. Why are they making certain actions and doing certain things? Then start your testing. Don't stop at the test.

See, we've had tests that look like they're going to, you know ... They brought it up earlier that that person had a negative lift. We had tests just like that. Trust was not on the top of our minds as a buying factor. You have to sit there and play with it, but take time.

We live in such a rushed world. Take time just to sit with your team and talk about it. Talk about it at lunch when you go out. Just talk and talk and talk because what ends up happening is you just come across these little intuitive ideas that seem benign. Then you put them into a test and all of a sudden, it opens up a door into how your customers are thinking.

Then the last piece that I would say is, all the time I hear people talk about it and when we get to the summit in San Francisco I'll hear it all the time about the technology. Email platforms and all this stuff ... The technology means nothing to me. It's honestly really understanding the customer because when you understand that the technology is just there to enable you.

Burstein: Excellent. Thank you for your time today, Jon. Thank you for sharing this huge success with us.

Ciampi: All right. Thank you to everyone. I appreciate the opportunity.

Burstein: And thank you to everyone who called in and you can see many more marketing case studies at MarketingSherpa.com. Have a great day.