February 24, 2006
Case Study

How Scholastic Doubled eCommerce Revenues with a Site Survey

SUMMARY: Despite its famous brand name, ecommerce sales at Scholastic's site got off to a rockier-than-expected start. So, the team launched an online survey. Results were so helpful in improving the bottom line that the survey's been continuously served up to shoppers for four years now. Discover which two major changes, suggested by survey results, helped turn Scholastic's conversion rate from blah to boffo.
CHALLENGE

Scholastic is an 82-year-old company with a focus on offering tools "for helping your child learn and grow."

So despite the fact that the brand is best known for books and magazines, the ecommerce team decided the site would stock mass market products such as 'Tickle Me Elmo' in addition to Scholastic's own bestsellers.

After all, if Amazon could evolve from a trusted bookstore brand online to a selling practically everything, why not Scholastic?

However, sales were more sluggish than expected. Maryssa Miller, Scholastic's Senior Manager eCommerce, was assigned to figure out why.

She carefully examined Web analytics reports and sales data spreadsheets, trying to pinpoint the problem. Why weren't products that were selling like crazy on other sites not moving off Scholastic's famous-name shelves? The reports were fascinating, but ultimately frustrating. Miller needed to see inside the mind of the customer, to ask them directly, "What's the problem with our site?"

CAMPAIGN

So her team launched an ongoing series of online satisfaction surveys in October 2002 to ask consumers for help.

The site served a pop-up to roughly 4% of site users chosen randomly and cookied these visitors so they'd not be bothered by the same survey again in their visits.

The pop-up's introductory copy read:

"Thank you for visiting our store. Now tell us what you think! Please answer this brief survey and let us know how we can improve the store to serve you better. No personally identifiable information is required and results are strictly confidential."

In total, the surveys had about 30 questions, which Miller acknowledges is a lot. However, most were multiple choice check-boxes so they didn't take long to answer. Of the 30, 25 questions were standard defaults that appeared in every survey served month after month for years, so the site could track user satisfaction metrics over time.

These 25 standard questions asked things such as, "How likely are you to recommend this store to someone else?" or "How likely are you to purchase from this store?" Answers are based on scores of one to ten, with one being "not very likely" or "poor," and ten being "very likely" or "excellent."

The remaining five or so questions were changed about every 90 days. The team used these questions to dig into areas of particular concern such as product lines and site sections that were underperforming.

For example, Miller learned that customers weren't finding what they wanted on the site. So she used the five questions to ask more in-depth questions: was it because the site search wasn't working well or because the site didn't carry the products users wanted to purchase?

Then, Miller and her team used the resulting data, combined with more typical metrics such as sales and Web analytics data, to pitch upper management on making significant changes to site design and products offered.

RESULTS

"One of the best things we did was launch the customer survey tool," Miller exclaims. "We have had great revenue growth. We're growing by double-digit increases."

Roughly 5% of site visitors who see the survey pop-up take a minute to complete it. "We're surprised by the number of people who really want to tell [us] things," Miller says. "They take the time to fill out the custom questions, too, and give hints of what they want, what they don't want."

In fact, through surveys, the team discovered that shoppers were very frustrated by the site's internal search results. They simply couldn't find what they were looking for.

-> Discovery #1. Shoppers wanted more scholastic books

Miller discovered the biggest reason shoppers weren't finding what they wanted was because the site didn't carry the products they were looking for.

For example, shoppers were seeking backlist titles, books they remembered reading in, say, 1975. "We weren't featuring those because we didn't think people would be interested in older titles, but that's why they were coming to the publisher. They assumed we'd have them."

The site began carrying older titles and also implemented a feature that allows customers to sign up to receive an email when an out-of-stock book is available.

Of the emails sent, about 35% of the people who receive them purchase the book. That email feature also helps Miller's team understand demand. "If there's one product that's out-of-stock and 50 people just requested it, we know there's a higher demand and we need to get more of that product."

On the other hand, turns out shoppers were not hugely interested in purchasing many of the non-Scholastic-branded products offered. "They did sell, but we didn't see the demand we were planning." In 2003, based on the survey results, the team decided that those items didn't represent the brand. "Our brand is about the best learning product for your child."

The product mix was shifted from generic products to products that will help children develop and that can't be found at other retailers. "We can't compete on price, so we don't want to offer stuff you can get cheaper at Wal-Mart or Target."

In all, about 300 SKUs were marked down and sold out. Books are still the number-one category, but there are also successful toys that tie into books, such as a Clifford book and doll.

-> Discovery #2. Kids, teachers and parents search differently

Based on survey data, the team overhauled the site's internal search function in September 2005. The main change was in taxonomy, the words used for search result headings and categories. "The key for us is that Scholastic serves so many types of customers -- parents, kids and teachers," Miller explains. They all use different words for the same thing. "A parent probably calls it 'reading' where a teacher calls it 'phonemic awareness.'"

The team refined the taxonomy so each category meant the same thing across all segments, but the words used were different. "The Scholastic store for parents would have one word, while the teacher store would have a synonym for that word."

Plus, to make search even more convenient, the team also added narrow-your-search filters by age and product type -- the two most popular filters -- as well as by characters, series and price-points.

90 days after the revamped search launched, surveyed customer satisfaction scores shot upwards.

Useful links related to this article

Scholastic samples: http://www.marketingsherpa.com/cs/scholastic/study.html

Foresee Results -- the user satisfaction surveying firm Scholastic uses: http://www.foreseeresults.com/

Scholastic http://www.scholastic.com

Note: Scholastic is a member of Shop.org, a forum for retailing online executives to share information, lessons-learned, new perspectives, insights and intelligence. More info at http://www.shop.org

Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions