Dynamic content testing tailors B2B marketing insights to individual users, improving relevance and boosting conversions. This article explores five case studies where businesses used data-driven strategies to personalize user experiences and achieve measurable results. Here’s what you’ll learn:
- WildEarth: Combined customer data with Amazon’s tools, increasing purchases by 4% and recovering $140,000 from failed transactions.
- Billabong: Used geo-specific shipping offers and progress bars, raising average order values by 10–20%.
- Going: Changed a CTA button from "Sign up for free" to "Trial for free", doubling trial starts by 104%.
- World of Wonder: Leveraged AI to test landing page variations, boosting conversions by nearly 20%.
- Booking.com: Personalized search results and promotions, increasing bookings and revenue significantly.
Each case highlights the power of small, targeted adjustments based on user behavior, from tweaking CTAs to leveraging AI and geo-targeting. These examples show how personalization can reduce bounce rates, increase engagement, and drive conversions.
Case Study 1: WildEarth‘s Personalized Product Recommendations

Testing Strategy
In 2023, WildEarth, a company specializing in plant-based pet food, teamed up with WBX Commerce to explore a cross-platform personalization strategy using Amazon Marketing Cloud (AMC). By merging their own customer data with Amazon’s, they aimed to evaluate how advertising across multiple channels influenced consumer behavior.
Their approach involved dividing media spending across Streaming TV, display ads, and Sponsored Products to pinpoint which platforms led to the most website conversions. By integrating first-party customer data into AMC, they could track overlaps between customers exposed to Amazon ads and those who later made purchases on their website. This data-driven method allowed them to fine-tune their personalization efforts based on actual customer interactions rather than assumptions, ensuring their strategy was both targeted and measurable.
Results
The analysis uncovered a 15% overlap between customers who saw Amazon ads and those who made purchases directly on WildEarth’s website. This insight led to reallocating their budget toward higher-performing campaigns, resulting in a 4% boost in total website purchases.
"Finding an advertising partner who understands the latest Amazon Ads product offerings helped us understand where and when to invest." – Steve Simitzis, Chief Marketing Officer, WildEarth
In addition, WildEarth implemented a machine learning-powered recovery model through Recharge to tackle failed transactions. This solution, tailored to individual payment behaviors, recovered 88% of over 1,970 failed transactions, reclaiming $140,000 in revenue. Instead of relying on generic retry rules, the system adapted to each customer’s unique payment patterns.
Lessons Learned
Several key takeaways emerged from WildEarth’s experience. They discovered that personalization is most effective when it spans the omnichannel customer experience – from initial ad exposure to resolving payment issues. Their machine learning model became increasingly effective over time, as it adapted to customer behaviors and pinpointed the causes of payment errors.
"The model isn’t one-size-fits-all. It personalizes its approach to each brand as it gets to know their customers and the factors behind their payment errors, meaning it delivers better results over time." – Recharge Case Study
This case demonstrated that personalization goes far beyond recommending products. By measuring cross-channel performance and tailoring strategies to individual customer journeys, businesses can achieve more impactful results.
sbb-itb-f16ed34
Case Study 2: Billabong‘s Geo-Specific Shipping Offers

Testing Strategy
Billabong, a surf and lifestyle brand, took on one of the biggest challenges in ecommerce: unexpected shipping costs. Since 48% of cart abandonments are linked to surprise shipping fees, they introduced a geo-location detection system. This system identified each visitor’s region and displayed shipping details tailored to their location.
Visitors were either redirected to their region-specific store (like the UK site) or shown rates customized for areas such as the contiguous US, Hawaii, Alaska, and Canada. On top of that, Billabong added dynamic progress bars to the cart, updating in real time to show how much more a customer needed to spend to qualify for free shipping. For example, the message might read: "You are $125 away from free shipping". This turned shipping costs into a motivational challenge, encouraging customers to add extra items to hit the free shipping threshold.
To keep shipping incentives front and center, Billabong used a multi-touchpoint promotion system, featuring banners and promotional ribbons throughout the shopping experience. From the homepage to checkout, these visual cues ensured customers were always aware of free shipping opportunities. This approach tackled the common issue of hidden thresholds that can frustrate new visitors.
Results
This location-specific strategy delivered clear improvements. By making shipping thresholds visible and relevant to each region, Billabong turned a common frustration into a sales booster. The dynamic progress bars alone drove a 10–20% increase in Average Order Value as customers added more to their carts to unlock free shipping. The geo-targeted system also reduced customer dissatisfaction – someone in Alaska, for instance, didn’t encounter promotions that only applied to the lower 48 states.
Real-time cart updates added urgency by showing exactly how much more needed to be spent, removing the guesswork that often leads to abandoned carts.
Lessons Learned
Billabong’s success highlighted the importance of threshold optimization. Setting the free shipping minimum too high can discourage customers, while setting it too low can hurt profits. The key is finding the right balance, which varies by region – a challenge their geo-specific system handled effectively.
This case also underscored the value of clear and consistent shipping messaging. By integrating dynamic content at multiple stages – from product pages to checkout – Billabong ensured customers always knew their options. Context made all the difference: a shopper in Canada had different needs than one in California, and treating them the same would have created unnecessary roadblocks that could derail sales.
Real World Dynamic Content Example [inside Keap]

Case Study 3: Going‘s CTA Button Text Variations

This case highlights how a small tweak in CTA language made a huge difference for Going (formerly Scott’s Cheap Flights).
Testing Strategy
Going faced a challenge with low homepage conversion rates. In May 2024, Forrest Schaffer, their Senior Manager of Growth, decided to test two different CTAs using Unbounce‘s A/B testing tool. The original button read "Sign up for free," while the alternative was "Trial for free." Schaffer believed that "Sign up" felt like a chore, while "Trial" sounded more inviting and less intimidating. To ensure accurate results, traffic was evenly split (50/50), and a custom conversion script was used to track outcomes precisely.
"Unbounce ensured a perfect 50/50 split with just one URL."
– Forrest Schaffer, Senior Manager of Growth, Going
Results
The results were striking. The "Trial for free" button led to a 104% increase in trial starts compared to the previous month. Even more impressive, conversions from paid channels outperformed organic ones for the first time, enabling the team to scale their media spending significantly.
"This experiment, as small as it was, legitimately changed the way that we’re able to spend on media. Our conversion rate through paid channels is now higher than organic for the first time ever."
– Forrest Schaffer, Senior Manager of Growth, Going
Lessons Learned
This test proved that even small changes in wording can have a massive impact. By replacing "Sign up" with "Trial", Going shifted the user experience from feeling like a task to an opportunity for exploration. This subtle adjustment tapped into the Framing Effect – a psychological principle that influences how people perceive choices.
For marketers, the takeaway is clear: carefully review your CTAs. Are they focused on the process ("Register", "Submit") or on the value the user will gain ("Get Access", "Start Exploring")? Small, thoughtful changes in micro-copy can lead to substantial improvements, especially when they encourage meaningful user engagement. This case underscores how precise wording can play a pivotal role in personalization strategies, paving the way for additional insights in future examples.
Case Study 4: World of Wonder‘s AI-Driven Content Personalization

World of Wonder, the creative powerhouse behind RuPaul’s Drag Race, sought an automated way to fine-tune their landing pages for better performance. During an intensive four-week period in early 2026, Designer Maggie Tielker and Chief of Staff Kelly Dirck turned to Unbounce’s Smart Traffic AI tool. Their goal? To optimize three key campaigns: DragCon UK (focused on lead generation), a Vegas live show (geared toward ticket sales), and their streaming platform, WOW Presents Plus.
Testing Strategy
Using Smart Traffic, the team harnessed visitor data like location and device type to direct users to specific landing page variants. This approach allowed them to test without overhauling their designs. The experiments covered:
- Visual styles: image-heavy layouts versus text-focused designs.
- Background media: videos compared to static images.
- Headline clarity: tweaking messaging for better understanding.
- CTA button designs: experimenting with color-reversed buttons and other variations.
These targeted changes helped uncover what resonated most with different audience segments.
"We build these pages with our best knowledge and intention, but we haven’t really done a lot of testing at all – so it’ll be cool just to see what really works. I’m all for using AI as a tool to help us."
– Maggie Tielker, Designer, World of Wonder
Results
The outcome? A 19.7% overall boost in conversions across all three landing pages. The streaming service page saw the most dramatic improvement, with its top-performing variant achieving a 29.74% lift, jumping from 10.57% to 17.09%. Meanwhile, the DragCon UK page hit a 31.88% conversion rate (up by 19.1%), and the Vegas live show ticket page reached an impressive 54.07% conversion rate, a 10.51% increase. These numbers far exceeded the typical media and entertainment benchmark of 7.9%.
| Landing Page Campaign | Primary Goal | Total Conversion Rate | Conversion Lift from AI |
|---|---|---|---|
| DragCon UK | Lead Generation | 31.88% | 19.1% |
| Vegas Live Show | Ticket Sales | 54.07% | 10.51% |
| Drag Race Streaming | Service Sign-ups | 13.56% | 29.74% |
Lessons Learned
This case showed how AI-driven personalization can uncover hidden opportunities and simplify landing page goals by aligning content with user behavior. Even small changes – like tweaking CTA button colors or switching static images for videos – led to significant gains when paired with audience-specific targeting. For marketers with limited resources, AI offers a way to bypass labor-intensive A/B testing while still delivering tailored experiences. It also helps address the needs of diverse audiences, including those seeking localized content.
"Some of this stuff is so simple that I don’t see why we couldn’t just throw up a variant with Smart Traffic. It’s pretty quick and it’s not like you have to change the whole page."
– Maggie Tielker, Designer, World of Wonder
Case Study 5: Booking.com‘s Local Search Recommendations

Booking.com handles millions of searches daily across 75 countries and 43 languages, making personalization a tough but necessary challenge. To meet this demand, the platform developed a dynamic content system that tailors search results based on user location, search habits, and market trends. Instead of delivering the same results to everyone, the platform adjusts rankings and displays for each user in real time.
Testing Strategy
Booking.com relied on personalized ranking algorithms to reorder search results dynamically. These rankings were influenced by user preferences, market trends, and property performance data. For instance, the Android app’s "Quick Trips" feature categorized nearby destinations into groups like "Quick Getaways", "Popular", and "Deals", based on proximity.
They also tested dynamic rules to fine-tune availability and pricing. One example: minimum stay restrictions were lifted for "gap nights" between reservations, allowing bookings that would otherwise be blocked. Another test focused on the Genius program, which targeted high-value travelers with personalized discounts and perks such as free breakfast or room upgrades.
With 1,000 experiments running simultaneously, Booking.com uses Group Sequential Design (GSD) to analyze data more quickly while maintaining statistical rigor. This decentralized testing culture empowers employees to launch experiments without requiring top-down approval, although about 90% of these tests fail to deliver positive outcomes.
Results
The dynamic content initiatives proved effective. Properties in the Genius program saw 30% more search views, up to 45% more bookings, and a 40% revenue boost on average. Meanwhile, properties that achieved a perfect "property page score" – by fully tagging amenities, facilities, and high-quality photos – experienced up to 18% more bookings than those with incomplete listings.
In Q4 2025, Booking Holdings reported a 16% year-over-year revenue increase, reaching $6.35 billion. The company also added 24 million room nights booked, totaling 285 million nights, a 9.2% year-over-year increase.
Some tests yielded surprising insights. Kristina Gibson, former Director of Product, noted that showing "sold out" labels alongside available properties boosted overall bookings. The scarcity signals, like "1 left", created urgency and drove more conversions.
"Showing sold-out properties alongside available ones increased overall bookings. The unavailable options heightened travelers’ desire to book what they could still get."
– Kristina Gibson, former Director of Product at Booking.com
Lessons Learned
Booking.com’s case highlights the power of data-driven testing to improve user engagement and revenue. For example, their expansion into Germany revealed unexpected demand. While executives initially focused on Berlin, search data showed Dutch travelers were more interested in Winterberg, a small ski village. As a result, the company opened its first German office in Winterberg instead.
Testing also uncovered that users prioritize outcomes over technical details. A failed experiment rating "WiFi Strength" (1-100) led to qualitative research showing guests cared more about whether they could stream Netflix or send emails. A follow-up test with labels like "Fast Netflix Streaming" delivered much better results.
"Internet wasn’t important – jobs done through the Internet were."
– Bhavya Sahni, VWO
Despite a 90% failure rate, Booking.com’s high-volume testing approach creates compounding returns. With a 10% success rate and a 1% revenue lift per successful test, running 1,000 experiments simultaneously can lead to a 100% total revenue uplift. This strategy shows how consistent testing, even with frequent failures, can drive long-term growth and performance gains.
Comparison of Results Across Case Studies

5 Dynamic Content Testing Case Studies: Results and Key Metrics Comparison
Metrics and Insights
The five case studies highlight diverse strategies that led to success, but they all share one common thread: testing tailored to specific audiences consistently outperforms generic approaches. For instance, while WildEarth and Booking.com leaned heavily on data-driven personalization, Going achieved significant growth with a small tweak to its call-to-action (CTA) wording.
Each company defined success in its own way. Going focused on trial start rates, WildEarth tracked purchase rates, and Billabong zeroed in on reducing cart abandonment. Despite these differences, they all found that small, targeted adjustments often delivered better results than sweeping overhauls. A great example is Going’s decision to change its CTA from "Sign up for free" to "Trial for free", which led to paid channel conversions surpassing organic rates for the first time in the company’s history.
Here’s a quick breakdown of how these tailored strategies translated into measurable performance gains:
| Case Study | Test Focus | Key Change | Conversion Lift |
|---|---|---|---|
| WildEarth | Recommendations | Personalized vs. static | 2x effectiveness, 70% higher purchases |
| Billabong | Geo-offers | Location-based shipping | Reduced abandonment |
| Going | CTA text | Targeted CTA language | 104% increase in trial starts |
| World of Wonder | AI content | Behavior-based dynamic | 20% conversion increase |
| Booking.com | Local recs | Geo-matched suggestions | Increased bookings |
These examples show how different methods – like personalized content and refined messaging – can lead to significant improvements. WildEarth and Booking.com focused on personalized content to engage users, while Going achieved its results by optimizing messaging. World of Wonder blended these strategies, using AI-powered "Smart Traffic" to dynamically deliver the best-performing content variation.
Ultimately, these case studies reinforce a key takeaway: tailoring content and messaging to match user behavior is a powerful driver of conversions. Sometimes, the simplest changes can make the biggest difference. Going’s three-word adjustment required minimal effort but completely transformed how customers viewed their offer.
Conclusion
Summary of Lessons Learned
These case studies clearly show that personalized content consistently outperforms generic approaches. The most effective strategies went far beyond surface-level personalization, like simply adding a customer’s name. Instead, companies tapped into deep data – factors like device types, browsing habits, geographic location, and individual interests – to craft experiences that felt truly relevant.
Another key takeaway? Small changes can drive big results. For example, Campaign Monitor saw a 31.4% increase in trial signups by aligning landing page verbs with search queries. Even minor adjustments – like changing a button color, swapping an image, or rephrasing a call-to-action (CTA) – can make a noticeable difference.
These insights offer a roadmap for getting started with dynamic content testing.
Next Steps for Dynamic Content Testing
Start small. HP’s Cathy Howard began testing dynamic email content with just 2% of their U.S. database back in February 2012. The result? A 300% increase in open rates before scaling the effort. Focusing on your most engaged audience first can provide quicker validation of your strategy.
Use the data you already have. For instance, Sprint’s Pam Messier tailored newsletters using device type and service plan information. If you’re running pay-per-click (PPC) campaigns, try Dynamic Text Replacement (DTR) to match landing page text with search query verbs. ConversionLab demonstrated the power of this method, achieving statistically significant results during a test conducted from October 31, 2017, to January 16, 2018.
Make testing an ongoing effort. DSW, for example, conducted 15 A/B tests in a single year using Movable Ink’s Optimizer tool. Experiment with everything – headlines, images, CTA placement, button colors, and even form locations. Keep a close eye on the metrics that matter most to your goals, like trial signups, purchase rates, or cart abandonment. Let the data lead the way.
"Our hypothesis was that a verb defines HOW you solve a challenge… if we could meet the visitor’s definition of solving their problem we would have a greater chance of converting." – Rolf Inge Holden, ConversionLab
FAQs
What counts as dynamic content testing?
Dynamic content testing is all about crafting personalized content variations for each user and testing them systematically. The goal? To figure out which tailored elements connect best with your audience. By doing this, you can boost engagement, improve conversions, and fine-tune the overall user experience.
How do I start personalization tests with limited traffic?
Start with small, targeted experiments focusing on key areas like headlines, call-to-action buttons, or specific content sections. Leverage A/B testing tools to identify which changes make the biggest difference, and keep the testing period short to quickly gather actionable data. Prioritize personalization techniques that pack a punch – like dynamic keyword insertion or customizing content based on user details. These approaches can deliver valuable insights, even with lower traffic, while helping to improve conversions efficiently.
What metrics should I track for dynamic content tests?
To gauge how well your dynamic content is performing, keep an eye on key metrics like:
- Conversion rates: This includes actions like purchases or sign-ups.
- Bounce rates: Tracks how often visitors leave without interacting.
- Cost per lead: Helps measure the efficiency of your campaigns.
- Revenue per visitor: Shows the monetary value each visitor brings.
For e-commerce, focus on metrics such as:
- Click-through rates: How often users click on your links.
- Add-to-cart rates: The percentage of visitors adding items to their carts.
- Purchase rates: The number of completed purchases.
In email campaigns, monitor:
- Open rates: How many recipients open your emails.
- Click rates: The percentage of clicks on links within your emails.
- Engagement levels: Tracks how actively users interact with your content.
These KPIs provide a clear picture of how effectively your dynamic content is achieving its goals.










