AI builds your ad from a single prompt

May 14, 2026
Pulse Strength & Conditioning is a 4,200-square-foot strength and conditioning studio in a mid-sized Midwestern metro. Twelve coaches, six daily group class slots, three specialty programs (CrossFit-style metcon, powerlifting, mobility), and a 9-month membership ramp that had stalled in late 2025.
The studio had been pouring its full marketing budget into Meta and a small Google Local Service Ads spend. Lead volume was reasonable. Lead quality was not. The owners were paying $58 per lead in Meta and converting around 11% of those leads into trial-class signups, which meant a real customer acquisition cost north of $500. For a $159/month membership, the math wasn't holding up.
In February 2026, the owners decided to test CTV alongside their existing digital mix. They launched their first Adwave campaign on March 3, 2026, and ran it through May 9, 2026, a 67-day test window. Here's what happened.
Pulse had a strong organic presence (4.9 stars on Google, an engaged Instagram following, regular member referrals) but had hit a ceiling. New trial signups had plateaued at around 18 per month for the previous six months. Coach utilization in the 5:30 a.m. and 6:30 a.m. class slots was at 62%. Those empty slots were costing the business roughly $4,200 a month in unrealized revenue.
The owners had a few constraints going in:
Budget cap of $1,800/month. They couldn't afford to scale TV up if it didn't work, so the test had to prove out at small spend.
Local targeting required. Pulse's drive-time radius was about 12 miles. Anything beyond that wouldn't convert.
Trial class signups were the success metric. Not impressions, not website visits. Real trial bookings on the calendar.
We agreed up front that any campaign needed to deliver at least 25 incremental trial signups per month to be worth continuing. That meant a cost per trial under $72, and ideally lower.
The campaign ran across 100+ premium streaming networks (Hulu, Peacock, Tubi, Roku, and similar) on Adwave's standard subscription tier. Three creative variations were produced through Adwave's AI tools and rotated through the campaign:
"The 5:30 Club" spot. A morning-energy emotional appeal aimed at early risers and 40+ professionals trying to fit fitness into a full schedule. Heavy emphasis on the coaching, community, and consistency. Soft CTA: "Try your first class free."
"Stronger Than Last Year" spot. Aspiration-driven, emphasizing the strength-and-conditioning identity rather than weight loss. Same soft CTA.
"For People Who've Tried Everything" spot. Direct response framing aimed at consumers who'd tried other gyms and not stuck with them. Light testimonial element. Slightly harder CTA: "Book your first class on our site."
Targeting was set to the 12-mile drive radius. Daily cap started at $60 and held steady for the full run.
For attribution, Pulse implemented three tracking layers:
A unique vanity URL (pulse5amclub.com, redirecting to the trial booking page) mentioned in the 5:30 Club spot
A "How did you hear about us?" question on the trial booking form with TV as an explicit answer
A pre/post baseline comparison of total trial signups against the prior six-month average
The first month produced more measured results than the team expected, but they pointed in the right direction.
Two patterns worth noting. First, the attributed TV trials slightly missed the $72 cost-per-trial target. Second, the total trial volume jumped from 18 to 41, a +128% lift, even though only 19 of those were directly attributed to TV. The owners' working theory: TV was lifting their other channels by building familiarity that made Meta ads and Google searches convert better.
The other early signal was qualitative. Trial-class attendees in month one mentioned the TV ad unprompted in conversation with coaches more often than the team had expected. Several were already familiar with Pulse's name before they booked, even when they'd come in through what looked like a Meta lead.
After the first 30 days, the team made three changes based on the dashboard data:
1. Tightened the targeting. Geographic reporting showed roughly 22% of impressions landing in ZIPs outside Pulse's real drive radius. Adwave support tightened the targeting to a narrower set of ZIPs that matched the studio's actual member geography.
2. Paused the weakest creative. The "For People Who've Tried Everything" spot had the lowest conversion rate per impression and the highest completion-rate drop-off. The team paused it and let the other two carry the load.
3. Adjusted daily cap timing. Adwave's geographic data showed strongest delivery on weekday evenings and Sunday mornings, which matched Pulse's known booking patterns. The team raised daily caps modestly to push more weight into those windows without increasing total monthly spend.
The adjustments compounded.
By month 3, cost per attributed trial had dropped under the $72 target and was still trending down. More importantly, the total trial volume held at 53 in month 2 and was on pace to exceed that in month 3.
The tighter targeting did most of the work. CPM came down modestly because impressions were landing on better-converting inventory. Frequency stabilized in the 4-5 sweet spot. Reach narrowed but on more relevant households.
Trial signups were the campaign's primary metric, but the bigger question was whether those trials converted to members at the same rate as Pulse's historical baseline, or better, or worse.
Historical baseline: Pulse converted about 38% of trial-class attendees to full members within 30 days.
TV-attributed trials over the 67-day test: conversion rate of 47%.
That nine-point bump translated into significant downstream revenue. Across 69 attributed TV trials, that meant 32 new members rather than the 26 a baseline conversion rate would have predicted. At $159/month, those incremental six members are worth approximately $11,448 in first-year membership revenue.
The team's hypothesis on the higher trial-to-member conversion rate: TV-attributed prospects came in pre-warmed by the campaign's emotional framing. They'd already been imagining themselves at Pulse before booking. That warmer mindset translated into higher commitment when the trial happened.
Three things didn't go as planned, worth sharing for any studio considering a similar test.
The third creative underperformed. The "For People Who've Tried Everything" spot was the one the team was most excited about going in. In the dashboard, it consistently lagged the other two on both conversion and completion rate. The lesson: instinct about what will resonate is often wrong. The data corrects it.
Attribution undercounted real TV impact. Even with three attribution layers, the team estimates 30-40% of TV's actual lift was invisible in the direct attribution numbers. The pre/post baseline (18 trials/month vs 53 trials/month at peak) captured more of the real impact than the vanity-URL or "how did you hear about us" tracking did. If they'd only used direct attribution, they'd have undervalued the campaign.
The campaign took time to ramp. First-month performance was meaningfully worse than what came later. A team without patience for the 2-3 week dashboard-ramp curve might have pulled the campaign in week 2 based on weak early signals. Sticking to the original 67-day window mattered.
Based on months 2-3 performance, Pulse's owners modeled out a year-one run continuing the campaign at the same $1,800/month budget:
Total annual TV spend: $21,600
Projected TV-attributed trials: roughly 330 (28/month average)
Projected TV-attributed members at 47% conversion: 155 new members
First-year membership revenue from TV-attributed members: $295,740
Plus the lift to non-TV channels (Meta CPL and conversion both improved over the test period)
The 14x first-year ROAS is the kind of number that turns a "let's test it" into "this is now a permanent line item." Pulse has continued the campaign as of this writing and is exploring scaling to $2,400/month in their seasonal pre-summer window.
This was one studio's experience over 67 days. It doesn't predict identical results for every fitness business. A few patterns from Pulse's run translate well to other studios considering a similar test:
Small-budget CTV tests can work. Pulse never spent more than $1,800/month. The trust-and-familiarity effects compound at small budgets just as they do at large ones.
Multi-creative tests beat single-creative campaigns. Pulse's two surviving creatives carried the bulk of the conversion. Without rotation, the studio would have been guessing.
Local targeting needs vigilance. The targeting tweak after month 1 unlocked most of the gains. Any small business running CTV should verify in week 3 that impressions are landing in real customer geography.
Attribution needs multiple layers. Direct attribution will miss most of TV's true impact. Pre/post baseline comparisons surface the rest.
Trial-to-member conversion may rise. The 9-point bump in conversion rate is a result of the emotional priming TV does that other channels can't match.
How long did it take Pulse to see results?
Total trial volume jumped in the first month (+128% over baseline), but cost-per-attributed-trial took 60-90 days to hit the team's $72 target. Studios considering CTV should plan for a 60-90 day test window rather than judging performance off the first 14 days.
What was the most important targeting decision?
Tightening the geographic radius in week 5 after dashboard data showed impressions leaking outside the studio's real drive radius. That single adjustment did more for cost-per-trial than every other optimization combined.
Did the studio cut Meta or Google spend during the TV test?
No. The owners held Meta and Google steady through the 67-day window to avoid confounding the test. Meta CPL and conversion rate both improved during the TV window, which the team attributed to TV building familiarity that lifted Meta's performance. Cutting Meta after seeing TV results is now under consideration.
Could a smaller studio (one or two coaches) run this same playbook?
Yes, at smaller scale. A studio with a $600-900 monthly TV budget can still build meaningful local frequency in a tight geography. The trade-off is fewer impressions and a slower trial ramp, but the trust dividend builds at any budget level.
What was the single biggest surprise from the campaign?
The trial-to-member conversion lift (9 points above baseline). The owners expected TV to drive new trials but didn't expect it to increase the quality of those trials. The emotional priming TV does is the most underappreciated part of the channel for most small business owners.
Is the studio sharing its specific creative anywhere?
The studio has shared parts of its setup at local fitness business meetups, but the creative itself is kept private for competitive reasons within their metro. The general approach (one morning-energy emotional spot, one aspirational identity spot, one direct-response spot) is replicable for any studio with Adwave's AI creative tools.
The takeaway from Pulse isn't that TV is magic. It's that CTV is now accessible enough for small fitness businesses to test, and the trust effects of TV (lean-back attention, premium content context, emotional priming) compound at small budgets in ways most studio owners haven't yet experienced.
If you're running fitness-studio marketing in 2026 and you haven't tested CTV, the cost of finding out whether it works for your studio is a 60-90 day campaign at $1,000-$2,000 per month. The cost of not testing is leaving a trust-building channel completely uncovered while every digital competitor competes for the same cold clicks.
Ready to test what Pulse tested? Create your first ad with Adwave in about two minutes, set your service area and budget, and start your own 60-day case study.