A/B Testing
ELI5 — The Vibe Check
Show half your users a blue button and the other half a green button, then see which one gets more clicks. That's A/B testing — science experiments on your users (with their consent, hopefully). It's how Netflix decides which thumbnail makes you click and how Google picks the right shade of blue.
Real Talk
A controlled experiment methodology where users are randomly split into groups that see different variants of a feature. Statistical analysis determines which variant performs better against defined metrics. Requires proper sample sizes, statistical significance, and guardrail metrics to produce valid results.
When You'll Hear This
"Our A/B test showed the simplified checkout flow increased conversions by 12%." / "Don't ship based on A/B test results until you hit statistical significance — p < 0.05 minimum."
Related Terms
Canary Analysis
Named after canaries in coal mines — you send a small version of your new deployment into production first to see if it dies. Route 5% of traffic to the ne
Chaos Monkey
Netflix built a program that randomly kills servers in production — on purpose. It's like hiring someone to randomly unplug things in your office to make s
Feature Flag Service
Feature flags are like light switches for your code. You deploy new features turned OFF, then flip the switch for 1% of users, then 10%, then everyone. If