|
| 1 | +# Content A/B testing |
| 2 | + |
| 3 | +This guide explains how to set up an A/B test on article content using the experiment framework. Both content variants live in the same Markdown file and are served in a single Fastly-cached response. Client-side JavaScript swaps which variant is visible based on the user's experiment group. |
| 4 | + |
| 5 | +For background on A/B testing at GitHub Docs, see [the A/B testing guide](https://github.com/github/docs-team/blob/main/analytics/ab-test.md). |
| 6 | + |
| 7 | +> [!IMPORTANT] |
| 8 | +> Only one experiment can have `includeVariationInContext: true` at a time, because `experiment_variation` is a single key in the event context. Multiple experiments can run concurrently if the others use `sendExperimentSuccess` for tracking instead. See the [experiments README](./README.md) for details. |
| 9 | +
|
| 10 | +## Authoring an experiment article |
| 11 | + |
| 12 | +Wrap the control (original) and treatment (rewritten) content in HTML divs. Both variants render as normal Markdown inside the divs. |
| 13 | + |
| 14 | +```markdown |
| 15 | +--- |
| 16 | +title: Your article title |
| 17 | +--- |
| 18 | + |
| 19 | +<div class="exp-control" data-experiment="readability_copilot"> |
| 20 | + |
| 21 | +Original article body here. Markdown renders normally inside the div. |
| 22 | + |
| 23 | +Links, lists, code blocks, and all other Markdown features work as usual. |
| 24 | + |
| 25 | +</div> |
| 26 | +<div class="exp-treatment" data-experiment="readability_copilot" hidden data-nosnippet> |
| 27 | + |
| 28 | +Rewritten article body here. This is the treatment variant. |
| 29 | + |
| 30 | +</div> |
| 31 | +``` |
| 32 | + |
| 33 | +### Rules |
| 34 | + |
| 35 | +* **Blank lines required**: Leave a blank line after the opening `<div>` and before the closing `</div>` so Markdown renders correctly inside the div. |
| 36 | +* **`data-experiment` attribute**: Must match the experiment key registered in `experiments.ts` (currently `readability_copilot`). |
| 37 | +* **`hidden` attribute**: Always add `hidden` to the treatment div. This ensures the control is shown by default (safe fallback if JavaScript fails). |
| 38 | +* **`data-nosnippet` attribute**: Always add `data-nosnippet` to the treatment div. This tells search engines to ignore the treatment text. |
| 39 | +* **Multiple pairs**: You can have multiple control/treatment pairs in one article if only some sections differ. Each pair must have the matching `data-experiment`, class, and attributes. |
| 40 | + |
| 41 | +## Previewing the treatment |
| 42 | + |
| 43 | +* **Staff cookie**: If you have the `staffonly` cookie set, you will always see the treatment. |
| 44 | +* **URL parameter**: Add `?feature=readability` to any article URL to force the treatment variant. |
| 45 | +* **Console override**: In the browser console, run: |
| 46 | + ```javascript |
| 47 | + window.overrideControlGroup('readability_copilot', 'treatment') |
| 48 | + ``` |
| 49 | + Reload the page to see the treatment. Use `'control'` to switch back. |
| 50 | + |
| 51 | +## How tracking works |
| 52 | + |
| 53 | +When `includeVariationInContext` is `true` (which it is for this experiment), **every** analytics event on the page automatically includes `experiment_variation: "control"` or `"treatment"` in its context. This means: |
| 54 | + |
| 55 | +* Page view events → measure impressions per variant |
| 56 | +* Link click events → measure CTA clicks per variant |
| 57 | +* Exit events → measure scroll depth (`exit_scroll_length`), time on page (`exit_visit_duration`), and scroll engagement (`exit_scroll_flip`) per variant |
| 58 | +* No extra tracking code is needed in the content |
| 59 | + |
| 60 | +CTA links that point to external sites (like `github.com/features/copilot`) are already tracked by the existing link event system. The `link_samesite: false` flag identifies external (CTA) clicks. |
| 61 | + |
| 62 | +To analyze additional event types (such as scroll depth or time on page), add queries to the dashboard—no code changes are needed. |
| 63 | + |
| 64 | +## Analyzing results |
| 65 | + |
| 66 | +Use the **[Docs Experiment Results dashboard](https://gh.io/docs-8c0c)** to track split verification, CTA click-through rates, sequential significance testing, and per-article breakdowns. The dashboard has parameters for experiment name, path product, and minimum detectable effect. |
| 67 | + |
| 68 | +Dashboard source config: [`docs-team/analytics/dashboard-builder/experiment-results.config.ts`](https://github.com/github/docs-team/blob/main/analytics/dashboard-builder/experiment-results.config.ts) |
| 69 | + |
| 70 | +## Ending the experiment |
| 71 | + |
| 72 | +1. Set `isActive: false` in `src/events/components/experiments/experiments.ts`. |
| 73 | +2. Remove the experiment divs from the articles, keeping whichever variant won. |
| 74 | +3. Open a PR documenting the results. |
0 commit comments