# Experiments

Experiments is your A/B testing tracker. It turns "let's try this and see" into a structured loop with variants, success metrics, time windows, and recorded learnings.

### Anatomy of an experiment

* **Name** — short, recognizable label
* **Description** — what you're actually testing and why
* **Variant A (control)** — your current/standard approach
* **Variant B (test)** — the new thing you're trying
* **Success metric** — what tells you it worked (engagement rate, replies, click-through, follows)
* **Window (hours)** — how long the test runs before judging
* **Platform** — X, IG, or FB
* **Status** — draft → running → completed (or cancelled)
* **Learnings** — what you concluded after it ran

### Workflow

1. Click **New Experiment**
2. Fill out the form (you can also have it pre-filled from [Competitor Analysis](/thecontentforge-docs/feature-guides/competitor-analysis.md) — see below)
3. Save as draft, then click **Start** when the test is live
4. After the window closes, click **Complete** and record the learning
5. Use the learning to inform Brand Voice, Patterns thinking, and future experiments

### Pre-fill from Competitor Analysis

In Competitor Analysis you can click **Create Experiment** on any insight. The Experiments page will open with the form pre-filled — name, description, Variant B, and an impact badge — and a banner reminding you to fill in Variant A (your control) and a success metric before saving.

### Tips

* Test one variable at a time. If you change format AND topic AND time-of-day, you've learned nothing
* Pick a window long enough to be meaningful but short enough to act on (24–72 hours is typical for X)
* Always record learnings — even "no difference" is a useful result


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://thecontentforge.gitbook.io/thecontentforge-docs/feature-guides/experiments.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
