We’re running an A/A experiment (random assignment to two variants, but user sees the same product)....
d

Dylan Sather

almost 3 years ago
We’re running an A/A experiment (random assignment to two variants, but user sees the same product). Typically there’s been small variation for metrics in Growthbook but most of the violin plots are centered at close to 0 now that ~50,000 users have been assigned. Our assignments should be random. But one of our core binomial metrics is showing a 2.3% percent drop in the “experiment” group. Growthbook computes a 3.68% chance that the variant will beat the control, when I’d expect ~50% chance, since the product is the same. This is a ratio metric, with the denominator set to one of the other states in our funnel. The denominator’s metric shows the expected 0% change. There’s a conversion window on the metric, but I tried copying the metric and 1) setting the denominator to all experiment users and 2) modifying the conversion window, and the percent size / probability distribution is roughly the same. This experiment contains has been running for 4 months. I created a new experiment phase and looked at just data from the last 1 month, and I wasn’t able to reproduce this. Seeing a 0% percent change for the new phase. I had a few questions, and I’m open to any advice people have on interpreting the results of A/A tests in Growthbook: • What could help explain this outcome? Are these false positives expected, even after such a long experiment length? • Could this suggest a flaw in our assignment logic or our metric definition in any way? Again, the difference in most other metrics is close to 0, and the second phase showed a 0% difference, so the first result (on 4x the users) is confusing. • Does this suggest that I should be skeptical of any experiment result that moves the metric less than ~2.3%? Is the A/A test helping me understand the underlying natural variance for the metric in any way?
Hi, team! We’re running an experiment where the Growthbook analysis does not find users allocated to...
j

Jon Eide Johnsen

almost 3 years ago
Hi, team! We’re running an experiment where the Growthbook analysis does not find users allocated to
variant_id = 0
. Looking at the raw data from BigQuery, the variants (
0
and
1
) are equally distributed (see screenshot), but variant
0
users do not show up in the experiment results. I’ve also verified that the there are no `userId`s exposed to both variants. Here’s the query from the analysis:
-- Activation (binomial)
WITH __rawExperiment as (
  SELECT
    user_id,
    received_at as timestamp,
    experiment_id,
    variation_id
  FROM
    `rudderstack_data.experiment`
),
__experiment as (
  -- Viewed Experiment
  SELECT
    e.user_id as user_id,
    cast(e.variation_id as string) as variation,
    CAST(e.timestamp as DATETIME) as conversion_start,
    DATETIME_ADD(CAST(e.timestamp as DATETIME), INTERVAL 336 HOUR) as conversion_end
  FROM
    __rawExperiment e
  WHERE
    e.experiment_id = 'hello-sanity_template'
    AND CAST(e.timestamp as DATETIME) >= DATETIME("2022-08-15 19:00:00")
),
__metric as (
  -- Metric (Activation)
  SELECT
    user_id as user_id,
    1 as value,
    CAST(m.timestamp as DATETIME) as conversion_start,
    CAST(m.timestamp as DATETIME) as conversion_end
  FROM
    (
      SELECT
        user_id,
        CAST(activation_date AS TIMESTAMP) AS timestamp
      FROM
        `dbt_production.dim_user`
      WHERE
        is_activated = TRUE
    ) m
  WHERE
    CAST(m.timestamp as DATETIME) >= DATETIME("2022-08-15 19:00:00")
),
__distinctUsers as (
  -- One row per user/dimension
  SELECT
    e.user_id,
    cast('All' as string) as dimension,
    (
      CASE
      WHEN count(distinct e.variation) > 1 THEN '__multiple__'
      ELSE max(e.variation) END
    ) as variation,
    MIN(e.conversion_start) as conversion_start,
    MIN(e.conversion_end) as conversion_end
  FROM
    __experiment e
  GROUP BY
    e.user_id
),
__userMetric as (
  -- Add in the aggregate metric value for each user
  SELECT
    d.variation,
    d.dimension,
    1 as value
  FROM
    __distinctUsers d
    JOIN __metric m ON (m.user_id = d.user_id)
  WHERE
    m.conversion_start >= d.conversion_start
    AND m.conversion_start <= d.conversion_end
  GROUP BY
    variation,
    dimension,
    d.user_id
),
__overallUsers as (
  -- Number of users in each variation
  SELECT
    variation,
    dimension,
    COUNT(*) as users
  FROM
    __distinctUsers
  GROUP BY
    variation,
    dimension
),
__stats as (
  -- Sum all user metrics together to get a total per variation/dimension
  SELECT
    variation,
    dimension,
    COUNT(*) as count,
    AVG(value) as mean,
    STDDEV(value) as stddev
  FROM
    __userMetric
  GROUP BY
    variation,
    dimension
)
SELECT
  s.variation,
  s.dimension,
  s.count,
  s.mean,
  s.stddev,
  u.users
FROM
  __stats s
  JOIN __overallUsers u ON (
    s.variation = u.variation
    AND s.dimension = u.dimension
  )
Are you able to spot what’s wrong here?
Hiya, I am trying to use Puppeteer to auto-generate data for my GrowthBook demo. My puppeteer script...
s

Shelagh Lewins

over 2 years ago
Hiya, I am trying to use Puppeteer to auto-generate data for my GrowthBook demo. My puppeteer script appears to work perfectly but none of the GA events (defined with GTM) are firing, except the initial page_view. I’ve googled but can’t find any reason why the GA events would work when I visit the page and perform actions manually, but not when Puppeteer does the same thing? I’m running a React app locally on port 3000, and GrowthBook locally on port 4000. GA events from Puppeteer are not showing up in BigQuery, and when I intercept requests, it appears that they are not being sent. Forgive me if this question is too out-of-scope; I realise there’s no fault in GrowthBook, I’m just out of ideas! Here’s my Puppeteer script:
const puppeteer = require('puppeteer');
const seedrandom = require('seedrandom');

const mainCTAProbabilities = {
  freeControl: 0.3,
  freeExperiment: 0.5,
  paidControl: 0.6,
  paidExperiment: 0.7,
};

const requestUrlParamsToJSON = requestURL => {
  // Split request parameters and store as key-value object for easy access
  let params = requestURL.split('?')[1];
  return JSON.parse(
    '{"' +
      decodeURI(params)
        .replace(/"/g, '\\"')
        .replace(/&/g, '","')
        .replace(/=/g, '":"') +
      '"}',
  );
};

(async () => {
  const browser = await puppeteer.launch();
  const page = await browser.newPage();

  page
    .on('console', message =>
      console.log(
        `${message.type().substr(0, 3).toUpperCase()} ${message.text()}`,
      ),
    )
    .on('pageerror', ({ message }) => console.log(message))
    .on('response', response =>
      console.log(`${response.status()} ${response.url()}`),
    )
    .on('requestfailed', request =>
      console.log(`${request.failure().errorText} ${request.url()}`),
    );

  await page.setRequestInterception(true);
  page.on('request', req => {
    const requestURL = req.url();
    console.log('all url', requestURL);
    if (requestURL.indexOf('<http://google-analytics.com/g/collect|google-analytics.com/g/collect>') > -1) {
      console.log('Intercepted: ' + requestURL);
      console.log('full req', requestUrlParamsToJSON(requestURL));
      //req.abort();
      req.continue();
    } else {
      req.continue();
    }
  });

  await page.goto('<http://localhost:3000/>');

  // Set screen size
  await page.setViewport({ width: 1080, height: 1024 });

  let userCount = 1;

  const accountType = 'free';

  const username = `${accountType}-puppeteer-${userCount}`;

  // Enter username
  await page.type('.username-input', username);

  // Click login button
  const loginButton = '.login-button';
  await page.waitForSelector(loginButton);
  await page.click(loginButton);

  // Locate the main panel and read the title
  console.log('logged in as', username);
  const textSelector = await page.waitForSelector('.main-panel h2');
  const panelTitle = await textSelector.evaluate(el => el.textContent);

  // Print the full title
  console.log('The panel title is "%s".', panelTitle);

  // Decide whether to click the Main CTA button
  // use a seeded random number generator to ensure the same user always makes the same decision
  const encounterType =
    panelTitle === 'Welcome (control)' ? 'Control' : 'Experiment';
  const key = `${accountType}${encounterType}`;

  const seededRandomNumberGenerator = seedrandom(username);
  console.log('Key', key, mainCTAProbabilities[key]);
  const dieRoll = seededRandomNumberGenerator();
  const rollToClickCTA = dieRoll >= mainCTAProbabilities[key];
  console.log('die roll', dieRoll);
  if (rollToClickCTA) {
    console.log('gonna click it');
    const mainCTAButton = '#main-cta-button';
    await page.waitForSelector(mainCTAButton);
    await page.click(mainCTAButton);
    console.log('clicked it!');
  } else {
    console.log('nah');
  }

  // logout
  const logoutButton = '.logout-button';
  await page.waitForSelector(logoutButton);
  await page.click(logoutButton);

  await browser.close();
})();
From the console output I can see that the dummy user logs in, and clicks the button. But no GA event is sent. But if I visit the page, login and click the button, the GA event is sent and shows up in BigQuery moments later. Does anyone have any experience of using Puppeteer with Google Tag Manager? Any idea why it’d be different in Headless Chrome? Many thanks for any ideas how to explore this problem!