Hi there, We started running our first growth boo...
# ask-questions
s
Hi there, We started running our first growth book experiment using the react-SDK and we noticed the following error, why is this error happening?
*Multiple Exposures Warning*. 1,310 users (10.61%) saw multiple variations and were automatically removed from results. Check for bugs in your implementation, event tracking, or data pipeline.
The experiment has been exposed to
11,033
total users over one week Also the Split between groups seems uneven 58% vs 42% vs 0% (in the config we are splitting users 50/50 between control and variant 1), why is the split not even? More info: • We used user id as the main key into splitting users into groups, the user id is a unique uuid that we have for each user • We understand that growthbook has a deterministic function to assign the user into one of the groups, so we do not understand how a user can be assigned into 2 variants at the same time • We used the trackingcallback and store the exposure data into Google Big Query and provide read only access to GB to have access to it to generate the reports
r
@steep-grass-70833 has opened an issue Close Issue button
Our official support hours are 6:30am - 5pm Pacific Time, Monday through Friday. You may occasionally hear from us outside of these hours. Your support request has been logged in our system. Our support team will get back to you very soon!
s
@green-kite-41783 @prehistoric-printer-86780 FYI
p
I can also add information about our settings: "assignment attribute": id (always the same for individual user) "attribute targeting" : {"isSpecificBrowser": true, "transactions": true, "country": {"$ne": "pl"}} default value: Variant 1
r
Hi Mohamed and Monika, thanks for reaching out! I've asked the team if there could be any other cause of the Multiple Exposures Warning. I'm also looking into why the split would be uneven (42/58). This might not matter, but just to cover it, are you self-hosting or using the cloud-hosted version of GrowthBook? Could you please send some screenshots of the configurations in your dashboard, as well as the GrowthBook-related code you're using?
f
there are typically a few reasons for SRMs, and the most common is adjusting the split mid experiment.
1
s
• We are using cloud version • We did not change the assignment or any settings mid experiment • Here are a few screenshots from the configurations
Transaction amount
metric query
Copy code
SELECT
  user_id,
  timestamp,
  orderAmount_USD as value
FROM
  `desktop_growthbook.dt_cb_savings_1_0_transactions`
Transaction amount
metric query from experiment dashboard->view queries
Copy code
-- Transaction amount (30-day window) (revenue)
WITH
  __rawExperiment AS (
    SELECT
      user_id,
      timestamp,
      experiment_id,
      variation_id,
      country
    FROM
      `osp-bu-desktop.desktop_growthbook.dt_cb_savings_1_0_assignments`
  ),
  __experimentExposures AS (
    -- Viewed Experiment
    SELECT
      e.user_id as user_id,
      cast(e.variation_id as string) as variation,
      CAST(e.timestamp as DATETIME) as timestamp
    FROM
      __rawExperiment e
    WHERE
      e.experiment_id = 'dt_cb_savings_1.0'
      AND e.timestamp >= '2023-10-30 12:50:00'
      AND e.timestamp <= '2023-11-07 07:15:04'
  ),
  __experimentUnits AS (
    -- One row per user
    SELECT
      e.user_id AS user_id,
      cast('All' as string) AS dimension,
      MIN(e.timestamp) AS first_exposure_timestamp,
      (
        CASE
          WHEN count(distinct e.variation) > 1 THEN '__multiple__'
          ELSE max(e.variation)
        END
      ) AS variation
    FROM
      __experimentExposures e
    GROUP BY
      e.user_id
  ),
  __distinctUsers AS (
    SELECT
      user_id,
      dimension,
      variation,
      first_exposure_timestamp AS timestamp,
      date_trunc(first_exposure_timestamp, DAY) AS first_exposure_date
    FROM
      __experimentUnits
  ),
  __metric as ( -- Metric (Transaction amount (30-day window))
    SELECT
      user_id as user_id,
      m.value as value,
      CAST(m.timestamp as DATETIME) as timestamp
    FROM
      (
        SELECT
          user_id,
          timestamp,
          orderAmount_USD as value
        FROM
          `desktop_growthbook.dt_cb_savings_1_0_transactions`
      ) m
    WHERE
      m.timestamp >= '2023-10-30 12:50:00'
      AND m.timestamp <= '2023-12-07 07:15:04'
  ),
  __userMetricJoin as (
    SELECT
      d.variation AS variation,
      d.dimension AS dimension,
      d.user_id AS user_id,
      (
        CASE
          WHEN m.timestamp >= d.timestamp
          AND m.timestamp <= DATETIME_ADD(d.timestamp, INTERVAL 720 HOUR) THEN m.value
          ELSE NULL
        END
      ) as value
    FROM
      __distinctUsers d
      LEFT JOIN __metric m ON (m.user_id = d.user_id)
  ),
  __userMetricAgg as (
    -- Add in the aggregate metric value for each user
    SELECT
      variation,
      dimension,
      user_id,
      SUM(COALESCE(value, 0)) as value
    FROM
      __userMetricJoin
    GROUP BY
      variation,
      dimension,
      user_id
  )
  -- One row per variation/dimension with aggregations
SELECT
  m.variation AS variation,
  m.dimension AS dimension,
  COUNT(*) AS users,
  'mean' as statistic_type,
  'count' as main_metric_type,
  SUM(COALESCE(m.value, 0)) AS main_sum,
  SUM(POWER(COALESCE(m.value, 0), 2)) AS main_sum_squares
FROM
  __userMetricAgg m
GROUP BY
  m.variation,
  m.dimension
p
Code:
Copy code
const growthbook = new GrowthBook({
  apiHost: '<https://cdn.growthbook.io>',
  clientKey: 'sdk-fGO40cEeEBCXoNxF',
  enableDevMode: true,
  subscribeToChanges: true,
  trackingCallback: (experiment, result) => {
    if (experiment.key === EXPERIMENT_KEY.SAVINGS_GOALS_V1 && result.key) {
      StatsService.sendStatABTestEvent(
        result.key,
        LocalStorageService.getItemForUser('huuid'),
        experiment.key,
      );
    }
  },
});

export default function ABTestWrapper({ attributes, children }) {
  const setUserAttributes = async (attributes) => {
    const hashedUserId = await hash(LocalStorageService.getUserId());
    LocalStorageService.setItemForUser('huuid', hashedUserId);
    await growthbook.setAttributes({
      id: hashedUserId,
      ...attributes,
    });
  };

  useEffect(() => {
    growthbook.loadFeatures();
  }, []);

  useEffect(() => {
    if (attributes) {
      setUserAttributes(attributes);
    }
  }, [attributes]);

  return <GrowthBookProvider growthbook={growthbook}>{children}</GrowthBookProvider>;
}
Copy code
const featureVariant = useFeatureValue(EXPERIMENT_KEY.SAVINGS_GOALS_V1);
{featureVariant === 'control' && (
s
@brief-honey-45610 @fresh-football-47124 any thoughts here?
b
Multiple exposures means that in the Experiment Assignment Query, for whatever Identifier Type you are using, we are finding multiple variation IDs for one Identifier ID, e.g:
Copy code
user_id | experiment_id | variation_id
1234    | abc           | 1
1234    | abc           | 0
This normally happens when there is a problem in the code. Some other troubleshooting questions: • Why is there a variation 2 that has no traffic? • How are you tracking and setting IDs? Lastly, the 52/48 split is just random chance.
s
• Why is there a variation 2 that has no traffic?
This is intentional we are not sending any traffic to it.
• How are you tracking and setting IDs?
We use a hashed UUID as a user identifier for the experiment, it looks something like this:
d6092e3a09794b66caa6ac60ae2e9ead48d50a13b07d715c97277eebbfbb498c
h
With so many multiple exposures, it's likely that there's a difference in the ID being used for randomization to the one you're using in your
trackingCallback
. Is it possible that for some reason the growthbook attributes aren't being updated and therefore don't match the huuid in local storage?
f
is
const hashedUserId = await hash(LocalStorageService.getUserId());
always returning the same value for the same user? is there a possibility of a race condition?
like you’re doing the split with one value, but in the tracking callback, you also call localstorage, so its possible that value has changed in the mean time
p
@fresh-football-47124 thank you for your response! This is probably our case.