Hey <@U01TCPDB58C> We are currently doing an a/b ...
# announcements
r
Hey @future-teacher-7046 We are currently doing an a/b test that the numbers look pretty weird, the total number of signups showing in the experiment does not reflect what we can see in mixpanel. it is basically a fraction of what it should be. In the image as you can see in the last days (experiment started on january 16) We should have at around 1500 signups but in the experiment it is a bit over 600 only. Could you please help us understand why the numbers do not match?
@future-teacher-7046 Could you help us understand what is happening there? I will add two more images here of how it looks considering just the test.
f
We filter out metric conversions if they don't happen within the conversion window configured in GrowthBook. Can you check for that metric in GrowthBook and see what the conversion window is set to?
r
this is the window for the signup metric
This issue is very odd, like we test things always the same way, or at least try so, but this one experiment have this strange problem
f
Hmm. Debugging Mixpanel issues can be a little tedious. Basically, you need to view queries in GrowthBook, copy the JQL into Mixpanel, and start deleting filtering logic until all the conversions you want show up. That can help narrow in what the issue is.
r
@fresh-football-47124 or @future-teacher-7046 I am trying to debug this one today. Do I have the option to change the query of the experiment so I can debug it better?
f
Yes, you can edit the experiment query from the data source page
r
I tried several different things there but still, the numbers do not match what I see in mixpanel
Would it be possible that this experiment somehow conflicts with another one?
I think I finally found the line of the query that is filtering out the users:
Copy code
else if(state.variation !== event.properties["Variant name"]) {
          state.multipleVariants = true;
          continue;
        }
But how are the users seeing multiple variants? the implementation we have is the same as before, why is this happening?
f
what are you randomizing by?
r
Nothing changing regarding that to be honest, we get the traffic split and that is it. I saw that we trigger more then once the call back sometimes but for some reason now it is return different values even though it is the same user
f
maybe you could DM your sdk implementation code, and we can see if there is anything up