in this sql every row are 1 and same user_id can b...
# ask-questions
in this sql every row are 1 and same user_id can be on may rows, SELECT 1 AS value, c.created_at AS timestamp, c.employee_account_id AS user_id, i have set up the experiment as conversion type COUNT. The thing that i cant get together if i need to have SUM(value) in the user value aggregation field because i get different result when i have it..
Hi Samuel - can you give us a bit more context as to what you're looking to do?
what does the metric represent?
Thanks! Every row is contact request and have a request id that is sent from a user_id. So every row is a request and every row has a user_id field. One user_id can be on many rows.
we want to see which experiment has most sent request, and the experiment is divided om user_id
here is the full sql
Copy code
21 AS value,
3c.created_at AS timestamp,
4c.employee_account_id AS user_id,
5c.company_id as company_id
7  demando-data.prod_public.connection c
8  INNER JOIN demando-data.prod_public.account AS a ON = c.employee_account_id
10 c.assistant_id IS NULL
11AND c.initiated_by = 'employee'
okay, so where are you having an issue?
When i use SUM in user value aggregation on value
i get another result then not having sum in user value aggregation
what is the right result
@fresh-football-47124 can we book a short meeting with someone at your team. It feels like i cant trust these result as they are right now.
For us right now we track a experiment by the kpis sent request , how many request is accepted and the grade(%) of sent that is accepted . Now the results is pointing in a ological way: variation contact request is +9% , accepted -6% and grade is +37% …
Our default is
COALESCE(SUM(value, 0))
, so if you have NULLs in your data and you have
as your aggregation then you can get different results.
you mean SUM(COALESCE(value, 0)) ?
I get the same strange results 😞
Screenshot 2024-02-18 at 14.25.58.png,Screenshot 2024-02-18 at 14.26.04.png
Another question can i set too date in experiment to a specific date before? like 15th of january. Or is it to late now
I'm telling you to set that field to
, not
SUM(COALESCE(value, 0))
I think i had that before
Can you set it to that and then share the results table with the full statistics? You can share it in a DM if you don't want to publish it
Screenshot 2024-02-21 at 07.56.41.png
Now i get
Please set the field under
User Value Aggregation
to the number
. This is what I have been asking you to do.
Like this?
this makes no sense
i also get an error
Ah, I thought it would work for sure, maybe MAX(1) would work instead, I haven't tested it. This makes sense because you seem to be trying to replicate a binomial metric using a count metric. I don't think you should be doing this at all because you should rely on our binomial metric to handle this type of metric, but you seem to want to check the results with a count metric.
If you remind me what your goal is maybe I can help pull out of this back and forth and provide higher level feedback/suggestions.
There is no advanced things i want to achieve. I want to count all accepts on all users
Ok, an an "accept" is one row in this query?
And it's ok if one user has more than one accept? e.g. their value can be 2, 3, 4 or more?
and i the binomial i want to get what percentage is accept of all sent
Yes it can be more than one accept on each user
one user can send many request
and the request can be accepted
now i just want to count accept in one exeperient
and on that i want the result to be logical
now i just want to count accept in one exeperient
What do you mean by this?
Sorry i meant Metric not experiment
One metric in my experiment is to count accepts
One metric is to count sent
one metric is a binominal to see accept / sent
Ok, and there can be multiple requests sent by one user?
Ok, then you want three metrics, all three of them will be type "count": Metric 1 (count sent): Type: count SQL: one row per sent request, with
1 as value
in the query. User Value Aggregation: leave it blank so it accepts the default, you want basically to sum the rows and the default will do that if you have
1 as value
Denominator: do not pick one. We will use all experiment users as default Metric 2 (count accepted): Type: count SQL: one row per accepted request, with
1 as value
in the query. User Value Aggregation: again leave it blank Denominator: do not pick one. We will use all experiment users as default Metric 3 (ratio accepted / sent): Type: count (not binomial, that's something different where each user is either a 0/1) SQL: same as count accepted above Denominator:
count sent
should be picked as your denominator
This could also be achieved a bit more clearly with fact tables, but we can talk about that later if you're interested, for now the above should work for you. Feel free to share screenshots and I can tell you if things look right.
Okey thanks
why not binominal?
isent that for this use case
to calc%
Can you give example to when to use binominal
Yeah, it won't be rendered as a percentage, because we don't guarantee that the numerator is < the denominator. You could use the same flow I describe to compute
sent / accepted
which would be larger than 1. But at the end of the day it's because binomial forces users to be either 0 or 1, which isn't what you want here.
You should use binomial when you what to count a % of USERS that ever do some action.
e.g. if you wanted
% of users to accept at least one request
then you could use binomial.
okey so whe i do it when its 1 to1 with users
The Fact Table interface makes this a bit clearer
Yeah, that's right.
Okey get it , thanks!