Good morning team! I noticed you guys released a n...
# experimentation
n
Good morning team! I noticed you guys released a new version yesterday with new features. Kudos! I was just looking over the Bandit experimentation in the docs and I have a question. We have a bandit experiment we would like to run, but our scenario needs the bandit experimentation to be calculated by a dimension (the user's geographical state). We still have a single metric to optimize towards, but our results vary significantly by this dimension, thus we would like to optimize based on this. How could we run this experiment today?
s
Hi Rodolfo, that's great you are excited about Bandits. When you say that the bandit experimentation needs to be calculated by geographical state, does that mean that the treatment applied varies by geographical state (as in a contextual bandit) or just that the metric varies significantly across geographical state? If the latter, at the end of the experiment, do you plan on rolling out the same variation across all states, or could you potentially roll out different variations in different states or regions? If its by region, you may want to consider different bandits for different regions. Thanks, Luke
n
Hi Luke! Thanks for the reply! The treatment would apply different per state, as the results are too different. What would be your suggestion in this scenario? Keep in mind we would be talking about all 50 US states, if we had to create an experiment per state it might be cumbersome just to manage it. I'm not sure if the API would make it any easier to set up. Looking forward to your answer. Thank you!
h
Hi Rodolfo, different Luke here! This seems like a case for contextual bandits which can optimize within each dimension. We don't currently have support for contextual bandits, but we plan on working towards them.
An experiment per state would definitely be a bit clunky in the UI. We are waiting to build API endpoints until Bandits are out of Beta, so it would require quite a bit of manual set up. In the mean time, I could potentially encourage you to run just 3 or 4 bandits with pre-grouped regions of states.
n
Hi @helpful-application-7107! Thank you so much for the insights! I'll give it some thought and potentially give it a go. Many thanks!!!!!