Hey everyone! :wave: Hope you're all having a prod...
# ask-questions
p
Hey everyone! 👋 Hope you're all having a productive week! I'm diving into the world of SEO A/B testing and was wondering if anyone in this brilliant community has experience using Cloudflare Workers Edge Apps (or similar edge compute solutions) in conjunction with Growthbook for this purpose? Specifically, I'm trying to achieve something akin to how SEOPilot operates, where you can designate specific URLs as either control or variant for SEO experiments. The beauty of edge-side includes is that the changes are server-side rendered, making them fully crawlable by Googlebot and other search engines. The potential for integrating this with tools like BigQuery, using Google Search Console data to track ranking position and other SEO metrics, seems incredibly powerful. However, I'm currently scratching my head on the best way to implement this within Growthbook. My initial thought is to create two separate experiments: • Control: Forced to 100% control. • Variation: Forced to 100% variation. Then, I'd adapt the bigquery results query to pull data based on both experiment machine names, rather than a single experiment machine name. Has anyone explored a similar approach or have any insights or alternative methods they could share? I'm all ears for your wisdom and any "aha!" moments you might have had. Thanks in advance for your help!
h
Depending on how you implement it, SEO A/B testing tends to be a slightly different beast than user A/B testing. It sounds like you're trying to set it up so that your experimentation/randomization unit is URLs instead of users. You'd likely also define a success metric like page rank/position, is indexed, is crawled, or even amount of organic traffic (much further downstream). These tend to be quite difficult to measure accurately and also have a considerable lag time. There's also the issue of URL bucketing/randomization. Wondering if you can expand a bit on the goals and methodology you're exploring