How to do A/B testing without creating two separate url like google analytic experiment?
-
Hello Experts,
I want to do A/B testing for my page. In google analytic experiment we have to create two pages 1) Original Page 2) Variant 1 but I don't want to go in this method that is I donot want to create two pages is it possible only via one page but two different events or something else ? If yes then which is the best tool?
Thanks!
Wrigths!
-
Hi Martijn,
Is it possible via google tag manager?
Thanks!
-
Hi
Yeah, I have used quite a few tools.
My personal favourite is HP autonomy, you get the full management team, they help set up the test, do the analysis, suggest tests etc. It's a wonderful service, but it comes with a high price.
Something similar which is slightly cheaper is www.optimizely.com again these are more the enterprise end but do offer a free 30-day trial. It might be worth looking into these two, they may offer a self-service option.
Some more reasonable price platforms that target mid-size companies.
One of my clients I have just taken on uses unbounce and while I haven't yet manage to use their solution. I did get one of their reps at Mozcon to show me the platform - looking forward to using it. Seems like a very very good platform for the price.
With most of this software it's all about what you do. You have the find the test, set them up and analyse them, but so long as your putting in good tests you will get great results.
If you use the Adobe stack they offer A/B testing suite.
Also, there is Maxymiser its been a few years since I used it, but was good at the time and it's probably evolved since.
There are loads of options to choose from. If you use a platform like Shopify or Wordpress it might be worth while seeing which ones of these integrates with your current site to make your life easier.While this article from Mashable is pretty old it does cover some good platforms and some I've never heard of but might be worth checking out: http://mashable.com/2015/01/30/ab-testing-tools/
Thanks
Andy
-
Try this, with this you can do the same with Google Content Experiments without creating multiple pages. You do everything through JavaScript: https://developers.google.com/analytics/devguides/collection/analyticsjs/experiments
It's unfortunately a lesser known feature of Content Experiments, although I would suggest to start using Google Optimize in the future.
-
Hi Jaume,
Thanks for your response!
But other than google and other than creating two separate pages is it possible via other method and tool?
Thanks!
-
Google is one of the best tools for what you want to do, but i think you have to create two versions of the page, Greetings
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What could cause Google to not honor canonical URLs?
I have a strange situation on a website, when I do a Google query of site:example.com all the top indexed results appear to be queries that users can perform on the website. So any random term the user searches for on the website for some reason is causing the search result page to get indexed - like example.com/search/query/random-keywords However, the search results page has a canonical tag on it that points to example.com/search, but that doesn't seem to be doing anything. Any thoughts or ideas why this could be happening?
Technical SEO | | IrvCo_Interactive0 -
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
Using http: shorthand inside canonical tag ("//" instead of "http:") can cause harm?
HI, I am planning to launch a new site, and shortly after to move to HTTPS. to save the need to change over 5,000 canonical tags in pages the webmaster suggested we implement inside the rel canonical "//" instead of the absolute path, would that do any damage or be a problem? oranges-south-dakota" />
Technical SEO | | Kung_fu_Panda0 -
URL Question: Is there any value for ecomm sites in having a reverse "breadcrumb" in the URL?
Wondering if there is any value for e-comm sites to feature a reverse breadcrumb like structure in the URL? For example: Example: https://www.grainger.com/category/anchor-bolts/anchors/fasteners/ecatalog/N-8j5?ssf=3&ssf=3 where we have a reverse categorization happening? with /level2-sub-cat/level1-sub-cat/category in the reverse order as to the actual location on the site. Category: Fasteners
Technical SEO | | ROI_DNA
Sub-Cat (level 1): Anchors
Sub-Cat (level 2): Anchor Bolts0 -
A/B testing entire website VS Seo issues
I'm familar with A/B testing variations of a page but I'd like to A/B test a new designs version of a e-commerce site. I´m wondering about the best way to test with SEO concerns... this is what I´ve in mind right now, any suggestion? Use parameters to make version B different from A version. Redirect 50% of the users with 302 ( or javascript would be a better way?) Use noindex in the B pages. Use rel=canonical in the B pages pointing to A version. In the end use 301 redirect to all B pages to A urls. PS: We can´t use subdomain and i don´t wanna use robots.txt file to protect the new design from competitors. I´d love any suggestions and tips about it - thanks folks 🙂
Technical SEO | | SeoMartin10 -
Trailing Slashes In Url use Canonical Url or 301 Redirect?
I was thinking of using 301 redirects for trailing slahes to no trailing slashes for my urls. EG: www.url.com/page1/ 301 redirect to www.url.com/page1 Already got a redirect for non-www to www already. Just wondering in my case would it be best to continue using htacces for the trailing slash redirect or just go with Canonical URLs?
Technical SEO | | upick-1623910 -
Adding Wordpress On Separate Server w/ subdomain = SEO Issues?
I have a client that wants to setup wordpress for their business site. However, they are really concerned about the load time a wordpress install creates on their site in the root. So, they want to setup wordpress with a sub-domain on a separate server. For example: abc.domain.com As far as SEO is concerned with the main site, what are the advantages and disadvantages for using a separate server w/ subdomain?
Technical SEO | | VidenMarketing0 -
Existing Pages in Google Index and Changing URLs
Hi!! I am launching a newly recoded site this week and had a another noobie question. The URL structure has changed slightly and I have installed a 301 redirect to take care of that. I am wondering how Google will handle my "old" pages? Will they just fall out of the index? Or does the 301 redirect tell Google to rewrite the URLs in the index? I am just concerned I may see an "old" page and a "new" page with the same content in the index. Just want to make sure I have covered all my bases. Thanks!! Lynn
Technical SEO | | hiphound0