Issue with GA tracking and Native AMP
-
Hi everyone,
We recently pushed a new version of our site (winefolly.com), which is completely AMP native on WordPress (using the official AMP for WordPress plugin). As part of the update, we also switched over to https. In hindsight we probably should have pushed the AMP version and HTTPS changes in separate updates.
As a result of the update, the traffic in GA has dropped significantly despite the tracking code being added properly. I'm also having a hard time getting the previous views in GA working properly.
The three views are:
- Sitewide (shop.winefolly.com and winefolly.com)
- Content only (winefolly.com)
- Shop only (shop.winefolly.com)
The sitewide view seems to be working, though it's hard to know for sure, as the traffic seems pretty low (like 10 users at any given time) and I think that it's more that it's just picking up the shop traffic.
The content only view shows maybe one or two users and often none at all. I tried a bunch of different filters to only track to the main sites content views, but in one instance the filter would work, then half an hour later it would revert to no traffic. The filter is set to custom > exclude > request uri with the following regex pattern:
^shop.winefolly.com$|^checkout.shopify.com$|/products/.|/account/.|/checkout/.|/collections/.|./orders/.|/cart|/account|/pages/.|/poll/.|/?mc_cid=.|/profile?.|/?u=.|/webstore/.
Testing the filter it strips out anything not related to the main sites content, but when I save the filter and view the updated results, the changes aren't reflected. I did read that there is a delay in the filters being applied and only a subset of the available data is used, but I just want to be sure I'm adding the filters correctly.
I also tried setting the filter to predefined, exclude host equal to shop.winefolly.com, but that didn't work either.
The shop view seems to be working, but the tracking code is added via Shopify, so it makes sense that it would continue working as before.
The first thing I noticed when I checked the views is that they were still set to http, so I updated the urls to https. I then checked the GA tracking code (which is added as a json object in the Analytics setting in the WordPress plugin. Unfortunately, while GA seems to be recording traffic, none of the GA validators seem to pickup the AMP tracking code (adding using the amp-analytics tag), despite the json being confirmed as valid by the plugin.
This morning I decided to try a different approach and add the tracking code via Googles Tag Manager, as well as adding the new https domain to the Google Search Console, but alas no change.
I spent the whole day yesterday reading every post I could on the topic, but was not able to find any a solution, so I'm really hoping someone on Moz will be able to shed some light as to what I'm doing wrong.
Any suggestions or input would be very much appreciated.
Cheers,
Chris (on behalf of WineFolly.com) -
Lots going on here, so, a laundry list of follow up questions and thoughts for you...
Are you seeing AMP results showing up in the Search Console? Are you seeing them indexed as intended?
If you're doing Native AMP, you won't be able to diagnose pages by /amp URL types of formatting. It might be worth trying to fire off an event, or custom dimension in GA, for AMP = Yes / No or something like that.
For the sitewide view, have you tested loading pages on a private browser and incognito mobile browser and seeing if they show up in GA realtime in each of the 3 views when they're supposed to?
It looks like you might be using Cloudflare - I haven't dealt with an AMP site that uses it, but have you checked whether there are compatibility issues or anything you need to activate?
Are any Google Tag Manager pages set to fire on HTTPS only?
Are any GA filters in place that specify HTTP/HTTPS that need to be broadened?
Your Amp Analytics code seems to match the one on a site that is functioning as intended, so I don't think it's a formatting issue.
For the GA view filter - it seems like you should be able to simply include/exclude traffic to shop.winefolly.com - why the added complexity beyond that?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang usage for language & country x only language
Hi guys, I´m dealing with a website of a client where hreflang tags are implemented as follows: As you can see the hreflang tags reference language & countrycode as well as only the languagecode with the same URL (for french: website/fr/ihr-besuch/online-tickets" hreflang="fr-fr" as well as hreflang="fr" href="https://www.website/fr/ihr-besuch/online-tickets"). Is this a problem and should be corrected so that either language & countrycode is referenced or only languagecode? Thanks in advance!
Technical SEO | | Julisn0 -
Link to AMP VS AMP Google Cache VS Standard page?
Hi guys, During the link building strategy, which version should i prefer as a destination between: to the normal version (php page) to the Amp page of the Website to the Amp page of Google Cache The main doubt is between AMP of the website or standard Version. Does the canonical meta equals the situation or there is a better solution? Thank you so mutch!
Technical SEO | | Dante_Alighieri0 -
Social Profile & Logo Markup: Where to add it?
We're looking to implement structured data for our social profiles and logo, as referenced here: https://developers.google.com/search/docs/data-types/social-profile https://developers.google.com/search/docs/data-types/logo Should we add the markup for these structured data types to multiple pages, the homepage only, or all indexable pages? TIA
Technical SEO | | Allie_Williams0 -
Is Noindex Enough To Solve My Duplicate Content Issue?
Hello SEO Gurus! I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article. Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well. My concern is duplicate content. In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit. I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution. Thanks in advance! Kind Regards, Mike
Technical SEO | | RCNOnlineMarketing0 -
How to use internal tracking without causing duplicate content issues
Hi, We've been testing internal tracking for 4 weeks on a couple of pages using the basic string ?internalcampaign=X, but hese pages have started appearing in the search results. We don't currently have the facility to add canonical tags to correct this. Does anyone have any other solutions to this problem other than deleting the internal tracking or adding filters on the server? Thanks!
Technical SEO | | NSJ780 -
Ajax #! URLs, Linking & Meta Refresh
Hi, We recently underwent a platform change and unfortunately our updated ecom site was coded using java script. The top navigation is uncrawlable, the pertinent product copy is undetectable and duplicated throughout the code, etc - it needs a lot of work to make it (even somewhat) seo-friendly. We're in the process of implementing ajax #! to our site and I've been tasked with creating a document of items that I will test to see if this solution will help our rankings, indexing, etc (on Google, I've read the issues w/ Bing). I have 2 questions: 1. Do I need to notify our content team who works on our linking strategy about the new urls? Would we use the #! url (for seo) or would we continue to use the clean url (without the #!) for inbound links? 2. When our site transferred over, we used meta refresh on all of the pages instead of 301s for some reason. Instead of going to a clean url, our meta refresh says this: . Would I update it to have the #! in the url? Should I try and clean up the meta refresh so it goes to an actual www. url and not this browsererrorview page? Or just push for the 301? I have read a ton of articles, including GWT docs, but I can't seem to find any solid information on these specific questions so any help I can get would be greatly appreciated. Thanks!
Technical SEO | | Improvements0 -
Htaccess issue
I have some urls in my site due to a rating counter. These are like: domain.com/?score=4&rew=25
Technical SEO | | sesertin
domain.com/?score=1&rew=28
domain.com/?score=5&rew=95 These are all duplicate content to my homepage and I want to 301 redirect them there. I tried so far: RedirectMatch 301 /[a-z]score[a-z] http://domain.com
RedirectMatch 301 /.score. http://domain.com
RedirectMatch 301 /^score$.* http://domain.com
RedirectMatch 301 /.^score$.* http://domain.com
RedirectMatch 301 /[a-z]score[a-z] http://domain.com
RedirectMatch 301 score http://domain.com
RedirectMatch 301 /[.]score[.] http://domain.com
RedirectMatch 301 /[.]score[.] http://domain.com
RedirectMatch 301 /[a-z,0-9]score[a-z,0-9] http://domain.com
RedirectMatch 301 /[a-z,0-9,=,&]score[a-z,0-9,=,&] http://domain.com
RedirectMatch 301 /[a-z,0-9,=&?/.]score[a-z,0-9,=&] http://domain.com None of them works. Anybody? Solution? Would be very much appriciated0 -
Google crawl index issue with our website...
Hey there. We've run into a mystifying issue with Google's crawl index of one of our sites. When we do a "site:www.burlingtonmortgage.biz" search in Google, we're seeing lots of 404 Errors on pages that don't exist on our site or seemingly on the remote server. In the search results, Google is showing nonsensical folders off the root domain and then the actual page is within that non-existent folder. An example: Google shows this in its index of the site (as a 404 Error page): www.burlingtonmortgage.biz/MQnjO/idaho-mortgage-rates.asp The actual page on the site is: www.burlingtonmortgage.biz/idaho-mortgage-rates.asp Google is showing the folder MQnjO that doesn't exist anywhere on the remote. Other pages they are showing have different folder names that are just as wacky. We called our hosting company who said the problem isn't coming from them... Has anyone had something like this happen to them? Thanks so much for your insight!
Technical SEO | | ILM_Marketing
Megan0