Page Tracking using Custom URLs - is this viable?
-
Hi Moz community!
I’ll try to make this question as easy to understand as possible, but please excuse me if it isn’t clear.
Just joined a new team a few months ago and found out that on some of our most popular pages we use “custom URLs” to track page metrics within Google Analytics. NOTE: I say “custom URLs” because that is the best way for me to describe them.
As an example:
-
This page exists to our users: http://usnews.rankingsandreviews.com/cars-trucks/Ram_HD/2012/photos-interior/
-
But this is the URL we have coded on the page: cars-trucks/used-cars/reviews/2012-Ram-HD/photos-interior/ (within the custom variance script labeled as “var l_tracker=” )
-
It is this custom URL that we use within GA to look up metrics about this page.
-
This is just one example of many across our site setup to do the same thing
-
Here is a second example:
- Available page to user: http://usnews.rankingsandreviews.com/cars-trucks/Cadillac_ATS/2015/
- Custom “var l_tracker=” /cars-trucks/2015-Cadillac-ATS/overview/
- NOTE: There is a small amount of fear that the above method was implemented years ago as a work-around to a poorly structured URL architecture. Not validated, but that is a question that arose.
Main Questions:
- Is the above implementation a normal and often used method to track pages in GA? (coming from an Omniture company before – this would not be how we handled page level tracking)
- Team members at my current company are divided on this method. Some believe this is not a proper implementation and are concerned that trying to hide these from Google will raise red flags (i.e. fake URLs in general = bad)
- I cannot find any reference to this method anywhere on the InterWebs
- If method is not normal: Any recommendations on a solution to address this?
Potential Problems?
- GA is currently cataloguing these tracking URLs in the Crawl Error report. Any concerns about this?
- The team wants to hide the URLs in the Robots.txt file, but some team members are concerned this may raise a red flag with Google and hurt us more than help us.
Thank you in advance for any insight and/or advice.
Chris
-
-
any help?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How important is Lighthouse page speed measurement?
Hi, Many experts cite the Lighthouse speed as an important factor for search ranking. It's confusing because several top sites have Lighthouse speed of 30-40, yet they rank well. Also, some sites that load quickly have a low Lighthouse speed score (when I test on mobile/desktop they load much quicker than stated by Lighthouse). When we look at other image rich sites (such as Airbnb, John Deere etc) the Lighthouse score can be 30-40. Our site https://www.equipmentradar.com/ loads quickly on Desktop and Mobile, but the Lighthouse score is similar to Airbnb and so forth. We have many photos similar to photo below, probably 30-40, many of which load async. Should we spend more time optimizing Lighthouse or is it ok? Are large images fine to load async? Thank you, Dave bg_05.jpg
Reporting & Analytics | | erdev0 -
How to Track Button Clicks as Events in Universal Analytic using New Interface of GTM?
Hello Experts, How to Track Button Clicks as Events in Universal Analytic using New Interface of GTM? Also is there anything code we have to do on website also to track buttons? Sorry to say request please don't just share any post, please give specific answer or share very relevant post. Your input is really very valuable to me. Regards, Mitesh
Reporting & Analytics | | bkmitesh0 -
What determines the page order of site:domain?
Whenever I use site:domain.com to check what's index, it's pretty much always in the same order. I gather from this, the order is not random. I'm also reasonably certainly it isn't related to any page strength signals or ranking results. So, does anyone know why the pages are displayed in the order they are? What information does the order of the pages tell me? Thanks, Ruben
Reporting & Analytics | | KempRugeLawGroup1 -
Tracking Multiple Top Level Domains in GA
Hi, I am setting up Google Analytics for www.example.co.uk and www.example.ie Is this as simple as just adding GA tracking code 'track Multiple top-level domains'? With this method how can I distinguish between each site's incoming traffic? Do filters needed to be created? Thanks in advance 🙂
Reporting & Analytics | | daracreative0 -
How to filter pages in Analytics by multiple criteria
Hello, we have several pages with the same page title. Now out of all those pages I want to pick two. Let's call them "/page1" and "/page2". For those pages I want the following information (combined for both): Avg. time on page, Bounce rate, Navigation Summary Normally I get all the information under "Content" "Pages" and by choosing the "page title" as primary dimension and clicking on the respective page title. Let's call it "page | title". Choosing the filter for 1 page works fine (I just enter "/page1" in simple filter). But how can I filter for two pages ( entering " include page ends with /page1 and include page ends with /page2" in the advance filter will show 0 results). Thanks in advance
Reporting & Analytics | | guitarslinger0 -
Campaigns: How to Decide What to Track?
Hi There! I'm new to SEOMoz, but not completely new to SEO. A bit of background: 9 year old content site, medium traffic (low 7 figures unique visitors yearly) with about 85% coming from SE traffic. (It was 75% but Panda seems to have bumped that up.) I've created this with zero keyword research. I just built a site that I would have liked to have found online. It's a weight loss/fitness website with tools, calculators and articles. I'm going to be working harder on getting traffic from other sources, but I signed up here because I want to make sure I'm taking full advantage of my site when it comes to SEO. Having said that, I'm at a loss as to what to do with a campaign. My site is huge, with thousands of pages and keywords. I started a campaign, threw in a few keywords and some competitors just to get started, but this seems so...directionless. How do I figure out what I want to analyze? Thanks! Suzanne
Reporting & Analytics | | Suzany0 -
Phantom urls causing 404
I have a very strange problem. When I run SEOmoz diagnostics on my site, it reveals urls that I never created. It seems to combine two slugs into a new url. For example, I have created the pages http://www.naplesrealestatestars.com/abaco-bay-condos-naples/ and http://www.naplesrealestatestars.com/beachwalk-naples-florida/ and now the url http://www.naplesrealestatestars.com/abaco-bay-condos-naples/beachwalk-naples-florida/ exists in addition to the two I created. There are over 100 of these phantom urls and they all show a 404 error when clicked on or crawled by SEOmoz. Any body know how to correct this?
Reporting & Analytics | | DanBoyle760 -
Tagging URLs Linkbuilding and anchor links
Hi, I am going to publish a press release on a number of different websites. First and foremost, I want to build anchor links back to website for specific keywords. Secondly I want to measure clickthrus from each site using parameter tracking in GA. I want to know if I put in a url with ?utm_source=xxx, will this have any impact upon my linkbuilding efforts? i.e. will search engines attribute the keyword to the long url with tracking or the url without tracking. I understand that everything from the ? mark is ignored. However, i just want to double check before I publish release. Thanks for your help. Mik
Reporting & Analytics | | increation0