GA Benchmarking: sum of sessions by channel vs total number provided
-
Hello.
There is a huge difference in benchmark provided total amount of sessions and number of sessions if I add them up by channels:Screenshot
If I manually add number of sessions by channel i get 187+114+59+69+231+121+168+225=1174 However,report says 227. That's MORE THAN 5 TIMES difference.
Does anybody know what's happening?
If not, are there other free tools, providing some benchmark reports?Thanks!
-
No problem I looked at Google's documentation for benchmark reporting and they didn't go in to how their benchmarking system works unfortunately.
-
Thanks!It took me some time to convert your english into my brain's english, but now I understand what's what.
-
Your stats are on the left. 178 vs 227 (your total sessions are 178). So the total does add up for your data.
As for the sample data, there's 2 possibilities I think:
1. It's all samples and averages. It's grabbing sample data based on your industry category, region and traffic volume. It's not necessary using the exact same data set for Total, Social, Direct, Organic, Referral, etc. Which is why they don't add up. Google collects a lot more data on overall traffic numbers than they do on say, email referrals (different sample sizes, different averages).
2. It's not an average at all. The report is to measure your channel volume against a benchmark. Google never said it was a benchmark average. The could be determining the benchmark based on some other equation.
There's definitely some other tools that could give you some benchmark reports. Like SimilarWeb, Alexa, I'm sure there are others some Mozzers can list.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Relevancy vs Quality of the website in blog commenting
For example, I would like to comment on a do follow blog with my link inserted in the comment. The blog post is relevant with the link i inserted however there are too many spammy links on the comment. Will this affect my website in terms of "link neighborhood" even my site is relevant to the blog? how do you judge whether the blog is worth commenting and putting your link on it? Does link building on blog comments actually hurt the page rank of websites if it go wrong such as spammy sites?
Competitive Research | | andzon0 -
Abnormally high number of root domains linking to root domains
My client runs a very small business hiring out space in a single warehouse and rarely adds content to his incredibly badly optimised wordpress site. Can anyone make any wild guesses as to why my client has 2.9 million root domains linking to root domains? He also has a domain authority of 100. Strange when the number one competitor in warehouse space in this part of the world is Gumtree (equivalent of Craig's List) and that only has 9.25k root domains linking to root domains and site authority of 80. Is there any way of finding out where these links are coming from via Moz? Thanks
Competitive Research | | SocialStreets.Co0 -
Ranking for Competitive Keywords vs. Less Competitive Keyword Variations
I'm curious about situations where a website ranks very well for query variations, but doesn't rank for the query itself (or the reverse of that). For Redfin (where I work), here is the situation with regard to keyword rankings on Google (searched today from USA, incognito)... real estate search - #4 real estate online - #4 real estate site - #5 find real estate - #9 get real estate - #16 real estate - #163 It stands to reason that a site ranking well for a competitive query should also rank well for less competitive query variations - especially query variations that are non-limiting and do not demand a custom landing page (for example, I would consider 'board games' to dramatically limit the query 'games' and be best targeted with a targeted page...not so with 'real estate site' and 'real estate'). So, my question is, what are some theories regarding situations like this? Why do some sites rank so well for competitive queries but not for non-limiting query variations? Why aren't the sites that are crushing us for 'real estate' also crushing us for 'real estate' variations (to be clear...the top sites are crushing us for both)? Is it anchor text? Is it social signals? Is it offline signals, co-occurrence, or citations? What about internal linking and site structure? I realize it's likely a mix of all this, but I'm hoping we can drum up some new ideas here. FYI, on Bing we also rank very well for 'real estate' variations, but leap up to 31st for 'real estate'. Thoughts?
Competitive Research | | RyanOD0 -
Currently Number 2 - Worth Shooting for 1?
Hello SEO Community, So I was able to rise to #2 in a highly competitive search term after a guest blog post that I wrote really took off (wish I could recreate that for my other terms, haha) Anyway, I steadily rose from 30ish to number 2 and have obviously seen more traffic, which is awesome and I'm happy. But still, in terms of the amount of traffic that google estimates the keyword gets, and the amount I'm getting at #2, it seems like #1 is getting the VAST majority (I know this is how most markets work). Judging from my link profile compared to #1, it would be extremely difficult to dethrone them. So what do y'all think? Is it worth spending time and resources shooting for the glory and traffic of #1 and potentially never getting there - or focusing my efforts on my middling keywords and getting those on the first page as well?
Competitive Research | | CleanEdisonInc0 -
Subdomain vs Root Domain Metrics
My site has everything in the root - test.com. www.test.com is redirected to test.com and all content is in the root; there are no subdomains. So I'm completely confused as to how the mozrank and moztrust are different. They have the same number of links yet: Subdomain: MozRank - 3.72
Competitive Research | | Optimise
MozTrust - 3.23 Root Domain: MozRank - 3.05
MozTrust - 2.79 This seems consistent with all my competitors too so just wondered if someone could explain why the root domain is usually/often lower? Surely the root domain numbers should always be equal to or higher than the subdomain?0 -
Ranking for long tailed keyword vs shorter keyword phrase?
I have a webpage http://freightmonster.com/free-freight-quote that currently ranks 19th in the Google SERP's for the keyword free freight quote. The keyword gets 59 exact match searches a month. Competition is high for this keyword. The keyword freight quote gets 8625 exact match searches a month and my home page http://freightmonster.com/ ranks 26th in the Google SERP for it. Competition is high for this keyword. Would I be better off creating another page http://freightmonster.com/freight-quote and doing a better job of on SEO optimization for it in the hopes of getting on page one, or given the fact that freight quote is such a highly competitive term should I just go after other long tailed keywords in my market like flatbed trucking freight quote, heavy haul trucking freight quote, RGN freight quote, etc? We have just started on SEO for organic keywords after spending over $500,000 on PPC in the last 5 years, which in our market niche is the norm. Thank you in advance!
Competitive Research | | FreightBoy0 -
Page quantity vs no crawl errors.
Which one is better: Have a ton of pages but accept crawl errors or; Have a lot less pages with no crawl errors. Let's say I have a product catalogue with 10 regular pages and 500 product pages (same page with content and title defined by url parameter like 'id'). It seems that even with a different product name, product description, price, color etc, I get dupplicate content crawl errors. I also know I could use a link tag with cannonical rel attribute to fix the crawl errors but I would lose indexing on 499 pages. In this case is it better SEO wise to have: 510 pages with 499 dupplicate content crawl errors or; 11 pages with 0 crawl errors?
Competitive Research | | escteam0 -
Backlinks: Majestic SEO vs. OSE
My apologies if this noob question has already been answered somewhere, but what's the difference between the backlinks reported by Majestic SEO vs. the links reported by OSE. Just curious why the former might report 2,360 external backlinks to a page and the latter might report 949 total links to the same page. Just curious. Thanks!
Competitive Research | | SmartDraw0