Big fan of Ubersuggest as well for related keywords. Basically Google's related searches at the bottom of the page on steroids.
If you're looking for high volume, Grepwords is another good one, but paid like SEMrush.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Big fan of Ubersuggest as well for related keywords. Basically Google's related searches at the bottom of the page on steroids.
If you're looking for high volume, Grepwords is another good one, but paid like SEMrush.
If you want to make sure Google is crawling your webpages as quickly as possibly after making changes, make sure to use Google Webmaster Tools. Either use the Fetch as Google tool and submit individual updated pages, or if there are many page updates, submit a new sitemap file.
Are you still seeing the drop in rankings?
No problem I looked at Google's documentation for benchmark reporting and they didn't go in to how their benchmarking system works unfortunately.
Your stats are on the left. 178 vs 227 (your total sessions are 178). So the total does add up for your data.
As for the sample data, there's 2 possibilities I think:
1. It's all samples and averages. It's grabbing sample data based on your industry category, region and traffic volume. It's not necessary using the exact same data set for Total, Social, Direct, Organic, Referral, etc. Which is why they don't add up. Google collects a lot more data on overall traffic numbers than they do on say, email referrals (different sample sizes, different averages).
2. It's not an average at all. The report is to measure your channel volume against a benchmark. Google never said it was a benchmark average. The could be determining the benchmark based on some other equation.
There's definitely some other tools that could give you some benchmark reports. Like SimilarWeb, Alexa, I'm sure there are others some Mozzers can list.
Canonical tags would be the most important thing to look at for your dynamic URLs. As long as each of your dynamic pages has a canonical tag to the static version of the page (i.e. package-search/holidays/hotelFilters/) then you won't have to worry about duplicate content.
There are other tools out there that will give you some insights into competitor traffic as well.
SimilarWeb (www.similarweb.com) is a great one that will break down an estimate of traffic sources for a domain. They have quite a good supply of data, and I have found their estimates to be pretty accurate.
Agreed. URL A is now a higher authority page because of the proper canonical, which in turn means a link from URL A could have more value. But the equity from that link to URL B is not directly passed to your site.
It would be nice if there was a setting to turn off the Mozbar for search results. I love using it for diagnosing web pages, but when it tries to load and analyze 100 results on a SERP, it can be rather slow/annoying.
Definitely not the Moz Toolbar
It is possible you could have a Chrome extension that is interfering with the Wistia player.
Since it wasn't included in Alick300's answer, I'll just note that Followed Linking Root Domains are the amount of different websites that have equity-passing links pointing to your site (whether it's 1 or 1000 links).
Variety is good and you want lots of sites sending you links, not just 1 site (this isn't a good signal when you have a lot of backlinks coming from only a couple sites).
Of course in the link graph world, quality is just as important as quantity. Make sure you are taking a look at the quality of links as well (PA, DA, Moztrust, Mozrank, etc.). You want high quality links from relevant website if you want to consistent growth and to exceed you competitors!
Hope this helps clarify things a bit more!
One more point to add here:
If you are looking at "Total Conversion Value" instead of specifically "Transaction Value", you could have other goals set up with some value tied to them.
For instance, you could have a lead page goal setup with a value of 1. Whenever an AdWords users hits that lead page, the Total Conversion Value is going to go up by 1. '
You can review all your conversions you are tracking in your account in the Tools tab under Conversions.
If you are using Google Tag Manager for ecommerce tracking, you need to set up the "Track Type" as Transaction and not Event.
You will only want to fire it on confirmation page/checkout page, so yes you will need a firing rule like you have there.
For more implementation info, refer to the official Google documentation.
Hope this helps!
Pageviews specifically...no. Popularity...yes. User experience is far more important though and Google's approach is based on sites giving users great experience and relevant content.
You do not need to code anything, using tag manager let's you skip the coding so you only have to add the container.
Link tracking was meant for external links, but you can set-it up to do some internal links as well.
What you are describing sounds more like "In-Page Analytics" or some sort of heat mapping which is available with other software like Crazy Egg or Clicktale or others.
In the new Google Tag Manager:
Make sure there is a sufficient delay put on clicks, this will give time for the tag to fire before the next page is loaded. Once your tag container is put live on the site then it should automatically start tracking clicks.
Hi Dan,
In many Google+ cases, your custom URL will need additional characters added to the end of the pre-selected URL. So they have chosen to add a keyword to the end; but it could have been a city, number or anything.
There is no benefit to doing this, there is no 'SEO' value to your linked website or your Google+ Page. Other factors like categorizing, reviews, location and your linked website's value will help Google+ Local pack listings.
Once you have chosen your custom URL, you cannot edit it. You can also not choose whatever URL you want for your page. We are simply given whatever Google thinks appropriate.
Hope this helps!
I agree with Monica, Google uses a different crawler for mobile (Googlebot-Mobile), so it likely just hasn't indexed the newer pages yet.
Have you referred to the Google Analytics documentation?
https://developers.google.com/analytics/devguides/collection/analyticsjs/custom-dims-mets
Hey Moz Community!
Looking for some input on a site migration.
When redirecting some old pages that aren't going to be moved over to the new site, do you prefer to redirect to a homepage (or similar page) or to throw up a 404/410 on the new site?
What have you found works best?
display=none
As a CSS style for your header. Won't be displayed but is still there for the links to be crawled.
Not sure what you mean by "blacklisted" by Google here.
I feel this would detract from Google's entire PageRank algorithm, if they were to base popularity off search query data. Although this would be really hard to test because as the popularity of the brand query grows so will other factors like links and anchor text. It would be really hard to tell what is affecting what.
A related study was done on the impact of brand mentions: http://moz.com/blog/panda-patent-brand-mentions ...But these are not tied to brand query data alone.
Where you might find some interesting insights is how universal search is affected by query popularity. I.e. the influence of news results or in-depth articles as a particular query grows in interest.
Data from OSE is from the Mozscape Index. It was last updated on October 9th, 2014. The next update will be ~ November 5, 2014.
Moz generally updates the index every ~30 days (usually).
This is the official help page from Google for this issue: https://support.google.com/analytics/answer/1037063
Use the Tag Assistant extension as well to see if your Analytics code is installed correctly.
Depending on how often (how little) you update the blog it could technically been seen as duplicate content. Pagination (rel=next/prev) doesn't apply to those category pages, but you definitely do want them indexed so don't noindex or rel=canonical them.
Your archive pages are that important, but you don't really need to noindex them also. If you end up finding that the archive pages are duplicating some of your regular blog pages or category pages, then throw a noindex,follow on them.
Hope this helps!
One thing to note that I forgot to mention is that SquareSpace does use rel=canonical tags on their pages. So while your reporting may be off, link juice will not be split between / and no-/ pages. This also takes care of duplicate content problems, as the tag defines the source content.
This is a problem I have encountered with SquareSpace. SS does not automatically redirect non-trailing slash pages to the trailing slash version. This creates duplicate content issues and reporting issues, as:
and
will be recorded as two separate pageviews on your reporting.
If you want to know more about why it's recorded as two separate pages, read this blog from the Official Google Webmaster Central Blog: http://googlewebmastercentral.blogspot.ca/2010/04/to-slash-or-not-to-slash.html
Unfortunately this is just a problem with SS and since you can't use htaccess to redirect pages or use the built in Redirect Tool to do this, you are kind of just stuck until SS improves their system.
Thomas has a lovely answer. I'll just throw in that sites that have the top-level domain like .edu or .gov tend to have very good trust scores. Links from these sites will earn you more MozTrust.
I believe Moz's servers are in Texas, but in any case that wouldn't be it.
Do a landing page report to see what page/pages this direct traffic is landing on. You can usually get some good insight from there.
OSE (Open Site Explorer) is great. But Moz has another tool many others do not know about call Competitive Link Finder: http://moz.com/labs/link-finder
It does exactly what you are looking for.
This has to be done via cross-domain tracking. Depending on your Analytics code (Classic vs Universal), here are your 2 guides to implementing this:
classic: https://developers.google.com/analytics/devguides/collection/gajs/gaTrackingSite
universal: https://developers.google.com/analytics/devguides/collection/analyticsjs/cross-domain
By default in your GA reporting, subdomains and domains are not shown for page dimensions, so you might have a hard time differentiation a page on one subdomain versus another. The bottom of this page will show you how to add this into your reporting: https://support.google.com/analytics/answer/1034342
The Moz Analytics tool will allow you to connect your Facebook page so you can view Facebook Like growth on your account.
OSE does not show the number of Facebook Likes your account has but the number of Likes a post has received that links to www.transformationstreatment.com (or any specific page).
Same goes for Facebook Shares.
Like Kingof5 said, OSE doesn't do as large a crawl as many other tools out there. They pride themselves on accuracy not volume.
It looks like a large site, probably with many subdomains. OSE likely just has crawled that page or hasn't in a long time. Might want to take a look at the sitemap, see if those pages/subdomain/link to that sitemap subdomain are contained in it.
I just did a test with the on-page grader and the re-grade button did work for me. Changing one element of the page and then re-grading did appear on the test result.
Could be accessing a cached version of the page. I assuming your changes went live though your server, right?
It looks like some of the light greens didn't capture on your screenshot. Here's what it looks like...
The green (and the check mark) show the highest number for the given metric for the serp results. i.e. it looks like result 2 has the highest total links.
The grey (and other green shades) are there to show intensity as a comparison to the other results. The more grey the higher compared to the other results.
It looks like the intensity scale goes from white > light grey > dark grey > light green > dark green (check mark)
Could be on a external javascript file. Have you checked those referenced on your site?
What Ray said is good for the content-side of optimization (which is important), but links are important too. If you want to start ranking higher you are going to need to start building links to your pages to increase your rankings.
Moz has a ton of great resources in their Learn section or in the Blog to help.
Start here: http://moz.com/beginners-guide-to-link-building
**The fire away: **http://moz.com/pages/search_results?q=link building
As long as it is in the section of your page. Crawlers look for tagging information in this section first so it may be missed if it is anywhere else.
If you are concerned about the amount of code in the top head section, you could move all that javascript into a external js file and reference it.
I assume you're using OSE? It looks like rogerbot hasn't crawled the page - which why it doesn't show PA or links.
Your 404 page looks awfully like a regular content page. It is hard to immediately tell that this page doesn't exist. Firstly, check the source for the traffic to "/site/kempruge". See if it is a bad link (internal and external) and get it fixed.
Also, that page could exist on a subdomain on your site. Do you have any subdomains? Because GA doesn't automatically display the domain name in page URLs, you can't distinguish which subdomain it is coming from under a normal property view (ex www.kemruge.com or blog.kempruge.com).
You can use GWT to check for 404's in your crawl errors, that will help show you some URLs pointing to the page. Or you can use a tool like Screaming Frog SEO Spider to analyze your internal links to see which internal pages are linking to that 404 URL.
Hope this helps
You say the goal is a destination URL - is your content experiment for landing pages? Perhaps the 3 missing goals came from a landing page outside your content experiment and still hit the goal you set up. (recording a goal conversion, but not a conversion in your experiment)
Unless I'm misunderstanding your setup, it would entirely be possible to record a goal outside of your experiment. You may want to use a different goal or set up a funneled goal (where the have to follow a path for it to be recorded as a goal conversion).
White-hat SEO (which Moz is completely about) incorporates good general marketing practices (like mobile experience, design, CRO...) Moz Local is a tool that will get and maintain your most important online citations for your business. These citations hold value for local search results for your business.
If you want to learn more about local SEO, check out some of these Moz resources:
The '40-50' is the actual amount of links on that specific page (internal and external). The '1150' is the total amount of internal links on the website (up to 3000 linked pages).
Can you add the link to the page? Is there anything on the page that could inhibit Moz from crawling (meta robots, meta refresh redirect)? Or anything in the site's robots.txt file that could block Rogerbot?
It is also possible that Moz just hasn't crawled that page, therefore the Mozscape index doesn't contain information on that page in OSE.
Remember that OSE doesn't actively crawl your site when you run a report, it just recalls information from the latest Mozscape index.
How many variations have you set up in your experiment and how many conversions are associated with both the original page and the variation pages?
Again I would say they both matter (follow and nofollow) but followed links are going to have a lot more value. It is interesting to look at the distribution of anchor text between followed and nofollowed links. If you want to rank high for a competitive keyword, you're are going to need some keyword anchor text from followed links - but not an unnatural amount!