Webmaster Tools HTML Improvements Page Blank / Site Not Ranking Well
-
I have an ecommerce site that is not ranking well currently. It has about 1,000 pages indexed in Google but very few appear to be ranking.
I normally find issues in Webmaster Tools HTML Improvements but for some reason it does not see a problem with the site. There are problems, trust me. Moz shows many issues. Google nothing!
There is a problem somewhere but I am not seeing it. Why are HTML Improvements blank and the site not ranking? Am I in the dreaded sandbox?
Any ideas?
Sean
We didn't detect any content issues with your site. As we crawl your site, we check it to detect any potential issues with content on your pages, including duplicate, missing, or problematic title tags or meta descriptions. These issues won't prevent your site from appearing in Google search results, but paying attention to them can provide Google with more information and even help drive traffic to your site. For example, title and meta description text can appear in search results, and useful, descriptive text is more likely to be clicked on by users.
-
Hi Sean
In my experience there are always many differences in the crawl reports from Moz and Webmaster Tools. Bear in mind, they are all just computer programs running automatically to detect "issues" - and that a human eye is the best tool at the end of the day.
To step back a little, your overall ranking will likely have very little to do with these sorts of things (titles, descriptions, etc). They can have a small effect but will usually not bump the needle way up. So just keep that in perspective when fixing things.
In general, I would aim to have your titles and descriptions unique, within length guidelines, compelling for users and so on. Google has a great guide here: https://support.google.com/webmasters/answer/35624?hl=en
There can sometimes be a delay in what Google shows you in WMT as well. They might not have crawled everything as recently as Moz. I do tend to find Moz will check everything whereas WMT will only alert you to the pages they find are problematic. (For example, you might have pages with duplicate titles, but Google has determined the pages not important - a quality check if you well - whereas Moz doesn't make this qualitative check, they just crawl and rate everything).
In order of priority I would fix WMT issues first and then move to Moz issues. But again, your overall ranking is likely more due to site authority & trust as measured by links and usage.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to rank a page on established site quickly
Hi, I'm looking for information about how I can rank an e-commerce category page quickly from a link building perspective. It usually takes me 6-12 months to rank these pages within the top 3 spots with link building, but I would like to get results faster. My site is established for more than 10 years and performs well in Google organic search. Here is what usually works over a 6-12 month time span: 15-40 links within articles on DA 15-60 sites, built within 6-12 months More than 75% of the links are from blogs Variety of anchor text Combination of follow/nofollow Deep links to product pages within the category we're trying to rank Might be important to note that it was easy for us to get category pages listed in DMOZ categories, when it was still around but it didn't seem to play any role in getting ranked faster. Note: We only build links on real sites with real traffic and decent performance metrics. No PBNs or other crap sites. I'd sincerely appreciate it if anyone can make any suggestions or point me towards helpful info. Thanks!
Intermediate & Advanced SEO | | Choice0 -
Prioritise a page in Google/why is a well-optimised page not ranking
Hello I'm new to Moz Forums and was wondering if anyone out there could help with a query. My client has an ecommerce site selling a range of pet products, most of which have multiple items in the range for difference size animals i.e. [Product name] for small dog
Intermediate & Advanced SEO | | LauraSorrelle
[Product name] for medium dog
[Product name] for large dog
[Product name] for extra large dog I've got some really great rankings (top 3) for many keyword searches such as
'[product name] for dogs'
'[product name]' But these rankings are for individual product pages, meaning the user is taken to a small dog product page when they might have a large dog or visa versa. I felt it would be better for the users (and for conversions and bounce rates), if there was a group page which showed all products in the range which I could target keywords '[product name]', '[product name] for dogs'. The page would link through the the individual product pages. I created some group pages in autumn last year to trial this and, although they are well-optimised (score of 98 on Moz's optimisation tool), they are not ranking well. They are indexed, but way down the SERPs. The same group page format has been used for the PPC campaign and the difference to the retention/conversion of visitors is significant. Why are my group pages not ranking? Is it because my client's site already has good rankings for the target term and Google does not want to show another page of the site and muddy results?
Is there a way to prioritise the group page in Google's eyes? Or bring it to Google's attention? Any suggestions/advice welcome. Thanks in advance Laura0 -
How does Googlebot evaluate performance/page speed on Isomorphic/Single Page Applications?
I'm curious how Google evaluates pagespeed for SPAs. Initial payloads are inherently large (resulting in 5+ second load times), but subsequent requests are lightning fast, as these requests are handled by JS fetching data from the backend. Does Google evaluate pages on a URL-by-URL basis, looking at the initial payload (and "slow"-ish load time) for each? Or do they load the initial JS+HTML and then continue to crawl from there? Another way of putting it: is Googlebot essentially "refreshing" for each page and therefore associating each URL with a higher load time? Or will pages that are crawled after the initial payload benefit from the speedier load time? Any insight (or speculation) would be much appreciated.
Intermediate & Advanced SEO | | mothner1 -
Page A Best for Users, but B Ranks
This is real estate MLS listings related. I have a page "B" with lots of unique content (MLS thumbnails mixed with guide overview writing, pictures etc) which outranks "A" which is a page simply showing MLS thumbnails with map feature included. I am linking from "B" to "A" with anchor "KEYWORD for sale" to indicate to search engines that "A" is the page I want to rank, even though "B" has more unique content. It hasn't worked so far.
Intermediate & Advanced SEO | | khi5
Questions: Should I avoid linking from "B" to "A" as that could impact how well "B" ranks? Should I leave this setup and over time hope search engines will give "A" a chance to rank? Include some unique content on "A" mostly not viewable without clicking "Read more" link? I don't foresee many users will click "Read more" as they are really just looking for the properties for sale and do rarely care about written material when searching for "KEYWORD for sale". Should I "no index, follow" A as there are limited to none unique content and this could enhance chance of ranking better for B? When I write blog posts and it includes "KEYWORD for sale" should I link to "A" (best for users) or link to "B" since that page has more potential to rank really well and still is fairly good for users? Ranking for "B" is not creating a large bounce rate, just that "A" is even better. Thank you,
Kristian0 -
Why do pages with a 404 error drop out of webmaster tools only to reappear again?
I have noticed a lot of pages which have fallen out of webmaster tools crawl error log that had bee 404'ing are reappearing again Any suggestions as to why this might be the case? How can I make sure they don't reappear again?
Intermediate & Advanced SEO | | Towelsrus0 -
Our site is recieving traffic for both .com/page and .com/page/ with the trailing slash.
Our site is recieving traffic for both .com/page and .com/page/ with the trailing slash. Should we rewrite to just the trailing slash or without because of duplicates. The other question is, if we do a rewrite, google has indexed some pages with the slash and some without - i am assuming we will lose rank for one of them once we do the rewrite, correct?
Intermediate & Advanced SEO | | Profero0 -
Keeping the Navigation on the Sitemap HTML Page?
Hey everyone. We are about to create a sitemap.html page and have always just kept the site theme in place and put the sitemap in the "content" section of the page, with the header navigation, sidebars and footer in place. Well, now with the new "only first link counts" Google rule, wouldn't it be better to just have a "plain" html sitemap page without any other links on it?
Intermediate & Advanced SEO | | JamesO0 -
Pages un-indexed in my site
My current website www.energyacuity.com has had most pages indexed for more than a year. However, I tried cache a few of the pages, and it looks the only one that is now indexed by Goggle is the homepage. Any thoughts on why this is happening?
Intermediate & Advanced SEO | | abernatj0