Sitemap issues 19 warnings
-
Hi Guys
I seem to be having a lot of sitemap issues.
1. I have 3 top level domains, all with the co.nz sitemap that was submitted
2. I'm in the midst of a site re-design so I'm unsure if I should be updating these now or when the new site goes live (in two weeks)
3. I have 19 warnings from GWT for the co.nz site and they gave me 3 examples looks like 404 errors however I'm not too sure and a bit green on my behalf to find out where the issues are and how to fix them. (it is also showing that 95 pages submitted and only 53 were indexed)
4. I generated recently 2 sitemaps for .com and com.au submitted these both to google and when i create i still see the co.nz sitemap
Would love some guidance around this.
Thanks
-
Glad it was useful!
-
Oh you are a genius yourself Bob Thanks for the great information!
I will look into this and let you know how I go, thanks a bunch you have really helped me move this along and weed out all the confusion!
-
Hi Justin,
In that case I would ask your developer to make the sitemap on the website update automatically (or generate a new one every day). And submit that link to webmaster tools. If he's a real genius he could add your blog pages from Wordpress to this sitemap aswell but I'm not sure if Wordpress has a hook for this.
Alternative options:
- Let him make the automatically updated sitemap for the custom part of the website and use this combined with the sitemap from the yoast plugin. You can upload both separated in Google Webmaster Tools. Make sure both got their own URL. In this case it’s all automated and is just as good as the previous method.
- Keep on updating your sitemap manually. Just make sure you don't use the yoast sitemap and include the blogposts in your sitemap from screaming frog since this would give double input. If you choose to refresh your sitemap manually I would disable the sitemap within the Yoast plugin and use the Screaming frog sitemap which should include your blog pages aswell.
Good luck and let me know if this works for you!
-
Thanks a lot Dirk, your help has been tremendous to my SEO efforts!!!
-
Hi Bob
Thanks alot for your response!
That makes a lot of sense. We use Wordpress only for the blog, but the main site is custom built and doesn't have an yoast plugin.
So I'm not sure how that will work, when I create the site map with screaming frog do I need to include the blog pages in screaming frog if I'm using the yoast plugin?
Thanks again for your help!
-
Yep - you'll have to upload the file to the server first.
Bob's suggestion to generate the sitemap via the Yoast plugin is an excellent idea.
rgds
Dirk
-
Hi Justin,
Thanks for the screenshots. Dirk's suggestion about screaming frog should be really helpful. This should give you an insight in the true 404 errors that a bot can encounter while crawling through your internal site structure.
Based on what I see I think your main problem is the manual updated sitemap. Whenever you change a page, add a new one or mix up some categories those changes won't apply to your sitemap. This creates a 404 error while those pages aren't linked to from your website and (without a sitemap) wouldn't give any 404 error messages in Google Webmaster Tools.
I saw you were using SEO by Yoast already, I suggest using their sitemap functionality. That should resolve the problem and save you work in the future since there is no need to manually update your sitemap again.
Let me know if this works!
-
Hi Justin,
Could you post a screenshot of the error message and any links pointing to this URL? This way we can identify what pages return a 404. If this are important pages on your website I would fix it right now, if it however are pages you don’t use or your visitors rarely see I would make sure you pick this up with the redesign. No point in fixing this now if things will change in the near future. Besides that, sitemaps help you get your website indexed, releasing this two weeks earlier won’t make a big difference for the number of indexed pages since you won’t change your internal link structure and website authority (both help you get more pages indexed).
About your last point, could you provide me with a screenshot of this as well? When I check zenory.com/sitemap.xml I find the .com sitemap, so that part seems fine.
_PS. I would suggest you change your update frequency in your sitemap. It now states monthly, it’s probably a good idea to set this much faster since there is a blog on your website as well. At the moment you are giving Google hints to only crawl your website a few times a month. Keep in mind that you can give different parts of your website a different change frequency. For example, I give pages with user generated content a much higher change frequency then pages we need to update manually. _
-
Hi Justin,
Are the url's going to change when you update the design? If they are not changing you can already update now.
It's not really abnormal to have only a certain % of the sitemap indexed - it could be that Google judges that a certain number of pages is too light in content to be indexed. 55% of url's indexed seems rather low.
Sitemap errors - check the url's that are listed as errors. If I am not mistaken, you use an external tool to generate the sitemaps. It could be that this tools puts all the internal links in the the sitemap; regardless of their status (200, 301, 404) - normally only url's with status 200 should be put in the sitemap. Check the configuration of the tool you use & see if you can only add url's with status 200. Alternatively, you can check the internal linking on your site & make sure that no links exist to 404 pages (Screaming Frog is the tool to use - it can also generate the sitemap).
For the wrong sitemap- as your sites are exact duplicates, probably hosted on the same server, it could be that the .co.nz sitemap overwrites the .com sitemap , as they have the same name. You could rename your sitemap like sitemap_au.xml, sitemap_us.xml & sitemap_nz.xml. This way, if you add a new sitemap for .nz it will not overwrite the .com version. You submit these to Google & you delete the old ones (both on the server & in Google WMT).
Hope this helps.
Dirk
PS. If your design is also changing the url's - don't forget to put redirects in place that lead the old to the new url's.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Robots.txt file in Shopify - Collection and Product Page Crawling Issue
Hi, I am working on one big eCommerce store which have more then 1000 Product. we just moved platform WP to Shopify getting noindex issue. when i check robots.txt i found below code which is very confusing for me. **I am not getting meaning of below tags.** Disallow: /collections/+ Disallow: /collections/%2B Disallow: /collections/%2b Disallow: /blogs/+ Disallow: /blogs/%2B Disallow: /blogs/%2b I can understand that my robots.txt disallows SEs to crawling and indexing my all product pages. ( collection/*+* ) Is this the query which is affecting the indexing product pages? Please explain me how this robots.txt work in shopify and once my page crawl and index by google.com then what is use of Disallow: Thanks.
White Hat / Black Hat SEO | | HuptechWebseo0 -
A Sitemap Web page & A Sitemap in htaccess - will a website be penalised for having both?
Hi I have a sitemap url already generated by SEO Yoast in the htaccess file, and I have submitted that to the search engines. I'd already created a sitemap web page on the website, also as a helpful aid for users to see a list of all page urls. Is this a problem and could this scenario create duplicate issues or any problems with search engines? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
Want to know Best Method to fix keyword cannibalization issue?
I have a website that has been experiencing keyword cannibalization issue since last 2-3 months. We have one main key search term to bring our website TOP ranking, but we have been seeing our website’s 2 different pages ranking strangely sometime for 1st page& sometime for 2nd page that one main key search term. As e.g.:
White Hat / Black Hat SEO | | Aman_123
our main key search term 1st page rank sometime instead 2nd page
our main key search term 2nd page rank sometime instead page I am looking for best solution here to get this fixed..0 -
Rank Drop Possibly due to links but no warning in GWT
Hello, We've been experiencing rank drop in all major keywords for the past 9 months. I've had different people say different things here at Moz about how backlinks effect rank drop. Brilliant answers, but different opinions. Nothing is showing up in GWT for this site. Here's the backlink breakdown: 72 linking root domains. 20 of those are blogs. These blogs have no backlinks in and of themselves, and were created originally as easy links. Not white hat stuff. Three additional root domains are still paid links in this profile, though all but one was made to look editorial. The one that doesn't look editorial has links sprinkled throughout their website, among other paid links. The rest of the linking root domains (49) are legitimate. Again, nothing shows up in GWT. We had 96 root domains last March but in March of 2013 we cut most of the paid links and half (20) of the blogs. This brought our ranking down immediately by 2 or 3 slots. We've been slipping every since. I would like people to speak from experience and let me know if you think the backlinks could be causing the ranking drop and what to do about it. Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Are Links from blogs with person using keyword anchor text a Penguin 2.0 issue?
Hello, I am continuing a complete clean up of a clients link profile and would like to know if Penguin is against links from blogs with the user including keywords as anchor text? So far I have been attempting to get them removed before I go for a disavow. An example would be the work clothing comment at the bottom of: http://www.fashionstyleyou.co.uk/beat-the-caffeine-rush.html/comment-page-1 I am also questioning if we should keep any link directories, so far I have been ruthless, but worry I will be losing a hell of a lot of links. For example I have kept the following: http://www.business-directory-uk.co.uk//clothing.htm Your comments are welcomed!
White Hat / Black Hat SEO | | MarzVentures0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Is widget linkbaiting a bad idea now that webmasters are getting warnings of unnatural links?
I was reading this article about how many websites are being deindexed because of an unnatural linking profile and it got me thinking about some widgets that I have created. In the example given, a site was totally deindexed and the author believes the reason was because of multiple footer links from themes that they created. I have one site that has a very popular widget that I offer to others to embed into their site. The embed code contains a line that says, "Tool provided by Site Name". Now, it just so happens that my site name contains my main keyword. So, if I have hundreds of websites using this tool and linking back to me using the same anchor text, could Google see this as unnatural and possibly deindex me? I have a few thoughts on what I should do but would love to hear your thoughts: 1. I could use a php script to provide one of several different anchor text options when giving my embed code. 2. I could change the embed code so that the anchor text is simply my domain name, ie www.mywebsitename.com rather than "my website name". Thoughts?
White Hat / Black Hat SEO | | MarieHaynes1