Sitemap issues 19 warnings
-
Hi Guys
I seem to be having a lot of sitemap issues.
1. I have 3 top level domains, all with the co.nz sitemap that was submitted
2. I'm in the midst of a site re-design so I'm unsure if I should be updating these now or when the new site goes live (in two weeks)
3. I have 19 warnings from GWT for the co.nz site and they gave me 3 examples looks like 404 errors however I'm not too sure and a bit green on my behalf to find out where the issues are and how to fix them. (it is also showing that 95 pages submitted and only 53 were indexed)
4. I generated recently 2 sitemaps for .com and com.au submitted these both to google and when i create i still see the co.nz sitemap
Would love some guidance around this.
Thanks
-
Glad it was useful!
-
Oh you are a genius yourself Bob Thanks for the great information!
I will look into this and let you know how I go, thanks a bunch you have really helped me move this along and weed out all the confusion!
-
Hi Justin,
In that case I would ask your developer to make the sitemap on the website update automatically (or generate a new one every day). And submit that link to webmaster tools. If he's a real genius he could add your blog pages from Wordpress to this sitemap aswell but I'm not sure if Wordpress has a hook for this.
Alternative options:
- Let him make the automatically updated sitemap for the custom part of the website and use this combined with the sitemap from the yoast plugin. You can upload both separated in Google Webmaster Tools. Make sure both got their own URL. In this case it’s all automated and is just as good as the previous method.
- Keep on updating your sitemap manually. Just make sure you don't use the yoast sitemap and include the blogposts in your sitemap from screaming frog since this would give double input. If you choose to refresh your sitemap manually I would disable the sitemap within the Yoast plugin and use the Screaming frog sitemap which should include your blog pages aswell.
Good luck and let me know if this works for you!
-
Thanks a lot Dirk, your help has been tremendous to my SEO efforts!!!
-
Hi Bob
Thanks alot for your response!
That makes a lot of sense. We use Wordpress only for the blog, but the main site is custom built and doesn't have an yoast plugin.
So I'm not sure how that will work, when I create the site map with screaming frog do I need to include the blog pages in screaming frog if I'm using the yoast plugin?
Thanks again for your help!
-
Yep - you'll have to upload the file to the server first.
Bob's suggestion to generate the sitemap via the Yoast plugin is an excellent idea.
rgds
Dirk
-
Hi Justin,
Thanks for the screenshots. Dirk's suggestion about screaming frog should be really helpful. This should give you an insight in the true 404 errors that a bot can encounter while crawling through your internal site structure.
Based on what I see I think your main problem is the manual updated sitemap. Whenever you change a page, add a new one or mix up some categories those changes won't apply to your sitemap. This creates a 404 error while those pages aren't linked to from your website and (without a sitemap) wouldn't give any 404 error messages in Google Webmaster Tools.
I saw you were using SEO by Yoast already, I suggest using their sitemap functionality. That should resolve the problem and save you work in the future since there is no need to manually update your sitemap again.
Let me know if this works!
-
Hi Justin,
Could you post a screenshot of the error message and any links pointing to this URL? This way we can identify what pages return a 404. If this are important pages on your website I would fix it right now, if it however are pages you don’t use or your visitors rarely see I would make sure you pick this up with the redesign. No point in fixing this now if things will change in the near future. Besides that, sitemaps help you get your website indexed, releasing this two weeks earlier won’t make a big difference for the number of indexed pages since you won’t change your internal link structure and website authority (both help you get more pages indexed).
About your last point, could you provide me with a screenshot of this as well? When I check zenory.com/sitemap.xml I find the .com sitemap, so that part seems fine.
_PS. I would suggest you change your update frequency in your sitemap. It now states monthly, it’s probably a good idea to set this much faster since there is a blog on your website as well. At the moment you are giving Google hints to only crawl your website a few times a month. Keep in mind that you can give different parts of your website a different change frequency. For example, I give pages with user generated content a much higher change frequency then pages we need to update manually. _
-
Hi Justin,
Are the url's going to change when you update the design? If they are not changing you can already update now.
It's not really abnormal to have only a certain % of the sitemap indexed - it could be that Google judges that a certain number of pages is too light in content to be indexed. 55% of url's indexed seems rather low.
Sitemap errors - check the url's that are listed as errors. If I am not mistaken, you use an external tool to generate the sitemaps. It could be that this tools puts all the internal links in the the sitemap; regardless of their status (200, 301, 404) - normally only url's with status 200 should be put in the sitemap. Check the configuration of the tool you use & see if you can only add url's with status 200. Alternatively, you can check the internal linking on your site & make sure that no links exist to 404 pages (Screaming Frog is the tool to use - it can also generate the sitemap).
For the wrong sitemap- as your sites are exact duplicates, probably hosted on the same server, it could be that the .co.nz sitemap overwrites the .com sitemap , as they have the same name. You could rename your sitemap like sitemap_au.xml, sitemap_us.xml & sitemap_nz.xml. This way, if you add a new sitemap for .nz it will not overwrite the .com version. You submit these to Google & you delete the old ones (both on the server & in Google WMT).
Hope this helps.
Dirk
PS. If your design is also changing the url's - don't forget to put redirects in place that lead the old to the new url's.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Customer spammy markup warning
I just won a new customer, yippie. Bad news after i got the full story it seems they have a google penalty against them. It could be an old malicious employee with back links. The error in GSC is spammy markup, would anyone else care to take a quick look and see what the main issues are, im not sure of the tags or all the additional info and alternate names, they are relevant but maybe too much in rich snippets. the site is aipctshop.com. Thank you in advance.
White Hat / Black Hat SEO | | bendh0 -
Crawl anamoly issue on Search Console
Has anyone checked the crwal anamoly issue under the index section on Search console? We recently move to a new site and I'm seeing a huge list of excluded urls which are classified as crawl anamoly (they all lead to 404 page). Does anyone know that if we need to 301 redirect all the links? Is there any other smarter/ more efficiently way to deal with them like set up canonical link (I thought that's what they're used for isn't it?) Thanks!
White Hat / Black Hat SEO | | greenshinenewenergy0 -
Active, Old Large site with SEO issues... Fix or Rebuild?
Looking for opinions and guidance here. Would sincerely appreciate help. I started a site long, long ago (1996 to be exact) focused on travel in the US. The site did very well in the search results up until panda as I built it off templates using public databases to fill in the blanks where I didn't have curated content. The site currently indexes around 310,000 pages. I haven't been actively working on the site for years and while user content has kept things somewhat current, I am jumping back into this site as it provides income for my parents (who are retired). My questions is this. Will it be easier to track through all my issues and repair, or rebuild as a new site so I can insure everything is in order with today's SEO? and bonus points for this answer ... how do you handle 301 redirects for thousands of incoming links 😕 Some info to help: CURRENTLY DA is in the low 40s some pages still rank on first page of SERPs (long-tail mainly) urls are dynamic (I have built multiple versions through the years and the last major overhaul was prior to CMS popularity for this size of site) domain is short (4 letters) but not really what I want at this point Lots of original content, but oddly that content has been copied by other sites through the years WHAT I WANT TO DO get into a CMS so that anyone can add/curate content without needing tech knowledge change to a more relevant domain (I have a different vision) remove old, boilerplate content, but keep original
White Hat / Black Hat SEO | | Millibit1 -
Potential spam issue - back links
Hi - we have a client whom we work with for SEO. During a review we noticed in Webmaster Tools, there was an IP address with over 30,000 links to our clients site. The IP address is 92.60.0.123. From looking up the IP address details, it looks like it is based in Europe - but we are unable to establish what it is, where the links are and who created it. We are concerned it could be a potential spammer trying to cause an issue with the SEO campaign. Is there any way of finding out any more details apart from the basic information about the location of the IP address? Also - if we submit a disavow via webmaster tools, we are unsure what issue it will have on the clients site if we do not know what it is and the type of links it is creating. Any ideas? Thanks for your help! Phil.
White Hat / Black Hat SEO | | Globalgraphics0 -
Partial Sitemaps Impact on SERP
I have a website having 20 different categories. But have the sitemap for only 1 category and rest 19 categories will not have the sitemaps will this have an impact on the search results on not
White Hat / Black Hat SEO | | seosogo0 -
A client/Spam penalty issue
Wondering if I could pick the brains of those with more wisdom than me... Firstly, sorry but unable to give the client's url on this topic. I know that will not help with people giving answers but the client would prefer it if this thread etc didn't appear when people type their name in google. Right, to cut a long story short..gained a new client a few months back, did the usual things when starting the project of reviewing the backlinks using OSE and Majestic. There were a few iffy links but got most of those removed. In the last couple of months have been building backlinks via guest blogging and using bloggerlinkup and myblogguest (and some industry specific directories found using linkprospector tool). All way going well, the client were getting about 2.5k hits a day, on about 13k impressions. Then came the last Google update. The client were hit, but not massively. Seemed to drop from top 3 for a lot of keywords to average position of 5-8, so still first page. The traffic went down after this. All the sites which replaced the client were the big name brands in the niche (home improvement, sites such as BandQ, Homebase, for the fellow UK'ers). This was annoying but understandable. However, on 27th June. We got the following message in WMT - Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
White Hat / Black Hat SEO | | GrumpyCarl
As a result, Google has applied a manual spam action to xxxx.co.uk/. There may be other actions on your site or parts of your site. This was a shock to say the least. A few days later the traffic on the site went down more and the impressions dropped to about 10k a day (oddly the rankings seem to be where they were after the Google update so perhaps a delayed message). To get back up to date....after digging around more it appears there are a lot of SENUKE type links to the site - links on poor wiki sites,a lot of blog commenting links, mostly from irrelevant sites, i enclose a couple of examples below. I have broken the links so they don't get any link benefit from this site. They are all safe for work http:// jonnyhetherington. com/2012/02/i-need-a-new-bbq/?replytocom=984 http:// www.acgworld. cn/archives/529/comment-page-3 In addition to this there is a lot of forum spam, links from porn sites and links from sites with Malware warnings. To be honest, it is almost perfect negative seo!! I contacted several of the sites in question (about 450) and requested they remove the links, the vast majority of the sites have no contact on them so I cannot get the links removed. I did a disavow on these links and then a reconsideration request but was told that this is unsuccessful as the site still was being naughty. Given that I can neither remove the links myself or get Google to ignore them, my options for lifting this penalty are limited. What would be the course of action others would take, please. Thanks and sorry for overally long post0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Should I 301 Redirect a Site with an 'Unnatural Link' Warning?
Hey Fellow Mozzers, I have recently been approached by a new client that has been issued with an 'Unnatural Link' warning and lost almost all of their rankings. Open Site Explorer shows a ton of spammy links all using main keyword anchor text and there are way too many of them to even consider manually getting them removed. There are two glimmers of hope for the client; The first is that the spammy links are dropping off at a rate of about 25 per week; The second is that they own both the .com and the .co.uk domain for their business. I would really appreciate some advice on the best way to handle this, should I :- Wait it out for some of the spammy links to drop off whilst at the same time pushing social media and build some good clean links using the URL and brand as anchor text? Then submit a recosideration request? Switch the website over from the .com domain to the .co.uk domain and carry out a 301 redirect? Switch the website over from the .com to the .co.uk without doing a redirect and start again for the client with a clean slate? I would still register an address change via Webmaster Tools. Add a duplicate site on the .co.uk domain. Leave the .com site in place but rel="canonical" the entire domain over to the .co.uk Any advice would be very much apprecited. Thanks
White Hat / Black Hat SEO | | AdeLewis
Ade.0