How do I fix my sitemap?
-
I have no idea how this happened, but our sitemap was http://www.kempruge.com/sitemap.xml, now it's http://www.kempruge.com/category/news/feed/ and google won't index it. It 404's. Obviously, I had to have done something wrong, but I don't know what and more importantly, I don't know how to find it in the backend of wordpress to change it. I tried a 301 redirect, but GWT still 404'd it. Any ideas? And, it's been like this for a few weeks, I've just neglected it, so I can't just reset the site without losing a lot of work.
Thanks,
Ruben
-
Hey Paul,
I'm attaching a screenshot of the "Location of your sitemap file." I think these settings - which I didn't change - are correct. But, if I need to change something here, please let me know.
Thanks for help! I really have no idea what I did to cause this problem, but I guess, as long as it gets fixed, that's all that matters.
Best,
Ruben
-
Before you dump the plugin, Ruben, check in the plugin's settings for whether the location has somehow inadvertently been changed.
Under the Settings link in your WP Dashboard sidebar, there should be an entry for XML Sitemap (the plugin's settings)
On the settings page, there should be a spot to define where the sitemap should be located. If there is, and that location has somehow defaulted to the category feed, you should be able to change the location designation back to the correct /sitemap.xml location. (In the XML Sitemap Generator plugin I suspect you're using, the settings are in the 5th section down titled Location of your sitemap file.)
Does that work?
Paul
-
It's a plugin, but it's not Yoast. We use All-in-one SEO, and then a separate one for the sitemap. However, that does help me. I could always just uninstall the sitemap plugin and redo it. That might work. Thank you all for helping me flush that out. I do appreciate it.
Best,
Ruben
-
The best way to deal with sitemaps on wordpress is to simply use a plugin. I can recommend the yoast seo plugin which has a built in sitemap feature or I have more recently been using the "google xml sitemap" plugin. Both will work fine. Both update automatically. So no need to update it manually. yoast creates a link to your sitemap link such (yourwebsite.com/sitemap-index.xml) and the other plugin creates one like this(yourwebsite.com/sitemap.xml). once your sitemap is created there they both offer a link to it and you can copy the link and submit it to GWT.
I don't really know how your sitemap got messed up but i hope that the info above can make your life a little easier down the road. if you decide to use one of the plugins, you won't have to worry about that problem again.
Good luck.
-
It looks like your sitemap.xml has been moved from that location or deleted. Did you create this manually or are you using a plugin to create the sitemap.xml? If so which one?
-
Hi Guys,
I personally believe in updating a sitemap often so that any changes made in your website are reflected in you Google sitemap which keeps them up to date, and you will see Google's spiders hitting your site for the new data.
Use a sitemap generator like Screaming frog which may be downloaded for free here: http://www.screamingfrog.co.uk/seo-spider/
Generate your new sitemap.xml file (which is something you could do on a regular basis), then upload it to a place of your choosing. Now in Google Webmaster Tools list your new sitemap.xml file and your off and running again. A very simple procedure.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changed all external links to 'NoFollow' to fix manual action penalty. How do we get back?
I have a blog that received a Webmaster Tools message about a guidelines violation because of "unnatural outbound links" back in August. We added a plugin to make all external links 'NoFollow' links and Google removed the penalty fairly quickly. My question, how do we start changing links to 'follow' again? Or at least being able to add 'follow' links in posts going forward? I'm confused by the penalty because the blog has literally never done anything SEO-related, they have done everything via social and email. I only started working with them recently to help with their organic presence. We don't want them to hurt themselves at all, but 'follow' links are more NATURAL than having everything as 'NoFollow' links, and it helps with their own SEO by having clean external 'follow' links. Not sure if there is a perfect answer to this question because it is Google we're dealing with here, but I'm hoping someone else has some tips that I may not have thought about. Thanks!
Intermediate & Advanced SEO | | HashtagJeff0 -
Are HTML Sitemaps Still Effective With "Noindex, Follow"?
A site we're working on has hundreds of thousands of inventory pages that are generally "orphaned" pages. To reach them, you need to do a lot of faceting on the search results page. They appear in our XML sitemaps as well, but I'd still consider these orphan pages. To assist with crawling and indexation, we'd like to create HTML sitemaps to link to these pages. Due to the nature (and categorization) of these products, this would mean we'll be creating thousands of individual HTML sitemap pages, which we're hesitant to put into the index. Would the sitemaps still be effective if we add a noindex, follow meta tag? Does this indicate lower quality content in some way, or will it make no difference in how search engines will handle the links therein?
Intermediate & Advanced SEO | | mothner0 -
Sitemap Migration - Google Guidelines
Hi all. I saw in support.google.com the following text: Create and save the Sitemap and lists of links A Sitemap file containing the new URL mapping A Sitemap file containing the old URLs to map A list of sites with link to your current content I would like to better understand about a "A list of sites with bond link to current content" Question 1: have I need tree sitemaps simultaneously ?
Intermediate & Advanced SEO | | mobic
Question 2: If yes, should I put this sitemap on the Search Console of the new website?
Question 3: or just Google gave a about context how do we make the migration? And I'll need really have sitemaps about the new site only..? What about is Google talking? Thanks for any advice.0 -
Issue with Site Map - how critical would you rank this in terms of needing a fix?
A problem has been introduced onto our sitemap whereby previously excluded URLs are no longer being correctly excluded. These are returning a HTTP 400 Bad Request server response, although do correctly redirect to users. We have around 2300 pages of content, and around 600-800 of these previously excluded URLs, An example would be http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/botswana/suggested-holidays/botswana-classic-camping-safari/Dates and prices.aspx (the page does correctly redirect to users). The site is currently being rebuilt and only has a life span of a few months. The cost our current developers have given us for resolving this is quite high with this in mind. I was just wondering: How much of a critical issue would you view this? Would it be sufficient (bearing in mind this is an interim measure) to change these pages so that they had a canonical or a redirect - they would however remain on the sitemap. Thanks
Intermediate & Advanced SEO | | KateWaite
Kate0 -
Google News sitemap keywords
My company is a Theater news and reviews site. We're building a google news sitemap and Google suggests some recommended keywords we can use with their <keywords>tag: https://support.google.com/news/publisher/answer/116037</keywords> Our writers also tag their stories with relevant keywords. What should we populate the <keywords>tag with?</keywords> We were thinking we'd automatically populate it with author-added tags, in addition to one or more of the recommended ones suggested by Google, such as Theater, Arts, and Culture (all of our articles are related to these topics). Finally, many of our articles are about say, celebrities. An author may tag an article with 'Bryan Cranston,' and when this is the case we're considering also tagging it with the 'Celebrities' tag. Are all or any of these worthwhile?
Intermediate & Advanced SEO | | TheaterMania0 -
Should I remove all meta descriptions to avoid duplicates as a short term fix?
I’m currently trying to implement Matt Cutt’s advice from a recent YouTube video, in which he said that it was better to have no meta descriptions at all than duplicates. I know that there are better alternatives, but, if forced to make a choice, would it be better to remove all duplicate meta descriptions from a site than to have duplicates (leaving a lone meta tag description on the home page perhaps?). This would be a short term fix prior to making changes to our CMS to allow us to add unique meta descriptions to the most important pages. I’ve seen various blogs across the internet which recommend removing all the tags in these circumstances, but I’m interested in what people on Moz think of this. The site currently has a meta description which is duplicated across every page on the site.
Intermediate & Advanced SEO | | RG_SEO1 -
Does a sitemap override Google parameter handling?
This question might seem silly, but I'll ask anyway. We have an eCommerce site with a ton of duplicate content, mostly caused by faceted navigation. In researching ways to reduce the clutter, I've decided to use Google parameter handling to stop Googlebot from crawling pages with certain parameters, like: sort order, page #, etc... Now my question: If I set all of these parameters so that Googlebot doesn't crawl the grids, how will they ever find the individual product pages? We do upload a sitemap with all of the product pages. Does this solve my issue? Or, should I handle the duplicate content with noindex, follow tag? Or, is there an even better way? Thanks
Intermediate & Advanced SEO | | rhoadesjohn0 -
Techniques to fix eCommerce faceted navigation
Hi everyone, I've read a lot about different techniques to fix duplicate content problems caused by eCommerce faceted navigation (e.g. redundant URL combinations of colors, sizes, etc.). From what I've seen suggested methods include using AJAX or JavaScript to make the links functional for users only and prevent bots from crawling through them. I was wondering if this technique would work instead? If we detect that the user is a robot, instead of displaying a link, we simply display its anchor text. So what would be for a human COLOR < li > < a href = red >red < /a > < /li >
Intermediate & Advanced SEO | | anthematic
< li > < a href = blue>blue < /a > < /li > Would be for a robot COLOR < li > red < /li >
< li > blue < /li > Any reason I shouldn't do this? Thanks! *** edit Another reason to fix this is crawl budget since robots can waste their time going through every possible combination of facet. This is also something I'm looking to fix.0