Error reports showing pages that don't exist on website
-
I have a website that is showing lots of errors (pages that cannot be found) in google webmaster tools. I went through the errors and re-directed the pages I could. There are a bunch of remaining pages that are not really pages this is why they are showing errors. What's strange is some of the URL's are showing feeds which these were never created. I went into Google webmaster tools and looked at the remove URL tool. I am using this but I am confused if I need to be selecting "remove page from search results and cache" option or should I be selecting this other option "remove directory" I am confused on the directory. I don't want to accidentally delete core pages of the site from the search engines.
Can anybody shed some light on this or recommend which I should be selecting?
Thank you
Wendy
-
I would avoid using the "remove URL" option in GWT. The 301s are more ideal in my opinion because let's say I have that old URL posted on my website somewhere, and now it's going to a 404 page. When you redirect it, people will be taken to a different page, and you don't have to worry about having me update the old URL on my website. The link will work, it will take you to an active page and can get you some traffic. However, the "remove URL" option won't give you this same benefit.
Here's a helpful link straight from the source on when NOT to use the Remove URL option: https://support.google.com/webmasters/answer/1269119?hl=en
-
Ok this sounds good because it's not exactly duplicate content so it would be better I agree to do the redirect. I downloaded a redirect plugin yesterday that worked pretty good. I noticed that some pages have already redirected people (the tool has an area to see this)
Final question for you, what are your thoughts on that "remove URL" option in Google Webmaster tools? Wasn't sure if that would be better than a 301 redirect for these remaining senseless errors.
Just curious on your thoughts.
Thank you
-
Rel=canonical is used more when you have duplicate content. If you have the same post or page in two areas, you can use the rel=canonical tag to tell Google where the original of the duplicate is. It sounds like you don't need rel=canonical in this situation.
It sounds like you have 80-something 404 Page Not Found errors. I would use the "Redirection" plugin with Wordpress. Take each URL that is giving you the 404 error in your report, and redirect each one to the most relevant page associated with what was supposed to be on the page that is giving the 404 error. If there really is no relevant page at all, I would just redirect it to the homepage. In my opinion, it's better to have it redirect to the homepage than to have the user land on a 404 page. I would do that for every 404 error you are getting. Doing this, I don't think you'll need rel=canonical at all.
-
Thank you for your reply. You are correct the site is in wordpress. The long story of this whole situation is this....I had initially built the client a wordpress site, things were fine with, traffic and business was good for him. Then one day one of his employees suggested that her father build him a new site that was more graphically pleasing (rather than saying can you please update the graphics on the current site) so the father built an entirely new site on Joomla (I didn't find this out until he was launching this new Joomla site) The guy also went and changed the domain to a www. the original site I had built had no www on it. Fast Forward.....I have re-built the site in wordpress, went back to the non www version. The errors I am getting I have 301 re-directed where I could. I have also in webmaster tools changed the site settings to the preferred domain.(the new site) I fetched all the new pages in Google. I have submitted new site maps. I'm down to 82 errors. The errors are showing pages that do not exist and to re-direct those pages I don't have pages that would really make sense to re-direct them to. I'm wondering now if I'm to a point where I need to "remove the URL's" as offered in Google webmaster tools...??? What do you think on this?
As for the rel-canonical....I understand why I would use these. I see in the Yoast plug in where I can insert the rel-canonical. My question is this:Do I insert the rel-canonical on the page that is correct? So I go to my correct website (the non www one) and go to that page that is correct and I prefer the engines index that page and insert the rel-canonical on that page that is the preferred one? Or am I to go to the non preferred page and insert the rel-canonical on that page so when search engines see that wrong page they see the rel-canonical tag showing them the correct page to index? I looked at a video by Matt Cutts and I wasn't clear on which page I put the rel-canonical on (old site or new one?) I don't have access to the old site so this is why I was thinking maybe just "remove the URL's" as offered up in webmaster tools.
Your input? I really appreciate your help. Thank you
-
For me personally, on Wordpress I use the Yoast SEO tool and I went through the tutorial on the Yoast website. He shows you how to eliminate a lot of the duplicate content that automatically gets created with all Wordpress websites. Once you noindex and get rid of all the unnecessary archives and all that, at that point I would recommend going back to the error report and see the difference and see if those pages keep coming up. If they do, just simply 301 redirect them to another page on your website. Then check again after you redirect them and see what you're left with. Sometimes it takes a couple weeks to reflect from what I've seen. Not sure if this is the exact issue you're having, or if you're even using Wordpress at all, but it sounds like if you are this might help you as it helped me get my errors down to zero.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a difference between 'Mø' and 'Mo'?
The brand name is Mø but users are searching online for Mo. Should I changed all instances of Mø to be Mo on my clients website?
Intermediate & Advanced SEO | | ben_mozbot010 -
I'm in Canada and building a website for the US...approach?
Hi there - we already have a Canadian website for the company and we're building one for our American branch. From an SEO perspective what is the best approach here? We have already purchased a .com domain and the company is branded a little different in the US than in Canada. How do I tell Google that this site is American and should be served primarily to the American audience? Should I be tagging duplicate content with rel=canonical (for similar pages like the About us section for instance) or does that matter here? Hope you guys can help. Thanks!
Intermediate & Advanced SEO | | MelcorDev0 -
How I can improve my website On page and Off page
My Website is guitarcontrol.com, I have very strong competition in market. Please advice me the list of improvements on my websites. In regarding ON page, Linkbuiding and Social media. What I can do to improve my website ranking?
Intermediate & Advanced SEO | | zoe.wilson170 -
What is the proper way to execute 'page to page redirection'
I need to redirection every page of my website to a new url of another site I've made. I intend to add:"Redirect 301 /oldpage.html http://www.example.com/newpage.html"I will use the 301 per page to redirect every page of my site, but I'm confused that if I add:"Redirect 301 / http://mt-example.com/" it will redirect all of my pages to the homepage and ignore the URLs i have separately mentioned for redirection.Please guide me.
Intermediate & Advanced SEO | | NABSID0 -
Schema.org on Product Page showing strange result if you post url in google
Hi All, We have implemented Schema.org for our products and currently if you put the url in google, the results showing up are not the meta description but some of the schema.org content along with some other rubbish at the bottom . Do you know if we are doing this wrong as in GWT it all looks okay and says it fine? You can get the url from here -http://goo.gl/aSFPqP Any assistance, greatly appreciated. thanks peter
Intermediate & Advanced SEO | | PeteC120 -
OSE link report showing links to 404 pages on my site
I did a link analysis on this site mormonwiki.com. And many of the pages shown to be linked to were pages like these http://www.mormonwiki.com/wiki/index.php?title=Planning_a_trip_to_Rome_By_using_Movie_theatre_-_Your_five_Fun_Shows2052752 There happens to be thousands of them and these pages actually no longer exist but the links to them obviously still do. I am planning to proceed by disavowing these links to the pages that don't exist. Does anyone see any reason to not do this, or that doing this would be unnecessary? Another issue is that Google is not really crawling this site, in WMT they are reporting to have not crawled a single URL on the site. Does anyone think the above issue would have something to do with this? And/or would you have any insight on how to remedy it?
Intermediate & Advanced SEO | | ThridHour0 -
No matter what I do, my website isn't showing up in search results. What's happening?
I've checked for meta-robots, all SEO tags are fixed, reindexed with google-- basically everything and it's not showing up. According to SEOMoz all looks fine, I am making a few fixes, but nothing terribly major. It's a new website, and i know it takes a while, but there is no movement here in a month. Any insights here?
Intermediate & Advanced SEO | | Wabash0 -
Website monitoring online censorship in China - what's holding us back?
We run https://greatfire.org, a non-profit website which lets you test if a website or keyword is blocked or otherwise censored in China. There are a number of websites that nominally offer this service, and many of them rank better than us in Google. However, we believe this is unfortunate since their testing methods are inaccurate and/or not transparent. More about that further down*. We started GreatFire in February, 2011 as a reaction to ever more pervasive online censorship in China (where we are based). Due to the controversy of the project and the political situation here, we've had to remain anonymous. Still, we've been able to reach out to other websites and to users. We currently have around 3000 visits per month out of which about 1000 are from organic search. However, SEO has been a headache for us from the start. There are many challenges in running this project and our team is small (and not making any money from this). Those users that do find us on relevant keywords seem to be happy since they spend a long time on the website. Examples: websites blocked in china: 6 minutes+
Intermediate & Advanced SEO | | GreatFire.org
great firewall of china test: 8 minutes+ So, here are some SEO questions related to GreatFire.org. If you can give us advice it would be greatly appreciated and you would truly help us in our mission to bring transparency and spread awareness of online censorship in China: Each URL tested in our database has its own page. Our database contains 25000 URLs (and growing). We have previously been advised that one SEO problem is that we appear to have a lot of duplicate data, since the individual URL pages are very similar. Because of this, we've added automatic tags to most pages. We then exclude certain pages from this rule that are considered high-priority, such as domains ranked highly by Alexa and keywords that are blocked. Is this a good approach? Do you think the duplicate content factor is still holding us back? Can we improve? Some of our pages have meta descriptions, but most don't. Should we add them on URL pages? They would be set to a certain pattern which again might make them look very similar and could cause the duplicate content warning to go off. Suggestions? Many of the users that find us in Google search for keywords that aren't relevant to what we offer, such as "https.facebook.com" and lots of variations of that. Obviously, they leave the website quickly. This means that the average time that people coming from Google are spending on our website is quite low (2 minutes) and the bounce rate quite high (68%). Can we or should we do something to discourage being found on non-relevant keywords? Are there any other technical problems you can see that are holding our SEO back? Thank you very much! *Competitors ranking higher searching for "test great firewall china": 1. http://www.greatfirewallofchina.org. They are only a frontend website for this service: http://www.viewdns.info/chinesefirewall. ViewDNS only checks for DNS records which is one of three major methods to block websites. So many websites and keywords that are not DNS poisoned, but are blocked by IP or by keyword, will be specified as available, when in fact they are blocked. Our system uses actual test locations inside China to try to download the URL to be tested and checks for different types of censorship. 2. http://www.websitepulse.com/help/testtools.china-test.html. This is a better service in that they seem to do actual testing from inside China. However, they only display partial results, they do not explain test results and they do not offer historic data on whether the URL was blocked in the past. We do all of that.0