404 Errors generating in WP
-
Our crawl reports are generating back several 404 errors for pages with urls that look like:
/category/consulting/page/5/
The tag changes, the page number changes, but the result is always the same: A big glaring 404. Our sites are built on WordPress Multi-site, and I am fairly certain this issue is on the WP end, but I can't figure out why it is generating pages out to infinity, essentially, from the tags and categories.
It is worse on some sites than others, but is happening across the board (my initial concern was that it might be a theme issue, but that does not seem to be the case).
If anyone has run into this issue and knows a fix, you're insight would be greatly appreciated.
Thanks!
-
This is the exact answer that you want if you have any issues running into problems once you have isolated them with screaming frog you can easily pick them out and redirect them either through your host or through a plug-in depending upon where you're hosted and how your site is being managed.
-
Here's what to do;
- Crawl the site with Screaming Frog (you can do up to 500 URLs with the free version).
- Look for the suspect pages in your crawl list.
- Click once on the page in the list
- Below, click "To Links" - and this box will show you the page(s) linking to the bad 404 page.
- View the "From" page (which links to the bad page) in your browser.
- Look at the source code for that page - do a "find" and paste in the bad page's URL. This will show you where it's linking from.
- Fix the spot in your theme or template where it is linking to pages that do not exist.
-
I want to first say I apologize for the wording on my first answer I was using voice recognition and it made a few mistakes. I also want to say I agree with what Dan has said he really knows WordPress and I would take
his advice the fact that he's endorsed Lynn's answer makes me believe that it is the correct one and screaming frog spider is one of the best tools in the world for any type of website and can help you create 301's easily to get rid of your 404's on the paid version
here is a great Screaming frog spider guide By Seer
http://www.seerinteractive.com/blog/screaming-frog-guide
I hope this helps,
Thomas
-
Hi Melissa, you've received some great responses. Did any of them help you resolve your issue?
-
I second Lynn's answer. You need to find where the link is coming from to begin with. Could also use Screaming Frog SEO Spider or Webmaster Tools - they will all get you the same thing. Find out where the bad URLs are linked from, and then you can narrow down the source of bad code or whatever it may be.
-Dan
-
Yoast's SEO plug-in for WordPress
Will eliminate the /1/ /2/ /3/ page effect.
http://yoast.com/wordpress/seo/
I hope this is of help. The running multisite through a non-subdomain set up on expecting correct?
-
Hi Melissa,
It is actually pretty common for wp based sites to spin out repeated 404 errors (or at least have the potential to do this). It could be a theme issue, a plugin issue or a setup issue or a combination of all 3 depending on the site.
First thing to do is figure out where the link is coming from. Download your moz error report in csv and filter by 404's. On the left you will have the 404 page and on the far right you will have the referring page. Go to the referring page and view the source and try to find that link. Keep an eye out for relative links from deep structure pages also (so you have page /category/page/5 and the link is a relative href="/" or href="/6" which can have the effect of spawning repeated 404s like /category/page/5/6 etc etc).
Once you have found the link, you should be able to identify the template part which is producing them and then act accordingly to either edit the template or adjust the settings/plugin so that they stop. If you can give us a real example of a 404 and its linking page we should be able to give you more specific info.
-
I seriously doubt the issue is with WP. perhaps a plugin might be conflicting with how the page is rendered. Make sure you have a database and file backup. You could FTP in and rename the plugins folder to _plugins ( this will disable all plugins ) and check if the issue persists. Rename the folder back to plugins once you finish testing it.
NOTE : with some plugins you will have to manually go in a enable and configure them after you do this.
-
Hi 404 are extremely common. I would build 404 page and I would also use this plug-in here but make sure that you understand that this is by far no way perfect. You must check the logs consistently and make sure their note for forest mistaken for instance I had used it once and it had four or my feed back my homepage
http://wordpress.org/plugins/redirection/
However this is a very good plug-in that has been progressively getting better. It is recommended by some of the best people in WP I strongly recommend using it and I hope I've helped you if you need anymore help please let me know.
sincerely,
Thomas
Yes this will also work if you're using Nginx a much faster web server type that Apache or even lightspeed. However it is very hard to create 301 anyone use a Nginx web server environment. However this tool is able to do so and then most Managed WordPress hosts are the ones that are actually using Nginx with this Internet these days. Or managed WordPress hosts are able to then rewrite the correct 301 with this redirection plug-in is able to create 301 100% then most of the time people running Nginx host make the corrections after the redirection have to spend time.
The specialty of this redirection plug-in the is catching 404's and redirecting them to a page that it should be redirected to logically for instance if I was going to have /about/ 404 and then /about- us/ the plug-in easily figure that out that should be the correct redirect and then mixed the correction automatically you can also turn off this mode once you believe 404's are no longer an issue.
You can also use an Apache only mode which depending on your Web server might me your needs better depending on what type of Webster rerunning
I think this is an outstanding plug-in and definitely believe that it is a huge help if you have a any amount of 404s.
I hope I've been of help,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
503 Error or 200 OK??
So, in a Moz crawl and a Screaming From crawl, I'm getting some 503 Service Unavailable responses on the some pages. So I go to the pages in question, and the Moz bar is showing a 200 OK. The SEOBook http status checker (http://tools.seobook.com/server-header-checker/) also shows a 200 OK. What gives? The only reason I'm looking at this is because rankings plummeted a couple of weeks ago. Thanks! UPDATE So, I decided to use the mozbar to set the user agent as Googlebot and when I tried to access the pages in question I receive this message. I don't think this is an issue... anyone else have much experience here? Your access to this site has been limited Your access to this service has been temporarily limited. Please try again in a few minutes. (HTTP response code 503) Reason: Fake Google crawler automatically blocked Important note for site admins: If you are the administrator of this website note that your access has been limited because you broke one of the Wordfence firewall rules. The reason you access was limited is: "Fake Google crawler automatically blocked". If this is a false positive, meaning that your access to your own site has been limited incorrectly, then you will need to regain access to your site, go to the Wordfence "options" page, go to the section for Firewall Rules and disable the rule that caused you to be blocked. For example, if you were blocked because it was detected that you are a fake Google crawler, then disable the rule that blocks fake google crawlers. Or if you were blocked because you were accessing your site too quickly, then increase the number of accesses allowed per minute. If you're still having trouble, then simply disable the Wordfence firewall and you will still benefit from the other security features that Wordfence provides. If you are a site administrator and have been accidentally locked out, please enter your email in the box below and click "Send". If the email address you enter belongs to a known site administrator or someone set to receive Wordfence alerts, we will send you an email to help you regain access. Please read our FAQ if this does not work.
Moz Pro | | wiredseo0 -
403 error for a member site
Perhaps a stupid question but SEOmoz registers 403 errors for pages behind a membersite (ie. they are restricted on purpose). Should I noindex these pages or just let SEOmoz register these "errors"?
Moz Pro | | Crunchii0 -
Error on duplicated content, but when checking shouldn't been possible
Dear all, Every week I look at the different crawl reports for our website, since the start of my SeoMoz membership the Errors for duplicated content and duplicated Title is rising. But if I take out the .csv file and look in more detail, and select a pages which is marked as duplicated content, a canonical is actually existing on this page. So it shouldn't be an warning, I have no idea what the issue could be. For example pagesare marked as duplicated content, <colgroup><col width="966"></colgroup>
Moz Pro | | Letty
| http://www.zylom.com/es/descargar-juegos/3-en-raya/?sortby=2 |
| http://www.zylom.com/es/descargar-juegos/3-en-raya/?startnumber=60&sortby=2 |
| http://www.zylom.com/es/descargar-juegos/3-en-raya/?startnumber=80&sortby=2 | the parameters after '?' (question mark) are necessary for our internal system. To overcome duplicated content we coded that a canonical tag onis placed on every page with parameters and the main page is http://www.zylom.com/es/descargar-juegos/3-en-raya/ but it doesn't seem to work, because my error warnings are still rising. Please advice me Kind regards, Ms Letty van Eembergen0 -
SEOMoz Crawler and rel_canonical_tag Errors
This tag is showing up on category pages (that do not have a duplicate page on the site). In mid November Google cut our traffic by 30%. Could this tag be confusing the spider? According to the moz crawler - we seemed to be dinged for this on 95% of our pages. Is this hurting us? It seems to direct back to the same page.EG: From the FMI3600 Page http://www.brick-anew.com/FMI-3600-Fireplace-Doors.html: http://www.brick-anew.com/FMI-3600-Fireplace-Doors.html"> There is only one page for the FMI 3600 Fireplace Door category - however, it does have the same products on it as other FP Door Category pages,
Moz Pro | | SammyT0 -
How do i get rid of a duplicate page error when you can not access that page?
How do i get rid of a duplicate page error when you can not access that page? I am using yahoo store manager. And i do not know code. The only way i can get to this page is by copying the link that the error message gives me. This is the duplicate that i can not find in order to delete. http://outdoortrailcams.com/busebo.html
Moz Pro | | tom14cat140 -
Error 403
I'm getting this message "We were unable to grade that page. We received a response code of 403. URL content not parseable" when using the On-Page Report Card. Does anyone know how to go about fixing this? I feel like I've tried everything.
Moz Pro | | Sean_McDonnell0 -
What causes Crawl Diagnostics Processing Errors in seomoz campaign?
I'm getting the following error when seomoz tries to spider my site: First Crawl in Progress! Processing Issues for 671 pages Started: Apr. 23rd, 2011 Here is the robots.txt data from the site: Disallow ALL BOTS for image directories and JPEG files. User-agent: * Disallow: /stats/ Disallow: /images/ Disallow: /newspictures/ Disallow: /pdfs/ Disallow: /propbig/ Disallow: /propsmall/ Disallow: /*.jpg$ Any ideas on how to get around this would be appreciated 🙂
Moz Pro | | cmaddison0 -
Why is blocking the SEOmoz crawler considered a red "error?"
Why is blocking the SEOmoz crawler considered a red "error?" Please see attached image... Y3Vay.png
Moz Pro | | vkernel0