Increasing in 404 errors that doesnt exist
-
Hi
First of all I should say that I have error in the old webmaster not the new one.
I have two WordPress blog in root and subfolder.
Today I checked my webmaster and recognize that I had 100 errors (404) that found in few days ago.
My root WordPress is OK but subfolder WordPress has error. Let me show you by example.
http://example.com/subfolder/article15245
I had error for this page:
http://example.com/article15245
looks like this subfolder deleted
I checked my links, but all of them were OK and linked to the right URL.
unfortunately this errors dont have "linked from" section
-
Thanks for your reply
Today I redirect most of these links to the right post, but it was such a borning task.
-
The page by page redirection is necessary to preserve any link juice from any incoming links to the pages in question. You can throw the links into Ahrefs or look at them in Moz to see if the pages have any links that are worth saving. Also, SEO press has the ability to edit the HTACCESS built into the plugin. But you can also do wordpress level redirects as well. It's pretty awesome.
Hope that helps.
-
You can put the domain here, I'm sure lots of people would like to weigh in on this
it's an interesting problem
I have replied to your email
-
Redirect one by one? its so boring!
Does The section "linked from" updates for these links?
I use redirection plugin instead of htaccess. Its more safe
What happen if I don't redirect them for recognizing why these errors occur?
Thanks for your respond
-
Thanks for your perfect answer.
I checked these links in moz link explorer but no link found. i think this is an internal problem because most of my subfolder links (over 70%) become 404.
I have redirection plugin. It has 404 sections that shows last visitors that gone to the 404 pages but no reports like this error found!
As you said it seems I should redirect them with .htaccess
Thanks , I emailed my domain for you.
Can I put my domain here for others to check?
-
I would just 301 all the pages to the final URLs in prod, verify that they are working individually, then Fetch & Render. Many plugins like SEO press or Yoast will allow you to upload them in bulk to help save time. Or you can always update your HTACCESS file with the redirect. If you are working in Excel or Sheets, using the Find/Replace to bulk edit can be a life saver. It is usually pretty boring, but not the worst in the world. Cheers!
-
It's so annoying when things like that happen! When Google refuses to give the 'linked from' data, it's a real head-test working out where the links are coming from. Did you know that the links could even be coming from other websites, not just your own? When a user follows a link to your site (regardless of where that link is from), Google consider it your error if a valid page isn't returned
Since this error is only occurring in the old area of WMT, it probably doesn't matter much. That being said, one simple fix would be to 301 redirect all the broken links, to the functional article pages. After that you can just bulk mark them all as fixed
Usually I tell people to fix the actual link, but if it's an external link which you have no control over (or if Google can't even be bothered to tell you what the linking page is) then 301 and mark as fixed is probably your best bet. Especially since, these are only individual article pages (it's not like a malformed version of your homepage or something)
If you email me the domain (check my profile page) then I might be able to crawl your site for you to determine whether there are any obviously broken internal links. Regardless, you'd want the 301s as a back-stop anyway
Hope that helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Many of my pages are showing up as 4xx errors because they require a login/subscription to view. How can I fix this?
Should these pages be redirecting to a login page? Or does there need to be something other than an Access Denied message? I feel like these are bringing the overall site score down.
On-Page Optimization | | CaraMc0 -
Getting 404 Errors on Fonts After Website Redesign
Hi, I recently redesigned my website and I changed a few URLs so I have been keeping an eye on 404 errors and setting up redirects as needed. However, my 404 error log keeps getting filled font related URLs like this: <colgroup><col width="960"></colgroup>
On-Page Optimization | | DohenyDrones
| /fonts/AvenirLTStd-Heavy.otf), url(/fonts/AvenirLTStd-Heavy.woff), url(/fonts/AvenirLTStd-Heavy.eot), url(/fonts/AvenirLTStd-Heavy.eot? |
| /fonts/FrutigerLTStd-Black.otf), url(/fonts/FrutigerLTStd-Black.woff), url(/fonts/FrutigerLTStd-Black.eot), url(/fonts/FrutigerLTStd-Black.eot? |
| /fonts/AvenirLTStd-Heavy.otf), url(/fonts/AvenirLTStd-Heavy.woff), url(/fonts/AvenirLTStd-Heavy.eot), url(/fonts/AvenirLTStd-Heavy.eot? |
| /fonts/AvenirLTStd-Book.otf), url(/fonts/AvenirLTStd-Book.woff), url(/fonts/AvenirLTStd-Book.eot), url(/fonts/AvenirLTStd-Book.eot? |
| /fonts/FrutigerLTStd-Bold.otf), url(/fonts/FrutigerLTStd-Bold.woff), url(/fonts/FrutigerLTStd-Bold.eot), url(/fonts/FrutigerLTStd-Bold.eot? |
| /fonts/FrutigerLTStd-Bold.otf), url(/fonts/FrutigerLTStd-Bold.woff), url(/fonts/FrutigerLTStd-Bold.eot), url(/fonts/FrutigerLTStd-Bold.eot? |
| /fonts/FrutigerLTStd-Black.otf), url(/fonts/FrutigerLTStd-Black.woff), url(/fonts/FrutigerLTStd-Black.eot), url(/fonts/FrutigerLTStd-Black.eot? |
| /fonts/AvenirLTStd-Book.otf), url(/fonts/AvenirLTStd-Book.woff), url(/fonts/AvenirLTStd-Book.eot), url(/fonts/AvenirLTStd-Book.eot? |
| /fonts/AvenirLTStd-Heavy.otf), url(/fonts/AvenirLTStd-Heavy.woff), url(/fonts/AvenirLTStd-Heavy.eot), url(/fonts/AvenirLTStd-Heavy.eot? | Any idea what is causing these 404 errors? Is this something that might hurt my SERP?0 -
Website server errors
I launched a new website at www.cheaptubes.com and had recovered my search engine rankings as well after penguin & panda devestation. I'm was continuing to improve the site Sept 26th by adding caching of images and W3 cache but moz analytics is now saying I went from 288 medium issues to over 600 and i see the warning "45% of site pages served 302 redirects during the last crawl". I'm not sure how to fix this? I'm on WP using Yoast SEO so all the 301's I did are 301's not 302's. I do have SSL, could it be Http vs Https? I've asked this question before and two very nice people replied with suggestions which I tried to implement but couldn't, i got the WP white screen of death several times. They suggested the code below. Does anyone know how to implement this code or some other way to reduce the errors I'm getting? I've asked this at stackoverflow with no responses. "you have a lot of http & https issues so you should fix these with a bit of .htaccess code, as below. RewriteEngine On
On-Page Optimization | | cheaptubes
RewriteCond %{HTTPS} !=on
RewriteRule ^.*$ https://%{SERVER_NAME}%{REQUEST_URI} [R,L] You also have some non-www to www issues. You can fix these in .htaccess at the same time... RewriteCond %{HTTP_HOST} !^www.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] You should find this fixes a lot of your issues. Also check in your Wordpress general settings that the site is set to www.cheaptubes.com for both instances." When I tried to do as they suggested it gave me an internal server error. Please see the code below from .htaccess and the server error. I took it out for now. BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule ^.$ https://%{SERVER_NAME}%{REQUEST_URI} [R,L]
RewriteCond %{HTTP_HOST} !^www. RewriteRule ^(.)$ http://www.%{HTTP_HOST}/$1 [R=301,L]</ifmodule> END WordPress Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. Please contact the server administrator, [email protected] and inform them of the time the error occurred, and anything you might have done that may have caused the error. More information about this error may be available in the server error log. Additionally, a 500 Internal Server Error error was encountered while trying to use an ErrorDocument to handle the request.0 -
Internal 404 Error
Hi sorry for the newbie question, I have a few 404 pages on my moz crawl report. so for example this one : http://www.dwliverpoolphotography.co.uk/blog/www.coraclecomm.wordpress.com. How can I find the page that is linking to it so I can fix the link or delete it? Best wishes. David.
On-Page Optimization | | WallerD0 -
Moz not showing blog errors
I have a campaign running in moz for getpromoted.in. Your moz is only crawling and showing website errors not blog (blog.getpromoted.in). Please advice whats the issue
On-Page Optimization | | zigmund0 -
Why so many crawl errors?
Our site is showing it has a ton of crawl errors in the back end, mostly concerning duplicate content within our blog. The content is unique however. We know this for certain because it's done in house or put together by some of the freelance writers we work with. The site is for an RV dealership and we're using a template-based system from a well known company. Any ideas on what may be causing this?
On-Page Optimization | | BlakeArbogast0 -
How do I avoid duplicate content and page title errors when using a single CMS for a website
I am currently hosting a client site on a CMS with both a Canadian and USA version of the website. We have the .com as the primary domain and the .ca is re-directed from the registrar to the Canadian home page. The problem I am having is that my campaign produces errors for duplicate page content and duplicate page titles. Is there a way to setup the two versions on the CMS so that these errors do not get produced? My concern is getting penalized from search engines. Appreciate any help. Mark Palmer
On-Page Optimization | | kpreneur0 -
Tag clouds: good for internal linking and increase of keyword relevant pages?
As Matt Cutts explained, tag clouds are OK if you're not engaged in keyword stuffing (http://www.youtube.com/watch?v=bYPX_ZmhLqg) - i.e. if you're not putting in 500 tags. I'm currently creating tags for an online-bookseller; just like Amazon this e-commerce-site has potentially a couple of million books. Tag clouds will be added to each book detail page in order to enrich each of these pages with relevant keywords both for search engines and users (get a quick overview over the main topics of the book; navigate the site and find other books associated with each tag). Each of these book-specific tag clouds will hold up to 50 tags max, typically rather in the range of up to 10-20. From an SEO perspective, my question is twofold: 1. Does the site benefit from these tag clouds by improving the internal linking structure? 2. Does the site benefit from creating lots of additional tag-specific-pages (up to 200k different tags) or can these pages become a problem, as they don't contain a lot of rich content as such but rather lists of books associated with each tag? Thanks in advance!
On-Page Optimization | | semantopic0