4xx fix
-
Hi
I have quite a lot of 4xx errors on our site. The 4xx occurred because I cleaned poor URLs that had commas etc in them so its the old URLs that now 4xx. There are no links to the URLs that 4xx.What is the best way of rectifying this issue of my own making?!
Thanks
Gavin -
OK, thanks Dean. I'll update the sitemap and look into rectifying the errors identified by screamingfrog.
Thanks for your assistance!
-
No, I would recommend that you fix the underlying issue. I can see from your sitemap that you still have the URL's with commas in.
Personally I would use screamingfrog.co.uk to find your crawl errors as you will not need to wait a week for the next report.
-
I was waiting for the next crawl as I thought the 4xx would be removed from the crawl diagnostics, however I received a new crawl report today and they are still listed in the report.
I think the simplest way to remove the 4xx would be to create 301s for the URLs. Would you agree? -
So since you tidied the URL's has moz crawled your site again or are you waiting for the next crawl?
-
Where are you seeing the errors being reported? If you have corrected the problem with the error URL's and there are no links to theses URL's then there should not be a problem.
If however you are seeing theses URL's in the search results then yes a 301 redirect would be appropriate.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Keyword research, creating copy, fixing on-page optimisation - what next?
Hello - Wondered if I could get people's thoughts. We/I have started working on a client's website to improve everything - a general overhaul across SEO, on-page optimisation etc. I'm relatively new to this although picking things up and learning on the job which is great, and Moz is so helpful! So far we have conducted a review of the website, created a large list of keywords and analysed these, started overhauling the copy and adding the new keywords within this, have plans to overhaul the other elements of the site (headings, tags etc) and improve the design, functionality and customer journey through the website. My question is: where do I go from here in terms of keywords and SEO? Is it a case of plugging in the keywords we've researched, watch how they perform, and then switch things up with different keywords if they aren't performing as well as we expected? Is it really a lot of trial and error or is there an exact science behind it that I'm missing? I just feel a little as though we've pulled these keywords out of thin-air to a degree, and are adding them into our copy because the numbers on Moz show they should perform well, and they are what we are trying to promote on the website. But I don't know if this is right?! Perhaps I'm over-thinking it...
Technical SEO | | WhitewallGlasgow0 -
Site splitting value of our pages with multiple variations. How can I fix this with the least impact?
Just started at a company recently, and there is a preexisting problem that I could use some help with. Somebody please tell me there is a low impact fix for this: My company's website is structured so all of the main links used on the nav are listed as .asp pages. All the canonical stuff. However, for "SEO Purposes," we have a number of similar (not exact) pages in .html on the same topic on our site. So, for example, let's say we're a bakery. The main URL, as linked in the nav, for our Chocolate Cakes, would be http://www.oursite.com/chocolate-cakes.asp. This differentiates the page from our other cake varieties, such as http://www.oursite.com/pound-cakes.asp and http://www.oursite.com/carrot-cakes.asp. Alas, fully indexed in Google with links existing only in our sitemap, we also have: http://www.oursite.com/chocolate-cakes.html http://www.oursite.com/chocolatecakes.html http://www.oursite.com/cakes-chocolate.html This seems CRAZY to me, because wouldn't this split our search results 4 ways? Am I right in assuming this is destroying the rankings of our canonical pages? I want to change this, but problem is, none of the content is the same on any of the variants, and some of these pages rank really well - albeit mostly for long tail keywords instead of the good, solid keywords we're after. So, what I'm asking you guys is: How do I burn these .html pages to the ground without completely destroying our rankings for the other keywords? I want to 301 those pages to our canonical nav URLs but, because of the wildly different content, I'm afraid that we could see a heavy drop in search traffic. Am I just being overly cautious? Thanks in advance!
Technical SEO | | jdsnyc20 -
How to fix broken links?
Hi, I use WordPress CMS with Yoast SEO plugin. I have just found out that my 403 errors increased dramatically. It seems that all my tags below of each post are being broken for some reason. When i click on the tags i get the following massage: **403 Forbidden Request forbidden by administrative rules. ** I assume it has something to do with the configuration within Yoast SEO plugin. Dose anyone know how should i fix that? Thanks, Raviv evsGujA
Technical SEO | | Indiatravelz0 -
Google WMT continues reporting fixed 404s - why?
I work with a news site that had a heavy restructuring last spring. This involved removing many pages that were duplicates, tags, etc. Since then, we have taken very careful steps to remove all links coming into these deleted pages, but for some reason, WMT continues to report them. By last August, we had cleared over 10k 404s to our site, but this lasted only for about 2 months and they started coming back. The "linked from" gives no data, and other crawlers like seomoz aren't detecting any of these errors. The pages aren't in the sitemap and I've confirmed that they're not really being linked from from anywhere. Why do these pages keep coming back? Should I even bother removing them over and over again? Thanks -Juanita
Technical SEO | | VoxxiVoxxi0 -
We just fixed a Meta refresh, unified our link profile and now our rankings are going crazy
Crazy in a bad way!I am hoping that perhaps some of you have experienced this scenario before and can shed some light on what might be happening.Here is what happened:We recently fixed a meta refresh that was on our site's homepage. It was completely fragmenting our link profile. All of our external links were being counted towards one URL, and our internal links were counting for the other URL. In addition to that, our most authoritative URL, because it was subject to a meta refresh, was not passing any of its authority to our other pages.Here is what happened to our link profile:Total External Links: Before - 2,757 After - **4,311 **Total Internal Links: Befpre - 125 After - 3,221
Technical SEO | | danatanseo
Total Links: Before - 2,882 After - 7,532Yeah....huge change. Great right? Well, I have been tracking a set of keywords that were ranking from spots 10-30 in Google. There are about 66 keywords in the set. I started tracking them because at MozCon last July Fabio Riccotta suggested that targeting keywords showing up on page 2 or 3 of the results might be easier to improve than terms that were on the bottom of page 1. So, take a look at this. The first column shows where a particular keyword ranked on 11/8 and the second column shows where it is ranking today and the third column shows the change. For obvious reasons I haven't included the keywords.11/8 11/14 Change****10 44 -34
10 26 -16
10 28 -18
10 34 -24
10 25 -15
15 29 -14
16 33 -17
16 32 -16
17 24 -7
17 53 -36
17 41 -24
18 27 -9
19 42 -23
19 35 -16
19 - Not in top 200
19 30 -11
19 25 -6
19 43 -24
20 33 -13
20 41 -21
20 34 -14
21 46 -25
21 - Not in top 200
21 33 -12
21 40 -19
21 61 -40
22 46 -24
22 35 -13
22 46 -24
23 51 -28
23 49 -26
24 43 -19
24 47 -23
24 45 -21
24 39 -15
25 45 -20
25 50 -25
26 39 -13
26 118 - 92
26 30 -4
26 139 -113
26 57 -31
27 48 -21
27 47 -20
27 47 -20
27 45 -18
27 48 -21
27 59 -32
27 55 -28
27 40 -13
27 48 -21
27 51 -24
27 43 -16
28 66 -38
28 49 -21
28 51 -23
28 58 -30
29 58 -29
29 43 -14
29 41 -12
29 49 -20
29 60 -31
30 42 -12
31 - Not in top 200
31 59 -28
31 68 -37
31 53 -22Needless to say, this is exactly the opposite of what I expected to see after fixing the meta refresh problem. I wouldn't think anything of normal fluctuation, but every single one of these keywords moved down, almost consistently 20-25 spots. The further down a keyword was to begin with, it seems the further it dropped.What do you make of this? Could Google be penalizing us because our link profile changed so dramatically in a short period of time? I should say that we have never taken part in spammy link-building schemes, nor have we ever been contacted by Google with any kind of suspicious link warnings. We've been online since 1996 and are an e-commerce site doing #RCS. Thanks all!0 -
Fix duplicate content caused by tags
Hi everyone, TGIF. We are getting hundreds of duplicate content errors on our WP site by what appears to be our tags. For each tag and each post we are seeing a duplicate content error. I thought I had this fixed but apparently I do not. We are using the Genesis theme with Yoast's SEO plugin. Does anyone have the solution to what I imagine is this easy fix? Thanks in advance.
Technical SEO | | okuma0 -
How do I fix a multiple domain mess?
I just picked up an account that is a franchisee and they have 6 exact match domains plus their main domain (all exact duplicates, not 301s). GA shows the main domain as getting the lion's share of hits, but for some important local keywords, the exact match domains rank higher. Some pages may have the exact match domain, primary domain and the franchise domain all ranking on the same SERP. Yuk! My strategy is to work on the main domain and as the work progresses, the main site will surpass the exact match domains and main franchise domain for the important searches. For the exact match domains I plan on just leaving them alone. Is this a sound strategy? I could pull the exact match domains down, but since they rank well for their keywords, it seems most sensible to leave them up. What do you think?
Technical SEO | | KristinnD0