What are the solutions for Crawl Diagnostics?
-
Hi Mozers,
I am pretty new to SEO and wanted to know what are the solutions for the various errors reported in the crawl diagnostics and if this question has been asked, please guide me in the right directions.
Following are queries specific to my site just need help with these 2 only:
1. Error 404: (About 60 errors) : These are for all the PA 1 links and are no longer in the server, what do i do with these?
2. Duplicate Page Content and Title ( About 5000) : Most of these are automatic URL;s that are generated when someone fills any info on our website. What do I do with these URL;s. they are for example: _www.abc.fr/signup.php?_id=001 and then www.abc.fr/signup.php?id=002 and so on. What do I need to do and how? Plzz.
Any help would be highly appreciated.
I have read a lot on the forums about duplicate content but dont know how to implement this in my case, please advise.
Thanks in advance.
CY
-
I am glad, I brought it to your notice.
One last question for you, I have put the canonical of all these PR 1 pages to my home page, is it fine?
-
Charu, I went through one of my campaigns and it looks like you're right on that.
-
Thanks, I did what you suggested.
However, after setting the canonical tag, I guess now Moz does not report this asduplicate content or missing/duplicate title, instead it shows this under notices as Rel=Canonical tag.
Please correct me if I am wrong.
-
404 errors will usually mean that there are active links to pages on your site that don't exist. You need to find those links using OSE or a tool like screamingfrog and then you need to remove or revise those links.
Duplicate content--add a rel=canonical tag to the header of the www.abc.fr/signup.php page that shows that it is the canonical version. The duplicate versions will then carry the same tag, indicating that the original is the canonical version of the content. While that's a good practice for the search engines, it won't stop those pages from showing up in Moz's reports as duplicates--they will continue to show up as dupes if they get crawled.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz site crawl doesn't work
The Moz site crawl isn't working for my campaign, but works for the site's on demand crawl. The search should not be disallowed by robots.txt or the headers. I'd like to be able to track the website for the campaign so I can see SEO gains / losses and increases / decreases in indexing.
Getting Started | | DrainKing0 -
Moz Site Crawl can't index WIX sites
We've been attempting to work on some SEO for a new potential client however they are using a WIX site. We've noticed that Moz SEO tools will not index any WIX sites. e.g. https://www.sharonradisch.com/ (which is one of their case studies). Anyone seen this that can offer any advice? Thanks,
Getting Started | | monkeex
Mark2 -
Moz can't crawl my site.
Moz cannot carry out the site crawl on my online shop. Not really sure what the issue is, it has no problem getting onto my site when you use www. before the address, but it needs to be able to access bluerinsevintage.co.uk Stuck as what to do, we are a shopify store. Anyone else had this problem, or know what i need to change so they can crawl the site? thjis is the page they are getting when trying to get on bluerinsevintage.co.uk but if they use www.bluerinsevintage.co.uk the site comes up. Adam
Getting Started | | bluerinsevintage0 -
4XX Client Error Solution?
Hello. Could you tell em how I can solve the 4XX client error from the moz crawl? I have the broken links in moz crawler like this: test.com/page-1/test.com. Thanks.
Getting Started | | iulianafun0 -
My website does not allow all crawler to crawl, Now my question is that whether i need to give permission to moz crawler if yes then whaat is moz bot name?
My website does not permit all crawler to crawl website. Whether ii need to give permission to moz bot to crawl website or not? If yes what is the moz bot name?
Getting Started | | irteam0 -
Campaign.crawl-seed.bad-response
I am trying to set up a new campaign for a website, but I keep getting this error message... campaign.crawl-seed.bad-response 😞 I have no idea what the problem is. Can you tell me what I am suppose to do to fix this? The URL I am trying to set up is www.aboutplcs.com
Getting Started | | ChadC0 -
High Number of Crawl Errors for Blog
Hello All, We have been having an issue with very high crawl errors on websites that contain blogs. Here is a screenshot of one of the sites we are dealing with: http://cl.ly/image/0i2Q2O100p2v . Looking through the links that are turning up in the crawl errors, the majority of them (roughly 90%) are auto-generated by the blog's system. This includes category/tag links, archived links, etc. A few examples being: http://www.mysite.com/2004/10/ http://www.mysite.com/2004/10/17/ http://www.mysite.com/tagname As far as I know (please correct me if I'm wrong!), search engines will not penalize you for things like this that appear on auto-generated pages. Also, even if search engines did penalize you, I do not believe we can make a unique meta tag for auto-generate pages. Regardless, our client is very concerned seeing these high number of errors in the reports, even though we have explained the situation to him. Would anyone have any suggestions on how to either 1) tell Moz to ignore these types of errors or 2) adjust the website so that these errors now longer appear in the reports? Thanks so much! Rebecca
Getting Started | | Level2Designs0 -
Campaign.crawl-seed.bad-response ???
Hello Guys I have just tried to set-up a new campaign for this site http://www.emsababies.co.uk/ HoweverI keep getting this error ?? campaign.crawl-seed.bad-response Anyone know what I am doing wrong ?? Cheers James
Getting Started | | BlueNinja1