How to find 20 hidden 404s
-
Hello,
We have like twenty 404s left to find. How do you find these when:
1. They don't show up in Google Webmaster Tools
2. They don't have any other internal or external pages linking to them.
3. They don't show up in site:domain.com (We have 9000 pages and only 600 show up - I fixed those out of the 600).
4. They are probably causing high bounce rates.
5. They're not in the sitemap
Thanks!
-
You should be able to crawl the entire site without increasing the RAM, once you buy the paid version.
-
Right in the paid version wont stop! You can see in their website that the limit of urls will be remove once you buy the license. http://www.screamingfrog.co.uk/seo-spider/licence/
It should catch everything, maybe you can contact their support just to make sure, they are very good on support over twitter.
-
Yes, but if it only check 397 in the free version is it going to stop there in the paid version as well. Just making sure.
Also, will it catch everything?
-
Yes I just run a craw for 500 000 urls
-
The free version checked 397 URLs. Will the purchased version check all 9000?
-
You can try crawl the site using screamingfrog, make sure you check the box “Check Links Outside Folder” in the spider menu.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster tools -Fixing over 20,000+ crawl errors
Hi, I'm trying to gather all the 404 crawl errors on my website after a recent hacking that I've been trying to rectify and clean up. Webmaster tools states that I have over 20 000+ crawl errors. I can only download a sample of 1000 errors. Is there any way to get the full list instead of correcting 1000 errors, marking them as fixed and waiting for the next batch of 1000 errors to be listed in Webmaster tools? The current method is quite timely and I want to take care of all errors in one shot instead of over a course of a month.
Intermediate & Advanced SEO | | FPK0 -
How to Find Competitors Traffic Sources
Is it possible to find out where traffic is coming from on someone else website?
Intermediate & Advanced SEO | | Hall.Michael0 -
Creating 20+ websites with links back to central site
Hey guys, A client of ours owns an IT company with 20+ locations across the UK. He is looking for a solution to provide each of their 20+ locations with a page or website that they can manage themselves that links directly back to the main site. His idea is to create 20+ one or two page websites that could all link back to the main central site - aiding the possibility of ranking well for locally-based terms. At the moment, we have a page for each of the 20+ locations on the main site. However, the client wants to give his franchisees complete control over their web presence. Would a setup like this work? Would it be logical to have 20+ websites (likely to follow a very similar format) all pointing to one central website? Would we have to "no-follow" links back to main site in order to show we aren't trying to manipulate page rank? Would creating sub folders on the main site be a better option for each of the 20+ locations? Any feedback appreciated!
Intermediate & Advanced SEO | | Webrevolve0 -
Empty search results labeled as Soft 404s?
I have a site with faceted search but sometimes when someone drills down too far it ends up with no results. The page and outlined and faceted navigation are still there. The site uses dynamic URLs for the faceted navigation but Google is reporting these no results pages as Soft 404s. How should we handle these? Should we redirect these? Can we return 404 in the status code but still show the no results page they are looking for? Thanks for your responses
Intermediate & Advanced SEO | | MarloSchneider0 -
How does the crawl find duplicate pages that don't exist on the site?
It looks like I have a lot of duplicate pages which are essentially the same url with some extra ? parameters added eg: http://www.merlin.org.uk/10-facts-about-malnutrition http://www.merlin.org.uk/10-facts-about-malnutrition?page=1 http://www.merlin.org.uk/10-facts-about-malnutrition?page=2 These extra 2 pages (and there's loads of pages this happens to) are a mystery to me. Not sure why they exist as there's only 1 page. Is this a massive issue? It's built on Drupal so I wonder if it auto generates these pages for some reason? Any help MUCH appreciated. Thanks
Intermediate & Advanced SEO | | Deniz0 -
My warning report says I have too many on page links - 517! I can't find 50% of them but my q is about no follow
if we put 'no follow' on some of these links does that mean the search engines won't index the no follow pages even if those pages are linked to from elsewhere? no link juice will flow from the page with the (no follow) links on? Just trying to understand why my rankings have dropped so dramatically in the last 6 weeks or so since we redesigned the site, and it might be that now we have too many links on the homepage. This is the page http://www.suffolktouristguide.com/ All suggestions appreciated!
Intermediate & Advanced SEO | | SarahinSuffolk0 -
Hidden keywords - how many per page?
Hi All, We have a booking website we want to optimize for keywords we cannot really show, because some of our partners wouldn't want it. We figured we can put said keywords or close synonyms onpage in various places that are not too dangerous though (e.g. image names, image alt tags, URLs, etc.). The question is how much keywords we can target though? We know keyword stuffing is detrimental, and we will not start to create long URLs stuffed with keywords, same for H1 tags or page titles. So how many is acceptable/not counterproductive? Thanks!
Intermediate & Advanced SEO | | Philoups0 -
How to find pages hardest hit
I have been hearing that panda can penalize a website for low quality pages. I have run duplicate content check and done my best to go through the whole website. I hear many people talking about deleting hardest hit pages, or fixing hardest hit pages. My question is how can I find which pages on our website are hardest hit? Is there anyway to check a website for pages that might score low. We do have a ecom section to the website which I am concerned might be considered low quality for each product page. Any advice would be a great help.
Intermediate & Advanced SEO | | fertilityhealth0