How to crawl the whole domain?
-
Hi,
I have a website an e-commerce website with more than 4.600 products. I expect that Seomoz scan check all url's. I don't know why this doesn't happens.
The Campaign name is Artigos para festa and should scan the whole domain festaexpress.com. But it crels only 100 pages
I even tried to create a new campaign named Festa Express - Root Domain to check if it scans but had the same problem it crawled only 199 pages.
Hope to have a solution.
Thanks,
Eduardo -
Hi Kery, thanks, I just sent to them.
Regards,
Eduardo. -
Hi Eduardo,
I'm sorry you're still having problems. At this point, it'd be best for you to send an email to [email protected] and have our help team look at it for you. They'd be the ones who could give you the most advice for diagnosing this.
Keri
-
Still have the same problem. Isn't that an issue with SEOMoz?
The domain is www.festaexpress.com has no flash and is crawled by google with no issues.Regards,
Eduardo. -
Hi Eduardo.
The way crawlers work is the begin on your home page and "crawl". They look at all the links on your home page and follow each one to the next page, then the next until your whole site has been captured.
Why are only 100 pages being crawled?
Most likely either because your site is not very well linked, or because you don't have a good navigation system, or because your navigation and links are presented in a format such as flash which the crawler cannot read.
Another possibility would be if the crawler is being blocked or hindered by your robots.txt file.
-
Not sure, but you could try Microsoft's IIS tool to spider your site. It is possible that your site has issues that make it difficult to spider, hence why SEOMoz's bot isn't working. You could also try something like Xenu Link Sleuth or HTTrack.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Moz can't crawl my site
Moz is being blocked from crawling the following site - https://www.cleanchain.com. When looking at Robot.txt, the following is disallowing access but don't know whether this is preventing Moz from crawling too? User-agent: *
Moz Pro | | danhart2020
Disallow: /adeci/
Disallow: /core/
Disallow: /connectors/
Disallow: /assets/components/ Could something else be preventing the crawl?0 -
Have a Campaign, but only states 1 page has been crawled by SEOmoz bots. What needs to be done to have all the pages crawled?
We have a campaign running for a client in SEOmoz and only 1 page has been crawled per SEOmoz' data. There are many pages in the site and a new blog with more and more articles posted each month, yet Moz is not crawling anything, aside from maybe the Home page. The odd thing is, Moz is reporting more data on all the other inner pages though for errors, duplicate content, etc... What should we do so all the pages get crawled by Moz? I don't want to delete and start over as we followed all the steps properly when setting up. Thank you for any tips here.
Moz Pro | | WhiteboardCreations0 -
Why does Crawl Diagnostics report this as duplicate content?
Hi guys, we've been addressing a duplicate content problem on our site over the past few weeks. Lately, we've implemented rel canonical tags in various parts of our ecommerce store, over time, and observing the effects by both tracking changes in SEOMoz and Websmater tools. Although our duplicate content errors are definitely decreasing, I can't help but wonder why some URLs are still being flagged with duplicate content by our SEOmoz crawler. Here's an example, taken directly from our Crawl Diagnostics Report: URL with 4 Duplicate Content errors:
Moz Pro | | yacpro13
/safety-lights.html Duplicate content URLs:
/safety-lights.html ?cat=78&price=-100
/safety-lights.html?cat=78&dir=desc&order=position /safety-lights.html?cat=78 /safety-lights.html?manufacturer=514 What I don't understand, is all of the URLS with URL parameters have a rel canonical tag pointing to the 'real' URL
/safety-lights.html So why is SEOMoz crawler still flagging this as duplicate content?0 -
Does crawling help in optimisation.?
the website is as it was last week. no optimisation from my side for 10 days now. i was ranked 5 with my keyword not much competition there. however 2 days ago i registrred at seomoz and created a campaign for my website with my keywords that were ranked 5 in search. today i see that my rank has gone up to 2. i have nt done any optimisation neither have ii created any backlinks. so how and why did i climb up? i just created a campaign and let seomoz crawl my website for 2days. am i to assume seomoz crawl optimises website? if that is the case then can i create a campaign crawl pages, climb up in searches, delete the campaign after a week, create it again crawl pages and climb up and so on ? please advise?
Moz Pro | | wahin10 -
Domains and sub domains - newbie
Hi Recently set up my SEOM MOZ account so any help would be great! On the competitive domain analysis page the sub domain metrics appear as good (if not better) then the domain.The Moz rank and trust are better on the sub domain. I know the website copy is appearing on both non-www and also www (which I assume is being referred to as the subdomain) on the competitor domain analysis page). Should I now 301 the WWW site to the non-www.... which will then concentrate the SEO and SEOMoz then will only have the root domain metrics appearing? Many thanks for your help in advance!
Moz Pro | | Richard5550 -
Why Historical Domain Analysis is not updated for almost 2 months?
This is one of the most important areas of SeoMoz to see if you are making any progress or not. Unfortunately it is not updated since 2/28/12 and today is 4/18/12. What's happenning?
Moz Pro | | Pol3600 -
Is it possible to exclude pages from Crawl Diagnostic?
I like the crawl diagnostic but it shows many errors due to a forum that I have. I don't care about the SEO value of this forum and would like to exclude any pages in the /forum/ directory. Is it possible to add exclusions to the crawl diagnostic tool?
Moz Pro | | wfernley2