Why does my crawl report show just one page result?
-
I just ran a crawl report on my site: http://dozoco.com The result report shows results for just one page - the home page, but no other pages. The report doesn't indicate any errors or "do not follows" so I'm unclear on the issue, although I suspect user error - mine.
-
Thanks Sha. The content is "ours" - at least in so far as we've pulled it from retailer sites and or affiliate networks and modified to fit our needs...so not entirely ours, not a pure duplicate either. We do operate a fund raising site which shares the content which is something which I hadn't considered until now...will have to decide how to handle the duplication across the two sites. That said - the rest of your points are well taken and appreciated. We'll have to do some further research into the javascript points and determine how to best handle.
-
Thanks Keri - very helpful.
-
Hi William,
As indicated from the help page that Keri provided, the problem is that the page is entirely rendered in javascript and SEOmoz crawlers do not follow javascript links or redirects.
Of course, the reason why the SEOmoz crawlers do not do this is most likely because Google's (and other search engines) stated position is that they are "getting better" at handling javascript, but the likelihood of trouble free crawling for googlebot is likely low or at the very least unknown.
Bing now has an option in its Webmaster Central that lets you indicate that javascript crawling is required for a site. I have not seen any information on the effectiveness of this as yet, but you could investigate that by hitting their help forum.
Even if search engines manage to crawl the javascript without issue, there are other significant problems with the content on the site. It appears that the site is a multi affiliate whitelabel? All of the text is actually being pulled in from an external page and that page contains content that is duplicated across many other websites. This is the case with every "page".
Unfortunately, all of these things add up to a fairly bad SEO situation. Your best option for generating traffic would be to become massively popular through social channels and use them to feed traffic to the site. That is assuming that this whitelabel platform does not give you the option to create your own content (which would be much better).
Another alternative would be to create a site on a new domain with awesome, unique, shareable content with links to feed traffic to this site, but if you are going that route, making people take an extra click through a second domain on the way to the retailer's site would not be optimal for conversions. So it would be better to add direct affiliate links within the pages.
So, on the whole, I would say that ramping up your social activity is your best approach.
Hope this helps,
Sha
-
Here's a post from the help desk with a couple of reasons for that. http://seomoz.zendesk.com/entries/409821-why-isn-t-my-site-being-crawled-you-only-crawled-one-page. If that doesn't take care of the problem for you, email [email protected] and they'll work with you on getting the rest of the site crawled.
I'm looking at a site:dozoco.com search in Google and all the URLs I see look like http://dozoco.com/#!/store/us-pets. The #! may be the cause of the problem; I'm not exactly sure how Roger deals with crawling that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why have I stopped receiving emails about crawl reports and rankings reports?
I used to receive emails weekly telling me a new crawl had completed and the reports were ready and also another email saying the new rankings and on page reports were ready to view - I am not getting these anymore. Does this mean these are still happening but I need to find them within the package or has something changed? Thanks
Moz Pro | | Fitsensesports0 -
Why would opensite explorer show a link from baidu result page with baidu as a linking root domain?
I found an old domain that has very high domain authority and one of its top linking root domains is baidu I clicked the link in ose and it took me to a baidu serp????? please explain...im happy to clarify if need be or give you the site in question just let me know what you think. I checked ose for the page authority of the serp page it was 50. would that page be helping the site in question to rank? or am I just dumb for asking this question One more thing I couldnt actually find a link to the page I was looking at in ose on the baidu result page.. thanks
Moz Pro | | duncan2740 -
Links & page authority crawl
I see the links and page authority have not been updated in over a month... does anyone know how often it gets updated?
Moz Pro | | nazmiyal0 -
Crawl test tool from SEOmoz - which URLs does it actually crawl?
I am using for the first time the crawl test tool from SEOmoz and I do not really understand which URLs the tool is going to crawl. First, it says "enter any subdomain" --> why can´t I do the crawl for the root domain? Second it says "we'll crawl up to 3,000 linked-to pages" --> does that mean that the tool crawls all internal links that it can find on the given domain? Thanks for your help!
Moz Pro | | Elke.GetApp0 -
How do i get to know th pages crawled by SEOMOZ?
My SEOMOZ campaign says that "n" number of pages were crawled. How do i get access to the list of the pages crawled by SEOMOZ?
Moz Pro | | IM_Learner0 -
SEOMoz Campaign shows Warnings for pages with >200 and <300 links
We currently use SEOMoz's campaign tool to review the SEO progress of our site. One thing we are unsure of is that SEOMoz gives us a warning for over 1000 of our pages because we have around 200 links on those pages (all in the Menu Drop Downs). I read the post and watched the video, Whiteboard Friday Flat Site Architecture a while ago and Rand mentioned there is no issue with having a web page with 200 to 300 links and he even encouraged it. So why would these show up as warnings in our Campaign?
Moz Pro | | PBCLinear0 -
The Site Explorer crawl shows errors for files/folders that do not exist.
I'm fairly certain there is ultimately something amiss on our server but the Site Explorer report on my website (www.kpmginstitutes.com) is showing thousands of folders that do not exist. Example: For my "About Us" page (www.kpmginstitutes.com/about-us.aspx), the report shows a link: www.kpmginstitutes.com/rss/industries/404-institute/404-institute/about-us.aspx. We do have "rss", "industries", "404-institute" folders but they are parallel in the architecture, not sequential as indicated in the error url. Has anyone else seen these types of error in your Site Explorer reports?
Moz Pro | | dturkington0