Crawler doesn't discover the links in the main nav
-
Hi Moz Community,
We have a headless ecom (Magento) client that I'm trying to crawl the site. During the crawl, the tool (Screaming Frog) cannot discover the sub-category URLs in the main navigation when I start crawling via homepage.
Similarly, when I start crawling with one of the sub-category page, it doesn't crawl any of the product URLs on the sub-category page itself.
When I inspect product and sub-cat URLs through Search Console, they seem as indexed and if I view how Googlebot rendered the sub-category page, I can see the product URLs on the sub-cat page too.
If you have any idea what's the issue with Screaming Frog and would like to help me out, I'd be so grateful!
Thanks in advance
-
Hi Kate,
Thank you! I followed you on Twitter, my user name is @curetuvana
-
Find me on twitter @katemorris and follow me. Tell me your name and I'll follow you and we can DM.
-
Hi Kate,
Thank you for your time to respond! Is there any way that I can contact you directly?
By the way, I've tried crawling after changing configurations as 'rendering Javascript' however, it still didn't discover product URLs.
Thank you!
-
Ah, I might know your problem. What is your site? We had this issue at my last company, had to do with crawling using JS. If you will send me the site, I can take a look.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Is Link equity / Link Juice lost to a blocked URL in the same way that it is lost to nofollow link
Hi If there is a link on a page that goes to a URL that is blocked in robots txt - is the link juice lost in the same way as when you add nofollow to a link on a page. Any help would be most appreciated.
Intermediate & Advanced SEO | | Andrew-SEO0 -
Weird Site is linking to our site and links appears to be broken
I have got a lot of weird links indexed from this page: http://kzs.uere.info/files/images/dining-table-and-2-upholstered-chairs.html When clicking the link it shows 404. Also, the spam score is huge. What do you guys suggest to do with this?
Intermediate & Advanced SEO | | Miniorek
Could it be done by somebody to get our rankings down or domain penalized? Best Regards
Mike & Alex0 -
Does Navigation Bar have an effect on the link juice and the number of internal links?
Hi Moz community, I am getting the "Avoid Too Many Internal Links" error from Moz for most of my pages and Google declared the max number as 100 internal links. However, most of my pages can't have internal links less than 100, since it is a commercial website and there are many categories that I have to show to my visitors by using the drop down navigation bar. Without counting the links in the navigation bar, the number of internal links is below 100. I am wondering if the navigation bar links affect the link juice and counted as internal links by Google. The Same question also applies to the links in the footer. Additionally, how about the products? I have hundreds of products in the category pages and even though I use pagination I still have many links in the category pages (probably more than 100 without even counting the navigation bar links). Does Google count the product links as internal links and how about the effect on the link juice? Here is the website if you want to take a look: http://www.goldstore.com.tr Thank you for your answers.
Intermediate & Advanced SEO | | onurcan-ikiz0 -
Can't get auto-generated content de-indexed
Hello and thanks in advance for any help you can offer me! Customgia.com, a costume jewelry e-commerce site, has two types of product pages - public pages that are internally linked and private pages that are only accessible by accessing the URL directly. Every item on Customgia is created online using an online design tool. Users can register for a free account and save the designs they create, even if they don't purchase them. Prior to saving their design, the user is required to enter a product name and choose "public" or "private" for that design. The page title and product description are auto-generated. Since launching in October '11, the number of products grew and grew as more users designed jewelry items. Most users chose to show their designs publicly, so the number of products in the store swelled to nearly 3000. I realized many of these designs were similar to each and occasionally exact duplicates. So over the past 8 months, I've made 2300 of these design "private" - and no longer accessible unless the designer logs into their account (these pages can also be linked to directly). When I realized that Google had indexed nearly all 3000 products, I entered URL removal requests on Webmaster Tools for the designs that I had changed to "private". I did this starting about 4 months ago. At the time, I did not have NOINDEX meta tags on these product pages (obviously a mistake) so it appears that most of these product pages were never removed from the index. Or if they were removed, they were added back in after the 90 days were up. Of the 716 products currently showing (the ones I want Google to know about), 466 have unique, informative descriptions written by humans. The remaining 250 have auto-generated descriptions that read coherently but are somewhat similar to one another. I don't think these 250 descriptions are the big problem right now but these product pages can be hidden if necessary. I think the big problem is the 2000 product pages that are still in the Google index but shouldn't be. The following Google query tells me roughly how many product pages are in the index: site:Customgia.com inurl:shop-for Ideally, it should return just over 716 results but instead it's returning 2650 results. Most of these 1900 product pages have bad product names and highly similar, auto-generated descriptions and page titles. I wish Google never crawled them. Last week, NOINDEX tags were added to all 1900 "private" designs so currently the only product pages that should be indexed are the 716 showing on the site. Unfortunately, over the past ten days the number of product pages in the Google index hasn't changed. One solution I initially thought might work is to re-enter the removal requests because now, with the NOINDEX tags, these pages should be removed permanently. But I can't determine which product pages need to be removed because Google doesn't let me see that deep into the search results. If I look at the removal request history it says "Expired" or "Removed" but these labels don't seem to correspond in any way to whether or not that page is currently indexed. Additionally, Google is unlikely to crawl these "private" pages because they are orphaned and no longer linked to any public pages of the site (and no external links either). Currently, Customgia.com averages 25 organic visits per month (branded and non-branded) and close to zero sales. Does anyone think de-indexing the entire site would be appropriate here? Start with a clean slate and then let Google re-crawl and index only the public pages - would that be easier than battling with Webmaster tools for months on end? Back in August, I posted a similar problem that was solved using NOINDEX tags (de-indexing a different set of pages on Customgia): http://moz.com/community/q/does-this-site-have-a-duplicate-content-issue#reply_176813 Thanks for reading through all this!
Intermediate & Advanced SEO | | rja2140 -
Are Navigation links different to static links
We are trying to reduce the number of links on our homepage. We could remove some fly out navigation links, We rank 1st on Google for some of these links. Would removing these hurt our SEO. The links are accessible 1 level down if we remove the homepage.
Intermediate & Advanced SEO | | Archers0 -
My New(ish) Site Isn't Ranking Well And Recently Fell
I launched my site (jesfamilylaw.com) at the beginning of January. Since then, I've been trying to build high quality back links. I have a few back links with keyword targeted anchor text from some guest posts I've published (maybe 3 or so) and I have otherwise signed up for business directories and industry-specific directories. I have a few social media profiles and some likes on Facebook, both for the company page and some posts. Despite this, I've had a lot of trouble cracking Google's top ten for any term, long or tall tail. I was starting to climb for Evanston Family Law, which is the key term I believe I am best optimized for, but took a dive yesterday. I fell from maybe the 14th result to somewhere on the 4th page. For all my other target terms, I don't know if I've gotten into the 20s yet. To further complicate matters, my Google Places listing isn't showing and is on the second page of results for Places searches, after businesses that aren't located in the same city. The night before I fell, I resubmitted my site to Google because Webmaster tools was showing duplicate title tags when I had none. I had also made a couple changes to some internal links and title tags, but only for a small fraction of the site. Long story short, I don't know what's going on. I don't know why I fell in the rankings and why my site isn't competitive for some of my target key phrases. I've read so many horror stories about Penguin that I fear my onsite optimization may be hurting my rankings or my back links are insufficient. I've done plenty of competitor research and the sites that are beating me have very aggressive onsite optimization and few back links. In short, I am very confused. Any help would be immensely appreciated.
Intermediate & Advanced SEO | | JESFamilyLaw0 -
Lots of city pages - How do I ensure we don't get penalized
We are planning on having a job posting page for each city that we are looking to hire new CFO partners in. But, the problem is, we have LOTS of locations. I was wondering what would be the best way to have similar content on each page (since the job description and requirements are the same for each job posting) without being hit by Google for having duplicate content? One of the main reasons we have decided to have location based pages is that we have noticed visitors to our site are searching for "cfo job in [location] but we notice that most of these visitors then leave. We believe it to be because the pages they land on make no mention of the location that they were looking for and is a little incongruent with what they were expecting. We are looking to use the following URLs and TItle/Description as an example: | http://careers.b2bcfo.com/cfo-jobs/Alabama/Birmingham | CFO Careers in Birmingham, AL | | Are you looking for a CFO Career in Birmingham, Alabama ? We're looking for partners there. Apply today! | | Any advice you have for this would be greatly appreciated. Thank you.
Intermediate & Advanced SEO | | B2B.CFO0