On page link question, creating an additional 'county' layer between states and zips/cities
-
Question
We have a large site that has a page for all 50 states. Each of these pages has unique content, but following the content has a MASSIVE amount of links for each zip AND city in that state. I am also in the process of creating unique content for each of these cities and zips HOWEVER, I was wondering would it make sense to create an additional 'county' layer between the states and the zips/cities. Would the additional 'depth' of the links bring down the overall rank of the long tail city and zip pages, or would the fact that the counties would knock the on page link count down from a thousand or so, to a management 50-100 substantially improve the overall quality and ranking of the site? To illustrate, currently I have
State -> city and zip pages (1200+ links on each state page)
what i want to do is do
state -> county (5-300 counties on each state page) -> city + zip (maybe 50-100 links on each county page). What do you guys think? Am I incurring some kind of automatic penalty for having 1000+ links on a page?
-
1000 links per page is probably too much. Google used to recommend having 100 or less, although their crawlers have improved a lot since then.
It all depends on how high your page authority is, because that pagerank is going to be split between each and every link on the page. The more links you have on a page, the less pagerank each link will receive. If you have a really high authority page, this is no problem, but if you're a medium to small site, I would recommend against it. If you have a low authority page, Google may not even bother to crawl all the links on the page if it has too many.
So yes, splitting up those massive pages of links into smaller sub-pages makes a lot of sense. Also, think from a user perspective what is going to be the best way to navigate your massive site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links to Paywall from Content Pages
Hi, My site is funded by subscriptions. We offer lengthy excerpts, and then direct people to a single paywall page, something like domain.com/subscribe/ This means that most pages on the site links to /subscribe, including all of the high value pages that bring people in from Google. This is a page with an understandably high bounce rate, as most users are not interested in paying for content on the web. My question is are we being penalized in Google for having so many internal links to a page with a very high bounce rate? If anyone has worked with paywall sites before and knows the best practices for this, I'd be really grateful to learn more.
On-Page Optimization | | enotes0 -
Too Many On-Page Links
Hello. So, my SEO team has worked very hard to finally resolve RogerBot/GoogleBot specific Crawl Errors either manually or programmatically can be fixed for our Budget Blinds USA Pro Campaign. We've done a good job even if a lot of it came from Robots.txt file entries as this was the most efficient way our client chose to do it. Good news is most of it is CMS configuration and not bad site architecture. That being said our next big volume of Crawl Errors is "Too Many On-Page Links". Our Moz DomainRank is 61. Our client, on this new version of the website, added a large nav-based footer which has duplicate links from the Header Main Navigation. I believe our solution is to put in No-Follow Metatags at the Footer Link Level, so we don't zap Page Authority by over-dividing as you recommend. Is this the best way to resolve this? Is there any risk in this? Or is a 61 DomainRank high enough for RogerBot and GoogleBot to crawl these anyway? Please advise,
On-Page Optimization | | Aviatech0 -
Keyword repeats/presence in url's & over-optimisation
Hi I'm about to launch a redesigned site and worried about overdoing kw presence on-page, primarily using in url's since will already be using kw in titles as well as page content. What's current thinking re over optimisation: If kw is in titles and page content is it best not to repeat again in url structure i.e. less is more, even though this will cause things like SeoMoz on-page grade score to fall, or better to keep them/add them ? Personally i think it makes sense to include kw in url again since helps make the page relevant, and so long as matches the content should help as opposed to hinder rankings for the pages target keyword. However when i look into this some say don't do this since is over-optimisation The sites generally ranking quite well for its target kw which i obviously don't want to lose after re-launch & hopefully improve further, in the case of this example they are 'Sports Centre Services' & 'Sports Centre Equipment Rental'). The sites current url structure is similar to this below example: frankssportscentres.com/services/sports-centre-equipment-rental Would it be better to keep following existing/above format or to go with either of the below options i.e. more kw rich urls or less: frankssportscentres.com/sports-centre-services/sports-centre-equipment-rental Or frankssportscentres.com/sports-centre-services/equipment-rental Or even less frankssportscentres.com/services/equipment-rental Many Thanks in advance for any helpful comments Cheers Dan
On-Page Optimization | | Dan-Lawrence0 -
Does 'XXX' in Domain get filtered by Google
I have a friend that has xxx in there domain and they are a religious based sex/porn addiction company but they don't show up for the queries that they are optimized against. They have a 12+ year old domain, all good health signs in quality links and press from trusted companies. Google sends them adult traffic, mostly 'trolls' and not the users they are looking for. Has anyone experienced domain word filtering and have a work around or solution? I posted in the Google Webmaster help forums and that community seems a little 'high on their horses' and are trying to hard to be cool. I am not too religious and don't necessarily support the views of the website but just trying to help a friend of a friend with a topic that I have never encountered. here is the url: xxxchurch.com Thanks, Brian
On-Page Optimization | | Add3.com0 -
Number ID's in URL's
If you have to place a number ID into a URL. Does it matter from an seo perspective whether it is placed in the middle or at the end? Thanks
On-Page Optimization | | simmo2350 -
Internal Linking - Same page
Is there any benefit in internally linking back to the same page? I have no other relevant pages in the site I can link to so wondering if it's worthwhile to use anchor text to link back to the same page?
On-Page Optimization | | Will_Craig0 -
Why does the on page report reports a full path link as Cannibalize link?
On the seomoz on page report i get a cannibalize error. This is due to a link being full path. When i change the link to relative path then there is no Cannibalize error. Should i change the internal links of the site to relative path? I would appreciate your help.
On-Page Optimization | | pickaweb0 -
Http://www.xxxx.com does not re-direct to http://xxx.com
When typing in my website URL www.earthsaverequipment.com successfully re-directs to earthsaverequipment.com as specified in robot. However if you type http://www.earthsaverequipment.com it brings up a 404 error Is this a potential issue? if so is there a way to fix it? thanks
On-Page Optimization | | Earthsaver0