What is the best way to find stranded pages?
-
I have a client that has a site that has had a number of people in charge of it. All of these people have very different opinions about what should be on the site itself.
When I look at their website on the server I see pages that do not have any obvious navigation to them.
What is the best way to find out the internal linking structure of a site and see if these pages truly are stranded?
-
There are meny crawl tools out there, but Xenu is the one I would recomend for this. It will crawl your site, and then at the end ask for your ftp details so it can specificaly check for ophaned pages. That should make the report you need.
*edit - here's the link: http://home.snafu.de/tilman/xenulink.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Indexed Page Count vs Site:Search Operator page count
We launched a new site and Google Search Console is showing 39 pages have been indexed. When I perform a Site:myurl.com search I see over 100 pages that appear to be indexed. Which is correct and why is there a discrepancy? Also, Search Console Page Index count started at 39 pages on 5/21 and has not increased even though we have hundreds of pages to index. But I do see more results each week from Site:psglearning.com My site is https://wwww.psglearning.com
Technical SEO | | pdowling0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Pages to be indexed in Google
Hi, We have 70K posts in our site but Google has scanned 500K pages and these extra pages are category pages or User profile pages. Each category has a page and each user has a page. When we have 90K users so Google has indexed 90K pages of users alone. My question is. Should we leave it as they are or should we block them from being indexed? As we get unwanted landings to the pages and huge bounce rate. If we need to remove what needs to be done? Robots block or Noindex/Nofollow Regards
Technical SEO | | mtthompsons0 -
Product Pages Outranking Category Pages
Hi, We are noticing an issue where some product pages are outranking our relevant category pages for certain keywords. For a made up example, a "heavy duty widgets" product page might rank for the keyword phrase Heavy Duty Widgets, instead of our Heavy Duty Widgets category page appearing in the SERPs. We've noticed this happening primarily in cases where the name of the product page contains an at least partial match for the desired keyword phrase we want the category page to rank for. However, we've also found isolated cases where the specified keyword points to a completely irrelevent pages instead of the relevant category page. Has anyone encountered a similar issue before, or have any ideas as to what may cause this to happen? Let me know if more clarification of the question is needed. Thanks!
Technical SEO | | ShawnHerrick0 -
Why is the Page Authority of my product pages so low?
My domain authority is 35 (homepage Page Authority = 45) and my website has been up for years: www.rainchainsdirect.com Most random pages on my site (like this one) have a Page Authority of around 20. However, as a whole, the individual pages of my products rank exceptionally low. Like these: http://www.rainchainsdirect.com/products/copper-channel-link-rain-chain (Page Authority = 1) http://www.rainchainsdirect.com/collections/todays-deals/products/contempo-chain (Page Authority = 1) I was thinking that for whatever reason they have such low authority, that it may explain why these pages rank lower in google for specific searches using my exact product name (in other words, other sites that are piggybacking of my unique products are ranking higher for my product in a specific name search than the original product itself on my site) In any event, I'm trying to get some perspective on why these pages remain with the same non-existent Page Authority. Can anyone help to shed some light on why and what can be done about it? Thanks!
Technical SEO | | csblev0 -
Best way to get SEO friendly URLSs on huge old website
Hi folks Hope someone may be able to help wit this conundrum: A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888 and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/ I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way: Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included? Rebuild the site entirely (preferably on PHP with a decent URL structure) Accept that the URLS can't be made friendly on a site this size and focus on other aspects Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs. Any thoughts from the great minds in the SEOmoz community appreciated! Cheers Simon
Technical SEO | | SCL-SEO1 -
Handling 301s: Multiple pages to a single page (consolidation)
Been scouring the interwebs and haven't found much information on redirecting two serparate pages to a single new page. Here is what it boils down to: Let's say a website has two pages, both with good page authority of products that are becoming fazed out. The products, Widget A and Widget B, are still popular search terms, but they are being combined into ONE product, Widget C. While Widget A and Widget B STILL have plenty to do with Widget C, Widget C is now the new page, the main focus page, and the page you want everyone to see and Google to recognize. Now, do I 301 Widget A and Widget B pages to Widget C, ALTHOUGH Widgets A and B previously had nothing to do with one another? (Remember, we want to try and keep some of that authority the two page have had.) OR do we keep Widget A and Widget B pages "alive", take them off the main navigation, and then put a "disclaimer" on the pages announcing they are now part of Widget C and link to Widget C? OR Should Widgets A and B page be canonicalized to Widget C? Again, keep in mind, widgets A and B previously were not similar, but NOW they are and result in Widget C. (If you are confused, we can provide a REAL work example of what we are talkinga about, but decided to not be specific to our industry for this.) Appreciate any and all thoughts on this.
Technical SEO | | JU19850 -
What is the best way to broaden the reach of our website?
We were specifically targeting the Capital Region of New York and environs and now our goal is to broaden the net to reach potential clients. Should I to drop the location terms we already have baked in to the copy and add broader location terms? OR just add newer terms? We're developing a new design that sharpens our focus, but here is what we have now: http://www.behancommunications.com Thanks for any suggestions
Technical SEO | | PatDowd0