When to consolidate and when to bid Link Juice farewell?
-
Greetings all!
I've got a couple of questions about when and if it's alright to let accumulated Link Juice (LJ) slip into the depths of oblivion. I arrived 4 years late to the ticketing website that I work for (www.charged.fm), and found the website in a certain state of disarray. For the past 6 months I've been trying to wrap my head around SEO and our 750k+ page site, and lately we've been making good progress cleaning things up and redesigning. I'm at a loss, though, as to what to do with some pages.
Example: The blog director has been using hash tags for years now that created new pages for each different #, which created a lot of instances of 2 [bytag] pages for 2 different hash tags that had the same article on them.
http://www.charged.fm/blog/bytag/31631/steve-masiello-usf
http://www.charged.fm/blog/bytag/31632/steve-masiello-south-floridaWe've added 'noindex, follow' to this directory (which is the correct solutions, riiight??), but now I'm wondering if some of these pages should be 301'd to more relevant sections of the site, or back to the blog homepage. I know this could be bad for UI, but I don't believe that they're frequently used pages and don't want to let these PA 15 pages go to waste. Any thoughts on this?
Example 2: A similar situation is that they used 302s to redirect to search results pages instead of using category pages. So now there are hundreds, if not thousands, of search results pages that have a PA of 15 or more.
http://www.charged.fm/search/results/music-tickets
We're working on restructuring the site and removing the 302s, but I'm wondering if it's necessary to 301 all of the search results pages to the new category pages like so:
http://www.charged.fm/search/results/music-tickets >>> http://www.charged.fm/concert-tickets
This would require the programmer to create new search/results pages in order to 301 the old ranking ones, correct? Should I put this in queue for him or just leave the search results pages with 'noindex, follow' and let the PA 15 go to waste?
There are many other instances like this like a Login page with PA 20, and I just can't decide if everything should be redirected or what to leave as dust in the wind. Because all we are is dust in the wind ; )
Thanks for any help,
Luke
-
Thanks Jane! That's the affirmation I was looking for. If I might, one more question:
In your opinion, is PA 15 too valuable to leave on a page with no real purpose? Is it relative to the site?
Thanks again,
Luke Thomas -
Hi Luke,
Noindex, follow will work fine for the duplicated tag pages, although you could consider canonicalising them or redirecting them to a more useful resource - if you can do this en masse in an automated way, or if you only do this manually for tags / topics of high importance.
302ing to the search pages isn't good for two reasons: one, search engines traditionally don't follow or pass PR through a 302, and they also prefer you don't include your own "search pages" in their indexes. The "easy" way around this is exactly as you describe: produce quality category pages in place of what was a search results page. You can probably get away with having search pages indexed, and many companies get them to rank, but what Google wants to avoid is the indexation of hundreds or thousands of random search results pages from a website, often with complex query strings that can result in almost infinite numbers of pages being created.
Cheers,
Jane
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Migration Question - Do I Need to Preserve Links in Main Menu to Preserve Traffic or Can I Simply Link to on Each Page?
Hi There We are currently redesigning the following site https://tinyurl.com/y37ndjpn The local pages links in the main menu do provide organic search traffic. In order to preserve this traffic, would be wise to preserve these links in the main menu? Or could we have a secondary menu list (perhaps in the header or footer), featured on every page, which links to these pages? Many Thanks In Advance for Responses
Intermediate & Advanced SEO | | ruislip180 -
Any idea why Google Search Console stopped showing "Internal Links" and "Links to your site"
Our default eCommerce property (https://www.pure-elegance.com) used to show several dozen External Links and several thousand Internal Links on Google Search Console. As of this Friday both those links are showing "No Data Available". I checked other related properties (https://pure-elegance.com, http:pure-elegance.com and http://www.pure-elegance.com) and all of them are showing the same. Our other statistics (like Search Analytics etc.) remain unchanged. Any idea what might have caused this and how to resolve this?
Intermediate & Advanced SEO | | SudipG0 -
Links: Links come from bizzare pages
Hi all, My question is related to links that I saw in Google Search Console. While looking at who is linking to my site, I saw that GSC has some links that are coming from third party websites but these third party webpages are not indexed and not even put up by their owners. It looks like the owner never created these pages, these pages are not indexed (when you do a site: search in Google) but the URL of these pages loads content in the browser. Example - www.samplesite1.com/fakefolder/fakeurl what exactly is this thing? To mention more details, the third party website in question is a Wordpress website and I guess is probably hijacked. But how does one even get these types pages/URLs up and running on someone else's website and then link out to other websites. I am concerned as the content that I am getting link from is adult content and I will have to do some link cleansing soon.
Intermediate & Advanced SEO | | Malika10 -
Removing Bad Links
Hi all, I am in the process of conducting a Link Audit and I am faced with quite a lot of seemingly poor quality examples, such as; http://gotogetaways.com/tag/cunard/ http://jobhiringlocalandabroad.blogspot.com/p/job-hiring-for-cruise-liner-orchestra.html http://lumukixu.xlx.pl/p-o-cruises-aurora.php To me these should be removed \ disavowed but I am getting a little resistence from stakeholders regarding the amount of links I am seeking to rid ourselves of - all are of a similar quality to my examples above... Just so that I know that I am not being 'over eager' with my audit, I welcome your opinions Thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Disavow Links Notification
No manual actions on our sites, just Penguin related. I put in a disavow for one site in October and Webmaster Tools kept a message up for some time saying the disavow links file for that site had been updated. I put in a disavow for another site of ours last week and I've had no such message. I checked and the file is there. Was this an intentional change on Google's part? Just want to make sure something's not messed up here.
Intermediate & Advanced SEO | | Kingof50 -
How hard would it be to take a well-linked site, completely change the subject matter & still retain link authority?
So, this would be taking a domain with a domain authority of 50 (200 root domains, 3500 total links) and, for fictitious example, going from a subject matter like "Online Deals" to "The History Of Dentistry"... just totally unrelated new subject for the old/re-purposed domain. The old content goes away entirely. The domain name itself is a super vague .com name and has no exact match to anything either way. I'm wondering, if the DNS changed to different servers, it went from 1000 pages to a blog, ownership/contacts stayed the same, the missing pages were 301'd to the homepage, how would that fare in Google for the new homepage focus and over what time frame? Assume the new terms are a reasonable match to the old domain authority and compete U.S.-wide... not local or international. Bonus points for answers from folks who have actually done this. Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Create a link or redirect?
We have 60 demo movie pages on our site. We no longer link to these movie pages internally, because they are outdated; however, a lot of our partner companies are still linking to these pages. Some of these pages will have 10-15 linking root domains and a page authority of 30+... so pretty decent authority. These pages only include a movie on the pages, no links. I am trying to pass some of the link juice from these pages to other pages on our site. I am wondering if I should: A)Include transcripts on these pages, then link back to our current product page or solution pages? B)Set up redirects from these pages to a product or solution page? C)Set up a redirect to our homepage? Any advice? Thanks, Mike
Intermediate & Advanced SEO | | Mike.Goracke0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0