When to consolidate and when to bid Link Juice farewell?
-
Greetings all!
I've got a couple of questions about when and if it's alright to let accumulated Link Juice (LJ) slip into the depths of oblivion. I arrived 4 years late to the ticketing website that I work for (www.charged.fm), and found the website in a certain state of disarray. For the past 6 months I've been trying to wrap my head around SEO and our 750k+ page site, and lately we've been making good progress cleaning things up and redesigning. I'm at a loss, though, as to what to do with some pages.
Example: The blog director has been using hash tags for years now that created new pages for each different #, which created a lot of instances of 2 [bytag] pages for 2 different hash tags that had the same article on them.
http://www.charged.fm/blog/bytag/31631/steve-masiello-usf
http://www.charged.fm/blog/bytag/31632/steve-masiello-south-floridaWe've added 'noindex, follow' to this directory (which is the correct solutions, riiight??), but now I'm wondering if some of these pages should be 301'd to more relevant sections of the site, or back to the blog homepage. I know this could be bad for UI, but I don't believe that they're frequently used pages and don't want to let these PA 15 pages go to waste. Any thoughts on this?
Example 2: A similar situation is that they used 302s to redirect to search results pages instead of using category pages. So now there are hundreds, if not thousands, of search results pages that have a PA of 15 or more.
http://www.charged.fm/search/results/music-tickets
We're working on restructuring the site and removing the 302s, but I'm wondering if it's necessary to 301 all of the search results pages to the new category pages like so:
http://www.charged.fm/search/results/music-tickets >>> http://www.charged.fm/concert-tickets
This would require the programmer to create new search/results pages in order to 301 the old ranking ones, correct? Should I put this in queue for him or just leave the search results pages with 'noindex, follow' and let the PA 15 go to waste?
There are many other instances like this like a Login page with PA 20, and I just can't decide if everything should be redirected or what to leave as dust in the wind. Because all we are is dust in the wind ; )
Thanks for any help,
Luke
-
Thanks Jane! That's the affirmation I was looking for. If I might, one more question:
In your opinion, is PA 15 too valuable to leave on a page with no real purpose? Is it relative to the site?
Thanks again,
Luke Thomas -
Hi Luke,
Noindex, follow will work fine for the duplicated tag pages, although you could consider canonicalising them or redirecting them to a more useful resource - if you can do this en masse in an automated way, or if you only do this manually for tags / topics of high importance.
302ing to the search pages isn't good for two reasons: one, search engines traditionally don't follow or pass PR through a 302, and they also prefer you don't include your own "search pages" in their indexes. The "easy" way around this is exactly as you describe: produce quality category pages in place of what was a search results page. You can probably get away with having search pages indexed, and many companies get them to rank, but what Google wants to avoid is the indexation of hundreds or thousands of random search results pages from a website, often with complex query strings that can result in almost infinite numbers of pages being created.
Cheers,
Jane
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GWT app and link emulation
Hi, I have a gwt site - https://www.whatiswhere.com. I have tab control which emulates the menu. I am planning to put <a>links instead of text into tab labels to create internal links. I am thinking to add java script to stop onclick event of</a> <a>otherwise i will get to the new session of GWT site. What I want is to just change the tab but at the same time have a link for the crawler. Would my approach work? Will it be equivalent to non-follow link? Will it improve the ranking comparing to 'no link at all' case?</a> <a>Thanks, Andrei.</a>
Intermediate & Advanced SEO | | Anazar_20010 -
Linking to URLs With Hash (#) in Them
How does link juice flow when linking to URLs with the hash tag in them? If I link to this page, which generates a pop-over on my homepage that gives info about my special offer, where will the link juice go to? homepage.com/#specialoffer Will the link juice go to the homepage? Will it go nowhere? Will it go to the hash URL above? I'd like to publish an annual/evergreen sort of offer that will generate lots of links. And instead of driving those links to homepage.com/offer, I was hoping to get that link juice to flow to the homepage, or maybe even a product page, instead. And just updating the pop over information each year as the offer changes. I've seen competitors do it this way but wanted to see what the community here things in terms of linking to URLs with the hash tag in them. Can also be a use case for using hash tags in URLs for tracking purposes maybe?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Should I NoFollow Links Between Our Company Websites?
The company I work for owns and operates hundreds of websites throughout the United States. Each of these is tied to a legitimate local business many times with specific regional branding and mostly unique content. All of our domains are in pretty good shape and have not ever participated in any shady link building/SEO. These sites currently are often linking together between the other sites within their market. It makes perfect sense from a user standpoint since they would have an interest in each of the sites if they were interested in the specific offering that business had. My question is whether or not we should nofollow the links to our other sites. Nothing has happened from Google in terms of penalties and they don't seem to be hurting our sites now as they are all currently followed, but I also don't want to be on the false positive side of any future algorithm updates surrounding link quality. What do you think? Keep them followed or introduce nofollow?
Intermediate & Advanced SEO | | MJTrevens0 -
Link Juice + Site Structure
Hi All, I have attached a simple website model.
Intermediate & Advanced SEO | | Mark_Ch
Page A is the home page attracting 1000 visitors per month.
One click away is Page B with 400 visitors per month, so on and so forth. You get an idea of the flow and clicks required to get to various pages. I have purposely placed Pages E-G to be 3 clicks away as they yield very little traffic. 1] Is this the best way to distribute link juice?
2] Should I point Pages C + D back to page A to influence its Page Rank (PA) Any other useful advice would be appreciated. Thanks Mark vafnchI0 -
Do I even bother to remove links
Hi, I'm noticing increasing numbers of scraped directory links pointing back to the websites I manage. Much of this info appears to be scraped from a well known (and respected) directory. I don't build links to an of the websites I manage - and none have more than 200 linking root domains currently - not that many. The problem is I focus on quality links and the scraped links are incredibly weak on the whole. Diluting the quality links. I've noticed a certain paranoia in the SEO community about removing / disavowing links, and yet I'm tempted to ignore the rubbish (unless part of a major negative SEO push) and just get on with the job, focusing on quality content that drives natural links, and social media work.
Intermediate & Advanced SEO | | McTaggart0 -
Domain Links or SubDomain Links, which is better?
Hi, I only now found out that www.domain.com and www.domain.com/ are different. Most of my external links are directed to www.domain.com/
Intermediate & Advanced SEO | | BeytzNet
Which I understand is considered the subdomain and not the domain. Should I redirect? (and if so how?)
Should I post new links only to my domain?0 -
Google is not Indicating any Links to my site
We built a new store on another ccTLD and linked to it from some of our other domains in a few locations. I am noticing that with the Google operator command "links:" we are seeing nothing linking to our site anywhere. Some things to clarify: These are not no-follow links These pages linking to our new domain are indexed The pages being linked to on our new domain are indexed This is not a flash site or heavy in JavaScript The links existed the day the site was launched so when the new pages were crawled they existed. "Site:" command in Google shows me that my new site is indexed. What could potentially be causing this? I am trying to get these newer ccTLD's to begin ranking and I understand that I need to get links going to these pages since they are fairly new (2.5 months) so I can outrank the .com in the SE's in those locales. (Like Google.co.uk)
Intermediate & Advanced SEO | | DRSearchEngOpt0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0