Canonicalize vs Link Juice
-
I recently wrote (but have not published) a very comprehensive original article for my new website (which has pretty much no domain authority). I've been talking to the publisher of a very high Domain Authority site and they are interested in publishing it. The article will include 2-3 follow backlinks to my website.
My question is should I:
- Repost the article in my own site and then request a "rel=canonical" from the high authority site
- Not re-post the article on my own site and just collect the link juice from the high authority site
Which would be better for my overall SEO? Assume in case 1) that the high authority site would add a rel=canonical if I asked for it.
-
great - very helpful thanks!
-
If you use rel=canonical, the page on the publishing site should not be indexed by google and other search engines who recognize rel=canonical. The page on your site remains in the index, appears in the SERPs and attracts traffic. Any links that go to the page on the publisher's site with your article will appear in Google webmaster tools for the page where the article appears on your site.
So, it "appears" that your page (the original article page) gets all of the link equity that goes to the page on the Publisher site where you article is displayed - even links in their own navigation.
I said "appears" above. We do not know how google counts them. Most people believe that google passes link equity through the rel=canonical based upon what Googlers have said and published about them. But we do not know for sure. Also, we know for a fact that google sometimes changes their mind about stuff and doesn't tell anybody.
I can say that I have a few pages that receive rel=canonical attribution from other websites and the results have been kickass, from what I can tell.
-
Is that better than getting link juice for SEO?
-
Post the article on your site first. After it has been there long enough to be stable in the index, then seek an agreement that another site can publish it with rel=canonical.
I normally don't give my content away under any circumstance, but if the right major website would do an rel=canonical, I would likely allow them to use it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound Links - Redirect, Leave Alone, etc
Hi, I recently download the inbound links report for my client to look for some opportunities. When they switched to our platform a couple years ago, the format of some of their webpages change, so a number of these inbound links are going to an error page and should be redirected. However, some of these are spammy. In that case, someone recommended to me to disavow them but still redirect anyway. In other cases, some were "last seen" a year or two ago, so when I try to go to the URL the link is coming from, I also get an error page. Should I bother to redirect in these cases? Should I disavow in both cases? Or leave them alone? Thanks for any input!
White Hat / Black Hat SEO | | AliMac261 -
Site Footer Links Used for Keyword Spam
I was on the phone with a proposed web relaunch firm for one of my clients listening to them talk about their deep SEO knowledge. I cannot believe that this wouldn’t be considered black-hat or at least very Spammy in which case a client could be in trouble. On this vendor’s site I notice that they stack the footer site map with about 50 links that are basically keywords they are trying to rank for. But here’s the kicker shown by way of example from one of the themes in the footer: 9 footer links:
White Hat / Black Hat SEO | | RosemaryB
Top PR Firms
Best PR Firms
Leading PR Firms
CyberSecurity PR Firms
Cyber Security PR Firms
Technology PR Firms
PR Firm
Government PR Firms
Public Sector PR Firms Each link goes to a unique URL that is basically a knock-off of the homepage with a few words or at the most one sentences swapped out to include this footer link keyword phrase, sometimes there is a different title attribute but generally they are a close match to each other. The canonical for each page links back to itself. I simply can’t believe Google doesn’t consider this Spammy. Interested in your view.
Rosemary0 -
Do I lose link juice if I have a https site and someone links to me using http instead?
We have recently launched a https site which is getting some organic links some of which are using https and some are using http. Am I losing link juice on the ones linked using http even though I am redirecting or does Google view them the same way? As most people still use http naturally will it look strange to google if I contact anyone who has given us a link and ask them to change to https?
White Hat / Black Hat SEO | | Lisa-Devins0 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
White Hat / Black Hat SEO | | NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0 -
Advice on links after Penguin hit
Firstly we have no warnings or messages in WMT. We have racked up thousands of anchor text urls. Our fault, we didnt nofollow and also some of our many cms sites replicated the links sitewide to the tune of 20,000 links. I`m in the process of removing the code which causes this problem in most of the culprit sites but how long will it take roughly for a crawl to recalculate the links? In my WMT it still shows the links increasing but I think this is retrospective data. However, after this crawl we should see a more relevant link count. We also provide some web software which has been used by many sites. Google may consider our followed anchor text violating spam rules. So I ask, if we were to change the link text to our url only and add nofollow, will this improve the spam issue? We could have as many as 4,000 links per website, as it is a calendar function and list all dates into the future.......and we would like to retain a link to our website of course for marketing purposes. What we dont want is sitewide link spam again. Some of our other links are low quality, some are okay. However, we have lost rankings, probably due to low quality links and overuse of anchor text.. Is this the case the Google has just devalued the links algorythmically or is there an actual penalty to make the rankings drop? As we have no warnings in WMT, I feel there isnt the need to remove the lower quality links and in most cases we havent control over the link placements. We should just rectify that we have a better future linking profile? If we have to remove spam links, then that can only be a good reason to cause negative seo?
White Hat / Black Hat SEO | | xtopher660 -
How do I know what links are bad enough for the Google disavow tool?
I am currently working for a client who's back link profile is questionable. The issue I am having is, does Google feel the same way about them as I do? We have no current warnings but have had one in the past for "unnatural inbound links". We removed the links that we felt were being referred to and have not received any further warnings, nor have we noticed any significant drop in traffic or rankings at any point. My concern is that if I work towards getting the more ominous looking links removed (directories, reciprocal links from irrelevant sites etc.), either manually or with the disavow tool, how can I be sure that I am not removing links that are in fact helping our campaign? Are we likely to suffer from the next Penguin update if we chose to proceed without moving the aforementioned links? or is Google only likely to target the serious black hat links (link farms etc.)? Any thoughts or experiences would be greatly appreciated.
White Hat / Black Hat SEO | | BallyhooLtd0 -
Link Building on Blog Posts w/ Ads & Mostly Pictures
I found a group of similar websites that offer anchor text links with good to great domain and page authority (30 to 75), but I'm not sure how "safe" they are. Most of their posts are compilations of images/logos and there are a lot of ads on the page. Would links from sites like TutorialChip.com help or would Google discount them because of the nature of the site? Thanks!
White Hat / Black Hat SEO | | pbhatt0 -
Link Building: Location-specific pages
Hi! I've technically been a member for a few years, but just recently decided to go Pro (and I gotta say, I'm glad I did!). Anyway, as I've been researching and analyzing, one thing I noticed a competitor is doing is creating location-specific pages. For example, they've created a page that has a URL similar to this: www.theirdomain.com/seattle-keyword-phrase They have a few of these for specific cities. They rank well for the city-keyword combo in most cases. Each city-specific page looks the same and the content is close to being the same except that they drop in the "seattle keyword phrase" bit here and there. I noticed that they link to these pages from their site map page, which, if I were to guess, is how SEs are getting to those pages. I've seen this done before on other sites outside my industry too. So my question is, is this good practice or is it something that should be avoided?
White Hat / Black Hat SEO | | AngieHerrera0