Sitewide Vs HomePage Links For Network of Sites
-
I wanted to site wide link a few sites together as they are sort of in the same network of ownership and wanted some advice.
1X PR1
2X PR2
2x PR3- Would it be best to just get home page links before the footer, the links will be within a paragraph of text
OR
- Just site wide link them in the footer with a heading of "Our Shopping Network"
-
You are welcome to do both, but I would only do such if you felt the link would actually be used by visitors or otherwise added value to your site.
The SEO value of footer links is quite small. It is my belief Google understands footer links, understands sitewide links, and assigns a value accordingly.
-
Would it be wise to combine both have homepage intext links and also footer links? Wouldn't footer site wide links also pass a signification amount of link juice as well?
-
Home page links are normally the most valuable links a site can offer. Links within a paragraph of text (i.e. in content) are the most valuable links a page can offer. Footer links are the least valued links a page can offer.
Option #1 is definitely preferred.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google WMT/search console showing thousands of links in "Internal Links"
Hi, One of our blog-post has been interlinked with thousands of internal links as per search console; but lists only 2 links it got connected from. How come so many links it got connected internally? I don't see any. Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
Possible problem with new site (GWT no queries/very low index vs. submitted)
Hi everyone, I recently launched a new website for a small business loan company in the Dallas area. The site has been live for roughly a month and a half. I submitted everything to GWT as usual, including my sitemap. I am not sure what's going on with the site, as there is no activity from GWT in the impressions or queries. The submit vs. index is 24/3 (and hasn't moved). Also the queries graph on the overview stops at 3/18/2015... On another note, when I go to Crawl > Sitemaps, it shows that there were pages indexed during the month of march and then on April 3 it drops from 17 to 2 and never increases. Google says there are no errors or issues found, but I feel like there's something wrong. When I do site:, my URLs do pop up which makes me believe there's just a problem with my GWT. With that being said, I'm not happy THINKING there's something wrong. I need to actually know what the problem is. The only thing I can think of that I have done is purchase SSL for the site, but when I search what pages are indexed using www. it shows all the HTTPS URLS, so that would tell me that the site is getting indexed without a problem? Does anyone have a clue as to what might be happening? I will attach some screen shots so that you can get a better idea... KQ2366i D5xBNZf mF7kkgW
Intermediate & Advanced SEO | | jameswesleyhunt0 -
Internal Link Analysis (Site Wide)
Hi i'm currently doing a internal link analysis for one of my clients and want to pull internal link data for the entire website. So i can look at the distribution of internal anchor text and to identify ways in which we can optimize internal linking. I have had a look at screaming frog the trouble is, this data is only exportable one page at a time. Meaning, you can’t export an entire site “In Link” data. The site has 200+ pages so pulling in link data for each page would take quite long! Can anyone recommend anyways or tools which can look at the entire link profile for a website. I have checked OSE but there's not much data because the site is relatively new. Cheers, RM
Intermediate & Advanced SEO | | MBASydney0 -
Disavow Links & Paid Link Removal (discussion)
Hey everyone, We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario: We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor. Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed. Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings. **The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients. **The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty. The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
Intermediate & Advanced SEO | | Etna0 -
What happens when I redirect an entire site to an established page on another site?
Hi There, I have a website which is dedicated to selling ONE product (in different forms) or my main brand site. It is branded similarly, targets similar keywords, and gets some traffic which convert to leads. Additionally, the auxiliary site has a Google Rank 2 in its own right. I am thinking of consolidating this "auxillary" site to the specific product page on my main site. The reason I am considering doing this is to give a "boost" to the main product page on our main site which has many core keywords sitting with SERP ranking of between 11-20 (so not in first 10) Because this auxiliary site it gets traffic and leads in its own right, I don't want this to be to the detriment of my leads overall. Question is - if I 301 redirect the entire domain from my auxillary site to the equivalent product on my main site am I likely to see a large "boost" to that product page? (i.e. will I likely see my ranking rise from 11 - 20 significantly)
Intermediate & Advanced SEO | | love-seo-goodness0 -
Network Of Sites...
Hi Guys, Just wondering if anyone can help me out... We have recently been hit by the Google penguin update and I'm currently working though all the bad / spammy backlinks that previous SEO companies have built for us. I have come across 1 particular domain www.justgoodcars.com they seem to have a lot of different domain names: <colgroup><col width="390"></colgroup>
Intermediate & Advanced SEO | | ScottBaxterWW
| http://www.justpulsarcars.com/nissan-pulsar-warranties/1/United_Kingdom/all.html |
| http://www.justpumacars.com/ford-puma-warranties/1/United_Kingdom/all.html |
| http://www.justpuntocars.com/dutch-site/fiat-punto-warranties/1/United_Kingdom/all.html?selectcountry1=United_Kingdom |
| http://www.justpuntocars.com/fiat-punto-warranties/1/United_Kingdom/all.html?selectcountry1=United_Kingdom | Now all of theses domains names have exactly the same IP Address?? Above is just a few I would say there are 100s of them. Do you think this could have an affect on us? Thanks, Scott0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0