Putting blog excerpts in footer of every page?
-
I have roughly 150 non-blog pages and 500 articles in my blog.
The footer of every non-blog page includes excerpts from 3 blog posts selected at random from the inventory of 500.
The posts in the footer of each page change with every page refresh.
So, if you scroll to the bottom of any non-blog page, you'll see about 85 words for each of 3 randomly selected blog posts, with a link to the source article in the blog section of my site. Each page will link to 3 different posts.
One of my objectives is to drive visitors to some older blog content that has become buried deep in the archives over the years.
Question 1: In a post-Panda/Penguin world, is this a good or bad technique?
Question 2: Should the links to the full content in the blog use rel="nofollow"? Without it, the internal link structure for this part of the site looks pretty crazy and random - I assume nofollow would help make things look more orderly (and prevent my main non-blog pages from passing excess link juice to my blog).
Thoughts or comments?
-
On my site, I have a sidebar widget that links to 10 random pages on my site. I would love to have it link to 10 relevant pages on my site as EGOL suggests...It's on my list of things to program one day!
I do feel that these are helpful to me. The main reason why I like this widget is because I think it helps me keep my pages in Google's index. My site has over 3000 pages and I think many of them would end up in the supplemental index if they were not regularly linked to internally like they are.
Also, I have it set so that they link using my desired anchor text for the page. This may help somewhat in my SERPs for each page.
Oh, and all of these links are followed on my site.
-
I think that 45 words is too much. I would use a title. Or a title and a few words. Or, a title and a sexy sentence.
Last, a new question: Can you think of ANY good reason to put blog excerpts in a site-wide footer the way http://springboard.com/ and http://seoaware.com do?
Yes, but only until I had time to do something better.
-
Thanks for a very thoughtful reply; I agree that it's important to promote the best content, and I definitely do that.
I misspoke earlier - the excerpts are just the title and 45 words, not 85. Maybe 3 sentences. Do you still think this is a duplicate content risk?
Last, a new question: Can you think of ANY good reason to put blog excerpts in a site-wide footer the way http://springboard.com/ and http://seoaware.com do? I currently do it randomly; these other sites excerpt the latest posts.
-
These are in the footer? How many people are going all the way down there and clicking on them? I bet nobody. If nobody is clicking them then why do this?
I would run crazyegg on a few pages to see if anybody is clicking these links.
The footer of every non-blog page includes excerpts from 3 blog posts selected at random from the inventory of 500.
If you really want to get people into these posts your best bet would be to link to relevant rather than random? Don't you think?
Almost any blog that has 500 posts is going to have some really good ones and some real sleepers. I would focus on promoting the really good ones if you want people to click these.
Promote your best. Not random.
So, if you scroll to the bottom of any non-blog page, you'll see about 85 words for each of 3 randomly selected blog posts, with a link to the source article in the blog section of my site. Each page will link to 3 different posts.
Eighty five words for each? Wow... that is a lot... Way more than I would use. How a bout a title and ten words. Nobody is going to read these.
I think that you could run into duplicate content issues with this - even though they are shuffled randomly.
Question 1: In a post-Panda/Penguin world, is this a good or bad technique?
You can see my answers above. I think that the footer is bad location and I think that you should promote your best instead of random.
Question 2: Should the links to the full content in the blog use rel="nofollow"? Without it, the internal link structure for this part of the site looks pretty crazy and random - I assume nofollow would help make things look more orderly (and prevent my main non-blog pages from passing excess link juice to my blog).
I would not use nofollow on links within my own website. When you use nofollow the pagerank that would have flowed into those links evaporates. It is lost. Poof! Instead, allow the pagerank to flow into these pages and out through their links. If you nofollow you cut off the power.
-
Ben,
Thanks for the reply. I didn't know about the nofollow on the internal links.
Yes, the body of every page has unique content.
My entire site revolves around serving the needs of a specific type of individual, and the content in the body and the content in the blog work together to fill the information needs of the visitor - so hopefully yes, the blog content is relevant.
I see lots of websites incorporating a blog excerpt in the footers or sidebars of their pages (e.g., http://springboard.com/ and http://seoaware.com). The problem I have is that they almost always link to the most recent blog post(s), concentrating attention on a small subset of the blog inventory. That means older, still-relevant, still-useful content is getting neglected. That's the main thing I'm trying to overcome.
I do have good site search so people who know what they are looking for can find it - the problem is that many people don't know what they want until they see it.
Thanks again!
-
Question 2 first - I don't think there's any need for the no-follow at all. In fact I'm pretty certain Google have gone on record saying that you'll never need to no-follow with internal links.
Question 1 - If you've got some unique content in the main body of the page I wouldn't foresee it being a problem. However you should probably be asking yourself if it's actually of any use to the user? Is the content you're sending them to relevant to what they're looking for and is it improving their overall experience on your site?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Organization schema] Which Facebook page should be put in "sameAs" if our organization has separate Facebook pages for different countries?
We operate in several countries and have this kind of domain structure:
Technical SEO | | Telsenome
example.com/us
example.com/gb
example.com/au For our schemas we've planned to add an Organization schema on our top domain, and let all pages point to it. This introduces a problem and that is that we have a separate Facebook page for every country. Should we put one Facebook page in the "sameAs" array? Or all of our Facebook pages? Or should we skip it altogether? Only one Facebook page:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
], All Facebook pages:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
"https://www.facebook.com/xxx_gb"
"https://www.facebook.com/xxx_au"
], Bonus question: This reasoning springs from the thought that we only should have one Organization schema? Or can we have a multiple sub organizations?0 -
Page content not being recognised?
I moved my website from Wix to Wordpress in May 2018. Since then, it's disappeared from Google searches. The site and pages are indexed, but no longer ranking. I've just started a Moz campaign, and most pages are being flagged as having "thin content" (50 words or less), when I know that there are 300+ words on most of the pages. Looking at the page source I find this bit of code: page contents Does this mean that Google is finding this and thinks that I have only two words (page contents) on the page? Or is this code to grab the page contents from somewhere else in the code? I'm completely lost with this and would appreciate any insight.
Technical SEO | | Photowife1 -
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
What to do with temporary empty pages?
I have a website listing real estate in different areas that are for sale. In small villages, towns, and areas, sometimes there is nothing for sale and therefore the page is completely empty with no content except a and some footer text. I have thousand of landing pages for different areas. For example "Apartments in Tibro" or "Houses in Ljusdahl" and Moz Pro gives me some warnings for "Duplicate Content" on the empty ones (I think it does so because the pages are so empty that they are quite similar). I guess Google could also think bad of my site if I have hundreds or thousands of empty pages even if my total amount of pages are 100,000. So, what to do with these pages for these small cities, towns and villages where there is not always houses for sale? Should I remove them completely? Should I make a 404 when no houses for sale and a 200 OK when there is? Please note that I have totally 100,000+ pages and this is only about 5% of all my pages.
Technical SEO | | marcuslind900 -
Will it make any difference to SEO on an ecommerce site if they use their SSL certificate (https) across every page
I know that e-commerce sites usually have SSL certificates on their payment pages. A site I have come across is using has the https: prefix to every page on their site. I'm just wondering if this will make any difference to the site in the eyes of Search Engines, and whether it could effect the rankings of the site?
Technical SEO | | Sayers1 -
Help! Pages not being indexed
Hi Mozzers, I need your help.
Technical SEO | | bshanahan
Our website (www.barnettcapitaladvisors.com) stopped being indexed in search engines following a round of major changes to URLs and content. There were a number of dead links for a few days before 301 redirects were properly put in place. And now, only 3 pages show up in bing when I do the search "site:barnettcapitaladvisors.com". A bunch of pages show up in Google for that search, but they're not any of the pages we want to show up. Our home page and most important services pages are nowhere in search results. What's going on here?
Our sitemap is at http://www.barnettcapitaladvisors.com/sites/default/files/users/AndrewCarrillo/sitemap/sitemap.xml
Robots.txt is at: http://www.barnettcapitaladvisors.com/robots.txt Thanks!0 -
Duplicate Footer Content
A client I just took over is having some duplicate content issues. At the top of each page he has about 200 words of unique content. Below this is are three big tables of text that talks about his services, history, etc. This table is pulled into the middle of every page using php. So, he has the exact same three big table of text across every page. What should I do to eliminate the dup content. I thought about removing the script then just rewriting the table of text on every page... Is there a better solution? Any ideas would be greatly appreciated. Thanks!
Technical SEO | | BigStereo0 -
Are all duplicate pages bad?
I just got my first Crawl Report for my forum and it said I have almost 9,000 duplicate pages. When I looked at a sample of them though I saw that many of them were "reply" links. By this I mean the "reply" button was clicked for a topic yet since the crawler was not a member, it just brought them to the login/register screen. Since all the topics would bring you to the same login page I'm assuming it counted all these "reply" links as duplicates. Should I just ignore these or is there some way to fix it? Thanks in advance.
Technical SEO | | Xee0