Varying Internal Link Anchor Text with Each New Page Load
-
I'm asking for people's opinions on varying internal anchor text. Before you jump in and say, "Oh yes, varying your anchor text is always a good idea", let me explain.
I'm not talking about varying anchor text on different links scattered throughout a site. We all know that is a wise thing to do for a variety of reasons that have been covered in many places. What I'm talking about is including semi-useful links below the fold and then varying the anchor text with each page load. Each time Googlebot crawls a page, it sees different anchor text for each link. That way, Googlebot is seeing, for example, 'san diego bars', 'taverns in san diego', 'san diego clubs', and 'pubs in san diego' all pointing to a San Diego bar/tavern/club/pub page.
I'm wondering if there is value in this approach. Will it help a site rank well for multiple search queries? Could it potentially be better than static anchor text as it may help Google better understand the targeted page? Is it a good way to protect a large site with a huge number of internal links from Penguin?
To summarize, we're talking about the impact of varying the anchor text on a single page with each page load as opposed to varying the anchor text on different pages.
Thoughts?
-
Thanks for everyone's input!
Without pointing any fingers, let's just say this is happening in the wild right now. It came as a bit of a surprise to me as I wouldn't expect Google to be fooled into ranking a site better for multiple keywords based on dynamic internal anchor text. To be clear, I have no evidence this technique is helping or that the motivation is to game Google for better rankings, but I haven't come up with any other reason.
If it is working, I must admit, it's pretty clever...
-
I would say test it out and see what happens. I would love to know the result. ( youmoz post perhaps ? )
what I assume would happen :
The new link only counts when G-bot crawls the page ( and obviously not on each page load ), and each time Gbot crawls the page it will see that an old link is dropped and a new one is added. So what ever value you gain from the new link , you will lose from the old one which is no longer there. So I really don't see the value to be had from an SEO point of view . But repeat visitors to you page may click through to those pages. ( Again testing it will give you solid proof )
-
What comes to me is this: I don't think you'll get the value out of links with dynamic anchor text that you would get with anchor text that is static. A page's overall value and the value it passes on to other page via links is iterative--it's not assigned after just a single pass of the bot. The dynamism would devalue the links, if not render them worthless all together.
And even if you had one thousand variations of anchor texts for each link and they did pass some sort of value, what do you think that footprint would look like after a year or two of google crawls? Upon a manual review, someone there would say, "Huh, look at this, their links change all the time and each one is focused around a specific money term--I think it's obvious that they're trying to manipulate their rankings. Smack--here's a penalty for you."
-
Oh yes, varying your...oh wait sorry you didn't want that haha.
Erm this is an interesting idea - on first read my first thought was you're trying to game the system and that's never a good idea.
Then I thought a little more and I suppose it is very similar to dynamic content such as offers on your linking page, although it always points at one location.
I suppose it is only similar to changing your anchor text manually to see what works best, but I think that such frequent changes could end up getting noticed - a link anchor changing every time Google visits - surely Google is clever enough to notice this pattern and doesn't it smack of over-optimisation?
I bet others have already tried this - have you done any digging to see if you can find out what the impact was?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pull multiple link data for multiple pages at once?
Hi guys, I was wondering if there is a tool or way to pull link data for a list of URLs/Pages at once to one single file with ahrefs or majestic. I know scrapebox can do this with OSE, but looking for a way to do this with the other backlink databases. Any ideas? Cheers. Hi guys, I was wondering if there is a tool or way to pull link data for a list of URLs/Pages at once to one single file with ahrefs or majestic. I know scrapebox can do this with OSE, but looking for a way to do this with the other backlink databases. Any ideas? Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
New site, new URL, lots of custom content. Load it all or "trickle" it over time?
New site, new URL, lots of custom content. Load it all or "trickle" it over time? Would it make a difference in terms of ranking the site? Interested in your thoughts. Thanks! BBuck!
Intermediate & Advanced SEO | | BBuck0 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
Page loads fine for users but returns a 404 for Google & Moz
I have an e-commerce website that is built using Wordpress and the WP E-commerce plug-in, the products have always worked fine and the pages when you view them in a browser work fine and people can purchase the products with no problems. However in the Google merchant feed and in the Moz crawl diagnostics certain product pages are returning a 404 error message and I can't work out why, especially as the pages load fine in the browser. I had a look at the page headers and can see when the page does load the initial request does return a 404 error message, then every other request goes through and loads fine. Can anyone help me as to why this is happening? A link to the product I have been using to test is: http://earthkindoriginals.co.uk/organic-clothing/lounge-wear/organic-tunic-top/ Here is a part of the header dump that I did: http://earthkindoriginals.co.uk/organic-clothing/lounge-wear/organic-tunic-top/
Intermediate & Advanced SEO | | leapSEO
GET /organic-clothing/lounge-wear/organic-tunic-top/ HTTP/1.1
Host: earthkindoriginals.co.uk
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:21.0) Gecko/20100101 Firefox/21.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Language: en-gb,en;q=0.5
Accept-Encoding: gzip, deflate
Cookie: __utma=159840937.1804930013.1369831087.1373619597.1373622660.4; __utmz=159840937.1369831087.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); wp-settings-1=imgsize%3Dmedium%26hidetb%3D1%26editor%3Dhtml%26urlbutton%3Dnone%26mfold%3Do%26align%3Dcenter%26ed_size%3D160%26libraryContent%3Dbrowse; wp-settings-time-1=1370438004; __utmb=159840937.3.10.1373622660; PHPSESSID=e6f3b379d54c1471a8c662bf52c24543; __utmc=159840937
Connection: keep-alive
HTTP/1.1 404 Not Found
Date: Fri, 12 Jul 2013 09:58:33 GMT
Server: Apache
X-Powered-By: PHP/5.2.17
X-Pingback: http://earthkindoriginals.co.uk/xmlrpc.php
Expires: Wed, 11 Jan 1984 05:00:00 GMT
Cache-Control: no-cache, must-revalidate, max-age=0
Pragma: no-cache
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 6653
Connection: close
Content-Type: text/html; charset=UTF-80 -
E-commerce Adding New Content - Blog vs New Page
I have an ecommerce site (www.brick-anew.com) focused on Fireplace products and we also have a separate blog (fireplacedecorating.com) focused on fireplace decorating. My ecommerce site needs new content, pages, internal links, etc... for more Google love, attention, and rankings. My question is this: Should I add a blog to the ecommerce site for creating new content or should I just add and create new pages? I have lots of ideas for relevant new content related to fireplaces. Are there any SEO benefits to a blog over new static pages? Thanks! SAM
Intermediate & Advanced SEO | | SammyT0 -
How should i best structure my internal links?
I am new to SEO and looking to employ a logical but effective internal link strategy. Any easy ways to keep track of what page links to what page? I am a little confused regarding anchor text in as much as how I should use this. e.g. for a category page "Towels", I was going to link this to another page we want to build PA for such as "Bath Sheets". What should I put in for anchor text? keep it simple and just put "Bath Sheets" or make it more direct like "Buy Bath Sheets". Should I also vary anchor text if i have another 10 pages internally linking to this or keep it the same. Any advise would be really helpful. Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0 -
Does having multiple links to the same page influence the Link juice this page is able to pass
Say you have a page and it has 4 outgoing links to the same internal page. In the original Pagerank algo if these links were links to an page outside your own domain, this would mean that the linkjuice this page is able to pass would be devided by 4. The thing is i'm not sure if this is also the case when the outgoing link, is linking to a page on your own domain. I would say that outgoing links (whatever the destination) will use some of your link juice, so it would be better to have 1 outgoing link instead of 4 to the same destination, the the destination will profit more form that link. What are you're thoughts?
Intermediate & Advanced SEO | | TjeerdvZ0