How authentic is a dynamic footer from bots' perspective?
-
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case.
**Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page.
Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well.
**What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
-
Thank you so much Sir Alan. I really appreciate your efforts for compiling this detailed response to my questions. Have noted down all the points along with how better I can handle them, will soon come up with a better fat footer.
-
Nitin
You're dealing with multiple considerations and multiple issues in this setup.
First, it's a matter of link distribution. When you link to x pages from page 1, this informs search engines "we think these are important destination pages". If you change those links every day, or on every refresh, and if crawlers also encounter those changes, it's going to strain that communication.
This is something that happens naturally on news sites - news changes on a regular basis. So it's not completely invalid and alien to search algorithms to see or deal with. And thus it's not likely their systems would consider this black hat.
The scale and frequency of the changes is more of a concern because of that constantly changing link value distribution issue.
Either X cities are really "top" cities, or they are not.
Next, that link value distribution is further weakened by the volume of links. 25 links per section, three sections - that's 75 links. Added to the links at the top of the page, the "scrolling" links in the main content area of the home page, and the actual "footer" links (black background) so it dilutes link equity even further. (Think "going too thin" with too many links).
On category pages it's "only" 50 links in two sub-footer sections. Yet the total number of links even on a category page is a concern.
And on category pages, all those links dilute the primary focus of any main category page. If a category page is "Cell Phone Accessories in Bangalore", then all of those links in the "Top Cities" section dilute the location. All the links in the "Trending Searches" section dilute the non-geo focus.
What we end up with here then is an attempt to "link to all the things". This is never a best practice strategy.
Best practice strategies require a refined experience across the board. Consistency of signals, combined with not over-straining link equity distribution, and combined with refined, non-diluted topical focus are the best path to the most success long-term.
So in the example of where I said initially that news sites change the actual links shown when new news comes along, the best news sites do that while not constantly changing the primary categories featured, and where the overwhelming majority of links on a single category page are not diluted with lots of links to other categories. Consistency is critical.
SO - where any one or a handful of these issues might themselves not be a critical flaw scale big problem, the cumulative negative impact just harms the site's ability to communicate a quality consistent message.
The combined problem here then needs to be recognized as exponentially more problematic because of the scale of what you are doing across the entire site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
It's possible a bounce-rate attack manipulate SEO?
My site has been visited by unusual users with one second session times. This leaves my analytics data confused.
White Hat / Black Hat SEO | | CompraBit0 -
How to deal with spam heavy industries that haven't gotten the hammer from Google?
One of our clients works in the video game category - specifically, helping people rank higher in games like League of Legends. In spite of our trying to do things the right way with white hat link building, we've suffered when trying to compete with others who are using comment and forum spam, private blog networks, and other black hat tactics. Our question is - what is the right approach here from a link building perspective? Is it an "if you can't beat them, join them" or do we wait it out and hope Google notices and punishes those who don't play nice? Some test terms to see what we're up against: "elo boost" and "lol coach." Would love to hear thoughts from anyone who's dealt with a similar situation.
White Hat / Black Hat SEO | | kpaulin0 -
Seeing URLS indexed that we don't want how do we approach this?
Hey guys, I have seen a few pages in the SERPS that are appearing from my site, some of these pages urls are actually ajax to refresh the buttons on our site... If these are important to our site but don't need to show up in the serps results can anyone recommend anything? Should I remove the urls? Or exclude them from the sitemap? or noindex? Any advice would be much appreciated thanks
White Hat / Black Hat SEO | | edward-may0 -
Why my banklinks haven't been removed?
Hi Everyone So I had over 1500 backlinks in under month, and i found out it was coming from a directory. I asked them to delist me from the directory, but it still shows i have these links pointing to me. How do I get completely take them down? Also I contacted myseotools who I use and they said "It is most likely because you have some dynamic pages that can create thousands of various URLs. Maybe a directory? This is not an issue with our software as it comes directly from ahrefs. Try going to ahrefs.com and enter your domain to see where all the links are coming from." I proceeded to do this and its definely coming from that 1 directory. They said they have removed me from they directory, but my question is I can still see I have 1500 backlinks coming from their domain? Does this take time to clear? Or have I missed something in the process?
White Hat / Black Hat SEO | | edward-may0 -
Getting links on competitor's blog
An SEO agency I'm working with has asked if we're okay with guest posting on a competitor's blog. What are the negatives of getting a link from a competitor's blog? Two things I thought of: They can remove the link at any time - why wouldn't you as a competitor? I generally don't want to alert my competition what I'm doing for SEO and how I'm doing it. Is that enough to not pursue those links? Thanks in advance for your thoughts!
White Hat / Black Hat SEO | | pbhatt0 -
How tdo you replace an old SEO company's work?
I have a client that has been paying someone for what is basically directory placement on very specific niche sites that they have created. These sites are exact match keyword domains with not very high PA or DA (they're in the teens) and they provide no direct traffic. It's basically a link wheel that is probably helping them to rank for some of their bigger holy grail keywords. They are also providing some low quality article/blog marketing on these sites. Ultimately, they link to them ALOT and it's working in this specific niche. This client no longer wants to pay for these services, but there's the possibility of all of the links being taken down and their rankings being set back a ton. Has anybody ever experienced this and if so, how did you deal with it? What are some good tactics? Any tips would be great.
White Hat / Black Hat SEO | | MichaelWeisbaum0