How authentic is a dynamic footer from bots' perspective?
-
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case.
**Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page.
Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well.
**What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
-
Thank you so much Sir Alan. I really appreciate your efforts for compiling this detailed response to my questions. Have noted down all the points along with how better I can handle them, will soon come up with a better fat footer.
-
Nitin
You're dealing with multiple considerations and multiple issues in this setup.
First, it's a matter of link distribution. When you link to x pages from page 1, this informs search engines "we think these are important destination pages". If you change those links every day, or on every refresh, and if crawlers also encounter those changes, it's going to strain that communication.
This is something that happens naturally on news sites - news changes on a regular basis. So it's not completely invalid and alien to search algorithms to see or deal with. And thus it's not likely their systems would consider this black hat.
The scale and frequency of the changes is more of a concern because of that constantly changing link value distribution issue.
Either X cities are really "top" cities, or they are not.
Next, that link value distribution is further weakened by the volume of links. 25 links per section, three sections - that's 75 links. Added to the links at the top of the page, the "scrolling" links in the main content area of the home page, and the actual "footer" links (black background) so it dilutes link equity even further. (Think "going too thin" with too many links).
On category pages it's "only" 50 links in two sub-footer sections. Yet the total number of links even on a category page is a concern.
And on category pages, all those links dilute the primary focus of any main category page. If a category page is "Cell Phone Accessories in Bangalore", then all of those links in the "Top Cities" section dilute the location. All the links in the "Trending Searches" section dilute the non-geo focus.
What we end up with here then is an attempt to "link to all the things". This is never a best practice strategy.
Best practice strategies require a refined experience across the board. Consistency of signals, combined with not over-straining link equity distribution, and combined with refined, non-diluted topical focus are the best path to the most success long-term.
So in the example of where I said initially that news sites change the actual links shown when new news comes along, the best news sites do that while not constantly changing the primary categories featured, and where the overwhelming majority of links on a single category page are not diluted with lots of links to other categories. Consistency is critical.
SO - where any one or a handful of these issues might themselves not be a critical flaw scale big problem, the cumulative negative impact just harms the site's ability to communicate a quality consistent message.
The combined problem here then needs to be recognized as exponentially more problematic because of the scale of what you are doing across the entire site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What does Google's Spammy Structured Markup Penalty consist of?
Hey everybody,
White Hat / Black Hat SEO | | klaver
I'm confused about the Spammy Structured Markup Penalty: "This site may not perform as well in Google results because it appears to be in violation of Google's Webmaster Guidelines." Does this mean the rich elements are simply removed from the snippets? Or will there be an actual drop in rankings? Can someone here tell from experience? Thanks for your help!1 -
Drastic surge of link spam in Webmaster Tools' Link Profile
Hello all I am trying to get some insights/advice on a recent as well as drastic increase in link spam within my Webmaster Tools' Link Profile. Before I get into more detail, I would like to point out, that I did find some relevant MOZ community posts addressing this type of issue. However, my link spam situation may have to be approached from a different angle, as it concerns two sites at the same time and somewhat in the same way. Basically, starting in July 2017, from one day to the other, a multitude of domains (50+) is generating link spam (at least 200 links a month and counting) and to cut a long story short, I believe the sites are hacked. This is because most of the domain names sound legit and load the homepage, but all the sub-pages linking to my site contain "adult" gibberish. In addition, it is interesting to see, that each sub-page follows the same pattern, scraping content from my homepage including the on-page links - that generate the spammy backlinks to my sites - while inserting the adult gibberish in between (basically it's all just text and looks like as if a bot is at work). Therefore, it's not like my link is being inserted "specifically" into pages or to spam me with the same anchor text over and over. So, I am not sure what kind of link spam this really is (or the purpose of it). Some more background information: As mentioned above, this link spam (attack?) is affecting two of my sites and it started off pretty much simultaneously (in addition, the sites focus on a competitive niche). The interesting detail is, that one site suffered a manual penalty years ago, which has been lifted (a disavowal file exists and no further link building campaigns have been undertaken after the cleanup), while the other site has never seen any link building efforts - it is clean, yet the same type of spam is flooding that websites' link profile too. In the webmaster forums the overall opinion is, that Google ignores web spam. All well. However, I am still concerned, that the dozens of spammy links pointing to the website "with a history" may pose a risk (more spam on a daily basis on both sites though). At the same time I wonder, why the other "clean" site is facing the same issue. The clean sites' rankings do not appear to be impacted, while the other website has seen some drops, but I am still observing the situation. Therefore, should I be concerned for both sites or even start an endless disavowal campaign on the site with a history? PS: This MOZ article appears to advice so: https://mza.seotoolninja.com/blog/do-we-still-need-to-disavow-penguin "In most cases, sites that have a history of collecting unnatural links tend to continue to collect them. If this is the case for you, then it’s best to disavow those on a regular basis (either monthly or quarterly) so that you can avoid getting another manual action." What is your opinion? Sorry for the long post and many thanks in advance for any help/insight.
White Hat / Black Hat SEO | | Hermski0 -
Site architecture change - +30,000 404's in GWT
So recently we decided to change the URL structure of our online e-commerce catalogue - to make it easier to maintain in the future. But since the change, we have (partially expected) +30K 404's in GWT - when we did the change, I was doing 301 redirects from our Apache server logs but it's just escalated. Should I be concerned of "plugging" these 404's, by either removing them via URL removal tool or carry on doing 301 redirections? It's quite labour intensive - no incoming links to most of these URL's, so is there any point? Thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
Is this a 'real site' or a spam site for backlinks
I have been asked what type of site this is? What kind of page is this? [http://www.gotocostarica.com/](http://www.gotocostarica.com/) In my opinion it is site put up to create back links and should be avoided (especially in the light of the new Penguin and Panda updates coming). But I don't want to give wrong advice. What are your opinions?
White Hat / Black Hat SEO | | Llanero0 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
What to do if you've been hacked.....
Just logged into our CMS system and it appears we have been hacked. All page titles have been hijacked adding a secondary title tag linking out to website http://emapaydayloans.com with anchor text pay day loans. Our Web Dev team are working on fixing the hack now. My concern is the potential knock on effect to SEO. This looks like a bad neighbourhood site: 3 pages indexed PR 0 And for I don't know how long we've had almost every page on all our domains linking out with the following page title including the same link and anchor text: payday loans I assume its a wait and see at this stage.
White Hat / Black Hat SEO | | RobertChapman0 -
Influence of users' comments on a page (on-page SEO)
Do you think when Google crawls your page, it "monitors" comments updates to use this as a ranking factor? If Google is looking for social signs, looking for comments updates might be a social sign as well (ok a lot easier to manipulate, but still social). thx
White Hat / Black Hat SEO | | gt30