What's the best practice for implementing a "content disclaimer" that doesn't block search robots?
-
Our client needs a content disclaimer on their site. This is a simple "If you agree to these rules then click YES if not click NO" and you're pushed back to the home page.
I have this gut feeling that this may cause an upset with the search robots.
Any advice?
R/
John
-
Hi John. I've seen some websites that use a simple box that is "lightboxed" on top of the content. When you click Yes, the lightbox appears and the content is shown as normal. To a search engine, this would look like a perfectly normal website.
However, if your "click yes or click no" refers the end-user to another page ONLY AFTER they click yes, then this would be a huge issue with search engines.
I'd recommend using the "User Agent Switcher" in Firefox to view your site as a Googlebot. This should tell you whether or not it's seeing the entire site or just a portion of your site:
https://addons.mozilla.org/en-US/firefox/addon/user-agent-switcher/
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Web designer doesn't see duplicate pages or issues??? Help!
I have a website that has duplicate pages, and I want to fix them. I told my web guy and his response was: "we do not see any issues on our end. It is not hurting any ranking in the 3 major searches. We have no duplicate pages" Here's the site http://www.wilkersoninsuranceagency.com/ Here's the home page again http://www.wilkersoninsuranceagency.com/index.php Here's what MOZ say: Please see attached image. I'm not sure what to tell him, as i'm not a web person, but MOZ is telling me we have issues on the site and I feel like they should be fixed?? 7QWkVU0 tYCjV
On-Page Optimization | | MissThumann0 -
Hit by Panda 4.1 and it couldn't be more wrong!
Hi, I'm scratching my head with this one, I have a website with around 40 pages of unique content produced by a professional copywriter who works magazines and PR agencies - each page has around 750/1000 words - according to Google the reading age is intermediate as you would expect from a good copywriter, I have anchor points jumping around the page to information the user shows an interest in - this happens I have video recording and heat maps. I also receive 100s and on some pages 1000s of social shares from Facebook, Twitter, Linkedin and G+. I wanted to build a site the way Google wants you to so I have done no link build at all, everything focused onsite so I just spent the last 2 months making the whole site responsive for mobile and tablet devices - I also spent time getting the load time down and 'was' in the process of hooking into a CDN for extra performance. Ive done everything I can to make the site just good and its reflected in the social sharing and natural links from sites such as huffington post. On the 23rd my sites rankings which were solid for over 2 years have crashed, but what's worse is ive been dropped and replaced with sites using the same tactics as the Payday Loan industry, and it seems great favouritism has been given to sites containing Adsense - I can see ranking one page sites with less than 300 words content and 3/4 ad units above the fold and sites which have taken chunks of content off Wikipedia and rank. Thumbs up Google, great job.
On-Page Optimization | | followuk0 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Best practice for URL structure - short and sweet, or double keyword?
We are just about to re-jig our main category pages and have found that different leading sites have taking different views on short and sweet url structure vs. repeated keywords1. For our website we have two options. We have two options: mywebsite.com/browse/birmingham/restaurants-in-birmingham or mywebsite.com/browse/birmingham/restaurants Someone like opentable have gone for short and sweet (opentable.co.uk/birmingham-restaurants) whereas people like Time Out have gone longer with multiple matches in the url (timeout.com/london/food-drink/londons-top-50-restaurants). Is there a consensus on which is better?
On-Page Optimization | | HireSpace0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
What's the point of my blog?
My website, www.toplinecomms.com has a reasonably good blog that gets quite good interaction and sharing. I introduced the blog at the start of 2013 because the general sentiment from all the SEO books and articles I had read was that a good blog could be invaluable to a search marketing campaign. The posts on the blog are keyword optimised and they get great shares and social engagement. However, I have noticed that the blog is stealing my services' pages' thunder! There are some keywords that I am keen for our services pages to rank for, but the blog is beating them to it! So my question is: How should I be using my blog to get my services pages to rank higher?
On-Page Optimization | | HeatherBakerTopLine0 -
How Pandas Define "Thin" content
Many websites like www.geico.com have little content on the homepage, but instead a ton of graphics. I've been told before to watch out for pages/posts less than 200 words, but 95% of websites have "main pages" that are graphically driven and have very very few words. So, if Panda is cracking down on thin content, how does Panda define "thin" with regards to major pages of a site? Thanks!
On-Page Optimization | | terran0