Hiding content or links in responsive design
-
Hi,
I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that.
Google says:
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/detailsFor usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none")
Is this counted as hidden content and could penalize your site or not?
What do you guys do when you create responsive design websites?
Thanks!
GaB
-
Hi,
Saijo and Bradley are right in saying that hiding elements on a smaller screen should not be an issue (as it's a correct implementation of responsive design). Bear in mind as well that there is a Googlebot and a Smartphone Googlebot, so as long as the Googlebot is seeing what desktop users see and the Smartphone Googlebot (which uses an iPhone5 user agent) is seeing what mobile users see, it shouldn't be a problem.
The only thing I would add:
If you are going to use display:none to prevent a user from seeing something when they view your site, it's good to include an option to 'view full site' or 'view desktop site'. Also in that case I would question whether you actually need that content on the desktop site at all? Because best practice is to provide all the same content regardless of device.
If it's hidden but still accessible to the mobile user (in a collapsible div for instance) there's no cloaking involved so it shouldn't cause a problem.
As a side note: the Vary HTTP header is really for a dynamically served website (that is, a single URL which checks user agent and then serves the desktop HTML to desktop devices and mobile HTML to mobile devices).
Hope that helps!
-
The way I see it.
Google does not have a problem with proper use of things like media queries. More info : https://developers.google.com/webmasters/smartphone-sites/details . They ONLY have problem with webmasters when the hidden text is only available to search engines for SERP manipulation.
Read more into the " The Vary HTTP header " bit in the link above and some info from Matt : http://www.youtube.com/watch?v=va6qtaiZRHg&feature=player_detailpage#t=219
-
I understand what you are referring to about having to hide certain elements on smaller screens. Sometimes not everything fits or flows correctly.
When this happens, however, I try to hide design elements as opposed to text or links. I'm also OK with hiding images. If a block of text or a link seems out of place or doesn't flow properly, I will build a dropdown for it. I'm sure you've seen mobile sites with dropdown navigation menus.
I wouldn't leave it to up to Google to interpret what you are doing. Don't hide any links.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hide keyword tag or not?
We have a mandatory keyword tag field in our cms page templates, which we have to keep
White Hat / Black Hat SEO | | AMurelli
as our internal search facility bases queries on the keywords we use. Should we hide the keywords from the search
engines, as I read that Bing uses it as a spam signal? Or do we just need to stick to best practise ensuring the keywords match the keywords found in the body content? Many thanks for any help. Sophie0 -
Google webmasters tools, Majestic and Ahref in a simple case study (Bad links and Good links)
Hey guys, This case study started from here. A simple summary, I discover that I got +1000 backlinks from Blogspot though Google webmasters tools after making a connection with owners of these blogs which points to my new blog. Before starting I proudly invite Thomas Zickell and Gary Lee in this discussion. I wish you accept my invitation. Let's go to the main point, I've used Google webmaster tools so I will start with. Then Ahref which used by **Thomas **and then Majestic which used by Gary. Take a look at "001" screenshot, you will see that Google webmaster tools discovered 1291 links points to my site. Take another look at "002" screenshot, you will find that there are 22 domains points to my site. Most of them are good links since they are coming from websites such as Google.com, Wikipedia.org, Reddit, Shoutmeload, WordPress.org, ...etc. Beside +1000 backlinks came from Blogspot.com (blogs). Also, there's some bad links such as this one came from tacasino.com Necessary to say that I've got some competitors and they nicely asked me to stop the competition for some keywords and I've ignored their request. So, I'm not surprised to see these bad links. At "002" screenshot, we can see that Google didn't discover the bad links as they discovered the good links. And they discovered a lot of backlinks which not discovered by any other tools. **Let's move to Ahref, ** I will use screenshots provided by Thomas. At "003" screenshot, you can see Ahref report that say 457 links from 10 domains. By the way, social engagements data are wrong. I got more than zero engagements .. really. At "004" screenshot, you can see domains points to my site, links with anchor text. Take a look at the second link you will find that it's a spammy link coming from PR2 home page since it's is over optimized. the third link is also a spammy link since it coming from a not-relevant website. Beside other bad links need to be removed. So, Ahref didn't discover all of my good links. Instead of that it discovered few good links and a lot of bad links. In a case like this a question come needs to be answered since there are some people trying so hard to hurt my site, Do I have to remove all this bad links? Or, just links discovered by Google. Or, Google understand the case? **Let's move to majestic, ** Gray Lee provided data from majestic which say "10 Unique Referring Domains, with 363 links, 2 domains make up a majority." Since Gray didn't take any screenshots I will provide mine. At "005" screenshot, you can see some of the bad links discovered by Majestic. Not all of them discovered by Ahref or Google. In the other hand, Majestic didn't discover all of my Good links. Also, there's a miss understand I would like to explain here. When I published the Discussion about +1000 link. Some people may think that I trying to cheat you by providing fake info and this totally wrong. I said before and I'm saying that again you are so elite and I respect you. Also, I'm preparing for an advanced case study about this thing. If any expert would like to join me this will be great. Thank you for reading and please feel free to share you thoughts, knowledge and experience in this Discussion. EE5bFNc jYg21cf Xyfgp28.png iR4UOwi.png D1pGAFO
White Hat / Black Hat SEO | | Eslam-yosef1 -
I have 4012 links from one blog - will Google penalise?
My website (http://www.gardenbeet.com) has 4012 links from http://cocomale.com/blog/ to my home page -a banner advert links from the blog - I also have 3,776 from another website to 6 pages of my website 1,832 from pinterest to 183 pages etc etc overall there are 627 domains linking to my website I have been advised by a SEO company that I was penalised in about may to july 2012 due to a large number of links coming from one domain or two domains is that true? should I ask the blog owner to remove my link?
White Hat / Black Hat SEO | | GardenBeet0 -
Advice on links after Penguin hit
Firstly we have no warnings or messages in WMT. We have racked up thousands of anchor text urls. Our fault, we didnt nofollow and also some of our many cms sites replicated the links sitewide to the tune of 20,000 links. I`m in the process of removing the code which causes this problem in most of the culprit sites but how long will it take roughly for a crawl to recalculate the links? In my WMT it still shows the links increasing but I think this is retrospective data. However, after this crawl we should see a more relevant link count. We also provide some web software which has been used by many sites. Google may consider our followed anchor text violating spam rules. So I ask, if we were to change the link text to our url only and add nofollow, will this improve the spam issue? We could have as many as 4,000 links per website, as it is a calendar function and list all dates into the future.......and we would like to retain a link to our website of course for marketing purposes. What we dont want is sitewide link spam again. Some of our other links are low quality, some are okay. However, we have lost rankings, probably due to low quality links and overuse of anchor text.. Is this the case the Google has just devalued the links algorythmically or is there an actual penalty to make the rankings drop? As we have no warnings in WMT, I feel there isnt the need to remove the lower quality links and in most cases we havent control over the link placements. We should just rectify that we have a better future linking profile? If we have to remove spam links, then that can only be a good reason to cause negative seo?
White Hat / Black Hat SEO | | xtopher660 -
'Stealing' link juice from 404's
As you all know, it's valuable but hard to get competitors to link to your website. I'm wondering if the following could work: Sometimes I spot that a competitor is linking to a certain external page, but he made a typo in the URL (e.g. the competitor meant to link to awesomeglobes.com/info-page/ but the link says aewsomeglobes.com/info-page/). Could I then register the typo domain and 301 it to my own domain (i.e. aewsomeglobes.com/info-page/ to mydomain.com/info-page/) and collect the link juice? Does it also work if the link is a root domain?
White Hat / Black Hat SEO | | RBenedict0 -
Link Building using Badges
In light of penguin update, is link building using badges(like "I love SEOMOZ" badge) still considered a white hat tactic? I have read old posts on SEOMOZ blog about this topic and wondering if this method is still effective. Look forward to feedback from MOZers.
White Hat / Black Hat SEO | | Amjath0 -
Should I 301 Redirect a Site with an 'Unnatural Link' Warning?
Hey Fellow Mozzers, I have recently been approached by a new client that has been issued with an 'Unnatural Link' warning and lost almost all of their rankings. Open Site Explorer shows a ton of spammy links all using main keyword anchor text and there are way too many of them to even consider manually getting them removed. There are two glimmers of hope for the client; The first is that the spammy links are dropping off at a rate of about 25 per week; The second is that they own both the .com and the .co.uk domain for their business. I would really appreciate some advice on the best way to handle this, should I :- Wait it out for some of the spammy links to drop off whilst at the same time pushing social media and build some good clean links using the URL and brand as anchor text? Then submit a recosideration request? Switch the website over from the .com domain to the .co.uk domain and carry out a 301 redirect? Switch the website over from the .com to the .co.uk without doing a redirect and start again for the client with a clean slate? I would still register an address change via Webmaster Tools. Add a duplicate site on the .co.uk domain. Leave the .com site in place but rel="canonical" the entire domain over to the .co.uk Any advice would be very much apprecited. Thanks
White Hat / Black Hat SEO | | AdeLewis
Ade.0 -
Thought on optimising the perfect keyword location link
My site works a bit like a directory, so say I have a page called "Ice Cream Vendors" - on that page I would talk a bit about Ice Cream Vendors, then I will have a list of Ice Cream Vendor Locations. My list of locations can be quite big depending on the product and the amount of locations they occur in - when you click a location, it goes to a page showing all "ICeCream Vendors" in that location. So Currently I will have a table on the page a bit like this: ICE CREAM VENDOR LOCATIONS
White Hat / Black Hat SEO | | James77
New York
Miami
Las Vegas This is all perfectly nice, simple and usable - BUT it is not producing perfect keyword links - for perfect keyword links the list should be like this: ICE CREAM VENDOR LOCATIONS
New York Ice Cream Vendors
Miami Ice Cream Vendors
Las Vegas Ice Cream Vendors Now I have my perfect anchor links - BUT it looks rediculous and is NOT user friendly. So What do I do?
1/. Build it for users and not have perfect anchor links, and loose in SEO?
2/. Build a perfect SEO links and make it less usable and looking spammy? OR 3/. Deliver the search engine the perfect SEO links, and the user the userfriendly version? In this I mean I could do the following:
SE's (and screen readers I think would see):
ICE CREAM VENDOR LOCATIONS
New York Ice Cream Vendors
Miami Ice Cream Vendors
Las Vegas Ice Cream Vendors Users would See
ICE CREAM VENDOR LOCATIONS
New York
Miami
Las Vegas Now in my view I am doing nothing wrong - I am mearly giving the user the most userfriendly version and I am giving the SE more information on the link, that the user doesn't need. So - In my view I am doing something that is honest - but what are your thoughts?? Has anyone tried to do this? Thanks0