Are Content in Inline Javascript and Collapsible Considered Cloaking to Google?
-
Hi,
I would like to save space in my website and do not want my other products to be pushed down below the first fold.
In order to do that, I have decided to add content inside inline javascript or using collapsible. For collapsible, I may be using "show/hide" button or "read more" button to show the whole content.
So does content in Javascript and collapsible considered hiding from Google? If it is, then I have to think of other options. Thanks.
-
So does content in Javascript and collapsible considered hiding from Google?
No, that is fine.
Cloaking is specifically showing different content to Google then you would to other users.
Honest presentations of "read more" or not openly displaying content until a trigger button is pressed would be fine. Deceptively hiding content behind a button that would normally not be discovered or pressed would be an issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google Understand H2 As Subtitle?
I use some HTML 5 tags on my custom template. I implement <header class="entry-header-outer"> Flavour & Chidinma – 40 Yrs 40 Yrs by Flavour & Chidinma </header> html code. h1 tag serves as the title, while h2 tag servers as the subtitle of the post. Take a look at it here: https://xclusiveloaded.com/flavour-chidinma-40-yrs/ I want to know if it's ok or should I remove the h2 tag. Guys, what is your thoughts?
On-Page Optimization | | Kingsmart4 -
Duplicate content from page links
So for the last month or so I have been going through fixing SEO content issues on our site. One of the biggest issues has been duplicate content with WHMCS. Some have been easy and other have been a nightmare trying to fix. Some of the duplicate content has been the login page when a page requires a login. For example knowledge base article that are only viewable by clients etc. Easily fixed for me as I dont really need them locked down like that. However pages like affiliate.php and pwreset.php that are only linked off of a page. I am unsure how to take care of these types. Here are some pages that are being listed as duplicate: Should this type of stuff be a 301 redirect to cart.php or would that break something. I am guessing that everything should point back to cart.php.
On-Page Optimization | | blueray
https://www.bluerayconcepts.com/brcl...art.php?a=view
https://www.bluerayconcepts.com/brcl...php?a=checkout These are the ones that are really weird to me. These are showing as duplicate content but pwreset is only a link of the KB category. It shows up as duplicate many times as does affilliate.php: https://www.bluerayconcepts.com/brcl...ebase/16/Email
https://www.bluerayconcepts.com/brcl...16/pwreset.php Any help is overly welcome.0 -
Delete or not delete outdated content
Hi there!
On-Page Optimization | | Enrico_Cassinelli
We run a website about a region in Italy, the Langhe area, where we write about wine and food, local culture, and we give touristic informations. The website also sports a nice events calendar: in 4 years we (and our users) loaded more than 5700 events. Now, we're starting to have some troubles managing this database. The database related to events is huge both in file size and number of rows. There are a lot of images that eat up disk space, and also it's becoming difficult to manage all the data in our backend. Also, a lot of users are entering the website by landing on outdated events. I was wondering if it could be a good idea to delete events older than 6 months: the idea was to keep only the most important and yearly recurring events (which we can update each year with fresh information), and trash everything else. This of course means that 404 errors will increase, and also that our content will gettin thinner, but at the same time we'll have a more manageable database, and the content will be more relevant and "clean". What do you think? thank you 🙂 Best0 -
OMG! does Google really consider text-decoration:none as a hidden link?
So I was reading this article today https://www.mattcutts.com/blog/hidden-links/ Can setting a link to the same color as regular text and applying text-decoration:none really be considered a 'hidden link'?
On-Page Optimization | | cbielich0 -
What Should I Do With Low Quality Content?
As my site has definitely got hit by Panda, I am in the process of cleaning my website of low quality content. Needless to say, shitty articles are completed being removed but I think lots of this content is now of low quality because it is obsolete and dated. So what should I do with this content? Should I rewrite those articles as completely new posts and link from the old posts to the new ones? Or should I delete the old posts and do a 301 redirect to the new post? Or should I rewrite the content of these articles in place so I can keep the old URL and backlinks? One thing is that I've got a lot more followers than I used to so publishing a new post gets a lot more views, like and shares and whatnot from social networks.
On-Page Optimization | | sbrault741 -
Logged In Only Content Made Available to Googlebot
Hi guys, On this page, http://www.jobiness.sg/changi-airport-group/work-reviews/id-18180200170/?page=2, I require my users to sign up to be able to view the content. I would like to make this available to search engine crawlers. Also, are there any general guidelines regarding making this type of optimization? Is this considered acceptable within Google's guidelines? From my research, there seems to be 3 ways to go about doing this: Creating an account for the bots such that they are considered 'logged in users' Adding checks to my html to see the http user agent Google click first free (havent dont much research into this yet)
On-Page Optimization | | adminjob0 -
Duplicate content list by SEOMOZ
Hi Friends, I am seeing lot of duplicate (about 10%) from the crawl report of SEOMOZ. The report says, "Duplicate Page Content" But the urls it listed have different title, different url and also different content. I am not sure how to fix this issue.. My site has both Indian cinema news and photo gallery. The problme mainly coming in photo gallery posts. for example: this is the main url of a post. apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos . But in this post, each image is a link to its enlarged images (default wordpress). The problem is coming with each individual image with in this post. examples of SEOMOZ report 3 individual urls as duplicate content...from the same above post.: http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-4 http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-3 http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-2 Some body please advise me.. Appreciate your help.
On-Page Optimization | | ksnath0 -
Static content VS Dynamic changing content what is best
We have collected a lot of reviews and we want to use them on our Categories pages. We are going to be updating the top 6 reviews per categories every 4 days. There will be another page to see all of the reviews. Is there any advantage to have the reviews static for 1 or 2 weeks vs. having unique new ones pulled from the data base every time the page is refreshed? We know there is an advantage if we keep them on the page forever with long tail; however, we have created a new page with all of the reviews they can go to.
On-Page Optimization | | DoRM0