Is using JavaScript injected text in line with best practice on making blocks of text non-crawlable?
-
I have an ecommerce website that has common text on all the product pages, e.g. delivery and returns information.
Is it ok to use non-crawlable JavaScript injected text as a method to make this content invisible to search engines? Or is this method frowned upon by Google?
By way of background info - I'm concerned about duplicate/thin content, so want to tackle this by reducing this 'common text' as well as boosting unique content on these pages.
Any advice would be much appreciated.
-
I haven't found standard shipping info text to be much of a problem in terms of duplicate content on product pages, but if you're concerned about it I would consider an iframe.
While hiding the content in a non-crawlable .js file might work, it isn't advisable. First, Google PreviewBot can parse javascript on the page and see the content in many cases, as is necessary to generate the site preview and visual cache of a page. Second, anything that keeps Google from being able to properly render the page may be considered suspect.
Please view this video by Matt Cutts: http://www.youtube.com/watch?v=B9BWbruCiDc
Essentially, he says "Don't block Google bot from css or javascript".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hidden text and mobile indexing
Hello, I believe mobile indexing 1 st is in place. Since then, does google give the same value to content that is hidden behind a tab (for example a question where you need to click on the + to see the answer) as content that would be directly visible ? Thank you,
Technical SEO | | seoanalytics0 -
Using hreflang tags properly.
On my site "example.com" I have set up the following in the header: The problem is that the tags are universal across the site, so every page has these tags, leading obviously to no return tag errors. I.e. the page www.example.ca/testing.html still has the tags: Not tags with "testing.html" in them. How bad is this? Does it matter?
Technical SEO | | absoauto0 -
How can I best handle parameters?
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning! The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here: Focus (ex: “Data Science”) Cost (ex: “$<5000”) City (ex: “Chicago”) State/Province (ex: “Illinois”) Country (ex: “Canada”) When a filter is applied to the directories page the CMS produces a new page with URLs like these: coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork My questions: 1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls. coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago VERSUS coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all) 2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city? Should I be changing page titles for the unique filtered URLs? I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ? https://support.google.com/webmasters/answer/1235687 An assortment of the other stuff I’ve read for reference: http://www.wordtracker.com/academy/seo-clean-urls http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/ http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
Technical SEO | | alovallo0 -
INTERNAL ANCHOR TEXT LINKS
If your site has say 50 pages, and you have a anchor text link from the home page to that page, what should you do in response to last friday, I have 60 keywords in the top 10 and now thye are all in the top 30 at best. PAGE RANK is still 5s and 6s on all of these pages.... NO PROBLEM ON THIS SITE UNTIL LAST FRIDAY!
Technical SEO | | jdcline0 -
Redirect non-www if using canonical url?
I have setup my website to use canonical urls on each page to point to the page i wish Google to refer to. At the moment, my non-www domain name is not redirected to www domain. Is this required if i have setup the canonical urls? This is the tag i have on my index.php page rel="canonical" href="http://www.mydomain.com.au" /> If i browse to http://mydomain.com.au should the link juice pass to http://www.armourbackups.com.au? Will this solve duplicate content problems? Thanks
Technical SEO | | blakadz0 -
Best XML Sitemap generator
Do you guys have any suggestions on a good XML Sitemaps generator? hopefully free, but if it's good i'd consider paying I am using a MAC so would prefer a online or mac version
Technical SEO | | kevin48030 -
Is Adobe Acrobat the best for making PDF documents in terms of seo and price?
As we add PDF documents to our website, I want to take it up a notch. In terms of seo and software price, is Adobe Acrobat the only choice? Thanks! No Mac here. I should clarify that I can convert files to PDFs with Microsoft Word and add some basic info for the search engines such as title, keywords, author, and links. This article inspired me: www.seomoz.org/ugc/how-to-optimize-pdf-documents-for-search I can add links back to the page when I create the PDF, but we also have specific product PDFs that suppliers let us copy and serve from our server--why use their bandwidth. Much as you would stamp your name on a hard copy brochure the vendor supplies, I want to add a link to our page from those PDFs. That makes me think I should ask our supplier to give me a version with a link to our page. Then there is the question: is that ok to do? In the meantime, I will check TriviaChicken's suggestions and dream about a Mac, Allan. Thanks
Technical SEO | | zharriet0 -
Internal anchor text
Designed my website with one keyword, one page adage. Wondering if i am creating an issue with internal anchor text and use of plurals for keywords. For instance, say I want my index page to rank for keyword exotic vacations, and an inner page to rank for exotic vacation. I do this as i notice there is a major discrepency with google when calling both the singular and plural term of certain keywords (like the example above, for instance). I see in yahoo it views singular and plural as essentially the same word, but google appears to rank them separately. Anyways since google is where the majority of my search traffic comes from, I separated my most competitive keywords for both singular and plural usage and created external links with anchor text that reflects this separation. I am concerned though that I may not be handling the Internal anchor text properly. What i have done is take a keyword l want to rank for (for example "exotic vacations") and attach it a page (for example index page) and use the anchor text "exotic vacations" on this page and link it to the inner page "exotic vacation." Reason: I want to rank for the term exotic vacations on the main page, but have a relavant page to link to this term so the closest would be the keyword exotic vacation on an inner page. I would appreciate any feedback on this. I think I am running into a problem with this strategy especially on the main index page/inner page keywords (plural to singular). I also notice google will find an inner page for a time then switch it to the default domain name index page when searchign for a keyword. Kinda keeps going back and forth. I never see any indent search results.
Technical SEO | | oxygenretreat0