How much of an issue is JS?
-
Hey folks,
So, I got two pages. Page A has a lot more content but in a tabular format which uses javascript and a Title Tag which is a synonym for our keyword, but not the actual keyword. Page B has less content, and the title tag is the exact keyword phrase we want to rank for. Page A has a bigger backlink profile (though not enormous by any extent).
Page A ranks in 30th. Page B ranks in 7th.
Importance of Title tag? Importance of JS? Both?
Discuss!
Cheers,
Rhys
-
Hi SwanseaMedicine,
Have a read of this hidden content experiment by Reboot Online: https://www.rebootonline.com/blog/hidden-text-experiment/
It was a very well-run experiment and, in summary, they found that visible content outperformed hidden content.
However, this will change once Google's mobile-first index rolls out (sometime in 2018?) where hidden content will be given full weight (source).
Cheers,
David
-
Google is all about serving the best experience with the best content. When you put tabbed content on a website, especially if the tabs serve multiple topics, you are watering down that page. Also because a portion of the content starts at hidden, it makes a worse UX experience for a user to directly get to your content (because they have to click).
-
Hey Michael,
Thanks for your response. The question is, I suppose, why does it not rank as well? Does Google not value it as highly? Or does it struggle to fetch and render it because it's tabbed? It does seem to be the biggest factor, in my opinion, in the difference between the two pages.
Cheers,
Rhys
-
Without knowing more I would guess the issue is tabbed content does not perform as well as content that is always displayed on a page. Always look to your content first, then worry about things like title tags.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fetch as Google issues
HI all, Recently, well a couple of months back, I finally got around to switching our sites over to HTTPS://. In terms of rankings etc all looks fine and we have not move about much, only the usual fluctuations of a place or two on a daily basis in a competitive niche. All links have been updated, redirects in place, the usual https domain migration stuff. I am however, troubled by one thing! I cannot for love nor money get Google to fetch my site in GSC. No matter what I have tried it continues to display "Temporarily unreachable". I have checked the robots.txt and it is on a new https:// profile in GSC. Has anyone got a clue as I am stumped! Have I simply become blinded by looking too much??? Site in Q. caravanguard co uk. Cheers and looking forward to your comments.... Tim
Technical SEO | | TimHolmes0 -
Duplicate Page Titles Issue in Campaign Crawl Error Report
Hello All! Looking at my campaign I noticed that I have a large number of 'duplicate page titles' showing up but all they are the various pages at the end of the URL. Such as, http://thelemonbowl.com/tag/chocolate/page/2 as a duplicate of http://thelemonbowl.com/tag/chocolate. Any suggestions on how to address this? Thanks!
Technical SEO | | Rich-DC0 -
Magento Rewrite Issue
Moz's Crawler has thrown up a bunch of crawl issue for my site.The site is a magento based site and I recently updated the themes so some routes may have have become redundant. Moz has identified 289 pages with Temporary Redirect. I thought magento managed the redirects if I set the "Auto-redirect to Base URL" to Yes(301 Moved permanently). But this is enabled on my store and I still get the errors. The only thing I could think of was to add a Robots.txt and handle the redirection of these links from here. But handling redirection for 289 links is no mean task. I was looking for any ideas that could fix this without me manually doing this .
Technical SEO | | abhishek19860 -
An odd duplicate content issue...
Hi all, my developers have just assured me that nothing has changed form last week but in the today's crawl I see all the website duplicated: and the difference on the url is the '/' so basically the duplicated urls are: htts://blabla.bla/crop htts://blabla.bla/crop/ Any help in understanding why is much appreciated. thanks
Technical SEO | | LeadGenerator0 -
Squarespace Duplicate Content Issues
My site is built through squarespace and when I ran the campaign in SEOmoz...its come up with all these errors saying duplicate content and duplicate page title for my blog portion. I've heard that canonical tags help with this but with squarespace its hard to add code to page level...only site wide is possible. Was curious if there's someone experienced in squarespace and SEO out there that can give some suggestions on how to resolve this problem? thanks
Technical SEO | | cmjolley0 -
Index Issues with Iframes
I have pages that are being scrapped and displayed in iframes and I wanted to see if anyone could tell me how I could get theses pages to be indexed here is a URL of one of the pages http://coggno.com/onlinetraining/safety-/other/lab-safety-1INde
Technical SEO | | PageOnePowerGang0 -
Restricted by robots.txt and soft bounce issues (related).
In our web master tools we have 35K (ish) URLs that are restricted by robots.txt and as have 1200(ish) soft 404s. WE can't seem to figure out how to properly resolve these URLs so that they no longer show up this way. Our traffic from SEO has taken a major hit over the last 2 weeks because of this. Any help? Thanks, Libby
Technical SEO | | GristMarketing0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0