Would using javascript onclick functions to override href target be ok?
-
Hi all,
I am currently working on a new search facility for me ecommerce site... it has very quickly dawned on me that this new facility is far better than my standard product pages - from a user point of view - i.e lots of product attributes for customers to find what they need faster, ability to compare products etc... All in all just better. BUT NO SEO VALUE!!!
i want to use this search facility instead of my category/product pages... however as they are search pages i have "robots noindex them" and dont think its wise to change that...
I have spoken to the developers of this software and they suggested i could use some javascript in the navigation to change the onlclick function to take the user to the search equivelant of the page...
They said this way my normal pages are the ones that are still indexed by google etc, but the user has the benefit of using the improved search pages...
This sounds perfect, however it also sounds a little deceptive... and i know google has loads of rules about these kinds of things, the last thing i want is to get any kind of penalty or any negative reaction from an SEO point of view... I am only considering this as it will improve the user experience on my website...
Can any one advise if this is OK, or a "no no"...
P.s for those wondering i use an "off the shelf" cart system and it would cost me an arm and a leg to have these features built into my actual category / product pages.
-
Hello James,
Why do these pages have "no SEO value"? Is it because they are AJAX pages or because you have them noindexed? Or both?
To answer your original question, using an on-click javascript event to send a user to a page other than the URL listed in the href tag is borderline. It goes beyond the risk level I would feel comfortable with on an eCommerce site, but a lot of affiliate sites do this. For instance, all of their links out to merchant sites may go through a directory called /outlink/ so the href tag might look like .../outlink/link1234 and appear to send the user to another page on their domain, when actually the user gets redirected to the merchant's (e.g. Amazon.com, Best Buy...) website. Sometimes the user is redirected from the /outlink/... URL and sometimes they never even get that far because the javascript sends them to the merchant's URL first.
It is not cloaking unless you are specifically treating Google differently. If Google doesn't understand your site that is their problem. If you have code that essentially says "IF Google, THEN do this. ELSE do that" it is your problem because you are cloaking. Make sense? There is a very distinct line there.
The bottom line is if you want to show users a certain page then you should be showing that page to Google as well. If the problem is the content on that page doesn't appear for Google (e.g. AJAX) then you should look into optimizing that type of content to the best of your ability. For example, look into the use of hashbangs (#!) as in:
https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
-
1. Google understands simple JS that is inline with your HTML. So Google understands that
is a link to domain.com. You can obfuscate this further and Google might not understand it. I've not seen Google try to parse or execute JS but that doesn't mean they can't or won't in the future.3. Google is very unlikely to spider AJAX. Many AJAX pages don't return any user readable content (most of mine return things like JSON, which is not for end user consumption) and , as such, are beyond the scope of indexation. Again, as in #2, you might want this content to be shown elsewhere if you want it indexed. https://developers.google.com/webmasters/ajax-crawling/
-
ok, i am not keen on this approach, the developers have offered an alternative... but again, i'm not sure about it, they have said they can use ajax to force their search results / navigation over my current navigation / products on my category / product pages...
this gets rid of having to use javascript to send to different url... but up above Alan mentions cloaking, which to my understanding is basically serving anything different for a search engine / person... and thats what this will do... it serves up a different navigation to people... and the products could be listed in a different order etc... search engines do not see the ajax...
Is this any better? or just as negative?
-
Are they identical, you say the search equivalent, I just wouldn't treat search engines any different
-
even thou the content is identical?
It is only the way that content can then be navigated that is different...
-
Well then, yes I would be concerned, you are serving up different content to users, that is cloaking.
-
Hi Alan,
i think i may have explained incorrectly - my search page does have the meta tag noindex,follow - it also has a canonical link back to the main search page (i.e search.html) so i do not think any of the search results will be indexed. So my concern is not duplicate content, this should not happen...
My concern is the fact i am using javascript to litterally divert customers from one page to another... its almost like the static pages are there only for the benefit of google... and thats concerning me...
-
Google can follow JavaScript links, unless you are very good at hiding them.
I would not worry too much about the duplicate content, don't expect the duplicates to rank, but your not likely to be penalized for them. you can use a canonical tag to point all search results back to the one page.
I would not no index any pages, any links pointed to a no-index page are pouring their link juice away. if you want to no index a page use the meta tag no-index,follow, this way the search engine will follow the links and flow back out to your site
read about page rank and how link juice flows
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migration + Change of Address Tool used - previous site de-indexed!!
OMG disaster! Recently migrated my site womencycles.com to moonrise.health. Painstakingly went through each URL manually to map out redirects, notified Google via change of address tool. Bam. My old website has disappeared from Google and my new site has thus lost all it's organic (i.e. redirected) traffic. I don't get it. I think I have done everything by the book, but it seems my old site has disappeared and no authority or link juice has been passed to my new site by the 301s, as the new site isn't ranking either. Some examples: https://www.google.com/search?q=women+cycles&oq=women+cycles&aqs=chrome..69i57j69i65j69i61l2j69i60.1834j0j1&sourceid=chrome&ie=UTF-8 'women cycles' previous position 1
Technical SEO | | tikitaka
https://www.google.com/search?q=chaffed+vagina&oq=chaffed+vagina&aqs=chrome..69i57.2370j0j1&sourceid=chrome&ie=UTF-8 - chaffed vagina, previous position 1 https://www.google.com/search?q=how+long+does+it+take+turmeric+to+shrink+fibroids&oq=how+long+does+it+take+turmeric+to+shrink+fibroids&aqs=chrome..69i57.1355j0j1&sourceid=chrome&ie=UTF-8 - how long does it take turmeric to shrink fibroids, previous position 1. Biggest traffic source pages were: https://womencycles.com/blog/top-10-home-remedies-that-claim-to-tighten-vagina-do-they-work/
https://womencycles.com/blog/sore-breasts-after-period-has-finished/
https://womencycles.com/blog/what-is-vaginal-gas-queefing/
https://womencycles.com/blog/tired-during-ovulation/
https://womencycles.com/blog/how-to-get-rid-of-saggy-vag-without-surgery/
https://womencycles.com/blog/vagina-chafing-causes-treatments-to-prevent-it-from-coming-back/
https://womencycles.com/blog/vaginal-dryness-during-pregnancy/ New blog articles on new site, with 301 redirect in place, but not ranking Screenshot shows my search traffic for my new site. Site migrated 13 June. Any ideas anyone??!Screenshot 2022-06-28 at 13.27.41.png0 -
Using one robots.txt for two websites
I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this: User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?
Technical SEO | | ciehmoz0 -
Should I Use Two Domains for Multi Language Sites?
I have an immigration attorney that wants a website in English and another in Spanish. We're going to have some of the website content from the English site translated via a translator to make it true, conversation spanish (automatic translators are good, but not perfect, and we want perfect). So my question is do you think we should use two different domains (englishsite.com, spanishsite.com), a subdomain (spanishsite.englishsite.com) or maybe just a separate section of the regular site (englishsite.com/spanishcontent)? My thought would be either a subdomain or a separate section so that we're not splitting PR.
Technical SEO | | atstickel120 -
Which Pagination/Canonicalization Page Selection Approach Should be Used?
Currently working on a retail site that has a product category page with a series of pages related to each other i.e. page 1, page 2, page 3 and Show All page. These are being identified as duplicate content/title pages. I want to resolve this through the applications of pagination to the pages so that crawlers know that these pages belong to the same series. In addition to this I also want to apply canonicalization to point to one page as the one true result that rules them all. All pages have equal weight but I am leaning towards pointing at the ‘Show All’. Catch is that products consistently change meaning that I am sometimes dealing with 4 pages including Show All, and other times I am only dealing with one page (...so actually I should point to page 1 to play it safe). Silly question, but is there a hard and fast rule to setting up this lead page rule?
Technical SEO | | Oxfordcomma0 -
An EMD with top level domain of another country. Still useful ?
Normally, EMDs (with couple of keywords in it) has more advantage over other domain names ( I understand other factors matters too & I am aware of EMD update). But what if there is an EMD with couple of keywords but top level domain is of another country. Confused? see ex - shoesinsydney.co.uk Will it STILL have a natural advantage over other domains (with no keywords in them)?
Technical SEO | | Personnel_Concept0 -
Would duplicate listings effect a client's ranking if they used same address?
Lots of duplication on directory listings using similar or same address, just different company names... like so-and-so carpet cleaning; and another listing with so-and-so janitorial services. Now my client went from a rank around 3 - 4 to not even in the top 50 within a week. -- -- -- Would duplication cause this sudden drop? Not a lot of competition for a client using keyword (janitorial services nh); -- -- -- would a competitor that recently optimized a site cause this sudden drop? Client does need to optimize for this keyword, and they do need to clean up this duplication. (Unfortunately this drop happened first of March -- I provided the audit, recommendations/implementation and still awaiting the thumbs up to continue with implementation). --- --- --- Did Google make a change and possibly find these discrepancies within listings and suddenly drop this client's ranking? And they there's Google Places:
Technical SEO | | CeCeBar
Client usually ranks #1 for Google Places with up to 12 excellent reviews, so they are still getting a good spot on the first page. The very odd thing though is that Google is still saying that need to re-verify their Google places. I really would like to know for my how this knowledge how a Google Places account could still need verification and yet still rank so well within Google places on page results? because of great reviews? --- Any ideas here, too? _Cindy0 -
Remove a directory using htaccess
Hi, Can someone tell me if there's a way using htaccess to say that everything in a particular directory, let's call it "A", is gone (http 410 code)? i.e. all the links should be de-indexed? Right now, I'm using the robots file to deny access. I'm not sure if it's the right thing to do since Google webmaster tools is showing me the link as indexed still and a 403 error code. Thanks.
Technical SEO | | webtarget0 -
Hiding text with Javascript and a more button
I am considering putting a block of text on may pages, that initially appears as a snippet with a 'show more' button that expands to show the whole lot. Question: If the search engines can see the whole lot, but the visitor only sees the snippet until they click 'show more' then is this cloaking? Is it a really bad idea? Or can I get away with it because I am not being deceptive just improving the design? Help!
Technical SEO | | mascotmike0