Would using javascript onclick functions to override href target be ok?
-
Hi all,
I am currently working on a new search facility for me ecommerce site... it has very quickly dawned on me that this new facility is far better than my standard product pages - from a user point of view - i.e lots of product attributes for customers to find what they need faster, ability to compare products etc... All in all just better. BUT NO SEO VALUE!!!
i want to use this search facility instead of my category/product pages... however as they are search pages i have "robots noindex them" and dont think its wise to change that...
I have spoken to the developers of this software and they suggested i could use some javascript in the navigation to change the onlclick function to take the user to the search equivelant of the page...
They said this way my normal pages are the ones that are still indexed by google etc, but the user has the benefit of using the improved search pages...
This sounds perfect, however it also sounds a little deceptive... and i know google has loads of rules about these kinds of things, the last thing i want is to get any kind of penalty or any negative reaction from an SEO point of view... I am only considering this as it will improve the user experience on my website...
Can any one advise if this is OK, or a "no no"...
P.s for those wondering i use an "off the shelf" cart system and it would cost me an arm and a leg to have these features built into my actual category / product pages.
-
Hello James,
Why do these pages have "no SEO value"? Is it because they are AJAX pages or because you have them noindexed? Or both?
To answer your original question, using an on-click javascript event to send a user to a page other than the URL listed in the href tag is borderline. It goes beyond the risk level I would feel comfortable with on an eCommerce site, but a lot of affiliate sites do this. For instance, all of their links out to merchant sites may go through a directory called /outlink/ so the href tag might look like .../outlink/link1234 and appear to send the user to another page on their domain, when actually the user gets redirected to the merchant's (e.g. Amazon.com, Best Buy...) website. Sometimes the user is redirected from the /outlink/... URL and sometimes they never even get that far because the javascript sends them to the merchant's URL first.
It is not cloaking unless you are specifically treating Google differently. If Google doesn't understand your site that is their problem. If you have code that essentially says "IF Google, THEN do this. ELSE do that" it is your problem because you are cloaking. Make sense? There is a very distinct line there.
The bottom line is if you want to show users a certain page then you should be showing that page to Google as well. If the problem is the content on that page doesn't appear for Google (e.g. AJAX) then you should look into optimizing that type of content to the best of your ability. For example, look into the use of hashbangs (#!) as in:
https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
-
1. Google understands simple JS that is inline with your HTML. So Google understands that
is a link to domain.com. You can obfuscate this further and Google might not understand it. I've not seen Google try to parse or execute JS but that doesn't mean they can't or won't in the future.3. Google is very unlikely to spider AJAX. Many AJAX pages don't return any user readable content (most of mine return things like JSON, which is not for end user consumption) and , as such, are beyond the scope of indexation. Again, as in #2, you might want this content to be shown elsewhere if you want it indexed. https://developers.google.com/webmasters/ajax-crawling/
-
ok, i am not keen on this approach, the developers have offered an alternative... but again, i'm not sure about it, they have said they can use ajax to force their search results / navigation over my current navigation / products on my category / product pages...
this gets rid of having to use javascript to send to different url... but up above Alan mentions cloaking, which to my understanding is basically serving anything different for a search engine / person... and thats what this will do... it serves up a different navigation to people... and the products could be listed in a different order etc... search engines do not see the ajax...
Is this any better? or just as negative?
-
Are they identical, you say the search equivalent, I just wouldn't treat search engines any different
-
even thou the content is identical?
It is only the way that content can then be navigated that is different...
-
Well then, yes I would be concerned, you are serving up different content to users, that is cloaking.
-
Hi Alan,
i think i may have explained incorrectly - my search page does have the meta tag noindex,follow - it also has a canonical link back to the main search page (i.e search.html) so i do not think any of the search results will be indexed. So my concern is not duplicate content, this should not happen...
My concern is the fact i am using javascript to litterally divert customers from one page to another... its almost like the static pages are there only for the benefit of google... and thats concerning me...
-
Google can follow JavaScript links, unless you are very good at hiding them.
I would not worry too much about the duplicate content, don't expect the duplicates to rank, but your not likely to be penalized for them. you can use a canonical tag to point all search results back to the one page.
I would not no index any pages, any links pointed to a no-index page are pouring their link juice away. if you want to no index a page use the meta tag no-index,follow, this way the search engine will follow the links and flow back out to your site
read about page rank and how link juice flows
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Rank regional homepages using canonicals and hreflangs
Here’s a situation I’ve been puzzling with for some time: The situation
Technical SEO | | dmduco
Please consider an international website targeting 3 regions. The real site has more regions, but I simplified the case for this question. screenshot1.png There is no default language. The content for each regional version is meant for that region only. The website.eu page is dynamic. When there is no region cookie, the page is identical to website.eu/nl/ (because Netherlands is the most important region) When there is a region cookie (set by a modal), there is a 302 redirect to the corresponding regional homepage What we want
We want regional Google to index the correct regional homepages (eg. website.eu/nl/ on google.nl), instead of website.eu.
Why? Because visitors surfing to website.eu sometimes tend to ignore the region modal and therefor browse the wrong version.
For this, I set up canonicals and hreflangs as described below: screenshot2.png The problem
It’s 40 days now since the above hreflangs and canonicals have been setup, but Google is still ranking website.eu instead of the regional homepages.
Search console’s report for website.eu: screenshot3.png Any ideas why Google doesn’t respect our canonical? Maybe I’m overlooking something in this setup (combination of hreflangs and canonicals might be confusing)? Should I remove the hreflangs on the dynamic page, because there is no self-referencing hreflang? Or maybe it’s because website.eu has gathered a lot of backlinks over the years, whereas the regional homepages have much less, which might be why Google chooses to ig nore the canonical signals? Or maybe it’s a matter of time and I just need to wait longer? Note: I’m aware the language subfolders (eg. /be_nl) are not according to Google’s recommendations. But I’ve seen similar setups (like adobe.com and apple.com) where the regional homepage is showing ok. Any help appreciated!0 -
Href lang issues - help needed!
Hi, I have an issue with Google indexing the US version of our website rather than the UK version on Google.co.uk. I have added hreflang tags to both sites (https://www.pacapod.com/ and https://us.pacapod.com/), have updated and submitted an XML sitemap for each website and checked that the country targeting in search console is set-up correctly but Google are still indexing the wrong website. I would be grateful for any assistance with this issue. Many thanks Eddie
Technical SEO | | mypetgiftbox0 -
Using Rel=Author with Multiple Contributors
I have multiple contributors who provide content on our page. I have created an authors page that shows the picture and bio of each author along with their Google+ profile link. Each profile link goes to the authors respective profile where I have had them verify themselves as contributors. My question is will Google see each of these authors and attribute the rel=author tag correctly (even though they are listed on the same profile page) or will Google only take the first person I point to for Rel=Author?
Technical SEO | | PLEsearch0 -
Google Penguin - Target Landing Page
Hello, One of our sites have been hit by the first penguin update back in April and ever since then we have been removing links and submitting reconsideration requests... It only seems to have affected our home page as some of our internal landing pages are still ranking OK in the SERPS #1 / #2. I'm just wondering if we created a landing page for this keyword and drove high quality / relevant links to this landing page could we get it to rank higher than our homepage even though our Homepage is on the 5th page.Hope the above make sense. Has anybody had any joy with this?
Technical SEO | | ScottBaxterWW0 -
Why is an error page showing when searching our website using Google "site:" search function?
When I search our company website using the Google site search function "site:jwsuretybonds.com", a 400 Bad Request page is at the top of the listed pages. I had someone else at our company do the same site search and the 400 Bad Request did not appear. Is there a reason this is happening, and are there any ramifications to it?
Technical SEO | | TheDude0 -
What tool can i use to get the true speed of my site
hi, i am trying to get the true speed of my site. i want to know how fast www.in2town.co.uk is but the tools that i am using are giving me different readings. http://tools.pingdom.com/fpt/#!/DkHoNWmZh/www.in2town.co.uk says the speed is 1.03s http://gtmetrix.com/reports/www.in2town.co.uk/i4EMDk34 says my speed is 2.25s and http://www.vertain.com/m.q?req=cstr&reqid=dAv79lt8 says it is 4.36s so as you can see i am confused. I am trying to get the site as fast as possible, but need to know what the correct speed is so i can work on things that need changing to make it faster. can anyone also let me know what speed i should be working for. many thanks
Technical SEO | | ClaireH-1848860 -
Is the copy-paste function under tynt.com SEO friendly
Hello you suggest http://www.tynt.com/publisher-tools/copy-and-paste-to-share-content/ in your pro tipps but I wonder why you are not using it for your own blog under seomoz.org. I noticed that when somebody copy and paste something form my blog a strange code is added to the link: Expample: http://www.janik.cc/webdesigner-blog/2011/02/sample/#ixzz1FQvadWNV Is the #ixzz1FQvadWNV maybe not that good in a seo view of point?
Technical SEO | | MichaelJanik0