Technical SEO question re: java
-
Hi,
I have an SEO question that came my way, but it's a bit too technical for me to handle. Our entire ecom site is in java, which apparently writes to a page after it has loaded and is not SEO-friendly.
I was presented with a work-around that would basically consist of us pre redering an html page to search engines and leaving the java page for the customer. It sounds like G's definition of "cloaking" to me, but I wanted to know if anyone has any other ideas or work-arounds (if there are any) on how we can make the java based site more SEO-friendly.
Any thoughts/comments you have would be much appreciated. Thanks!!
-
Oooh no thank you - I'm not a big risk-taker when it comes to SEO. he-he. Thanks again for your help!
-
With the AJAX crawlability guide implementation, Google knows they're requesting a different page than the one being shown to users, so it's not quite the same as cloaking. That being said, you could go black hat and return a completely different page, but Google has their ways of finding these things out.
-
Hi John, One more question for you if you don't mind... Creating an html snapshot (as noted in the ajax link) is different than a "serving up an html page" as described here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 ? Is that true?
-
Thought you were talking about "Java". Needless to say JavaScript can cause all sorts of issues with SEO.
-
Hi John,
You're right, I meant JavaScript. Thank you so much for the response. This definitely helps!
-
Hi, Thanks for the reply. To begin, if you turn off javascript, you can't see any of the content on our pages - not the text, navigation, etc. I'm trying to figure out how to make the content displayable without having to re-do the entire system (which isn't plausible).
Does that make sense? The site is improvementscatalog.com if you want to see it. We're in the process of building content for it, but we just recently switched platforms and these new issues popped up.
-
Hi, Thanks for the reply. To begin, if you turn off javascript, you can't see any of the content on our pages - not the text, navigation, etc. I'm trying to figure out how to make the content displayable without having to re-do the entire system (which isn't plausible).
Does that make sense? The site is improvementscatalog.com if you want to see it. We're in the process of building content for it, but we just recently switched platforms and these new issues popped up.
-
I think you mean JavaScript and not Java. What you're suggesting is what Google recommends in their AJAX crawling guide here http://code.google.com/web/ajaxcrawling/. They want you to create a static HTML page to serve to Googlebot instead of your regular page.
Google is getting better at crawling JavaScript content that's loaded asynchronously, so you might want to dedicate your resources elsewhere. On one of my sites, Google is indexing text that's loaded asynchronously (Bing isn't yet), and Matt Cutts has said that Google is crawling some comments that are loaded asynchronously, like Facebook comments (see http://www.searchenginejournal.com/google-indexing-facebook-comments/35594/)
-
Yeah I agree the work-around sounds like it may be interpreted as black hat cloaking and get you in trouble.
Can you explain further how your application is working and why its not SEO friendly?
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
302 Redirect Question
After running a site crawl. I found two 302 redirects. The two redirects go from: site.com to www.site.com & site.com/products to www.site.com/products How do I fix the 302 redirect and change it to a 301 redirect? I have no clue where to start. Thanks.
Technical SEO | | Ryan_1320 -
SEO Content Audits Questions (Removing pages from website, extracting data, organizing data).
Hi everyone! I have a few questions - we are running an SEO content audit on our entire website and I am wondering the best FREE way to extract a list of all indexed pages. Would I need to use a mix of Google Analytics, Webmaster Tools, AND our XML sitemap or could I just use Webmaster Tools to pull the full list? Just want to make sure I am not missing anything. As well, once the data is pulled and organized (helpful to know the best way to pull detailed info about the pages as well!) I am wondering if it would be a best practice to sort by high trafficked pages in order to rank them for prioritization (ie: pages with most visits will be edited and optimized first). Lastly, I am wondering what constitutes a 'removable' page. For example, when it is appropriate to fully remove a page from our website? I understand that it is best, if you need to remove a page, to redirect the person to another similar page OR the homepage. Is this the best practice? Thank you for the help! If you say it is best to organize by trafficked pages first in order to optimize them - I am wondering if it would be an easier process to use MOZ tools like Keyword Explorer, Page Optimization, and Page Authority to rank pages and find ways to optimize them for best top relevant keywords. Let me know if this option makes MORE sense than going through the entire data extraction process.
Technical SEO | | PowerhouseMarketing0 -
.htaccess redirect question
Hi guys and girls Please forgive me for being an apache noob, but I've been trawling for a while now and i can't seem to find a definitive guide for my current scenario. I've walked into a but of a cluster$%*! of a job, to rescue a horribly set up site. One of many, many problems is that they have 132 302redirects set up. Some of these are identical pages but http-https, others are the same but https-http and some are redirects to different content pages with http-http. A uniform redirecting of http to https is not an option so I'm looking to find out the best practice for reconfiguring these 302s to 301s within .htaccess? Thanks in advance 🙂
Technical SEO | | craig.gto0 -
Mobile SEO for mobile site
Hey there! I own a desktop website and just now finishe building a mobile site ( not a responsive design one) and I wish to start ranking my site on mobile serch result page. due to the automatic redirection from desktop website to the mobile site im confused: where should link to in order to SEO my mobile site? should i link to domin.com and assume the link juice is passing on? or should i link to m.domim.com?
Technical SEO | | logophone0 -
Pagination question
I have a website http://www.example.com with pagination series starting with page1.html upto page10.html. With backlinks to some of the pages ( page1.html, page2.html----page7.html). If i include rel="next" and rel="prev" on page1.html to page10.html pages. Will value of those links will be transfered to http://www.example.com This is what i interpret from http://bit.ly/mUOrn2 Am i right ?
Technical SEO | | seoug_20050 -
Is this dangerous (a content question)
Hi I am building a new shop with unique products but I also want to offer tips and articles on the same topic as the products (fishing). I think if was to add the articles and advice one piece at a time it would look very empty and give little reason to come back very often. The plan, therefore, is to launch the site pulling articles from a number of article websites - with the site's permission. Obviously this would be 100% duplicate content but it would make the user experience much better and offer added value to my site as people are likely to keep returning even when not in the mood to purchase anything; it also offers the potential for people to email links to friends etc. note: over time we will be adding more unique content and slowly turning off the pulled articled. Anyway, from an seo point of view I know the duplicate content would harm the site but if I was to tell google not to index the directory and block it from even crawling the directory would it still know there is duplicate content on the site and apply the penalty to the non duplicate pages? I'm guessing no but always worth a second opinion. Thanks Carl
Technical SEO | | Grumpy_Carl0 -
SEO Impact of IPv4 and iPv6?
Thought I'd see what the asking side of Q&A feels like 😉 We've been hearing for forever that the internet is running out of IP addresses, but I finally encountered the reality of it. I just realized that one of my sites is on a shared IP (hosted by Hosting.com, formerly HostMySite.com). My other sites with them included a unique IP, so I was surprised to discover this. They claim it's due to limitations on their IP allocations. Hosting.com doesn't have the option to buy a unique IP, but some other hosts do. I noticed, though, that many of them are using IPv6 for the new accounts. Has anyone had practical experience with having an IPv6 address and is there any impact on SEO?
Technical SEO | | Dr-Pete0