How to make AJAX content crawlable from a specific section of a webpage?
-
Content is located in a specific section of the webpage that are being loaded via AJAX.
-
Thanks Paddy! We'll definitely try these solutions.
-
Hi there,
There are plenty of really good resources online that cover this area, so I'd like to point you towards them rather than copy and paste their guidelines here!
Google has a good guide here with lots of visuals on how they crawl AJAX -
https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
They also have a short video here covering some of the basics of Google crawling AJAX and JavaScript:
https://www.youtube.com/watch?v=_6mtiwQ3nvw
You should also become familiar with pushState which is cover in lots of detail, with an example implementation in this blog post:
http://moz.com/blog/create-crawlable-link-friendly-ajax-websites-using-pushstate
The guys at Builtvisible have also put together a few good blog posts on this topic which are worth a read:
http://builtvisible.com/javascript-framework-seo/
http://builtvisible.com/on-infinite-scroll-pushstate/
Essentially, you need to make sure that Googlebot is able to render your content as you intended and that this looks the same to them as it does to users. You can often test how well they can render your content by checking the cache of your page or by using this feature in Google Webmaster Tools.
I hope that helps!
Paddy
-
Hi,
Making the content being loaded by AJAX crawlable by Google involves serving a static HTML snapshot of the content being loaded by AJAX to Google. We should make sure that the HTML snapshot is the exact copy that will be served to the visitors through AJAX.
Here you go for more information:
https://support.google.com/webmasters/answer/174992?hl=en
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content with URL Parameters
Moz is picking up a large quantity of duplicate content, consists mainly of URL parameters like ,pricehigh & ,pricelow etc (for page sorting). Google has indexed a large number of the pages (not sure how many), not sure how many of them are ranking for search terms we need. I have added the parameters into Google Webmaster tools And set to 'let google decide', However Google still sees it as duplicate content. Is it a problem that we need to address? Or could it do more harm than good in trying to fix it? Has anyone had any experience? Thanks
Intermediate & Advanced SEO | | seoman100 -
How to do Country specific indexing ?
We are a business that operate in South East Asian countries and have medical professionals listed in Thailand, Philippines and Indonesia. When I go to Google Philippines and check I can see indexing of pages from all countries and no Philippines pages. Philippines is where we launched recently. How can I tell Google Philippines to give more priority to pages from Philippines and not from other countries Can someone help?
Intermediate & Advanced SEO | | ozil0 -
Menu & Sub menu structure - section specific sub menu?
Hi, At present I have a set of regions across the nav bar with drop downs for towns within these regions. This totals about 60 links. This set up is pretty much site wide and the site ranks well for many of these towns. However, this is quite a lot of links per page (without the page specific/content links) and I am wondering about having just a horizontal sub menu of towns appear below the nav bar only specific to the selected region. Do you think cutting down the number of menu links will be beneficial to the content page's ranking. Sorry if that isn't very clear, can't think of a better way of putting it. TIA,
Intermediate & Advanced SEO | | Cornwall
Chris0 -
Does this make sense to recover from panda?
Hello guys, our website was pandalized on 9/27/2012 and we haven't been able to recover since then. I've fixed as much as possible when it comes to poor content, and we have been getting high quality links consistently for the past 3-4 months. Our blog had some duplicate content issues due to categories, tags, feeds, etc. I solved those problems before the past 2 refreshes without success. I'm considering moving the blog to a subdomain, more than PR, I'm interested in recovering from panda, and let the blog grow on its own. What do you think about that?
Intermediate & Advanced SEO | | DaveMri0 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
How to promote good content?
Our team just finished a massive piece of content.. very similar to the SEOmoz Begginer's Guide to SEO, but for the salon/aesthetics industry. We have a beautifully designed 10 Chapter, 50-page PDF which will require an email form submission to download. Each chapter is optimized for specific phrases, and will be separate HTML pages that are publicly available... very much like how this is setup: http://www.seomoz.org/beginners-guide-to-seo My question is, what's the best way to promote this thing? Any specific examples would be ideal. I think blogger outreach would likely be the best approach, but is there any specific way that I should be doing this?.. Again a specific start-to-finish example is what I'm looking for here. (I've read almost every outreach post on moz, so no need to reference them) Anyone care to rattle off a list of ideas with accompanying examples? (even if they seem like no-brainers.. I'm all ears)
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Google Translate for Unique Content
We are considering using the Google Translation tool to translate customer reviews into various languages for publication as indexable content both for users and for search engine long tail visibility and rankings. Does anyone have any experience, insights or caveats to share?
Intermediate & Advanced SEO | | edreamsbcn0 -
Duplicate page Content
There has been over 300 pages on our clients site with duplicate page content. Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs? If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87 on the most of the duplicate pages are not supposed to be there. I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.
Intermediate & Advanced SEO | | dlemieux0