Use of ajax to fetch data of a section
-
Hi,
Is it ok to fetch a section on a page using ajax. Will it be crawlable by Google.
I have already seen google's directions to get a complete ajax fetched page crawled by Google. Is there a way to get a particular section on a page fetched through ajax & indexed by Google.
Regards
-
Hi Anirban!
It looked like you asked a very similar question here:
http://moz.com/community/q/fetch-data-for-users-with-ajax-but-show-it-without-ajax-for-google
What additional information are you looking for?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Author Credit when Using Existing Article
Hello, We have received permission from a consultant we partner with to publish one of his articles on our site (listing him as the author, of course). However, he currently has the article published on his site, so if I put it on my site will I get penalized for stealing content? Is there some sort of tagging that will provide him/his site credit? Maybe a canonical tag?
Intermediate & Advanced SEO | | AliMac260 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
How can I use AMP html on a CMS
I have been trying to research using AMP to improve our mobile speed. We have a whole lot of sites on the same platform managed by a CMS. From what I have read, AMP html can only be used on static pages. Does that mean we would not be able to incorporate this into the html through our CMS? I would like to implement this across all our homepages to test the effectiveness of it if possible, but there is no way to rebuild all our homepages statically. Any advice is much appreciated!
Intermediate & Advanced SEO | | chrisvogel0 -
Dilemma: Should we use pagination or 'Load More' Function
In the interest of pleasing Google with their recent updates and clamping down on duplicate content and giving a higher preference to pages with rich data, we had a tiny dilemma that might help others too. We have a directory like site, very similar to Tripadvisor or Yelp, would it be best to: A) have paginated content with almost 40 pages deep of data < OR > B) display 20 results per page and at the bottom have "Load More" function which would feed more data only once its clicked. The problem we are having now is that deep pages are getting indexed and its doing us no good, most of the juice and page value is on the 1st one, not the inner pages. Wondering what are the schools of thought on this one. Thanks
Intermediate & Advanced SEO | | danialniazi0 -
Canonical & noindex? Use together
For duplicate pages created by the "print" function, seomoz says its better to use noindex (http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not) and JohnMu says its better to use canonical http://www.google.com/support/forum/p/Webmasters/thread?tid=6c18b666a552585d&hl=en What do you think?
Intermediate & Advanced SEO | | nicole.healthline1 -
Do any of you regularly use expired domains?
I know there has been discussion on using expired domains in the past. This is not so much a question as to how to do it or whether it works, but rather I would love to see how many of you use this in your backlink strategy. I have a domain in a low to moderately competitive niche that ranks really well, mostly on the power of a couple of expired domains. I bought the domains, created a quick wordpress site and pointed some anchor texted links to the site. It took some time for the expired domains to regain their PR, but when they did, the benefit was great. I'm considering whether I want to do this with another domain of mine. On one hand, it's a relatively inexpensive way to get some good quality anchor texted links. But, on the other hand, something in it feels "immoral" or "sneaky" to me. What do you think?
Intermediate & Advanced SEO | | MarieHaynes0