Is it OK to have Search Engines Skip Ajax Content Execution?
-
I recently added some ajax pages to automatically fill in small areas of my site upon page loading. That is, the user doesn't have to click anything. Therefore when Google and Bing crawl the site the ajax is executed too. However, my understanding is that does not mean Google and Bing are also crawling the ajax content.
I actually would prefer that the content would be not be executed OR crawled by them. In the case of Bing I would prefer that the content not even be executed because indications are that the program exits the ajax page for Bing because Bing isn't retaining session variables which that page uses, which makes me concerned that perhaps when that happens Bing isn't able to even crawl the main content..dunno..So, ajax execution seems potentially risky for normal crawling in this case.
I would like to simply have my program skip the ajax execution for Google and Bing by recognizing them in the useragent and using an If robot == Y skip ajax approach. I assume I could put the ajax program in the robots.txt file but that wouldn't keep Bing from executing it (and having that exit problem mentioned above). It would be simpler to just have them skip the ajax execution altogether.
Is that ok or is there a chance the search engines will penalize my site if they find out (somehow) that I have different logic for them than for the actual users? In the past this surely was not a concern but I understand that Google is increasingly trying to become like a browser so may increasingly have a problem with this approach.
Thoughts?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Copied Content - Who is a winner
Someone copied the content from my website just I publish the article. So who is the winner? and I am in any problem? What to do? Please check Image. RkJ0p9l.jpg
Intermediate & Advanced SEO | | varunrupal0 -
Creating a site search engine while keeping SEO factors in mind
I run and own my own travel photography business. (www.mickeyshannon.com) I've been looking into building a search archive of photos that don't necessarily need to be in the main galleries, as a lot of older photos are starting to really clutter up and take away the emphasis from the better work. However, I still want to keep these older photos around. My plan is to simplify my galleries, and pull out 50-75% of the lesser/older photos. All of these photos will still be reachable by a custom-build simple search engine that I'm building to house all these older photos. The photos will be searchable based on keywords that I attach to each photo as I add them to my website. The question I have is whether this will harm me for having duplicate content? Some of the keywords that would be used in the search archive would be similar or the same to the main gallery names. However, I'm also really trying to push my newer and better images out there to the front. I've read some articles that talk about noindexing search keyword results, but that would make it really difficult for search engines to even find the older photos, as searching for their keywords would be the only way to find them. Any thoughts on a way to work this out that benefits, or at least doesn't hurt me, SEO-wise?
Intermediate & Advanced SEO | | msphotography0 -
Does alt tag optimization benefit search rankings (not image search) at all?
The benefits of alt tag optimization for traditional SEO has always been a "yo yo" subject for me. Way back in the day (2004 to 2007) I believed there was some benefit to alt tag SEO. However as time went on I saw evidence that the major search engines were no longer considering alt tag SEO as a ranking signal. However I later had the pleasure to work on a joint project with a high end SEO firm in 2011/2012. My colleagues fully believed that alt tag optimization was still a very important strategy for traditional SEO at that time. Is there any evidence available that alt tags still help with traditional SEO nowadays? I'm fully aware of the benefits of optimized alt tags and image search. However could optimized alt tags be one of those ranking factors that Google removed due to abuse and later quietly resurrected?
Intermediate & Advanced SEO | | RosemaryB0 -
How should I handle URL's created by an internal search engine?
Hi, I'm aware that internal search result URL's (www.example.co.uk/catalogsearch/result/?q=searchterm) should ideally be blocked using the robots.txt file. Unfortunately the damage has already been done and a large number of internal search result URL's have already been created and indexed by Google. I have double checked and these pages only account for approximately 1.5% of traffic per month. Is there a way I can remove the internal search URL's that have already been indexed and then stop this from happening in the future, I presume the last part would be to disallow /catalogsearch/ in the robots.txt file. Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Noindex Valuable duplicate content?
How could duplicate content be valuable and why question no indexing it? My new client has a clever african safari route builder that you can use to plan your safari. The result is 100's of pages that have different routes. Each page inevitably has overlapping content / destination descriptions. see link examples. To the point - I think it is foolish to noindex something like this. But is Google's algo sophisticated enough to not get triggered by something like this? http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-july-november
Intermediate & Advanced SEO | | Rich_Coffman
http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-december-june0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
How to best handle expired content?
Similar to the eBay situation with "expired" content, what is the best way to approach this? Here are a few examples. With an e-commerce site, for a seasonal category of "Christmas" .. what's the best way to handle this category page after it's no longer valid? 404? 301? leave it as-is and date it by year? Another example. If I have an RSS feed of videos from a big provider, say Vevo, what happens when Vevo tells me to "expire" a video that it's no longer available? Thank you!
Intermediate & Advanced SEO | | JDatSB0 -
301 redirect for duplicate content
Hey, I have just started working on a site which is a video based city guide, with promotional videos for restaurants, bars, activities,etc. The first thing that I have noticed is that every video on the site has two possible urls:- http://www.domain.com/venue.php?url=rosemarino
Intermediate & Advanced SEO | | AdeLewis
http://www.domain.com/venue/rosemarino I know that I can write a .htaccess line to redirect one to the other:- redirect 301 /venue.php?url=rosemarino http://www.domain.com/venue/rosemarino but this would involve creating a .htaccess line for every video on the site and new videos that get added may get missed. Does anyone know a way of creating a rule to rewrite these urls? Any help would be most gratefully received. Thanks. Ade.0