Iframes, AJAX, JS, Etc.
-
Just started SEO on some legacy sites running JS navigation. Are there any proven ways to stop Google from parsing links and passing internal linkjuice? Ex: iframes, Ajax, JS, etc. Google is parsing some JS links on a couple of our legacy sites. The problem is that some pages are getting link juice and others aren't. It's also unpredictable which links are parsed and which aren't. The choice is rebuild the navigation (ouch), or figure out a way to block JS links entirely and build a simple text based secondary nav for link juice distribution. I definitely don't want to use nofollow.
Any thoughts?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
IFrame Persistent Music Player - SEO impact
Hi, I'd like to know if using an iFrame for a persistent music player on a website - e.g https://www.sampletoolsbycr2.com - would have any negative impact on SEO? For example, it doesn't cause any issues preventing the site being crawled? Thanks, Joe
Technical SEO | | iweb_agency0 -
Can an AJAX framework (using HTML5 + pushstate) on your site impact your ranking?
Hello everybody, I am currently investigating a website which is rendered by an AJAX Framework (Angularjs) using the HTML5 +API history - Pushstate methods.
Technical SEO | | Netsociety
Recently Google announced that they are able to execute Javascript and can therefore see the content and links to discover all pages in the structure. However it seems that it doesn't run the Javascript at ALL times. (after some internal testing) So technically it is possible it arrives on a page without seeing any content and links, while another time he can arrive, run Javascript and read/discover the content and links generated by AJAX.
The fact that Google can't always interpret or read the website correctly can therefore have negative SEO impact? (not the indexation process but ranking) We are aware that is better to create a snapshot of the page but in the announcement of Google they state that the method that is currently used, should be sufficient. Does anybody have any experience with this AND what is the impact on the ranking process? Thanks!0 -
Google's ability to crawl AJAX rendered content
I would like to make a change to the way our main navigation is currently rendered on our e-commerce site. Currently, all of the content that appears when you click a navigation category is rendering on page load. This is currently a large portion of every page visit’s bandwidth and even the images are downloaded even if a user doesn’t choose to use the navigation. I’d like to change it so the content appears and is downloaded only IF the user clicks on it, I'm planning on using AJAX. As that is the case it wouldn’t not be automatically on the site(which may or may not mean Google would crawl it). As we already provide a sitemap.xml for Google I want to make sure this change would not adversely affect our SEO. As of October this year the Webmaster AJAX crawling doc. suggestions has been depreciated. While the new version does say that its crawlers are smart enough to render AJAX content, something I've tested, I'm not sure if that only applies to content injected on page load as opposed to in click like I'm planning to do.
Technical SEO | | znotes0 -
How does your crawler treat ajax links?
Hello! It looks like the seomoz crawler (and google) follows ajax links. Is this normal behavior? We have implemented the canonical element and that seems to resolve most of the duplicate content issues. Anything else we can do? Example: Krom
Technical SEO | | AJPro0 -
.com domain is an iframe copy of a .net domain?
Hey folks, This one is over my head. I'm helping out a friend's dental office website (www.capitolperiodontal.com), and their home page source code points to the .net TLD for its content apparently: | | <title></span>http://www.capitolperiodontal.com/</title> http-equiv="content-type" content="text/html" /> rows="100%" id="dd_frameset_0001"> src="http://www.capitolperiodontal.net/" name="dd_content_0001" framespacing="0" frameborder="0" noresize="noresize" title="capitolperiodontal.com" /> <noframes></noframes> My idea was to load all the content from the .net to the .com, then redirect the .net to the .com as it has better domain authority and is, well a .com. Any insights what this iframe biz is all about and if my strategy above is ok? Many thanks folks! john
Technical SEO | | juanzo0070 -
Ajax #! URLs, Linking & Meta Refresh
Hi, We recently underwent a platform change and unfortunately our updated ecom site was coded using java script. The top navigation is uncrawlable, the pertinent product copy is undetectable and duplicated throughout the code, etc - it needs a lot of work to make it (even somewhat) seo-friendly. We're in the process of implementing ajax #! to our site and I've been tasked with creating a document of items that I will test to see if this solution will help our rankings, indexing, etc (on Google, I've read the issues w/ Bing). I have 2 questions: 1. Do I need to notify our content team who works on our linking strategy about the new urls? Would we use the #! url (for seo) or would we continue to use the clean url (without the #!) for inbound links? 2. When our site transferred over, we used meta refresh on all of the pages instead of 301s for some reason. Instead of going to a clean url, our meta refresh says this: . Would I update it to have the #! in the url? Should I try and clean up the meta refresh so it goes to an actual www. url and not this browsererrorview page? Or just push for the 301? I have read a ton of articles, including GWT docs, but I can't seem to find any solid information on these specific questions so any help I can get would be greatly appreciated. Thanks!
Technical SEO | | Improvements0 -
Dealing with indexable Ajax
Hello there, My site is basically an Ajax application. We assume lots of people link into deep pages on the site, but bots won't be able to read past the hashmarks, meaning all links appear to go to our home page. So, we have decided to form our Ajax for indexing. And so many questions remain. First, only Google handles indexable Ajax, so we need to keep our static "SEO" pages up for Bing and Yahoo. Bummer, dude, more to manage. 1. How do others deal with the differences here? 2. If we have indexable Ajax and static pages, can these be perceived as duplicate content? Maybe the answer is to disallow google bot from indexing the static pages we made. 3. What does your canonical URL become? Can you tell different search engines to read different canonical URLs? So many more questions, but I'll stop there. Curious if anyone here has thoughts (or experience) on the matter. Erin
Technical SEO | | ErinTM2 -
Are Google now indexing iFrames?
A client is pulling content through an iFrame, and when searching for a snippet of that exact content the page that is pulling the data is being indexed and not the iFrame page. Seen this before?
Technical SEO | | White.net0