Converse.com - flash and html version of site... bad idea?
-
I have a questions regarding Converse.com. I realize this ecommerce site is needs a lot of seo help. There’s plenty of obvious low hanging seo fruit. On a high level, I see a very large SEO issue with the site architecture.
The site is a full page flash experience that uses a # in the URL. The search engines pretty much see every flash page as the home page. To help with issue a HTML version of the site was created. Google crawls the
Home Page - Converse.com
Marimekko category page (flash version)
http://www.converse.com/#/products/featured/marimekko
Marimekko category page (html version, need to have flash disabled)
http://www.converse.com/products/featured/marimekko
Here is the example of the issue. This site has a great post featuring Helen Marimekko shoes
http://www.coolmompicks.com/2011/03/finnish_foot_prints.php
The post links to the flash Marimekko catagory page (http://www.converse.com/#/products/featured/marimekko) as I would expect (ninety something percent of visitors to converse.com have the required flash plug in). So the flash page is getting the link back juice. But the flash page is invisible to google.
When I search for “converse marimekko” in google, the marimekko landing page is not in the top 500 results. So I then searched for “converse.com marimekko” and see the HTML version of the landing page listed as the 4<sup>th</sup> organic result. The result has the html version of the page. When I click the link I get redirected to the flash Marimekko category page but if I do not have flash I go to the html category page.
-----
Marimekko - Converse
All Star Marimekko Price: $85, Jack Purcell Helen Marimekko Price: $75 ...
www.converse.com/products/featured/marimekko - Cached
So my issues are…
Is converse skating on thin SEO ice by having a HTML and flash version of their site/product pages?
Do you think it’s a huge drag on seo rankings to have a large % of back links linking to flash pages when google is crawling the html pages?
Any recommendations on to what to do about this?
Thanks,
SEOsurfer
-
Tom,
Thank you for taking the time to look at the site and giving a detailed response. I’ve been doing some research myself and my findings mirror your assessment. Thank you for recommended action items too. Converse uses http://www.asual.com/swfaddress/ which is a good site experience but as you pointed out not so hot for SEO.
--SEOsurfer
-
Great question!
Firstly - unfortunately, Steve's suggestion isn't going to be viable for you. The # portion of the URL is not available to your code server-side, so you won't be able to determine where the rel canonical should point.
Furthermore, if they are committed to keeping the flash for now, and all as a single unit so one URL (the homepage), then you are going to have to accept that some juice intended for subpages is going to go to the homepage. You cannot do anything about that aspect, so you need to focus on the rest of the problem. However, whilst far from ideal, at least the juice is hitting the site somehow.
So… what to do?
Firstly, I'd start getting into the mindset of thinking in terms of the HTML site as the main/canonical site, and the Flash site as the 'enhanced experience' version. In this way, the HTML version is going to be the version that should be crawled by Google, and should be linked to.
Actions:
- Setup detection for mobile user-agents (out of preference I'd say all, but at least those known not to support flash, such as iPhone/iPad) and search engine bots, and ensure they get served the HTML version. Currently your homepage requires a click through on iPad offering an impossible Flash download, why not serve them the HTML page off the bat.
Is this cloaking? No! The HTML version is the main version, remember? It's no more cloaking than if you detected the user agent and then chose to serve the Flash version to Googlebot.
I actually discussed this with Jane Copeland at the fantastic Distilled link building event a couple of weeks back, and she agreed with me and said if it would stand up to a manual inspection then it is the right course of action.
-
Get all links in articles, press releases, directories or whatever else that are linking to specific pages and are originating from in house (or any source you have control over) to link to the HTML pages.
-
If the user arrives, has Flash and has arrived to an HTML link, you can now redirect to the Flash link for that page so they get the 'enhanced experience'. Don't use a 301 redirect -- remember the HTML version is the main version!
-
If the user arrives via a Flash link, but doesn't have Flash, but does have javascript you can detect the # variable and redirect them to the HTML page to help them along.
-
Educate the relevant stakeholders regarding point 2. I see you have a 'flashmode=0' option, tell them about this and how to use it get the URLs they need.
So where does this leave us?
-
The search engines can crawl all your lovely content, and they can ignore the flash version completely.
-
You are getting inbound links to specific pages. These pages have their own titles and meta descriptions… and content! Because they are the real site!
-
Users with Flash arriving via these links are landing on the correct Flash page of the site and are experiencing the rich site that you want them to.
-
Users arriving without Flash are getting the correct page if they arrive via an HTML URL. If they arrive via a Flash url then they get the correct page if they have javascript on (e.g iPad users), or they get the fallback of the homepage (rare).
I had a client with an almost identical situation, and I rolled out an almost identical solution to this, and they got crawled very quickly, shot up in Google and have stayed there for months.
Hope it helps. Let us know how you get on!
-
It's definitely a drag to have your links diluted between 2 versions of the site. There are a few solutions you can use, but the easiest would probably be to start using the rel=canonical tag on the flash version which points back to the same or similar page on the HTML site. That way, the engines know that the version you want indexed is the HTML version.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unable to site crawl
Hi there, our website was revamped last year and Moz is unable to crawl the site since then. Could you please check what is the issue? @siteaudits @Crawlinfo gleneagles.com.my
Technical SEO | | helensohdg380 -
Ranking for combined version of keyword but not separated version
Hi All, My site is currently ranking on page 1 for the term "golfholidays" but is ranking at the bottom of page 3 for the term I am targeting and have optimised for, which is "golf holidays". Does anyone have any experience with the combined keyword ranking above the singular version? Nowhere on my page doesn't it mention the term "golfholidays" and backlinks to my site mostly use the anchor "golf holdiays" Thanks!
Technical SEO | | Andy94120 -
Google Deindexing Site, but Reindexing 301 Redirected Version
A bit of a strange one, a client's .com site has recently been losing rankings on a daily basis, but traffic has barely budged. After some investigation, I found that the .co.uk domain (which has been 301 redirected for some years) has recently been indexed by Google. According to Ahrefs the .co.uk domain started gaining some rankings in early September, which has increased daily. All of these rankings are effectively being stolen from the .com site (but due to the 301 redirect, the site loses no traffic), so as one keyword disappears from the .com's ranking, it reappears on the .co.uk's ranking report. Even searching for the brand name now brings up the .co.uk version of the domain whereas less than a week ago the brand name brought up the .com domain. The redirects are all working fine. There's no instance of any URLs on the site or in the sitemaps leading to the .co.uk domain. The .co.uk domain does not have any backlinks except for a single results page on ask.com. The site hasn't recently had any design or development done, the last changes being made in June. Has anyone encountered this before? I'm not entirely sure how or why Google would start indexing 301'd URLs after several years of not indexing these.
Technical SEO | | lyuda550 -
Site dropped from SERP
Hello, I've been ranking a site for the last 5 months with good success, ranking on the first page for a high traffic keyword. In the beginning of September however, my site completely dropped out of the SERPs for several of those keywords yet my site was still indexed and there was no penalty applied to my site via search console. I would assume this maybe because of the update during the time.My site came back again a week later and it was ranking much higher on the first page (#2). Today, I just checked the SERPs and my site is now gone again. It was there this morning but now as of two hours ago it is gone, as well as one of my main competitors. My site is still indexed and no penalties via search console. Does anyone know what causes these types of issues? Im assuming my site will come back in a week or so with hopefully the same or better ranking, but when I have disruptions like this it really hurts my organic traffic. Any input is appreciated. Thanks!
Technical SEO | | KathleenDC0 -
Are sitewide links bad for SEO?
I have 11 real estate sites and have had links from one to another for about 7 years but someone just suggested me to take them all out because I might get penalized or affected by penguin. My main site was affected on July of 2012 and organic visits have dropped 43%...I've been working on many aspects of my SEO but it's been difficult to come back. Any suggestions are very welcome, thanks 🙂
Technical SEO | | mbulox0 -
Does Site Structure Affect Google
Hi - I'm pretty new at this. We’re running an e-commerce affiliate site at http://www.mydomain.com. So we don’t take payments but customer gets passed through to third party sites when they select to buy a product. We have a blog at http://www.mydomain.com/news. I think Google is treating these 2 sites as as separate sites for PR. For this reason we're thinking about moving this to http://news.mydomain.com. Anyone have any experience in this?
Technical SEO | | richardjoseph0 -
Site maintenance and crawling
Hey all, Rarely, but sometimes we require to take down our site for server maintenance, upgrades or various other system/network reasons. More often than not these downtimes are avoidable and we can redirect or eliminate the client side downtime. We have a 'down for maintenance - be back soon' page that is client facing. ANd outages are often no more than an hour tops. My question is, if the site is crawled by Bing/Google at the time of site being down, what is the best way of ensuring the indexed links are not refreshed with this maintenance content? (ie: this is what the pages look like now, so this is what the SE will index). I was thinking that add a no crawl to the robots.txt for the period of downtime and remove it once back up, but will this potentially affect results as well?
Technical SEO | | Daylan1 -
Too many links on my site
Hi there everybody, I am a total SEO newbie and i am burning with questions. I had my site crawled and found out that it contains too many links. The reason is that it is a site where I constantly write news and articles and each one of them is a new Joomla item, thus a new link. I actually thought lots of content is good for SEO. How am I supposed to reduce the link amount?
Technical SEO | | polyniki0