I'm invisible!
-
Dear all,
Earl 2015 i launched www.taobao.nl. Its a platform that shows the products of chinese marketplaces in our local language (auto translated) and provides the option to pay in our local currency & methods.
(i blocked all the auto translated product pages via http://www.taobao.nl/robots.txt)
All content (category pages, general pages etc) are unique and written by us.
In the past I've set up several websites and shops and i know it always takes time and effort to get a site to good positions, but i never experienced that my efforts did not pay of. Until now
The url is not blocked by Google and i see a (very) limited amount of traffic in WMT & Analytics, but the site is >page 30 on its most important keyword 'taobao'. This, while the competition on this keyword is low in my country (google.nl).
My main competitor (taobao.com.nl) seems to do worse on most SEO elements i was able to check, but they rank very well on 'taobao' and all related keywords.
What could be the reason of this very poor ranking and what should i really adjust / improve?
(besides backlinks: i'm working hard on that)Your input is deeply appreciated!!
Sander
-
Update: The bad links have been disavowed and WMT does not mention them anymore (https://www.google.com/webmasters/tools/external-links).
Also the no-follow tags have been removed.
Still working on high-quality links (:-)) and im trying to add content on a daily basis.
So far no results.
-
Thanks & bedankt
I'll disavow that link in WMT. Useful info!
-
most of your incomming links are from palminvest.nl That business went out of business. Out of business i a very public and bad way.
I would start with preventing that and getting you some good links.
https://mza.bundledseo.com/researchtools/ose/links?site=http%3A%2F%2Fwww.taobao.nl
(leuk idee wel verder
-
I wouldn't take too much notice of that as it thinks my site is penalised or sandboxed - which it is neither
It only looks at your homepage, it seems. Tested a couple of others and you definitely can't take anything away from it.
A penalty is difficult to spot if you haven't noticed a sudden drop in traffic. It takes quite a bit of digging.
-Andy
-
Hi All,
I just tried this checker: http://pixelgroove.com/serp/sandbox_checker/ and it seems there might be a penalty on the domain. Does anyone know how i can be sure this is the case and how i can ask Google to re-evaluate?
(there is no warning in Webmastertools, so i can't submit a request for re-evaluation via that way)
Regards,
Sander
-
Hi Andy,
"Have you performed an audit to see if there are issues lurking anywhere that could be causing problems?"
Is there any tool in specific that you can recommend?
Thanks!
Sander
-
Hi Mike,
The thing is that the original product content is Chinese. English would be readable for my Dutch audience, but Chinese is no option.
All content on category pages and general pages are normal unique content in my language.
The site is about 6 months old and the domain has no history.
Regards,
Sander
-
Johannes,
Auto translated content does not have a negative effect on your SEO, just on user readability. Using Google Translate across your entire website will make your website hard to read in those translated languages. Take a page that is translated, and translate it back to English. It is the quickest fix yes, but more work needs to be done with the translator toolkit.
Thanks,
- Mike
-
To be appearing so far down the SERP's, it sounds like a trust issue from Google.
How old is the site? Have you performed an audit to see if there are issues lurking anywhere that could be causing problems? As Hector said, you are no-following links from the menu, but based on what you said above, this is to prevent the fact this information is pulled in from other sites? I suspect problems might start from around here.
From Google, this might seem like a strange way of doing things. There is no harm in allowing the links ( I would not advise no-following these), but you can no-index pages that you worry might get caught up in duplication.
Let Google crawl your site and see the pages you don't want indexed.
-Andy
-
Hi Hector,
Because the product pages have auto-translated content. I understood this has a negative effect on a sites ranking.
Sander
-
Without looking too deep into the site, I see at hrefs.com you only have 10 referring domains, and only 7 of them are not "nofollow".You should focus on a linkbuilding strategy at first.
Also, minimize and join your .js files. You load way too much external files for a site like that.
Why do you block product pages? They are the most importante pages on an ecommerce site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Strange: page no longer present in SERPS and I'm not sure why
I indexed a new page last week and it ranked 1st The page is still live, still registering sessions in analytics, registering activity in search console Why is it no longer present for the keyword in ranked first for on Friday?
Intermediate & Advanced SEO | | Jacksons_Fencing0 -
Google don't index .ee version of a website
Hello, We have a problem with our clients website .ee. This website was developed by another company and now we don't know what is wrong with it. If i do a Google search "site:.ee" it only finds konelux.ee homepage and nothing else. Also homepage title tag and meta dec is in Finnish language not in Estonian language. If i look at .ee/robots.txt it looks like robots.txt don't block Google access. Any ideas what can be wrong here? BR, T
Intermediate & Advanced SEO | | sfinance0 -
Using the same content on different TLD's
HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.
Intermediate & Advanced SEO | | dancape1 -
SEO Tools You Can't Live Without?
Hi Guys, I'm currently in the middle of creating a comprehensive blog post covering SEO Tools that I wouldn't be able to work without. So far I've got the following down, as I use these on a day to day basis and they make my job infinitely easier. SEOMoz / OSE AHrefs BuzzStream Scrapebox Xenu / Screaming Frog Excel GWT / Analytics / Adwords Keyword Tool What tools or subscriptions do you use on a daily basis and couldn't be without?
Intermediate & Advanced SEO | | SebastianCowie2 -
How can I tell if a website is a 'NoFollow'?
I've been link building for a long time but have recently discovered that most of my links are from NoFollow links, such as twitter and Youtube. How can I tell if a website is a 'NoFollow'?
Intermediate & Advanced SEO | | Paul_Tovey0 -
Can literally any site get 'burned'?
Just curious what people think. The SEOMOZ trust on my site has gone up, all while Google is dropping us in rankings for lots of keywords. Just curious if this can happen to anyone or once you are 100% 'trusted' you're good. We went from 120,000 page views down to about 50,000. All while doubling content, improving the design(at least from a user perspective), and getting more natural links. Seems counter intuitive to Google's mantra of ranking quality. I would guess 'authority' sites never get hit by these updates right? So when you make it you've made it.(at least from a dropping like a rock perspective, obviously you have to keep working). I'm guessing we just need a bunch more quality links but would hate to work on building links, quality content, trust etc for it to be something so finicky long term.
Intermediate & Advanced SEO | | astahl110 -
Competitior 'scraped' entire site - pretty much - what to do?
I just discovered a competitor in the insurance lead generation space has completely copied my client's site's architecture, page names, titles, even the form, tweaking a word or two here or there to prevent 100% 'scraping'. We put a lot of time into the site, only to have everything 'stolen'. What can we do about this? My client is very upset. I looked into filing a 'scraper' report through Google but the slight modifications to content technically don't make it a 'scraped' site. Please advise to what course of action we can take, if any. Thanks,
Intermediate & Advanced SEO | | seagreen
Greg0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0