Copied Content/ Copied Website/
-
Hello guys,
I was checking my product descriptions and I found out that there is a website that is using my descriptions word by word, also they use company name, product images, they have a link that sends you to my site, contact form..
I tried to purchase something and the order came through our email, but i made an inquire and it didn't come through. Also they have a sub-folder with my company name.
Also they have url's with my company name, and this isn't right is it?
I am confused and honestly I don't know what to do, we don't take part to any affiliation program or anything like that and we don't ship out of Europe. This is a Chinese website.
Just for curiosity, I noticed that one of our competitors is there as well, and it does seem weird.
Here is the links:
www.everychina . com/company/repsole_limited-hz1405d06.html
-
I checked analytics, backlinks, everything but it doesn't benefit us in any way. Not even the links point to my site.
So upset! What should I do? Contact Google and report the site or should I contact the company?
Thanks
-
This happens to us once in awhile. A Chinese firm simply dupes our site and changes the logo, company info and such to peddle their items (that are poor replicas). Seems like in your case they didn't replace the logo and such, but expect it to happen. Often times these sites do not get indexed (or indexed poorly) and fall into the abyss.
Always make surer to take action as Carla said (also notify Bing about it using their Support web form).
-
Hi,
We had something similar happen to us. I will tell you the story because I really do think it will help.
One of our clients liked our originally website so much that he wanted to copy it. We gave him the ok but asked him to modify the content 100% so that Google would not penalize us. He gave us access to his Google Analytics. Our client did not modify the content for about 6 months. Google recognized this as a duplicated site and our clients website received absolutely no traffic at all for keywords. The only way to get to his website was by searching his brand name. This was a great relief to us. Google penalized him not us. After about 6 months, the client finally got around to changing his content and when he did, he started ranking for keywords words.
In your case, I would disavow the chinese website and notify the Google Spam team.
Good luck.
Carla
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advice needed! How to clear a website of a Wordpress Spam Link Injection Google penalty?
Hi Guys, I am currently working on website that has been penalised by Google for a spam link injection. The website was hacked and 17,000 hidden links were injected. All the links have been removed and the site has subsequently been redesigned and re-built. That was the easy part 🙂 The problems comes when I look on Webmaster. Google is showing 1000's of internal spam links to the homepage and other pages within the site. These pages do not actually exist as they were cleared along with all the other spam links. I do believe though this is causing problems with the websites rankings. Certain pages are not ranking on Google and the homepage keyword rankings are fluctuating massively. I have reviewed the website's external links and these are all fine. Does anyone have any experience of this and can provide any recommendations / advice for clearing the site from Google penalty? Thanks, Duncan
White Hat / Black Hat SEO | | CayenneRed890 -
Can I leave off HTTP/HTTPS in a canonical tag?
We are working on moving our site to HTTPS and I was asked by my dev team if it is required to declare HTTP or HTTPS in the canonical tag? I know that relative URL's are acceptable but cannot find anything about HTTP/HTTPS. Example of what they would like to do Has anyone done this? Any reason to not leave off the protocol?
White Hat / Black Hat SEO | | Shawn_Huber0 -
Recovering from Google Penguin/algorithm penalty?
Anyone think recovery is possible? My site has been in Google limbo for the past 8 months to around a year or so. Like a lot of sites we had seo work done a while sgo and had tons of links that Google now looks down on. I worked with an seo company for a few months now and they seem to agree Penguin is the likely culprit, we are on page 8-10 for keywords that we used to be on page 1 for. Our site is informative and has everything in tact. We deleted whatever links possible and some sites are even hard to find contact information for and some sites want money, I paid a few a couple bucks in hopes maybe it could help the process. Anyway we now have around 600 something domains on disavow file we out up in March-April, with around 100 or 200 added recently as well. If need be a new site could be an option as well but will wait and see if the site can improve on Google with a refresh. Anyone think recovery is possible in a situation like this? Thanks
White Hat / Black Hat SEO | | xelaetaks0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Is linking out to different websites with the same C-Block IP bad for SEO?
Many SEOs state that getting (too many) links from the same C-Block IP is bad practice and should be avoided. Is this also applicable if one website links out to different websites with the same C-Block IP? Thus, website A, B and C (on the same server) link to website D (different server) could be seen as spam but is this the same when website D links to website A, B and C?
White Hat / Black Hat SEO | | TT_Vakantiehuizen0 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
White Hat / Black Hat SEO | | stever9990 -
Powered by/Credit backlinks and nofollow
Pseudo question: I have a website that has 100K pages. On about 50K of those pages I have information that is fed to me via an outside 3rd-party website. Now, I like to give credit where credit is due, so I add a backlink to the website that is feeding me this content. A simple backlink like so: Information provided by: Company ABC Now, this 3rd-party website wants me to remove the nofollow tags from the backlink, but I am very, very skeptical because to me, sending ~50K dofollow backlinks to a single site might make the Google monster upset with me. This 3rd-party site is being very hard-headed about this, to the point where I am thinking of terminating the relationship all together. I digress. Scoured the net before writing this, but couldn't really find anything directly related to my issue. Thoughts? Is a nofollow required here? We're not talking 1 or 2 links here; we're talking tens of thousands (50K is low; it will probably be upwards of 100K when all is said and done as my site has many, many pages). Thanks in advance.
White Hat / Black Hat SEO | | THB0 -
Buying a website and redirecting everything
We are considering purchasing an existing website in our industry with a domain authority of 52 and 20K inlinks and redirecting it to our new website with a domain authority of 26 and 1,000 inlinks. Would this be the best way to improve our new site's authority and inlinks? Would Google penalize us for doing that or would it effectively transfer the old sites authority to us?
White Hat / Black Hat SEO | | pbhatt0