Competitors and Duplicate Content
-
I'm curious to get people's opinion on this.
One of our clients (Company A) has a competitor that's using duplicate sites to rank. They're using "www.companyA.com" and "www.CompanyAIndustryTown.com" (actually, several of the variations). It's basically duplicate content, with maybe a town name inserted or changed somewhere on the page. I was always told that this is not a wise idea. They started doing this in the past month or so when they had a site redesign. So far, it's working pretty well for them. So, here's my questions:
-Would you address this directly (report to Google, etc.)?
-Would you ignore this?
-Do you think it's going to backfire soon?
There's another company (Company B) that's using another practice- using separate pages on their domain to address different towns, and using those as landing pages. Similar, in that a lot of the content is the same, just some town names and minor details changed. All on the same domain though. Would the same apply to that?
Thanks for your insight!
-
The only long lasting way to rank for local specific pages is to offer truly unique content on those pages, and build unique links to those pages.
The two methods you mentioned here, using near duplicate sites and pages, may work for a short time or in non-competitive niches. It may also work somewhat if a very strong link profile is backing it up... but in general these sorts of tricks usually result in a drop in rankings. If not now, then during an upcoming algorythm change.
Often times, misguided webmasters think they are doing the right thing in launching these sites and pages, and no ill intent is intended. Unless the pages are obviously spam or doorway pages, then in my opinion it's probably not worth it reporting them to Google, but that decision is of course best left to each individual.
Read more about doorway pages: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355
Consider how Yelp has 100s of pages about dentist, at least one page for every major city in America. Although the pages are similar, they are each filled with unique content and all have unique links pointing to them. Each delivers a similar message, but provides unique value based on that particular location.
Add unique value to each location specific page, and you're doing great.
-
Unfortunately, this isn't a method likely to work.
Most of the time, if you insert canonical tags on near similar pages, and Google interprets those canonical correctly, then they tend to index and rank the page that the canonical points to. So all of those other pages would have little or no search engine visibility whatsoever.
Not a good technique if you're trying to rank individual pages.
-
So ARE you suggesting that for local city pages that you add the canonical tag to point to the home page?
I guess I'm a little confused on this as Adam is?
Can you explain your thoughts behind this?
-
So let me clarify then, if they have (on same domain) multiple pages with near duplicate content, mostly changing names of cities, but use rel:canonical, they will still have the SEO benefit of ranking for different towns, but it won't be seen as duplicate content?
And then the multiple domain situation...that's just a wait and see.
-
The pages with the city specific information but similar content are pretty much the perfect space for a canonical tag. If you feel that they haven't been penalized, then this is probably the method they are using for hosting the same content.
-
here is an example of sites that have been using duplicate content with a few word changes
http://www.seomoz.org/q/duplicate-exact-match-domains-flagged-by-google-need-help-reinclusion
-
Having multiple sites with duplicate content is a bad idea as it affects your search engine rankings. The company is likely to be using bad SEO practice and soon google bots will pick this up and the domain will get penalised.
You can report to Google, but in most cases Google picks up sites that are using bad SEO techniques.
There is no harm in using separate pages on domains name to address they operate in different towns as this helps the site being found for local searches, but having content that is again duplicated and only a few words changed Google will pick this up.
Always remember Content is KING!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Have just submitted Disavow file to Google: Shall I wait until after they have removed bad links to start new content lead SEO campaign?
Hi guys, I am currently conducting some SEO work for a client. Their previous SEO company had built a lot of low quality/spam links to their site and as a result their rankings and traffic have dropped dramatically. I have analysed their current link profile, and have submitted the spammiest domains to Google via the Disavow tool. The question I had was.. Do I wait until Google removes the spam links that I have submitted, and then start the new content based SEO campaign. Or would it be okay to start the content based SEO campaign now, even though the current spam links havent been removed yet.. Look forward to your replies on this...
White Hat / Black Hat SEO | | sanj50500 -
A site is using their competitors names in their Meta Keywords and Descriptions
I can't imagine this is a White Hat SEO technique, but they don't seem to be punished for it by Google - yet. How does Google treat the use of your competitors names in your meta keywords/descriptions? Is it a good idea?
White Hat / Black Hat SEO | | PeterConnor0 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Getting links on competitor's blog
An SEO agency I'm working with has asked if we're okay with guest posting on a competitor's blog. What are the negatives of getting a link from a competitor's blog? Two things I thought of: They can remove the link at any time - why wouldn't you as a competitor? I generally don't want to alert my competition what I'm doing for SEO and how I'm doing it. Is that enough to not pursue those links? Thanks in advance for your thoughts!
White Hat / Black Hat SEO | | pbhatt0 -
Are duplicate item titles harmful to my ecommerce site?
Hello everyone, I have an online shopping site selling, amongst other items, candles. We have lots of different categories within the LED candles category. One route a customer can take is homepage > LED candles > Tealights. Within the tealights category we have 7 different products which vary only in colour. It is necessary to create separate products for each colour since we have fantastic images for each colour. To target different keywords, at present we have different titles (hence different link texts, different URLs and different H1 tags) for each colour, for example "Battery operated LED candles, amber", "Flameless candles, red" and "LED tealights, blue". I was wondering if different titles to target different keywords is a good idea. Or, is it just confusing to the customer and should I just stick with a generic item title which just varies by colour (eg. "LED battery candles, colour")? If I do the latter, am I at risk of getting downranked by Google since I am duplicating the product titles/link texts/URLs/H1 tags/img ALTs? (the description and photos for each colour are unique). Sorry if this is a little complicated - please ask and I can clarify anything...because I really want to give the best customer experience but still preserve my Google ranking. I have attached screenshots of the homepage and categories to clarify, feel free to go on the site live too. Thank you so much, Pravin BqFCp.jpg KC2wB.jpg BEcfX.jpg
White Hat / Black Hat SEO | | goforgreen0 -
Can't figure out how my competitor has so many links
I suspect something possibly black-hat is going on with the amount of inbound links for www.pacificlifestylehomes.com ( http://www.opensiteexplorer.org/links?site=www.pacificlifestylehomes.com ) mainly because they have such a large volume of links (for my industry) with their exact targeted keyword. Can anyone help clear this up for me?
White Hat / Black Hat SEO | | theChris0