Semi-duplicate content yet authoritative site
-
So I have 5 real estate sites.
One of those sites is of course the original, and it has more/better content on most of the pages than the other sites. I used to be top ranked for all of the subdivsion names in my town. Then when I did the next 2-4 sites, I had some sites doing better than others for certain keywords, and then I have 3 of those sites that are basically the same URL structures (besides the actual domain) and they aren't getting fed very many visits. I have a couple of agents that work with me that I loaned my sites to to see if that would help since it would be a different name. My same youtube video is on each of the respective subdivision pages of my site and theirs. Also, their content is just rewritten content from mine about the same length of content.
I have looked over and seen a few of my competitors who only have one site and their URL structures arent good at all, and their content isn't good at all and a good bit of their pages rank higher than my main site which is very frustrating to say the least since they are actually copy cats to my site. I sort of started the precedent of content, mapping the neighborhood, how far that subdivision is from certain landmarks, and then shot a video of each. They have pretty much done the same thing and are now ahead of me.
What sort of advice could you give me? Right now, I have two sites that are almost duplicate in terms of a template and same subdivsions although I did change the content the best I could, and that site is still getting pretty good visits. I originally did it to try and dominate the first page of the SERPS and then Penguin and Panda came out and seemed to figure that game out. So now, I would still like to keep all the sites, but I'm assuming that would entail making them all unique, which seems to be tough seeing as though my town has the same subdivisions.
Curious as to what the suggestions would be, as I have put a lot of time into these sites.
If I post my site will it show up in the SERPS?
Thanks in advance
-
Thanks Andy,
My thoughts here, as I just answered Russ, is that I will keep two sites (the two best performing ones, but go ahead and add great content to one, and that should help me, right? I ask because the other site that is similar is still getting moderately good traffic and it is a site for one of my other agents so it's not really tied to me, so by making my site better and better, it will make hers less and less similar right?
-
Thanks Russ,
That is pretty much what I ended up doing with my site, to where I made it a bit overwhelming for someone to compete with me and as I got super busy (because of the work that I did to my site) others have copied my sites and have started doing what I was doing. I still have higher Domain Authority than they do, and as I get more time like I have now, I will be doing more writing and researching, etc. That has been my goal all along, the only problem is one of the other sites that I have one of my agents on, has a site, pretty similar to mine, but she still gets great leads and makes money off of the site, so I don't want to quickly can that site even though it's very similar to my site. Thoughts on that?
-
Hi,
I would strongly urge you not to run multiple sites. If Google catches on to this, you will end up with a penalty for all of them. This would be seen as a way to try and 'game' the search engine by trying to find ways to get more exposure for keywords. What you will also find is that you are competing against your own sites in terms of keywords, which is never an area you want to get in to.
Look at it from Google's eyes - if they were to ask you why you have 5 sites, what would be your answer? I promise you that they wouldn't let you get away with it.
Best thing to do here is ditch all but one site and concentrate, as Russ suggests, on amazing content. Pull everything back to within one domain and create pages to target specific keywords, but don't make it spammy. Well researched content will win every time and can result in great SERP placements.
-Andy
-
Focus on one site for the time being and make it, without question, objectively, the best site in your area. Blow the competition out of the water. I'm talking thousands of words of content, interviews with neighborhood HOA members, guides to neighborhood schools, recommendations of local restaurants and businesses (great backlink opportunities, syndicated content from GreatSchools or other sites, etc. Literally make it so no other website even wants to compete with yours. Be the best. Don't divide an conquer, just conquer.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you think this case would be of a duplicated content and what would be the consequences in such case?
At the webpage https://authland.com/ which is a food&wine tours and activities booking platform, primary content - services thumbnails containing information about the destination, title and prices of the particular services, can be found at several sub-pages/urls. For example, service https://authland.com/zadar/zadar-region-food-and-wine-tour/1/. Its thumbnail/card through which the service is available, can be found on multiple pages (Categories, Destinations, All services, Most recent services...) Is this considered a duplicated content? Since all of the thumbnails for services on the platform, are to be found on multiple pages. If it is, which would be the best way to avoid that content being perceived by Google bots as such? Thank you very much!
Intermediate & Advanced SEO | | ZD20200 -
Are backlinks within duplicate content ignored or devalued?
From what I understand, Googles no longer has a "Duplicate Content Penalty" instead duplicate content simply isn't show in the search results. Does that mean that any links in the duplicate content are completely ignored, or devalued as far as the backlink profile of the site they are linking to? An example would be an article that might be published on two or three major industry websites. Are only the links from the first website GoogleBot discovers the article on counted or are all the links counted and you just won't see the article itself come up in search results for the second and third website?
Intermediate & Advanced SEO | | Consult19010 -
SEM Rush & Duplicate content
Hi SEMRush is flagging these pages as having duplicate content, but we have rel = next etc implemented: https://www.key.co.uk/en/key/brand/bott https://www.key.co.uk/en/key/brand/bott?page=2 Or is it being flagged as they're just really similar pages?
Intermediate & Advanced SEO | | BeckyKey0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Duplicate Content For E-commerce
On our E-commerce site, we have multiple stores. Products are shown on our multiple stores which has created a duplicate content problem. Basically if we list a product say a shoe,that listing will show up on our multiple stores I assumed the solution would be to redirect the pages, use non follow tags or to use the rel=canonical tag. Are there any other options for me to use. I think my best bet is to use a mixture of 301 redirects and canonical tags. What do you recommend. I have 5000+ pages of duplicate content so the problem is big. Thanks in advance for your help!
Intermediate & Advanced SEO | | pinksgreens0 -
Ezine Articles - Copied Content on Site
One of my clients has a bunch of articles on the normal article syndication sites. They have duplicated these articles on their own site. My instinct is to implement on the offending pages of the clients site. Anyone with any experience of something similar? Is this the right way to go? Thanks in advance. Justin
Intermediate & Advanced SEO | | GrouchyKids0 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0 -
Duplicate content ramifications for country TLDs
We have a .com site here in the US that is ranking well for targeted phrases. The client is expanding its sales force into India and South Africa. They want to duplicate the site entirely, twice. Once for each country. I'm not well-versed in international SEO. Will this cause a duplicate content filter? Would google.co.in and google.co.za look at google.com's index for duplication? Thanks. Long time lurker, first time question poster.
Intermediate & Advanced SEO | | Alter_Imaging0