Should I do something about this duplicate content? If so, what?
-
On our real estate site we have our office listings displayed. The listings are generated from a scraping script that I wrote. As such, all of our listings have the exact same description snippet as every other agent in our office. The rest of the page consists of site-wide sidebars and a contact form. The title of the page is the address of the house and so is the H1 tag.
Manually changing the descriptions is not an option.
Do you think it would help to have some randomly generated stuff on the page such as "similar listings"?
Any other ideas?
Thanks!
-
Until your site is the KickAss site in your SERPs just add something catchy to the title tag like "Schedule a Tour!" ....... or....... "Free Beer"........ or..... "See it Today!"
-
Right... after your site is established this might not be a problem. I know that your site is relatively new and that it will become the KickAss site in your SERPs.
Don't do obsessive SEO if you can do efficient SEO.
-
Thank you! You've got some great points!
I like the idea of having both the address and the mls in the title and then reversing them for the mls.
For the photos I have the address as my alt tag. I could certainly add the mls too.
-
Oooh. I like this thought. Right now for most of these searches we are on the front page but not #1. However, this is a brand new site and I haven't built any links to it. So, perhaps, once I've got links and my site is viewed as the "kickass site in the niche" then the duplication will only be a problem for the other realtors?
-
The property address is most important and would definitely use that in the title. You'll find the MLS # to be almost as important. Why not include both in the title? Then reverse the order for H1?
I wouldn't be too concerned about duplicate content. I'm not sure about your area but most areas have an MLS that is syndicating the listings to hundreds, if not thousands, of sites which all use the same description.
In working with real estate sites I also found that "house for sale on {street name}" or "home for sale on {street name}" tended to drive traffic to the the individual property pages.
What are you doing with the property photos? I'd optimize those as well for the property address and MLS number.
-
Go out into the SERPs. See what's happening.
If you have the kickass site in the niche, your page for this home might rank well.
Other guy's problem, not yours.
-
LOL...this is why I was asking the question. Is there anything I can do to help other than manually changing the descriptions?
-
That's even worse.
-
Whoah! You definitely don't want that...
-
Oh...I may have worded my question incorrectly! The content is not duplicated across my site. Rather, the home description is the exact same content as on several other realtors' sites.
-
You can always just have the content indexable on one page and add it to an image for all the other pages.
-
I'd love to discuss this...in fact, I'm going to start a new discussion on it!
-
It's not that, it's just that it's potentially damaging (sorry, I'm quoting that Market Motive seminar again... been doing that a lot lately lol) to have an H1 and title tag that match.
-
Interesting idea. We do get hits because of the content in the description though. for example, we get a lot of hits for "In law suite".
-
Good idea, or have it in an iframe!
-
Is it possible for you to put that listing content in an image? This would allow you to continue using indentical content on all pages. However, the content in the image would not be searchable. If you are just using this content for the user experience, that's fine. If you want it indexed to add quality to the page, you will instead want to make each listing unique.
-
I guess it makes sense to have a different h1. What do you think would be most effective? I think the title should be the house address as this is most likely to be searched. Perhaps the H1 could be "MLS #123456"?
-
I don't know the answer to the actual question but I do know that you should never have the title and h1 match... or have dupe meta descriptions but you already know that
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
How to Set Up Canonical Tags to Eliminate Duplicate Content Error
Google Webmaster Tools under HTML improvements is showing duplicate meta descriptions for 2 similar pages. The 2 pages are for building address. The URL has several pages because there are multiple property listings for this building. The URLs in question are: www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan/page/3 www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan How do I correct this error using canonical tags? Do I enter the URL of the 1<sup>st</sup> page under “Canonical URL” under “Advanced” to show Google that these pages are one and the same? If so, do I enter the entire URL into this field (www.metro-manhattan.com /601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan) or an abbreviated version (/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan)? Please see attached images. Thanks!! Alan rUspIzk 34aSQ7k
Intermediate & Advanced SEO | | Kingalan10 -
Copying my Facebook content to website considered duplicate content?
I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you
Intermediate & Advanced SEO | | knielsen0 -
Can videos be considered duplicate content?
I have a page that ranks 5 and to get a rich snippet I'm thinking of adding a relevant video to the page. Thing is, the video is already on another page which ranks for this keyword... but only at position 20. As it happens the page the video is on is the more important page for other keywords, so I won't remove it. Will having the same video on two pages be considered a duplicate?
Intermediate & Advanced SEO | | Brocberry0 -
Do you bother cleaning duplicate content from Googles Index?
Hi, I'm in the process of instructing developers to stop producing duplicate content, however a lot of duplicate content is already in Google's Index and I'm wondering if I should bother getting it removed... I'd appreciate it if you could let me know what you'd do... For example one 'type' of page is being crawled thousands of times, but it only has 7 instances in the index which don't rank for anything. For this example I'm thinking of just stopping Google from accessing that page 'type'. Do you think this is right? Do you normally meta NoIndex,follow the page, wait for the pages to be removed from Google's Index, and then stop the duplicate content from being crawled? Or do you just stop the pages from being crawled and let Google sort out its own Index in its own time? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
What constitutes duplicate content?
I have a website that lists various events. There is one particular event at a local swimming pool that occurs every few months -- for example, once in December 2011 and again in March 2012. It will probably happen again sometime in the future too. Each event has its own 'event' page, which includes a description of the event and other details. In the example above the only thing that changes is the date of the event, which is in an H2 tag. I'm getting this as an error in SEO Moz Pro as duplicate content. I could combine these pages, since the vast majority of the content is duplicate, but this will be a lot of work. Any suggestions on a strategy for handling this problem?
Intermediate & Advanced SEO | | ChatterBlock0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0