Duplicate content issue
-
I have recently built a site that has a main page intended to rank for national coverage. This site also has a number of pages targeted at local searches, these pages are slight variations of each other with town specific keywords. Does anyone know if google will see this as spam and quarantine my site from ranking? Thanks
-
You want to rank for local searches right? Now the question is do you have physical presence in those places? If not, by making city specific pages just to get rank for will definitely invite penalty sooner or later. Think about the customers first and not the search engines.
_Now, if you do have branches in those cities, you can create Google Local listing, can have separate landing pages for them given the fact that those pages say something unique about the business etc. Do not add rehash content that no one is going to add. Focus on adding value to users’ experience. _
-
Creating a site with multiple landing pages targeted to different regions is not new, therefore Google has made updates to try attempt to stop sites with low quality from capitalizing on localized keywords (miami keyword, tuscon x, san diego x, etc) where x is your main keyword.
What this means is that you need to do more than simply duplicate your pages and mix up the keywords, replace the local terms and create new URLs and titles/descriptions. What you should do is create completely unique copy, dynanic content and/or user engagement, local citations will help each landing page, and make sure to get local backlinks to each landing page.
-
While I certainly don't want to pretend to be able to predict anything Google might do, to me, the fact that you are thinking about this as being a potential problem should be enough to make you consider some options. Depending on how many pages you have, it may not be that difficult to get really truly original content produced for those other pages.
Will Google choose not to index you? I have no idea.
My guess is that you get indexed, but may not rank very high if the content is substantially similar on all of those pages. You might get stuck in the proverbial "sandbox." (ranked so low that no one can find you).
My gut says, if you have to ask "is this duplicate content?" It probably is, so make it unique.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regarding rel=canonical on duplicate pages on a shopping site... some direction, please.
Good morning, Moz community: My name is David, and I'm currently doing internet marketing for an online retailer of marine accessories. While many product pages and descriptions are unique, there are some that have the descriptions duplicated across many products. The advice commonly given is to leave one page as is / crawlable (probably best for one that is already ranking/indexed), and use rel=canonical on all duplicates. Any idea for direction on this? Do you think it is necessary? It will be a massive task. (also, one of the products that we rank highest for, we have tons of duplicate descriptions.... so... that is sort of like evidence against the idea?) Thanks!
Web Design | | DavidCiti0 -
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
Any alternative techniques to display tabbed content without using Javascript / JSON and be SEO Friendly?
John Mueller's input in the EGWMH hangout suggests that Google MAY ignore expandable content served by Javascript. Are there any alternative techniques to display tabbed content without using Javascript / JSON and be SEO Friendly? I do however view these as good for website interactivity and UX - and see many examples of websites performing well and ranking highly whilst using these techniques - are there any Google friendly ways to serve content on a page so that search bots can recognise and choose to crawl / consume the content as legitimate fodder?
Web Design | | Fergclaw0 -
How to add SEO Content to this site
Hi Great community and hope you guys can help! I have just started on a SEO project for http://bit.ly/clientsite , the clients required initial KPI is Search Engine Rankings at a fairly low budget. The term I use for the site is a "blurb site", the content is thin and the initial strategy I want to employ to get the keyword rankings is to utilize content. The plan is to: add targeted, quality (user experience & useful) and SEO content on the page itself by adding a "read more" link/button to the "blurb" on the right of the page (see pink text in image) when someone clicks on the "read more", a box of content will slide out styled much the same as the blurb itself and appear next to and/or overlay over the blurb and most of the page (see pink rectangle in image) Question: Is this layer of targeted , quality (user experience & useful) and SEO content (which requires an extra click to get to it) going to get the same SEO power/value as if it were displayed traditionally on the initial display? If not, would it be better to create a second page (2<sup>nd</sup> layer) and have the read more link to that and then rel-canonical the blurb to that 2<sup>nd</sup> page, so that all the SEO passes to this expanded content and the second page/layer is what will show up in the rankings? Thanks in advance qvDgZNE
Web Design | | Torean0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Pagenation - Crawl Issue
Hi,
Web Design | | semvibe
We have a site with large number of products (6000 +) under each categories and so we have made a page under each category to list out all products (View all page), which lists out product in pagenation setup built on Ajax. The problem is only our 1st page is crawlable and all the other pages beyond 1st page remains hidden,
We need make all our pagenation URL’s crawlable, our requirements are we never want a change in URL as user goes to next page, want to show the user the same URL for all the pagenation numbers. Is there a perfect solution?0 -
How can i write content rich descriptions?
we have recently started using seomoz. how can i make descriptions more content rich?
Web Design | | WCGAdmin0 -
Custom URL's with Bigcommerce Issue (Is it worth it?)
We're building out a store in Bigcommerce, who for all intensive purposes is perfect for SEO besides the fact that you can not change the URL's to be custom. My question is, does this kill the SEO value of bigcommerce, despite everything else being great? So for example the URL's for a category page would be something like this www.mysite.com/categories/keyword and the product URL's are pulled in by product name, so product URL's could be something like www.mysite.com/products/Product-Description-Long-223.html (notice the words will be capitalized and their is no way to remove the trailing .html) I could go with Interspire (the liscenced version of Bigcommerce) or Magento so I can custom edit this stuff. But then its a lot more work for my employee's on the buildout.
Web Design | | iAnalyst.com0