Duplicate content issue
-
I have recently built a site that has a main page intended to rank for national coverage. This site also has a number of pages targeted at local searches, these pages are slight variations of each other with town specific keywords. Does anyone know if google will see this as spam and quarantine my site from ranking? Thanks
-
You want to rank for local searches right? Now the question is do you have physical presence in those places? If not, by making city specific pages just to get rank for will definitely invite penalty sooner or later. Think about the customers first and not the search engines.
_Now, if you do have branches in those cities, you can create Google Local listing, can have separate landing pages for them given the fact that those pages say something unique about the business etc. Do not add rehash content that no one is going to add. Focus on adding value to users’ experience. _
-
Creating a site with multiple landing pages targeted to different regions is not new, therefore Google has made updates to try attempt to stop sites with low quality from capitalizing on localized keywords (miami keyword, tuscon x, san diego x, etc) where x is your main keyword.
What this means is that you need to do more than simply duplicate your pages and mix up the keywords, replace the local terms and create new URLs and titles/descriptions. What you should do is create completely unique copy, dynanic content and/or user engagement, local citations will help each landing page, and make sure to get local backlinks to each landing page.
-
While I certainly don't want to pretend to be able to predict anything Google might do, to me, the fact that you are thinking about this as being a potential problem should be enough to make you consider some options. Depending on how many pages you have, it may not be that difficult to get really truly original content produced for those other pages.
Will Google choose not to index you? I have no idea.
My guess is that you get indexed, but may not rank very high if the content is substantially similar on all of those pages. You might get stuck in the proverbial "sandbox." (ranked so low that no one can find you).
My gut says, if you have to ask "is this duplicate content?" It probably is, so make it unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical and Sitemap issue
Hi all, I was told that I could change my homepage Canonical tag to match that of my XML sitemap, this sitemap is being generated for me automatically and shows the homepage as e.g. https://www.mysite.com/index.html, yet my Canonical tag has been set to https://www.mysite.com. Google currently shows as https://www.mysite.com/ being indexed, but https://www.mysite.com/index.html is not currently displayed in search results. Can someone please tell me if I should change the Canonical to the index.html version, or if I should do nothing, or remove the Canonical tag altogether? Thank you for looking.
Web Design | | scarebearz0 -
Is there an issue if we show our old mobile site to Google & new site to users
Hi, We have our existing mobile site that contains interlinking in footer & content and new mobile site that does not have interlinking. We will show existing mobile site to google crawler & new mobile site to users. Will this be taken as black hat by Google. The mobile site & desktop site will have same url across devices & browsers. Regards
Web Design | | vivekrathore0 -
HELP! IE secure page display issue on new live site
For some reason IE 7, 8, & 9 do not display the following page: https://www.jwsuretybonds.com/protools.htm All they show is the Norton seal. It shows properly in all other browsers without issue (including IE 10+), but the earlier versions flash the page for a split second, then hides everything. Can someone shed some light on this? This is a new live site we just launched minutes ago and these browsers account for 12% of our overall traffic. UGH I hate you microsoft!!! Thanks all 🙂
Web Design | | TheDude0 -
Duplicate page title caused by Shopify CMS
Hi, We have an ecommerce site set up at devlinsonline.com.au using Shopify and the MOZ crawl is returning a huge number (hundreds!) of Duplicate Page Title errors. The issue seems to be the way that Shopify uses tagging to sort products. So, using the 'Riedel' collection as an example, the urls devlinsonline.com.au/collections/riedel-glasses/ devlinsonline.com.au/collections/riedel-glasses/decanters devlinsonline.com.au/collections/riedel-glasses/vinum all have the exact same page title. We are also having the same issue with the blog and other sections of our site. Is this something that is actually a serious issue or, perhaps, is Google's algorithm intelligent enough to recognise that this is part of Shopify's layout so it will not negatively affect our rankings and can, essentially, be ignored? Thanks.
Web Design | | SimonDevlin0 -
ECWID How to fix Duplicate page content and external link issue
I am working on a site that has a HUGE number of duplicate pages due to ECWID ecommerce platform. The site is built with Joomla! How can I rectify this situation? The pages also show up as "external " links on crawls... Is it the ECWID platform? I have never worked on a site that uses this. Here is an example of a page with the issue (there are 6280 issues) URL: http://www.metroboltmi.com/shop-spare-parts?Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560081
Web Design | | Atlanta-SMO0 -
URL parameters causing duplicate content errors
My ISP implemented product reviews. In doing so, each page has a possible parameter string of ?wr=1. I am not receiving duplicate page content and duplicate page title errors for all my product URLs. The report shows the base URL and the base URL?wr=1. My ISP says that the search engines won't have a problem with the parameters and a check of Google Webmaster Tools for my site says I don't have any errors and recommends against configuring URL parameters. How can I get SEOmoz to stop reporting these errors?
Web Design | | NiftySon1 -
Has Anyone Had Issues With ASP.NET 4.0 URL Routing?
I'm seeing some odd results in my SEOMOZ results with a new site I just released that is using the ASP.NET 4.0 URL routing. I am seeing thousands(!) of duplicate results, for instance, because the crawl has uncovered something like this: http://www.mysite.com/
Web Design | | TroyCarlson
http://www.mysite.com/default.aspx (so far, so good, though I wish it wouldn't show both)
http://www.mysite.com/default.aspx/about/ (what the heck -?)
http://www.mysite.com/default.aspx/about/about/ (WTF!?)
http://www.mysite.com/default.aspx/about/about/products/ (and on and on ad infinitum) I'm also seeing problems pop up in my sitemap because extensionless urls have an odd "eurl.axd/abunchofnumbersgohere" appended to the end of every address which is breaking links. sigh Buyer beware. I've found articles that discuss the "eurl.axd" issue here and there (this one seems very good), but nothing about the weird crawl issue I outlined above. Any advice?0 -
Dynamic pages and code within content
Hi all, I'm considering creating a dynamic table on my site that highlights rows / columns and cells depending on buttons that users can click. Each cell in the table links to a separate page that is created dynamically pulling information from a database. Now I'm aware of the google guidelines: "If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few." So we wondered whether we could put the dynamic pages in our sitemap so that google could index them - the pages can be seen with javascript off which is how the pages are manipulated to make them dynamic. Could anyone give us a overview of the dangers here? I also wondered if you still need to separate content from code on a page? My developer still seems very keen to use inline CSS and javascript! Thanks a bundle.
Web Design | | tgraham0