Duplicate Content
-
Hi, I have a website with over 500 pages.
The website is a home service website that services clients in different areas of the UK.
My question is, am I able to take down the pages from my URL, leave them down for say a week, so when Google bots crawl the pages, they do not exist.
Can I then re upload them to a different website URL, and then Google wont penalise me for duplicate content?
I know I would of lost juice and page rank, but that doesnt really matter, because the site had taken a knock since the Google update.
Thanks for your help.
Chris,
-
Just to second what Mike said - it's always tough to speak in generalities, but I can't think of any benefit to this approach. Typically, 301s are the preferred method for changing URLs. If you just kill the old pages and introduce new ones with the same content, you not only may experience some short-term duplicate content issues, but you lose inbound links and ranking signals to those old URLs.
Are you concerned about transferring a penalty via 301s? I'm just not clear on what the goal is here.
-
In good conscious I cant think of how that would be a good ongoing process to move forward with. What I would do is start an overhaul and merge scenario where you identify the most valuable unique content within the site and start merging less valuable content pages with stronger ones. As well as 301 any old pages to like items or the home page as well as make sure 404's and 410'are in place as well.
With that many pages I know it can be a daunting task to think about but if you truly care for the site and it seems that you do I would not take this approach as it can lead to further damage down the road. I would be trying to make sure I do not get hit again by P & P.
In the end if it seems like a spammy or fishy tactic it probably is or will be at some point.
Good luck my friend
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving content form Non-performing site to performing site - wihtout 301 Redirection
I have 2 different websites: one have good amount of traffic and another have No Traffic at all. I have a website that has lots of valuable content But no traffic. And I want to move the content of non-performing site to performing site. (Don't want to redirect) My only concern is duplicate content. I was thinking of setting the pages to "noindex" on the original website and wait until they don't appear in Google's index. Then I'd move them over to the performing domain to be indexed again. So, I was wondering If it will create any copied content issue or not? What should i have to take care of when I am going to move content from one site to another?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Would it be a good idea to duplicate a website?
Hello, here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com. Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4. We see 2 ways of doing this: we launch www.company2.com as a copy of www.company1.com. we launch www.company2.com as a completely different website. The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure. Do you think either of these approaches could result in penalties by the search engines? Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one? Thanks for your help!
White Hat / Black Hat SEO | | yacpro130 -
Content website of the year 2009 ....
I own a network of travel sites, after all the changes that happened to past 12 months and so. I am really thinking if maybe my sites are worthless. I mean, let's be honest here. I understand what Google is doing. So i ask myself. If I wasn't trying to make a living with google adsense and affiliate sites... Would I still have these travel sites ? well the truth is NO NO... Therefore should i forget about my content site ? It is a punch of useless content. well some interesting information but it is a travel guide like many others online. What do you think? now it is better to focus on your product site or create 1 good websites rather than a network of sites that worked very veryyy well the past 10 years...
White Hat / Black Hat SEO | | sandyallain0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Schema.org tricking and duplicate content across domains
I've found the following abuse, and Im curious what could I do about it. Basically the scheme is: own some content only once (pictures, description, reviews etc) use different domain names (no problem if you use the same IP or IP-C address) have a different layout (this is basically the key) use schema.org tricking, meaning show (the very same) reviews on different scale, show a little bit less reviews on one site than on an another Quick example: http://bit.ly/18rKd2Q
White Hat / Black Hat SEO | | Sved
#2: budapesthotelstart.com/budapest-hotels/hotel-erkel/szalloda-attekintes.hu.html (217.113.62.21), 328 reviews, 8.6 / 10
#6: szallasvadasz.hu/hotel-erkel/ (217.113.62.201), 323 reviews, 4.29 / 5
#7: xn--szlls-gyula-l7ac.hu/szallodak/erkel-hotel/ (217.113.62.201), no reviews shown It turns out that this tactic even without the 4th step can be quite beneficial to rank with several domains. Here is a little investigation I've done (not really extensive, took around 1 and a half hour, but quite shocking nonetheless):
https://docs.google.com/spreadsheet/ccc?key=0Aqbt1cVFlhXbdENGenFsME5vSldldTl3WWh4cVVHQXc#gid=0 Kaspar Szymanski from Google Webspam team said that they have looked into it, and will do something, but honestly I don't know whether I could believe it or not. What do you suggest? should I leave it, and try to copy this tactic to rank with the very same content multiple times? should I deliberately cheat with markups? should I play nice and hope that these guys sooner or later will be dealt with? (honestly can't see this one working out) should I write a case study for this, so maybe if the tactics get bigger attention, then google will deal with it? Does anybody could push this towards Matt Cutts, or anybody else who is responsible for these things?0 -
How can do I report a multiple set of duplicated websites design to manipulate SERPs?
Ok, so within one of my client's sectors it has become clear that someone is trying to manipulate the SERPs by registering tons of domains that are all keyword targeted. All of the websites are simply duplications of one another and are merely setup to dominate the SERP listings - which, at the moment, it is beginning to do. None of the sites have any real authority (in some cases 1 PA and DA) and yet they're ranking above much more established websites. The only back links they have are from dodgy-looking forum ones. It's all a bit crazy and it shouldn't be happening. Anyway, all of the domains have been registered by the same person and within a two-month time period of each other. What do you guys think is the best step to take to report these particular websites to Google?
White Hat / Black Hat SEO | | Webrevolve0 -
Duplicate Articles
We submit articles to a magazine which either get posted as text or in a flash container. Management would like to post it to our site as well. I'm sure this has been asked a million times but is this a bad thing to do? Do I need to a rel=canonical tag to the articles? Most of the articles posted to that other site do not contain a link back to our site.
White Hat / Black Hat SEO | | SirSud0