Moving Content To Another Website With No Redirect?
-
I've got a website that has lots of valuable content and tools but it's been hit too hard by both Panda and Penguin. I came to the conclusion that I'd be better off with a new website as this one is going to hell no matter how much time and money I put in it. Had I started a new website the first time it got hit by Penguin, I'd be profitable today.
I'd like to move some of that content to this other domain but I don't want to do 301 redirects as I don't want to pass bad link juice. I know I'll lose all links and visitors to the original website but I don't care.
My only concern is duplicate content. I was thinking of setting the pages to noindex on the original website and wait until they don't appear in Google's index. Then I'd move them over to the new domain to be indexed again.
Do you see any problem with this? Should I rewrite everything instead? I hate spinning content...!
-
If we're understanding the situation correctly, I'd say this sums it up pretty well.
-
It sounds to me as though most of the content from old site is staying but that 3 enigmatic 'tools' are being moved to a new domain.
In which case I would want to be sure that the functionality being moved wasn't the cause of the previously lifted penalty, especially from a Panda perspective (given that the tools on the new domain presumably won't have any links pointing to it, Penguin shouldn't be an issue) - as a penalty would be re-applied if the tools are not Panda-friendly.
So:
- if you want to have the tools on both sites, I'm with Pete - noindex the tools on the old site.
- if you are permanently moving the tools, review them for Panda-friendliness and then noindex the old site's URLs, probably worth blocking the old URL in robots.txt as well.
- If your previous penalty was nothing to do with the tools at all, and the link profile of those pages is good (or if there aren't any links) then 301 the old URLs to the new.
That's if between Pete and myself we've understood correctly what you're trying to achieve.
Good Luck!
-
So, I'm confused - are you looking to keep both sites active? If you're just moving the tools to a new domain, you could NOINDEX the old pages. If the link-based penalty isn't too severe, you might try a cross-domain rel=canonical on the old site. Unfortunately, without understanding the penalty profile, it's a bit tricky to advise. It's really a cost/benefit trade-off - how much risk of carrying the penalty are you willing to accept vs. the alternative of cutting off all authority and starting over on the new site.
If you've had Panda-related problems, though, I wouldn't keep the tools crawlable on both sites. That seems more likely to prolong your problems than it is to solve them.
-
In fact, I am not moving any content from the old website to the new one. It's just 3 online tools that I wanted to keep for the new website. They both have different content though but the functionalities are the same. I've "noindex" the tools on the old website.
By the way, the manual penalty has been revoked on the old website a few weeks ago.
-
I tend to agree with Martin - it seems like there's probably a way to preserve some of the power of the old site and 301-redirect selectively (or potentially use cross-domain rel=canonical tags), but it would take a much deeper understanding of the site than Q&A allows.
If you rebuild the site from scratch, you'd almost always want to de-index the old site. I'd flat out remove it via Google Webmaster Tools - it's the fastest method. Leaving both sites crawlable is only going to compound your problems and haunt the new site.
I'd warn, though, that if this is Panda-related, just moving the content won't solve your problems. You do have to sort out why they happened in the first place, or the same algorithmic issues will just come back. In other words, if the problems are content-related, then it doesn't really matter where the content lives. If the problems are link related, then moving will remove the problems. Of course, moving will also remove and advantages you currently have based on good links.
Unfortunately, this isn't a problem that can be addressed without a pretty deep audit. My gut feeling is that there may be a way to preserve some of the authority of the old site, but you really need to pin down the problems. Panda + Penguin is a wide swath of potential problems and just isn't enough information to do this right.
-
Some of this "content" are in fact online tools and the tutorials that accompanies it.
-
Hi Stephane,
All the below assumes you feel there is some value in keeping the original website live at all.
My first reaction would be to do a full review of all your old content and carefully consider which ones may have been hit by Panda - is there keyword stuffing, content duplicated from other sites, thin content...etc? Then either fix or completely rewrite those.
After that you should avoid publishing duplicated content so my view would be
1. Remove the rewritten/fixed articles completely from the old site
2. Don't implement the 301 so you don't get any redirected bad Penguin vibe
3. Put a block on those URLs using robots.txt
4. Remove the URLs from Google's index in Webmaster ToolsThen you are free to publish your new, Panda-friendly content to your new website.
Not sure what other mozzers would say, but that's my view. This is not about 'spinning content' but removing poor content and republishing great content. Hope it makes sense.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hacked website - Dealing with 301 redirects and a large .htaccess file
One of my client's websites was recently hacked and I've been dealing with the after effects of it. The website is now clean of malware and I already appealed to Google about the malware issue. The current issue I have is dealing with the 20, 000+ crawl errors which are garbage links that were created from the hacking. How does one go about dealing with all the 301 redirects I need to create for all the 404 crawl errors? I'm already noticing an increased load time on the website due to having a rather large .htaccess file with a couple thousand 301 redirects done already which I fear will result in my client's website performance and SEO performance taking a hit as well.
Intermediate & Advanced SEO | | FPK0 -
Website not ranking
Firstly, apologies for the long winded question. I'm 'newish' to SEO We have a website built on Magento , www.excelclothing.com We have been online for 5 years and had reasonable success. Having used a few SEO companies in the past we found ourselves under a 'partial manual penalty' early last year. By July we were out of penalty. We have been gradually working our way through getting rid of 'spammy' links. Currently the website ranks for a handful of non competitive keywords looking at the domain on SEM RUSH. This has dropped drastically over the last 2 years. Our organic traffic over the last 2-3 years has seen no 'falling off a cliff' and has maintained a similar pattern. I've been told so many lies by SEO companies trying to get into my wallet I'm not sure who to believe. We have started to add content onto all our Category pages to make more unique although most of our Meta Descriptions are a 'boiler plate' template. I'm wondering.... Am I still suffering from Penquin ? Am I trapped by Panda and if so how can I know that? Do I need more links removed? How can I start to rank for more keywords I have a competitor online with the same DA, PA and virtually same number of links but they rank for 3500 keywords in the top 20. Would welcome any feedback. Many Thanks.
Intermediate & Advanced SEO | | wgilliland1 -
Duplicate content on .com .au and .de/europe/en. Would it be wise to move to .com?
This is the scenario: A webstore has evolved into 7 sites in 3 shops: example.com/northamerica example.de/europe example.de/europe/en example.de/europe/fr example.de/europe/es example.de/europe /it example.com.au .com/northamerica .de/europe/en and .com.au all have mostly the same content on them (all 3 are in english). What would be the best way to avoid duplicate content? An answer would be very much appreciated!
Intermediate & Advanced SEO | | SEO-Bas0 -
Content not indexed
How come i google content that resides on my website and on my homepage and my site doesn't come up? I know the content is unique i wrote that. I have a feeling i have some kind of a crawling issue but cannot determine what it is. I ran the crawling test and other tools and didn't find anything. Google shows me that pages are indexed but yet its weird try googling snippets of content and you'll see my site isnt anywhere. Have you experienced that before? First i thought it was penalized but i submitted the reconsideration request and it came back clear, No manual spam action found. And i did not get any message in my GWMT either. Any thoughts?
Intermediate & Advanced SEO | | CMTM0 -
Remove content that is indexed?
Hi guys, I want to delete a entire folder with content indexed, how i can explain to google that content no longer exists?
Intermediate & Advanced SEO | | Valarlf0 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840 -
Content Focus
I have a particular Page which shows primary contact details as well as "additional" contact details for the client. GIven I do not believe I want Google to misinterpret the focus of the page from the primary contact details which of the following three options would be best? Place the "additional" contact details (w/maps) in Javascript, Ajax or similar to suppress them from being crawled. Leave "additional" contact details alone but emphasize the Primary contact details by placing the Primary contact details in Rich Snippets/Microformats. Do nothing and allow Google to Crawl the pages with all contact details Thanks, Phil
Intermediate & Advanced SEO | | AU-SEO0