Bringing a large news site back on line - anything to look out for?
-
Hi,
I'm advising an online news site site that has been completely offline for almost 6 months, and is now looking to start back up again. The site seems to be completely gone from google's cache.
This might mean moving to new hosting, but with the same URL.
The archive has about 7000 original articles. Most of these are date specific news, although there are some longer investigative pieces that are more timeless.
Is there any difference (from an SEO/digital marketing perspective) between putting the whole archive online at once, or gradually republishing the old articles?
Is there anything I should be aware of, when restarting a website of this size?
Thanks - Chris
-
Hi Chris,
First thing is I'd look at the Wayback machine and make sure you followed the same URL structure of the old site and look for any oddities.
Does the site have an old Google Webmaster Tools account? You might want to poke around in there and make sure there aren't any old URL removal requests or anything else that could potentially cause problems with the relaunch. if you don't have a GWT account, be sure to set one up.
Quite honestly, if these articles existed before, I don't see any advantage to releasing them slowly vs all at once. In fact I'd prefer all at once especially if articles linked to one another.
Finally, I'd make sure there were no penalty issues on the old site, and address them if there were. You don't want to bring any of those back to life in the process
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTML Site for Speed
I have a few small sites and landing pages on Wordpress that I want to load a lot quicker than they do. It occurred to me that if there is not a lot of content management necessary, I should simply make the static web pages straight html instead of trying all the modifications necessary to get some Wordpress sites and themes to load quicker. I have noticed the html sites I have load lighting fast on slow hosting service. Is this a good idea, can anyone think of drawbacks to it? Security? Responsiveness? SEO? And what about taking some company's sites with blog straight html so the home page loads quick, and then using Wordpress for the blog?
Technical SEO | | phogan0 -
Dev Site Was Indexed By Google
Two of our dev sites(subdomains) were indexed by Google. They have since been made private once we found the problem. Should we take another step to remove the subdomain through robots.txt or just let it ride out? From what I understand, to remove the subdomain from Google we would verify the subdomain on GWT, then give the subdomain it's own robots.txt and disallow everything. Any advice is welcome, I just wanted to discuss this before making a decision.
Technical SEO | | ntsupply0 -
Moved a site and changed URL structures: Looking for help with pay
Hi Gents and Ladies Before I get started, here is the website in question. www.moldinspectiontesting.ca. I apologize in advance if I miss any important or necessary details. This might actually seem like several disjointed thoughts. It is very late where I am and I am a very exhausted. No on to this monster of a post. **The background story: ** My programmer and I recently moved the website from a standalone CMS to Wordpress. The owners of the site/company were having major issues with their old SEO/designer at the time. They felt very abused and taken by this person (which I agree they were - financially, emotionally and more). They wanted to wash their hands of the old SEO/designer completely. They sought someone out to do a minor redesign (the old site did look very dated) and transfer all of their copy as affordably as possible. We took the job on. I have my own strengths with SEO but on this one I am a little out of my element. Read on to find out what that is. **Here are some of the issues, what we did and a little more history: ** The old site had a terribly unclean URL structure as most of it was machine written. The owners would make changes to one central location/page and the old CMS would then generate hundreds of service area pages that used long, parameter heavy url's (along with duplicate content). We could not duplicate this URL structure during the transfer and went with a simple, clean structure. Here is an example of how we modified the url's... Old: http://www.moldinspectiontesting.ca/service_area/index.cfm?for=Greater Toronto Area New: http://www.moldinspectiontesting.ca/toronto My programmer took to writing 301 redirects and URL rewrites (.htaccess) for all their service area pages (which tally in the hundreds). As I hinted to above, the site also suffers from a overwhelming amount of duplicate copy which we are very slowly modifying so that it becomes unique. It's also currently suffering from a tremendous amount of keyword cannibalization. This is also a result of the old SEO's work which we had to transfer without fixing first (hosting renewal deadline with the old SEO/designer forced us to get the site up and running in a very very short window). We are currently working on both of these issues now. SERPs have been swinging violently since the transfer and understandably so. Changes have cause and effect. I am bit perplexed though. Pages are indexed one day and ranking very well locally and then apparently de-indexed the next. It might be worth noting that they had some de-index problems in the months prior to meeting us. I suspect this was in large part to the duplicate copy. The ranking pages (on a url basis) are also changing up. We will see a clean url rank and then drop one week and then an unclean version rank and drop off the next (for the same city, same web search). Sometimes they rank along side each other. The terms they want to rank for are very easy to rank on because they are so geographically targeted. The competition is slim in many cases. This time last year, they were having one of the best years in the company's 20+ year history (prior to being de-indexed). **On to the questions: ** **What should we do to reduce the loss in these ranked pages? With the actions we took, can I expect the old unclean url's to drop off over time and the clean url's to pick up the ranks? Where would you start in helping this site? Is there anything obvious we have missed? I planned on starting with new keyword research to diversify what they rank on and then following that up with fresh copy across the board. ** If you are well versed with this type of problem/situation (url changes, index/de-index status, analyzing these things etc), I would love to pick your brain or even bring you on board to work with us (paid).
Technical SEO | | mattylac0 -
For large sites, best practices for pages hidden behind internal search?
If a website has 1M+ pages, with most of them being hidden behind an internal search, what's the best way to get pages included in an engine's index? Does a direct clickpath to those pages need to exist from the homepage or other major hub pages on the site? Is submitting an XML sitemap enough?
Technical SEO | | vlevit0 -
Site Disappeared off of Search
A friend of mine has a site (http://bit.ly/q4iWkM ) that was ranking number one for their key word (Drimnagh() and has now completely disappeared off of the ranking. I did some checking and can't see a problem. She does have duplicate meta and titles throughout but this shouldn't be a punishable offence that I know of and is something that I am going to correct with a quick plugin install. I couldn't see any redirects or code stopping search either. When you do site:URL it shows up OK as well. She is client of mine (for website not for SEO) and she is really upset about it so any help from the forum would be appreciated. This isn't even a site I did but you couldn't get a better person to work with so I am eager to help where and if possible. Guinness all round if someone solves it next time you are in Ireland
Technical SEO | | kdaly1000 -
Google not visiting my site
Hi my site www.in2town.co.uk which is a lifestyle magazine has gone under a major refit. I am still working on it but it should be ready by the end of this week or sooner but one problem i have is, google is not visiting the site. I took a huge gamble to redo the site, even though before the refit i was getting a few thousand visitors a day, i wanted to make the site better as i was getting google webmaster errors. But now it seems google is not visiting the site. for example i am using sh404sef and i have put friendly url in the site and on the home page it has its name and meta tag but when you look at google it is not giving the site a name. Also it has not visited the site since october 13th Can anyone advise how to encourage google to visit the site please.
Technical SEO | | ClaireH-1848860 -
Site Architecture Trade Off
Hi All I'm looking for some feedback regarding a site architecture issue I'm having with a client. They are about to enter a re-design and as such we're restructuring the site URLs and amending/ adding pages. At the moment they have ranked well off the back of original PPC landing pages that were added onto the site, such as www.company.com/service1, www.company.com/service2, etc The developer, from a developer point of view wished to create a logical site architecture with multiple levels of directories etc. I've suggested this probably isn't the best way to go, especially as the site isn't that large (200-300 pages) and that the key pages we're looking to rank should be as high up the architecture as we can make them, and that this amendment could hurt their current high rankings. It looks like the trade off may be that the client is willing to let some pages be restructured so for example, www.company.com/category/sub-category/service would be www.company.com/service. However, although from a page basis this might be a solution, is there a drawback to having this in place for only a few pages rather than sitewide? I'm just wondering if these pages might stick out like a sore thumb to Google.
Technical SEO | | PerchDigital1 -
UK and USA site versions
We have a UK site selling our product and we are due to appoint a reseller in the USA, they require a .com domain, which makes sense and they also would like to see American spellings etc and currency. also we feature heavily in pubs and they want this referred to as "bars" so there are a few tweaks here and there but mainly just slight variations on spelling and terminology. These are only minor adjustments to our current site, what is the best way of achieving this without falling foul of duplicate content issues.
Technical SEO | | IPIM0