Site down for more than a month - lost rankings
-
Hello,
We have run into a situtation where we had multiple pages setup for different keywords but didn't realize that we had a name server issue that has caused the pages to be down for the last month or so (2-3 weeks on the low side.) The rank finder was still working fine, but the offline page was never reported. We realized the situation recently and have since gotten the sites back online under the new nameservers.
Most of these sites were ranking 1 and 2 spots in their keywords, and now are no where to be found in the Google Index. Should I do anything differently, or just put the sites back online and wait it out? I have seen in different places that it may only take 2 weeks to come back, but it's possible that Google has marked the sites as 'not quality' because of their downtime and it will be even harder to get them to rank again.
Can anyone shed any light on this situation? Any information is appreciated.
Thanks in advance.
-
Submit to The GOLDEN " G " and be patient...
-
I would submit a sitemap to Google to get them pages re-indexed and maybe just build a few links to them pages and you should be more then fine!
-
Waiting for Google to re-index those pages should be just fine.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website Redesign and Migration to Squarespace killed my Ranking
My old website was dated, ugly, impossible to update and a mess between hard-coded pages and WP, but we were ranking #1 in the organic searches for our key words. I just redesigned my website using Squarespace. I kept most of the same text on the pages (for key words) and kept the same Meta-Tags and Title Tags for each page as much as possible. Once I was satisfied that I had done as much on-page optimization as I could, I changed the IP in our Domain Name Registry so that it would point to our new website on the Squarespace host. And our new website was live! ...Then I watched in dismay as our ranking fell into oblivion. I think this might have something to do with not doing any 301 redirects from the old website and losing all of my link juice. Is this the case? And, if so, how do I fix it? Our website url is www.kanataskinclinic.ca Thanks
Web Design | | StillLearning1 -
Can a cloud based firewall affect my search ranking?
Hi, I recently implemented a firewall on my website to prevent hacking attacks. We were getting a crazy amount of people per day trying to brute force our website. I used the sucuri cloud proxy firewall service which they claim because of the super fast caching actually helps SEO. I was just wondering is this true? Because we're slowly falling further and further down the SERPS and i really don't know why. If not, is there any major google update recently I don't know about? Thanks, Robert
Web Design | | BearPaw880 -
Best Approach to Rank For Multiple Locations With Similar Targeted Keywords
I'm trying to determine the best way to set up a website to rank for a similar set of keyword phrases in three different cities. The keyword phrases I want to rank for are all pretty much the same with the only difference being the city associated with the keyword phrase. For example, "Austin water restoration" vs "San Antonio water restoration" vs "Houston water restoration". Each city needs about 7 or 8 pages of unique content to accurately target the group of keywords I'm trying to rank for. My initial thought was to write up unique content for each city and have each city act a site within the main site. For example, the main navigation for xyz.com/austin would be Austin specific, so when you land on xyz.com/austin and go to Services - Water Restoration, it would be all Austin specific content. The same would be true for San Antonio and Houston. The only problem with this approach is that I have to build up the page authority for a lot of different pages. It would be much easier to build up the page authority for one Water Restoration page and just insert a little "Areas we serve" on the page that includes "Austin, San Antonio, and Houston" and maybe work the coverage area in again at the bottom of the page somewhere. However, it would be much more difficult to work "Austin, San Antonio, and Houston" into the title tags and H1s though, and I couldn't logically work the cities into the content as much either. That would be a downside to this approach. Any thoughts on this? Wondering how large companies with hundreds of locations typically approach this? I'd really appreciate your input.
Web Design | | shaycw0 -
Managing international sites
Hi all, I am trying to figure out the best way to manage our international sites. We have two locations, 1 in the UK and 1 in the USA. I currently use GEOIP to identify the location of the browser and redirect them using a cookie to index.php?country=uk or index.php?country=usa. Once the cookie is set I use a 301 redirect to send them to index.php, so that Google doesnt see each url as duplicate content, which Webmaster tools was complaining about. This has been working wonderfully for about a year. It means I have a single php language include file and depending on the browser location I will display $ or £ and change the odd ise to ize, etc. Problem I am starting to notice is that we are starting to rank better and better in the USA search result. I am guessing this is because the crawlers must be based out of the USA. This is great, but my concern is that I am losing rank in the UK, which is currently where most of our business is done out of... So I have done my research and because I have a .net will go for a /uk/ or /us/ sub folder and create two separate webmaster tools site and set them up to target each geographic location. Is this okay? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#2 HERE IS THE PROBLEM: I don't was to have to run two separate website with two separate sets of copy. Also, I dont want to lose all the rank data on urls like: http://www.mysite.net/great-rank-result.html now becomes http://www.mysite.net/uk/great-rank-result.html. On top of this I will have two pages, the one just mentioned and now adding http://www.mysite.net/us/great-rank-result.html, which I presume would be seen as duplicate copy? (Y/n) Can I use rel canonical to overcome this? How can I don't this without actually running the two pages. Could you actually have 1 site in the root folder and just use the same GEOIP techology to do a smart MOD REWRITE adding either UK or US to the url therefore being able to create two webmaster accounts targeting each geographic location? Any advise is most welcome.
Web Design | | Mediatomcat0 -
Empirical Data on the effect of embedded Google Maps on Search Ranking
Does anyone have any empiric data on the effect of an embedded map on SERP's? Please understand that I already have anecdotal info and a personal opinion. I am looking for data. Thanks
Web Design | | RobertFisher0 -
Aged .com domain or brand new .co.uk for UK site?
Should i buy a 2 year old .com domain or brand new .co.uk domain for a site i am making for UK (google.co.uk optimisation). I am struggling to find good aged .co.uk domains, there are loads of nice .com's that are old, any thoughts? thanks
Web Design | | SamBuck0 -
Optimzing a new ecommerce site, Need help with URL
Hi We are putting up a new ecommerce website and for product description, our tech team indicates that they must have the skun numbers in the URL. Which one of the following URL structure do you find the most SEO freindly? 1. http://www.Site.com/SKUNumber/ProductDescription/ or 2. http://www.Site.com/ProductDescription/SKUNumber/ My personal opinion is that most relevant content should be on load page so I like option 1. Thanks
Web Design | | CookingCom0 -
Old SEO keyword "articles", are they hurting rankings?
Hello, About two years ago, the company I work for hired an SEO firm to improve organic rankings on our site. The SEO company's primary method for doing this was producing "articles" that are not really articles but keyword stuffed pages with lots of hidden, internal links to other legitimate pages on our site. Examples: http://www.creamright.com/Isi-Chargers-articles.html http://www.creamright.com/How-To-Make-Whipped-Cream-article.html http://www.creamright.com/Cream-Whipper-articles.html Obviously, this strategy wasn't greatly successful and we cancelled our work with the firm. However, we still have all of the "articles" on the site (about 50-60 pages total) and each page is navigable from the html and XML sitemaps. Additionally, the SEO firm we used built a lot of useless links to these pages from BS directory sites which are all still active. The question I have is whether we should remove these "article" pages or should leave them alone? Although I'm sure they aren't helping any of our SEO efforts, could deleting the pages after two years negatively impact our search rankings? Thanks in advance for any help on this, Doug M.
Web Design | | Loganshark1