Magento Help - Server Reset
-
Good Morning,
After rebooting a server, a magento based website reset itself going back to December 2013. All changes to the site and orders dating up until yesterday (6/19/14) have disappeared.
There are several folders on the root of the server that have files with yesterday's date but we don't know how to bring everything back and restore.
Any Magento or server experts out there ever face this issue or have any ideas or potential solutions?
Thanks
-
Thanks Prestashop.
I'll look into that and let you know. Appreciate the advice
-
In a situation like that my first guess would be someone changed the document root in apache but did not restart the server. I am not a Magento person, so you might have what I am saying checked out, because I don't know where things are located. What I would do is look in the databases on the server and see if there is one that holds the most recent customer information. Then I would look at the settings file that Magento uses to specify what database the application uses, see if they are the same. If not, then look around on the server for another directory that holds the most recent version.
If someone changed the document root and did not reboot, it would not take effect until the server was rebooted, thus changing the whole configuration of the site. One place you can have someone look to be sure is the apache logs. In the log it will have the complete system path of the resources accessed, see if there was a change at the time of the reboot.
-
yikes! not much help to you now, but you should really at least do weekly backups of the ftp and nightly of your database. And definitely always to a backup before you do anything major (like a reboot).
you should go to you host, they possible might have a back up of your server, fingers crossed.
-
We have no backup, that's the issue, the last backup is from 2013.
-
I agree with Paddy. Did you do a backup before the changes? If so, getting your site back to how it was will be very easy, especially if you are using Cpanel. When you do your server reboots, make sure you are using the "graceful" method.
Here is a video explaining this:
https://www.youtube.com/watch?v=PddZ3FwHFZw -
this is the right place to ask about this, you would get a better responce on the magento forums or you could look something like freelancer or pph and find an magento expert with a good review history.
Do you have a backup of your data base? if so you should be able to restore it, if not, check if your host company has a back up.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help to identify that this SEO agency is doing a TERRIBLE job
Hi folks, I am working with a group for which I do SEO etc. for one part of the group. Another part of the group hired an SEO agency to carry out their SEO for them (before I joined). In short, they are doing a terrible job by building links in very dodgy directories (ones which get taken offline) and via machine generated 'articles' on horrendously bad 'blogs'. Please take a look at these 'articles' and leave your thoughts below so I can back up the point that these guys are not the kind of SEOs we should be working with. [List of links to articles removed by moderator] Many thanks in advance, Gill.
Intermediate & Advanced SEO | | Cannetastic0 -
Robots.txt Help
I need help to create robots.txt file. Please let me know what to add in the file. any real example or working example.?
Intermediate & Advanced SEO | | Michael.Leonard0 -
The "webmaster" disallowed all ROBOTS to fight spam! Help!!
One of the companies I do work for has a magento site. I am simply the SEO guy and they work the website through some developers who hold access to their systems VERY tightly. Using Google Webmaster Tools I saw that the robots.txt file was blocking ALL robots. I immediately e-mailed out and received a long reply about foreign robots and scrappers slowing down the website. They told me I would have to provide a list of only the good robots to allow in robots.txt. Please correct me if I'm wrong.. but isn't Robots.txt optional?? Won't a bad scrapper or bot still bog down the site? Shouldn't that be handled in httaccess or something different? I'm not new to SEO but I'm sure some of you who have been around longer have run into something like this and could provide some suggestions or resources I could use to plead my case! If I'm wrong.. please help me understand how we can meet both needs of allowing bots to visit the site but prevent the 'bad' ones. Their claim is the site is bombarded by tons and tons of bots that have slowed down performance. Thanks in advance for your help!
Intermediate & Advanced SEO | | JoshuaLindley0 -
Moving hosting to another company/server . What about SEO?
Hi, We have been experiencing issues with our hosting company and want to move the hosting to someone else. One problem was that we did Isapi rewrite rule but the company moved to mod rewrite and they never adjusted the rewrite rules. We are considering a move to another hosting company - would that hurt rankings? Are there any SEO considerations to think about when switching to another host?
Intermediate & Advanced SEO | | alexkatalkin0 -
Unexplained Drop In Ranking and Traffic-HELP!
I operate a real estate web site in New York City (www.nyc-officespace-leader.com). It was hit by Penguin in April 2012, with search volume falling from 6,800 per month in March 2012 to 3,300 by June 2012. After refreshing content and changing the theme, volume recovered to 4,300 per month in October 2013. There was a big improvement in early October 2013, perhaps tied to a Panda update. In November 2013 I hired an SEO company. They are reputable; on MOZ's recommended list. After following all their suggestions (searching and removing duplicate content, disavowing toxic links, improving the site structure to make it easier for Google to index listings, re-writing ten key landing pages, improving the design of the user interface) ranking and traffic started to decline in April of 2014 and crashed in June 2014 after an upgraded design with improved user interface was launched. Search volume is went from 4700 in March to around 3800 in June. However ranking on the keywords that generate conversions has really declined, and clicks from those terms are down at least 65%. My online business is severely compromised after I have spent almost double the anticipated budget to improve ranking and conversion. A few questions: 1. Could a drop in the number of domains lining to our site have led to this decline? About 30 domains that had toxic links to us agreed to remove them. We had another 70 domains disavowed in late April. We only have 78 domains pointing to our domain now, far less than before (see attached AHREFs image). It seems there is a correlation in the timeline between the number of domains pointing to us and ranking performance. The number of domains pointing to us has never been this low. Could this be causing the drop? My SEO firm believes that the quality of these links are very low and the fact that many are gone is in fact a plus. 2. The number of indexed pages has jumped to 851 from 675 in early June (see attached image from Google Webmaster tools), right after a site upgrade. The number of pages in the site map is around 650. Could the indexation of the extra 175 page somehow have diluted the quality of the site in Google's eyes? We have filed removal request for these pages in Mid June and again last week with Google but they still appear. In 2013 we also launched an upgrade and Google indexed an extra 500 pages (canonical tags were not set up correctly) and search volume and ranking collapsed. Oddly enough when the number of pages indexed by Google fell, ranking improved. I wonder if something similar has occurred. 3. May 2014 Panda update. Many of our URLs are product URLs of listings. They have less than 100 words. Could Google suddenly be penalizing us for that? It is very difficult to write descriptions of hundreds of words for products that change quickly. I would think the Google takes this into account. If someone could present some insight into this issue I would be very, very grateful. I have spent over $25,000 on SEO reports, wireframe design and coding and now find myself in a worse position than when I started. My SEO provider is now requesting that I purchase even more reports for several thousand dollars and I can't afford it, nor can I justify it after such poor results. I wish they would take it upon themselves to identify what went wrong. In any case, if anyone has any suggestions I would really appreciate it. I am very suspicious that this drop started in earnest at the time of link removal and the disavow and accelerated at the time of the launch of the upgrade. Thanks, Alan XjSCiIdAwWgU2ps e5DerSo tYqemUO
Intermediate & Advanced SEO | | Kingalan10 -
Does having all client websites on same server/same Google Analytics red flag Google?
If you have several clients, and they are all on the same server, and also under ONE Google Analytics account, will that negatively impact with Google? They all have different content and addresses, some have the same template, but with different images.
Intermediate & Advanced SEO | | BBuck1 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Can someone please help me understand my sites recent loss of rankings?
My site has been top 3 for 'speed dating' on Google.co.uk since about 2003 and it went to below top 50 for a lot of it's main keywords shortly after 27 Oct 2012. I did a re-submission request and was told there was 'no manual spam action'. I have a Page Authority of 53, a regular blog http://bit.ly/oKyi88, a KLOUT of 40, user reviews and quality content. I did discover that another URL I using was set to a 302 instead of a 301 for some reason. I don't necessarily think this was an issue as Google should know which is the trusted URL and therefore which content to list. I removed this redirect completely about 3 weeks ago, but I've seen no improvement. I'm looking at improving various things, but I'm still not sure why I've been hit and wonder if I'm missing something obvious? Any suggestions greatly appreciated.
Intermediate & Advanced SEO | | benners0