How do i fix fatal error message?
-
I Am Trying To Remove A Robots.txt code i put in my root domain a while back because i didn't know what i was doing. everytime i enter my domain (domain.com/robots.txt) i get a fatal error message. How do I fix this fatal error message?
-
Wish we were able to give instant 24/7 support, but alas we're just volunteers ¯_(ツ)_/¯
If you have time, would you mind explaining how this issue was resolved?
-
Never Mind The Moz Community Was So Slow Helping Out, I Ended Up Calling My Hosting Provider And They Helped Me Fix The Problem LOL SMH.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap error in Webmaster tools - 409 error (conflict)
Hey guys, I'm getting this weird error when I submit my sitemap to Google. It says I'm getting a 409 error in my post-sitemap.xml file (https://cleargear.com/post-sitemap.xml). But when I check it, it looks totally fine. I am using YoastSEO to generate the sitemap.xml file. Has anyone else experienced this? Is this a big deal? If so, Does anyone know how to fix? Thanks EwTswL4
Technical SEO | | Extima-Christian0 -
"Moz encountered an error on one or more pages on your site" Error
I have been receiving this error for a while: "Moz encountered an error on one or more pages on your site" It's a Multi-Lingual Wordpress website, the robots.txt is set to allow crawlers on all links and I have followed the same process for other website I've done yet I'm receiving this error for this site.
Technical SEO | | JustinZimri0 -
Webmaster Crawl errors caused by Joomla menu structure.
Webmaster Tools is reporting crawl errors for pages that do not exist due to how my Joomla menu system works. Example, I have a menu item named "Service Area" that stores 3 sub items but no actual page for Service Area. This results in a URL like domainDOTcom/service-area/service-page.html Because the Service Area menu item is constructed in a way that shows the bot it is a link, I am getting a 404 error saying it can't find domainDOTcom/service-area/ (The link is to "javasript:;") Note, the error doesn't say domainDOTcom/service-area/javascript:; it just says /service-area/ What is the best way to handle this? Can I do something in robots.txt to tell the bot that this /service-area/ should be ignored but any page after /service-area/ is good to go? Should I just mark them as fixed as it's really not a 404 a human will encounter or is it best to somehow explain this to the bot? I was advised on google forums to try this, but I'm nervous about it. Disallow: /service-area/*
Technical SEO | | dwallner
Allow: /service-area/summerlin-pool-service.
Allow: /service-area/north-las-vegas
Allow: /service-area/centennial-hills-pool-service I tried a 301 redirect of /service-area to home page but then it pulls that out of the url and my landing pages become 404's. http://www.lvpoolcleaners.com/ Thanks for any advice! Derrick0 -
Salvaging links from WMT “Crawl Errors” list?
When someone links to your website, but makes a typo while doing it, those broken inbound links will show up in Google Webmaster Tools in the Crawl Errors section as “Not Found”. Often they are easy to salvage by just adding a 301 redirect in the htaccess file. But sometimes the typo is really weird, or the link source looks a little scary, and that's what I need your help with. First, let's look at the weird typo problem. If it is something easy, like they just lost the last part of the URL, ( such as www.mydomain.com/pagenam ) then I fix it in htaccess this way: RewriteCond %{HTTP_HOST} ^mydomain.com$ [OR] RewriteCond %{HTTP_HOST} ^www.mydomain.com$ RewriteRule ^pagenam$ "http://www.mydomain.com/pagename.html" [R=301,L] But what about when the last part of the URL is really screwed up? Especially with non-text characters, like these: www.mydomain.com/pagename1.htmlsale www.mydomain.com/pagename2.htmlhttp:// www.mydomain.com/pagename3.html" www.mydomain.com/pagename4.html/ How is the htaccess Rewrite Rule typed up to send these oddballs to individual pages they were supposed to go to without the typo? Second, is there a quick and easy method or tool to tell us if a linking domain is good or spammy? I have incoming broken links from sites like these: www.webutation.net titlesaurus.com www.webstatsdomain.com www.ericksontribune.com www.addondashboard.com search.wiki.gov.cn www.mixeet.com dinasdesignsgraphics.com Your help is greatly appreciated. Thanks! Greg
Technical SEO | | GregB1230 -
Increase in Not Found Errors
Hello All, Looking for input on an issue I am having. We used to have a website www.gazaro.com. It was a price comparison engine for consumers. A shift in the focus of the business resulted in www.360pi.com - a price intelligence tool for retailers.The two websites have similar themes, so I thought it would be valuable to pass SEO juice from the old domain to the new domain.Back in August, I noticed that Gazaro was redirected to 360pi with a meta refresh. I know a 301 redirect is preferable to a meta refresh, so we switched to a 301 redirect.Since that happened, there has been a spike in 404 errors in webmaster tools. If you hover over the url, it is actuallywww.360pi.com/deal/amazon etc etc. It is looking for gazaro urls on the 360pi domain - which don't exist. I think this is hurting our homepage ranking. Our homepage no longer ranks for "price intelligence" when it used to be in pos. 4 or 5. As it turns out, we are ranking #1 for "price intelligence" but with our product page.I'm wondering why the 404 are happening. Is something setup in correctly? Or should I have them switch back to a meta refresh.Thoughts? Thanks for your helpPNM1cYO PNM1cYO
Technical SEO | | AmandaHorne0 -
GWT crawl errors: How big a ranking issue?
For family reasons (child to look after) I can't keep a close eye on my SEO and SERPs. But from top 10 rankings in January for a dozen keywords I'm now not in top 80 results -- save one keyword for which I'm ~18-20.
Technical SEO | | Jeepster
Not a sitewide penalty: some of my internal pages are still ranking top 3 or so. In GWT, late March I received warning of a rise in server errors:
17 Server Errors/575 soft 404s/17 Not Founds/Access Denied 1/Others 4
I've also got 2 very old sitemaps (from two different ex-SEO firms) & I'm guessing about 75% of the links on there no longer exist. Q: Could all this be behind my calamitous SERPS drop? Or should I be devoting my -- limited -- time to improving my links?0 -
Pagination/Crawl Errors
Hi, Ive only just joined SEO moz and after they crawled my site they came up with 3600 crawl errors mostly being duplicate content and duplicate urls. After researching this it soon became clear it was due to on page pagination and after speaking with Abe from SEO mozhe advised me to take action by getting our developers to implement rel=”next” & rel=”prev” to review. soon after our developers implemented this code ( I have no understanding of this what so ever) 90% of my keywords I had been ranking for in the top 10 have dropped out the top 50! Can anyone explain this or help me with this? Thanks Andy
Technical SEO | | beck3980 -
404 Error
Hello, Seomoz flagged a url as having a 404 client error. The reason the link doesn't return a proper content page is because the url name was changed. What should we do? Will this error disappear when Google indexes our site again? Or is there some way to manually eliminate it? Thanks!
Technical SEO | | OTSEO0