Server Connectivity
-
Hey there
When we go to our webmaster tools there is a orange triangle. The issue is that Google's robot can not access our site.
Does anyone know why this could be?
Thanks!
-
No problem - private message!
-
Thanks guys!. We are checking it..
Kevin, please forgive my ignorance , what is PM?
Cheers!
-
After the first step and Gerd's second step, I would check to see if the dns and/or firewall settings are not the issue (I would also assume that your hosting account is already verified).
If you PM me with the site, I will gladly take a look. Good luck!
-
First stop would be to go to GWMT and then go to "Health -> Fetch as Google" and test the homepage and some public pages.
Have a look at the results provided by GWMT.
-
Hey Kevin,
No we dont...Which would be the next step?
Thanks again
-
Do you have a robots.txt file prohibiting indexing (we can start here).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If some rooted domains providing back links to a website are from the same server, would it cause an issue?
My client with alliedautotransport.com has a brother that owns hundreds of relevant websites that has great content on there, however, if we have him do some back linkings from those pages from the same server, would it hurt the rankings or make a difference?
Technical SEO | | SeobyKP1 -
One server, two domains - robots.txt allow for one domain but not other?
Hello, I would like to create a single server with two domains pointing to it. Ex: domain1.com -> myserver.com/ domain2.com -> myserver.com/subfolder. The goal is to create two separate sites on one server. I would like the second domain ( /subfolder) to be fully indexed / SEO friendly and have the robots txt file allow search bots to crawl. However, the first domain (server root) I would like to keep non-indexed, and the robots.txt file disallowing any bots / indexing. Does anyone have any suggestions for the best way to tackle this one? Thanks!
Technical SEO | | Dave1000 -
We are using Hotlink Protection on our server for jpg mostly. What is moz.com address to allow crawl access?
We are using Hotlink Protection on our server for jpg mostly. What is moz.com crawl url address so we can allow it in the list of allowed domains? The reason is that the crawl statistics gives our a ton of 403 Forbidden errors. Thanks.
Technical SEO | | sergeywin10 -
Best way to fix a whole bunch of 500 server errors that Google has indexed?
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not. In any case, there are now thousands of these pages in their index that error out. If I wanted to simply remove them all from the index, which is my best option: Disallow all 1,000 or so pages in the robots.txt ? Put the meta noindex in the headers of each of those pages ? Rel canonical to a relevant page ? Redirect to a relevant page ? Wait for Google to just figure it out and remove them naturally ? Submit each URL to the GWT removal tool ? Something else ? Thanks a lot for the help...
Technical SEO | | jim_shook0 -
Does server (host) location effect local search results?
Hey I was wondering if the location of your server (host) effects your local search engine results?Suppose I have an e-commerce website in the Netherlands and I want to host my website in the USA or UK, does this effect my search engine results in the Netherlands?
Technical SEO | | kevba0 -
Getting Errors On Server Connectivity-??
Hi Guys I am getting a massive crawl errors on googlewebmaster ,stating there is over 2162 errors connect time out - anyone know where I can see exactly where the time out is from? I have browsed through my site and I do not see any connect timeout occured. Thanks Cary
Technical SEO | | ilovebodykits1 -
Best Google Practice for Hacked SIte: Shift Servers/IP or Disavow?
Hi - Over the past few months, I've identified multiple sites which are linking into my site and creating fake pages (below is an example and there's over 500K+ of similar links from various sites}. I've attempted to contact the hosting companies, etc. with little success. Was wondering if my best course of action might be at this point: A) which servers (or IP address). B) Use the Google Disavow tool? C) both. example: { http://aryafar.com/crossings/200-krsn-team-part19.html } Thanks!!
Technical SEO | | hhdentist0 -
Is there an easier way from the server to prevent duplicate page content?
I know that using either 301 or 302 will fix the problem of duplicate page content. My question would be; is there an easier way of preventing duplicate page content when it's an issue with the URL. For example: URL: http://example.com URL: http://www.example.com My guess would be like it says here, that it's a setting issue with the server. If anyone has some pointers on how to prevent this from occurring, it would be greatly appreciated.
Technical SEO | | brianhughes2