Website not listing in google - screaming frog shows 500 error? What could the issue be?
-
Hey,
http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this?
Thanks,
Andrew
-
Thanks - I'll get in touch with their hosting provider asap.
-
Hi Andrew,
The server on which your website is hosted is throwing a 500 server-side error. You better contact the hosting provider and get the issue fixed. You can use the HTTP header status checkers online like, web-sniffer.net and put the URL, http://www.interconnect.org.uk
All the other tools like W3C validator, http://builtwith.com/interconnect.org.uk etc gave the same result:
500 Internal Server Error
Best,
Devanur Rafi
-
Have you looked in webmaster tools as it may tell you in there under crawl errors ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Malicious links on our site indexed by Google but only visible to bots
We've been suffering from some very nasty black hat seo. In Google's index, our pages show external links to various pharmaceutical websites, but our actual live pages don't show them. It seems as though only certain user-agents see the malicious links. Setting up Screaming Frog SEO crawler using the Googlebot user agent also sees the malicious links. Any idea what could have caused this or how this can be stopped? We scanned all files on our webserver and couldn't find any of malicious links. We've changed our FTP and CMS passwords, is there anything else we can do? Thanks in advance!
White Hat / Black Hat SEO | | SEO-Bas0 -
Url suddenlly diappeared from Google search results
Hi, I am facing a big problem wheel Google stop showing a basic url of my site, It was ranked good for more than 35 keywords from 1st to 8st positions, and suddenly I can find it indexed in Google , this is the URL : http://tv1.alarab.com/view-8/مسلسلات-عربية Thnaks
White Hat / Black Hat SEO | | alarab.net0 -
More sitemap issues: help
Hey Guys, Seems I'm having more sitemap issues -I just checked my WMT and find that for my com.au and com site - the com.au site is showing i only have 2 pages indexed and 72 Web Pages submitted. The .com I look under sitemaps and it doesn't show any results as to how many pages have been indexed instead it is giving me this error warning - "Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead." All 3 sites are listed here: http://bit.ly/1KTbWg0 http://bit.ly/1AU0f5k http://bit.ly/1yhz96v Any advice would be much appreciate here! Thanks guys
White Hat / Black Hat SEO | | edward-may0 -
Its posible to use Google Authorship in an online shop?
Today I installed Google Authorship in my Wordpress Blog and I would like to know if its posible to implement it in my Opencart online shop. I am not interested in rich snippets because I have 9k of products and the 90% of them dont have sells nor reviews
White Hat / Black Hat SEO | | mozismoz0 -
Is there a danger linking to and from one website too many times?
Basically my webdeveloper has suggested that instead of using a subfolder to create an English and Korean version of the site I should create two different websites and then link them both together to provide the page in English, or in Korean, which ever the case may be. My immediate reaction is that search engines may perceive this kind of linking to be manipulative, as you can imagine there will be a lot of links (One for every page). Do you think it is OK to create two webpages and link them together page by page? Or do you think that the site will get penalized by search engines for link farming or link exchanging. Regards, Tom
White Hat / Black Hat SEO | | CoGri0 -
What Google considers to be a branded keyword?
We can set our own keywords as branded in SeoMoz campaign, but Google would not necessarily see them like branded. After reading the Blog post at http://www.seomoz.org/blog/how-wpmuorg-recovered-from-the-penguin-update I had a question: Are there known rules (or at least guesses) what Google considers a branded keyword/anchor text? I guess the first one would be your website domain. So bluewidget.com for example would be a branded keyword for bluewidget.com website. How about Blue Widget or Blue Widget Company?
White Hat / Black Hat SEO | | SirMax0 -
Should we add our site to Google Webmaster Tools
Hello, Should we add our site nlpca(dot)com to google webmaster tools? Everything's very white hat but we do have a section on each of our 4 sites for "Our other Sites" that link to the others. It's been there for many years. We're looking for clues as to why we've dropped in rank Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Google Penalising Pages?
We run an e-commerce website that has been online since 2004. For some of our older brands we are getting good rankings for the brand category pages and also for their model numbers. For newer brands, the category pages aren't getting rankings and neither are the products - even when we search for specific unique content on that page, Google does not return results containing our pages. The real kicker is that the pages are clearly indexed, as searching for the page itself by URL or restricting the same search using the site: modifier the page appears straight away! Sometimes the home page will appear on page 3 or 4 of the rankings for a keyword even though their is a much more relevant page in Google's index from our site - AND THEY KNOW IT, as once again restricting with the keywords with a site: modifier shows the obviously relevant page first and loads of other pages before say the home page or the page that shows. This leads me to the conclusion that something on certain pages is flagging up Google's algorithms or worse, that there has been manual intervention by somebody. There are literally thousands of products that are affected. We worry about duplicate content, but we have rich product reviews and videos all over these pages that aren't showing anywhere, they look very much singled out. Has anybody experienced a situation like this before and managed to turn it around? Link - removed Try a page in for instance the D&G section and you will find it easily on Google most of the time. Try a page in the Diesel section and you probably won't, applying -removed and you will. Thanks, Scott
White Hat / Black Hat SEO | | scottlucas0