Google Indexed a version of my site w/ MX record subdomain
-
We're doing a site audit and found "internal" links to a page in search console that appear to be from a subdomain of our site based on our MX record. We use Google Mail internally. The links ultimately redirect to our correct preferred subdomain "www", but I am concerned as to why this is happening and if it can have any negative SEO implications.
Example of one of the links:
Links aspmx3.googlemail.com.sullivansolarpower.com/about/solar-power-blog/daniel-sullivan/renewable-energy-and-electric-cars-are-not-political-footballs I did a site operator search, site:aspmx3.googlemail.com.sullivansolarpower.com on google and it returns several results.
-
You appear to have the MX sub-domain also set up as an A record.
If you have a mac / linux you can run the command: host aspmx3.googlemail.com.sullivansolarpower.com
You get the result aspmx3.googlemail.com.sullivansolarpower.com has address 72.10.48.198
Where you should get the result "not found".
I think you want to delete the A record (though check the documentation of your email provider first). You should only need them set up as MX records and shouldn't need the A record.
You've done the right thing by setting up the redirect - which should mean that the pages drop out of the index and those links disappear. (Note that there is also an https error on the aspmx3 sub-domain - but given that you don't actually want it, I don't suppose that matters that much).
Hope that helps.
-
I did not explain the problem thoroughly. The problem is, the link does not actually exist anywhere. To make a very long story short. There was an issue with server configuration for a period of a couple months. During that time, an unknown number of non-existent subdomains got indexed. Basically, if anyone had a typo in the subdomain when accessing our site, it would get cached and if Google crawled our site before we cleared the cache, the typo subdomain would get indexed. Over a period of a couple months, many bad subdomains were accidentally created and indexed by Google. We do not have any way of finding a comprehensive list of all of them. This problem has been resolved so we are not getting new bad subdomains created and indexed, but the damage has been done.
The way our site is setup currently, any attempt to reach our site with any subdomain other than "www" gets redirected to "www.sullivan..." Also, any nonsecure protocol gets resolved to https://
The actual problem, simply put is this: Google has an index which includes some number of unknown, non existent subdomains. We need to get rid of them and cannot figure out how.
Example: Copy and paste the following into google and search it:
site:aspmx3.googlemail.com.sullivansolarpower.com
Google will return two results. If you click on either, it resolves to the "https://www. version of the page.
I know it is confusing, but does that make sense? I have searched everywhere, but the reason this happened was because of a perfect storm of server configuration issues and I cannot find anyone else who has had the same problem.
If it were one or two bad subdomains, we would just put them into search console and then get "remove URL" for the entire subdomain. But it is not 1 or 2. It is at least 10 that I know of and could be hundreds for all I know.
Does anyone have any ideas? Any and all would be welcome.
Thank you.
-
You should find the locations of those links and correct them to point to the proper URL. I find that Screaming Frog's crawl is the easiest for this, you can find every link and see where they are located.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google search console image indexing issue
Google search console tells that only '58 out of the 3553' images in the images sitemap are indexed. But if I search "site:example.com" in Google images there seem to be lots of images. There are no errors in the sitemap and I am still getting reasonable number of image search hits daily. Are the webmaster tools stats for images indexed accurate? When I click on the Sitemap Errors & Index Errors this is what i get - Error details: No errors found. https://www.screencast.com/t/pqL62pIc
Technical SEO | | 21centuryweb0 -
Adding /es version to google search console
I have a Wordpress site and we are using WPML for making it bilingual. The domain is: https://www.designerfreelance.net and for Spanish https://www.designerfreelance.net/es Do I have to add to Google search console the /es version? And the no www: https://www.designerfreelance.net https://www.designerfreelance.net/es https://designerfreelance.net https://designerfreelance.net/es and do I have to add the non ssl version? http://www.designerfreelance.net http://www.designerfreelance.net/es http://designerfreelance.net http://designerfreelance.net/es Thanks
Technical SEO | | Trazo0 -
Verify all versions of site in Bing Webmaster Tools
Hello, We recently migrated our site to a new shopping cart, https, and from www to non-www, and it's been a rough transition. We've lost a lost of traffic particularly in Bing. All the versions of our site are verified Google WMT, sitemaps are submitted correctly, etc. Unfortunately, this was not done for Bing. Currently only the new version of our site (https, non-www) is verified in Bing WMT. Do we have to verify all versions of our site in Bing, the way they are in Google WMT? Also, now that it's been a few months since the switch, should we still submit a site move to Bing WMT or is it too late? Thanks in advance!
Technical SEO | | whiteonlySEO0 -
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
Redirecting HTTP to HTTPS - How long does it take Google to re-index the site?
hello Moz We know that this year, Moz changed its domain to moz.com from www.seomoz.org
Technical SEO | | joony
however, when you type "site:seomoz.org" you still can find old urls indexed on Google (on page 7 and above) We also changed our site from http://www.example.com to https://www.example.com
And Google is indexing both sites even though we did proper 301 redirection via htaccess. How long would it take Google to refresh the index? We just don't worry about it? Say we redirected our entire site. What is going to happen to those websites that copied and pasted our content? We have already DMCAed their webpages, but making our site https would mean that their website is now more original than our site? Thus, Google assumes that we have copied their site? (Google is very slow on responding to our DMCA complaint) Thank you in advance for your reply.0 -
Website is not indexed in Google
Hi Guys, I have a problem with a website from a customer. His website is not indexed in Google (except for the homepage). I could not find anything that can possibly be the cause. I already checked the robots.txt, sitemap, and plugins on the website. In the HTML code i also couldn't find anything which makes indexing harder than usual. This is the website i am talking about: http://www.xxxx.nl/ (Dutch) The only thing that i am guessing now is the Google sandbox, but even that is quite unlikely. I hope you guys discover something i could not find! Thanks in advance 🙂
Technical SEO | | B.Great0 -
Versions of same site with www, no www, ww, and w
It's just come to light that a couple of our clients have both www. and no www versions of their site, with no 301 in place. That's all fine, we're (trying) to get them to sort it. But, strangely, one of our clients not only has the www. and no www, but they also have ww., and w. - all showing their site as normal - the only difference being that the www. and no www are both PR 3 while all other versions are N/A. Does anyone have any idea what's going on/if it's a problem? Thanks very much 🙂
Technical SEO | | Chuck-Boom0