Removing site subdomains from Google search
-
Hi everyone,
I hope you are having a good week?
My website has several subdomains that I had shut down some time back and pages on these subdomains are still appearing in the Google search result pages.
I want all the URLs from these subdomains to stop appearing in the Google search result pages and I was hoping to see if anyone can help me with this.
The subdomains are no longer under my control as I don't have web hosting for these sites (so these subdomain sites just show a default hosting server page). Because of this, I cannot verify these in search console and submit a url/site removal request to Google.
In total, there are about 70 pages from these subdomains showing up in Google at the moment and I'm concerned in case these pages have any negative impacts on my SEO.
Thanks for taking the time to read my post.
-
Hi Andrew,
If you can send me the domain I'll have a look at the DNS and see if there's anything that looks unusual. Drop me a message on here, or email me on [email protected]
Cheers
David
-
Hi David,
Thank you very much for sharing these options with me. I don't have Google Analytics for the subdomains so I tried the second option.
I selected 'domain name provider', went with 'other', and added a txt record to the DNS configuration for the main domain as per Google's instructions yesterday. I just checked the status a minute ago and I'm still seeing the verification failure message coming up as below:
"Your verification DNS TXT record was not found. You might need to wait a few minutes before Google sees your changes to the TXT records."
How long does it normally take to verify a property this way?
Thanks again David for your kind help.
-
Hi Andrew! How is this going? We'd love an update!
-
Hi Andrew,
Is it possible for you to have your hosting provider or domain registrar setup a redirect from the old subdomains to the current domain in use?
Best regards,
Anders -
Hi Andrew,
This can be solved, you have a couple of options:
-
Do you have Google Analytics accounts for these subdomains? If so you could register search console via GA and remove the subdomains from there
-
Even if you don't have web hosting for these subdomains, as long as you have control over the main domain you can still take control of these subdomains. Is the core domain still under your control? If so you can register for search console via the 'domain name provider' option.
Thanks,
David
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there an easy way to hide one of your URL's on google search?, rather than redirecting?
We don't want to redirect to a different page, as some people still use it, we just don't want it to appear in search
Technical SEO | | TheIDCo0 -
How ask Google to de index scrapper sites?
While doing text Google searches for various keywords I have found two sites that have scrapped pages from my site which goes by an old URL of www.tpxcnex.com and a new URL of www.tpxonline.com www.folder.com is one of the sites and if you try to visit that site or any of the scrapped Google index listing, Chrome warns you not to. How can I ask Chrome to deindex www.folder.com or another scrapper site, or atleast deindex the URLs which have clearly scrapped my content?
Technical SEO | | DougHartline0 -
Google not indexing /showing my site in search results...
Hi there, I know there are answers all over the web to this type of question (and in Webmaster tools) however, I think I have a specific problem that I can't really find an answer to online. site is: www.lizlinkleter.com Firstly, the site has been live for over 2 weeks... I have done everything from adding analytics, to submitting a sitemap, to adding to webmaster tools, to fetching each individual page as googlebot and then submitting to index via webmaster tools. I've checked my robot files and code elsewhere on the site and the site is not blocking search engines (as far as I can see) There are no security issues in webmaster tools or MOZ. Google says it has indexed 31 pages in the 'Index Status' section, but on the site dashboard it says only 2 URLS are indexed. When I do a site:www.lizlinketer.com search the only results I get are pages that are excluded in the robots file: /xmlrpc.php & /admin-ajax.php. Now, here's where I think the issue stems from - I developed the site myself for my wife and I am new to doing this, so I developed it on the live URL (I now know this was silly) - I did block the content from search engines and have the site passworded, but I think Google must have crawled the site before I did this - the issue with this was that I had pulled in the Wordpress theme's dummy content to make the site easier to build - so lots of nasty dupe content. The site took me a couple of months to construct (working on it on and off) and I eventually pushed it live and submitted to Analytics and webmaster tools (obviously it was all original content at this stage)... But this is where I made another mistake - I submitted an old site map that had quite a few old dummy content URLs in there... I corrected this almost immediately, but it probably did not look good to Google... My guess is that Google is punishing me for having the dummy content on the site when it first went live - fair enough - I was stupid - but how can I get it to index the real site?! My question is, with no tech issues to clear up (I can't resubmit site through webmaster tools) how can I get Google to take notice of the site and have it show up in search results? Your help would be massively appreciated! Regards, Fraser
Technical SEO | | valdarama0 -
How to get found on local google search?
Hey When look for particular local businesses on Google like "Manchester Hotels" for example; at the top of Google page come all business up that are marked on the city map (with A,B,C..), then all the others follow. So my question is: "How can I get my business on that map?". Thank you! Ve
Technical SEO | | MissVe0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0 -
Directory Indexed in Google, that I dont want, How to remove?
Hi One of my own websites, having a slight issue, Google have indexed over 500+ pages and files from a template directory from my eCommerce website. In google webmaster tools, getting over 580 crawl errors mostly these ones below I went into my robots text file and added Disallow: /skins*
Technical SEO | | rfksolutionsltd
Disallow: /skin1* Will this block Google from searching them again? and how do I go about getting the 500 pages that are already indexed taken out? Any help would be great | http://www.rfkprintsolutions.co.uk/skin1/modules/Subscriptions/subscription_priceincart.tpl | 403 error | Jan 15, 2012 |
|http://www.rfkprintsolutions.co.uk/skin1/modules/Subscriptions/subscription_info_inlist.tpl | 403 error | Jan 15, 2012 |
|http://www.rfkprintsolutions.co.uk/skin1/modules/Subscriptions/subscriptions_admin.tpl | 403 error | Jan 15, 2012 |
0 -
Will frequently adding and frequently removing pages from my site hinder any SEO?
Hi Guys, Just looking through our crawl diagnositcs and we have a ton errors, well over 5000 actually, on 404 pages that cannot be accessed. Our website runs a lot of "Hot Offers" that are time bound, so they expire at the end of each month and we remove the page via our CMS. It's making the crawl diagnositcs loook bad, but will this be hindering our seo and Google 'stuff' because they are finding thousands of 404 errors? Any advice would be greatly appreciated! Website: www.vospers.com Lee Greenhill
Technical SEO | | lee_greenhill0 -
How to push down outdated images in Google image search
When you do a Google image search for one of my client's products, you see a lot of first-generation hardware (the product is now in its third generation). The client wants to know what they can do to push those images down so that current product images rise to the top. FYI: the client's own image files on their site aren't very well optimized with keywords. My thinking is to have the client optimize their own images and the ones they give to the media with relevant keywords in file names, alt text, etc. Eventually, this should help push down the outdated images is my thinking. Any other suggestions? Thanks so much.
Technical SEO | | jimmartin_zoho.com0