How did my dev site end up in the search results?
-
We use a subdomain for our dev site. I never thought anything of it because the only way you can reach the dev site is through a vpn. Google has somehow indexed it. Any ideas on how that happened? I am adding the noindex tag, should I used canonical? Or is there anything else you can think of?
-
Personally, I'd still recommend using robots.txt to disallow all crawlers, even if more steps are taken.
-
Don't use tool removal, it can go bad indeed. Now, are you sure that there are no external links coming from anywhere?
For now I'd recommend putting noindex, nofollow on that dev subdomain and do manual recrawl through GWT.
-
It just uses internal links. Do you think I should try the webmaster tools removal? That seems like it could go wrong.
-
I never used screaming frog, does it check both external and internal links?
-
I have ran screaming frog to see if there are any links to any pages and but couldn't see any. Even if Google did try to follow it the firewall would stop them. It is so strange.
-
Then my first assumption is that it's linked from somewhere - read my comment a little above.
-
Then there is a leak somewhere - Google bots can "see" your subdomain.
Or it's been simply linked from somewhere. Then Google will try to follow the link and that would make it indexed.
-
They are telling me that there are no holes, and I have tried getting to the pages but can not do it unless I am on my vpn.
-
We never updated the robots.txt because the site was behind a firewall. If you click on any of the results it will not load the page unless on my VPN.
-
Robots.txt won't help anyhow. Bots still can see that there is such directory, they just won't see what's inside of those directories/subdomains.
-
Hi there.
If what you say is true, then there are only two answers: you got a leak somewhere or your settings/configuration is messed up.I'd say go talk to your system admin and make sure that everything what's supposed to be closed is closed, IPs, which are supposed to be open for use are open and those IPs only.
-
Have you updated the dev sites robots.txt to disallow everything? It is up to the bot to listen, but that combined with removing all of the dev URLs from Google Webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it posible to improve site rankings working only with an other site?
Hi everyone, i´ll try to explain a situation is happening to me, i´m goint to try to explain the case (im writing the sites without links for explication purposes. Site 1: Adventurerooms Site 2: Adventureroomsmallorca Site 3: Adventureroomsmadrid (the new site) What happen is that at first there was only Adventurerooms and Adventureroomsmallorca, Adventurerooms was for Madrid and linked to the one in Mallorca too, was kind of giving the information for Madrid but in first page split with a link to Mallorca. In a new strategy we create Adventureroomsmadrid for Madrid, and leave Adventurerooms for Spain (with links to Adventureroomsmadrid and Adventureroomsmallorca. We redirect the info for Madrid in Adventurerooms to Adventureroomsmadrid with 301 redirections. We work during this 3 months in Adventureroomsmadrid making content in the blog, and improving (now Adventureroomsmadrid is Moz 15 (perhaps even more), and Adventurerooms is Moz 10. Surprising Adventurerooms is getting better in its search rankings, even when we took away content from it and even without working well. Adventureroomsmadrid is also improving but not as much as Adventurerooms (i know that is a new site, only 3 months), but Adventurerooms gets better results with no content and only DA of 10. I hope i´ve explain the case with my english so the question is: "Is it posible to improve site rankings working only with an other site?" Thanks in advance
Intermediate & Advanced SEO | | webtematica0 -
Spammy sites that link to a site
Hello, What is the best and quickest way to identify spammy sites that link to a website, and then remove them ( google disavow?) Thank you dear Moz, community - I appreciate your help 🙂 Sincerely, Vijay
Intermediate & Advanced SEO | | vijayvasu0 -
Dev Site Out of SERP But Still Indexed
One of our dev sites get indexed (live site robots.txt was moved to it, that has been corrected) 2-3 weeks ago. I immediately added it to our Webmaster Tools and used the Remove URL tool to get the whole thing out of the SERPs. A site:devurl search in Google now returns no results, but checking Index Status in WMT shows 2,889 pages of it still indexed. How can I get all instances of it completely removed from Google?
Intermediate & Advanced SEO | | Kingof50 -
Severe health issues are found on your site. - Check site health (GWT)
Hi, We run a Magento website - When i log in to Google Webmaster Tools, I am getting this message: Severe health issues are found on your site. - <a class="GNHMM2RBFH">Check site health
Intermediate & Advanced SEO | | bjs2010
</a>Is robots.txt blocking important pages? Some important page is blocked by robots.txt. Now, this is the weird part - the page being blocked is the admin page of magento - under
www.domain.com/index.php/admin/etc..... Now, this message just wont go away - its been there for days now - so why does Google think this is an "important page"? It doesnt normally complain if you block other parts of the site ?? Any ideas? THanks0 -
Optimal site structure for travel site
Hi there, I am seo-managing a travel website where we are going to make a new site structure next year. We have about 4000 pages on the site at the moment. The structure is only 2-levels at the moment: Level 1: Homepage Level 2: All other pages (4000 individual pages - (all with different urls)) We are adding another 2-3 levels, but we have a challenge: We have potentially 2 roads to the same product (e.g. "phuket diving product") domain.com/thailand/activities/diving/phuket-diving-product.asp domain.com/activities/diving/thailand/phuket-diving-product.asp I would very much appreciate your view on the problem: How do I solve this dilemma/challenge from a SEO standpoint? I want to avoid DC if possible, I also only want one landing page - for many reasons. And usability is of course also very important. Best regards, Chris
Intermediate & Advanced SEO | | sembseo0 -
Search Refinement URLs
My site is using search refinement and I am concerned about the URL adding additional characters when it's refined. My current URL is: http://www.autopartscheaper.com/Air-Conditioning-Heater-Parts-s/10280.htm and when someone chooses their specific year, make, and model then it changes to: http://www.autopartscheaper.com/Air-Conditioning-Heater-Parts-s/10280.htm?searching=Y&Cat=10280&RefineBy_7371=7708. Will this negatively affect SEO for this URL? Will the URL be counted twice? Any help would be great!
Intermediate & Advanced SEO | | BrandLabs0 -
2 sites or one sites: 2 locations
Hello, I have a dog training client who is offering services in 2 separate locations. We're looking to be first in the non-local search results and also rank well in google places. Would it be better to go for 2 separate sites or one site and try to rank for 2 different locations with one site? There's both local and standard search results when we type in our keywords. Thanks!
Intermediate & Advanced SEO | | BobGW0