How can I prevent Google and other search engines to crawl my secured pages (https:)?
-
Let me know your thoughts guys. Thanks in advance!
-
Your best bet is to place a meta noindex tag on each secure page. If it's only a few pages, you could just add it by hand. If it's many, you should be able to access each pages protocol with whatever server-side language you're using, and dynamically add it on all secure pages.
If you use robots.txt to exclude the pages, Google can still show them in search results, with the description below them that reads, "A description for this result is not available because of this site's robots.txt – learn more." Personally, I don't care for that.
-
Hi there, blocking the HTTPS version of your pages from being crawled by the search engines is a bit tricky. You might need to come up with a separate robots.txt file to handle the HTTPS requests.
Here you go to know more about the process:
http://www.seoworkers.com/seo-articles-tutorials/robots-and-https.html
Hope it helps.
Best,
Devanur Rafi
-
Hi esiow
You have a choice of placing a robots.txt file in the root folder of your website or if blocking individual pages you could use the meta robots tag. See these page for more information: http://moz.com/learn/seo/robotstxt and https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?csw=1
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is google still crawling my old website pages?
Why is google still looking at my old indexed pages and not my new index. ? Why are they crawling my old website links when none of them are available? How do I overcome these problems?
Web Design | | optimalspaces0 -
Can forwarding users from one domain to a different domain damage rank and authority of first domain?
Preliminary Explanation: We launched a new website a couple months back but haven't had much luck in Google taking notice. One of the main attractions to our site is an old flash app that was made nearly a decade ago. As the original developer has long ago moved on and we are unable to figure out how to integrate it with our new site, we've been stuck hosting the flash app on a different domain. As such, users who come to our site and want to use the app must immediately navigate away from our site to this other domain. This has caused our primary domain's bounce rate and average site time to plummet while raising it for the other domain. My question: is this damaging our search rank and page authority with Google for this primary domain/site and counter-acting any other positive SEO changes we can make? How much weight does Google give towards bounce rate/average site time spent by users in its overall calculations for search rank and page authority? Our average site time for this primary domain is resting currently at 50-60 seconds, while for the secondary domain that hosts the old flash app it is 4-5 minutes.
Web Design | | Closetstogo0 -
Site is getting crushed by spam traffic and Google Webmaster Tools giving crawl warnings. Also...
Currently hosting a site I'm planning on moving to a new server ASAP, 301 redirecting and have a domain that has nice authority and very old. On the current site I need to clean up the blog. I have a few questions actually.... 1. I'd like to remove most of the blog articles as I want the new site to be very high quality, but isn't it dangerous to do a 301 redirect to the same page for all these articles? 2. I want to focus on the new site as the current site has too many issues but still managing to hang in their. is highly outdated yet I don't want to spend a ton of time on the site before the 301 redirect. With the Pigeon and Panda 4.0 rumors being released soon, I want to get the new site completed ASAP. Do you think it's better if I fix the 3. Would removing cloudflare make things better or worse with the crashing of my site due to high traffic (mainly spam on the blog.) 4. My best article by far is outdated, but should I waste time updating it before redirecting or should I just get the new site going? I did way too many guest posts thinking content is king, but at least checked the outgoing links Domain Auth, Page Auth, and MozTrust in OSE, but first off I'm going to remove a page that mentions I'm looking for guest bloggers. I tried to keep the posts relevant but at the time you could get away with 5. Anything I can do to slow down these spammers on Wordpress? I noticed most of them are checking for vulnerabilities but I'm keeping it up to date, have caching setup. Thanks!
Web Design | | eugenecomputergeeks0 -
How do I optimize a site designed to be one scrolling page of content?
Our website uses section ID's as its navigation so all the content is on one page. When you click About Us, the page scrolls down to About Us. Products, the page scrolls to Products section, and etc. I am getting crawl errors for meta descriptions but will this go away once the main domain has this info? We just added the meta keywords and description to the header and since the navigation sections use the same page, I assume it will correct the errors. Any other advice on optimizing for site designs like ours would be great. www.theicecubekit.com is the site. Thanks,
Web Design | | bangbang
Chris0 -
Does Google have problem crawling ssl sites?
We have a site that was ranking well and recently dropped in traffic and ranking. The whole site is https and and not just the shopping pages. Thats the way the server is setup, they make whole site https. My manager thinks the drop in ranking is due to google not crawling https. I think contrary, but would like some feedback on this. Site is here
Web Design | | anthonytjm0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0 -
Auto-Generated META tag on 404 Page
I'm currently creating a 404 error page for my site and I noticed that a similar site uses some sort of code to automatically generate a meta title. Is this useful? For instance type in electrolux.com/john This page does not exist but in the title you'll see John | Electrolux How can i do this on my site?
Web Design | | evacuumstoreSEO0 -
How do you account for misspellings in search engine queries?
Howdy everyone, I'm pretty new to the whole SEO thing, in fact I hadn't even heard the term until this past Fall when a company I was doing a little freelance writing for fired their SEO guy and asked if I thought I could help them with it. I have a (old) background in HTML coding and web design, but have been out of the business for over a decade. This may be a simple question, but it has come up in discussion several times... How do you make sure that users are directed to your site even if they enter keywords with spelling errors? I know that Google offers "did you mean..." links for a lot of words. Is that the best method and if so, how do you manipulate the data so the misspellings continue to result in your site being listed? Any help on this is greatly appreciated! Marty K.
Web Design | | MartinKlausmeier0