How to Disallow Specific Folders and Sub Folders for Crawling?
-
Today, I have checked indexing for my website in Google. I found very interesting result over there. You can check that result by following result of Google.
I aware about use of robots.txt file and can disallow images folder to solve this issue.
But, It may block my images to get appear in Google image search.
So, How can I fix this issue?
-
You can, but then the content will be removed from Google's index for 90 days. I am not sure what effect this would have on pages with the images. It shouldn't have any effect, but I would hate for you to have rankings in any way affected for 90 days.
I have no experience in having images indexed in this manner. Perhaps someone else has more knowledge to share on this topic.
-
Can I use Remove URL facility from Google webmaster tools?
-
I checked your URL: http://www.lampslightingandmore.com/images/. The folder is now properly restricted and the images can no longer be seen using this method. Going forward, Google will not be able to index new images in the same manner your other images were indexed.
With respect to the images which have been indexed, I am not certain how Google will respond. The image links are still valid so they may keep them. On the other hand, the links are gone so they may remove them. If it were my site, I would wait 30 days to see if Google removed the results.
Another way you can resolve the issue is to change the file path to your images from /images to /image. This will immediately break all the links. You would need to ensure all the links on your site are updated properly. It still may take Google a month to de-index those results but it would certainly happen in that case.
-
I have added Options -Indexes for images folder in htaccess file.
But, I still able to find out images folder in Google indexing.
Can I check? Is it working properly or not? I don't want to index or display images folder in web search any more.
-
I am going to add following code to my htaccess page.
Options -Indexes
Will it work for me or not?
-
If you have a development team, they should instantly understand the problem.
A simple e-mail to any developer
E-mail title: Please fix
http://www.lampslightingandmore.com/images/
That's it. No other text should be needed. A developer should be able to look at the page and understand the index was left open and how to fix it. If you wish to be nicer then a simple "my index is open for the world to see, please don't allow public access to my server folders" should suffice.
-
Yes, I have similar problem with my code structure. Yesterday, I have set Relative path for all URLs. But, I am not sure about replacing of image name in code after make change in folder.
So, I don't want to go with that manner. I also discussed with my development team and recommend to go with htaccess method.
But, give me caution to follow specific method otherwise it may create big issue for crawling or indexing. Right??
-
The link you shared is perfect. Near the top there is a link for OPTIONS. Click on it and you will be on this page: http://httpd.apache.org/docs/1.3/mod/core.html#options
I want to very clearly state you should not make changes to your .htaccess file unless you are comfortable working with code. The slightest mistake and your entire site becomes unavailable. You can also damage the security of your site.
With that said, if you decide to proceed anyway you can add the text I shared to the top of your .htaccess file. You definitely should BACK UP the file before making any changes.
The suggestion vishalkialani made was to rename your /images folder to something else, perhaps /image. The problem is that if your site was not dynamically coded, you would break your image links.
-
In addition to what Ryan mentioned I would rename that folder on your server. That will make google's index outdated and you won't get any visitors on the server
-
I can't getting you.
-
also you can rename it so when google 's index shows up the results you won't get any hits.
if thats what you want.
-
Yes, I checked article to know more about it.
http://httpd.apache.org/docs/1.3/howto/htaccess.html
But, I am not able to find my solution. Can you suggest me specific article which suppose to help me more in same direction?
-
Hello.
You have left your site open in a manner which is not recommended. Please take a look at the following URL: http://www.lampslightingandmore.com/images/. On a properly secured server, you should receive a 404 Page Not Found or Access Denied type of error. Since the folder is left open, a Google crawler found it and you are seeing the results.
The means to secure your site varies based on your software configuration. If you are on an Apache web server (the most common setup) then these settings are controlled by your htaccess file. I am not an htaccess expert but I believe adding the following code to your .htaccess file at the top will fix the issue:
Options -Indexes
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
CcTLDs vs folders
My company is looking at expanding internationally, we have sudomains in the UK and Canada currently. I'm making recommendations on improving SEO and one of the parts that I'm struggling with is the benefits of ccTLDs vs using folders. I know the basic argument about Google recognizing the ccTLDs as being geo specific so they get priority. But I'd like to know HOW much priority they get. We have unique keywords and a pretty strong domain, is having a ccTLDs so much better that'd be worth going that route rather then creating folders within our current domain? Thanks, Jacob
Intermediate & Advanced SEO | | jacob.young.cricut0 -
Should sub domains to organise content and directories?
I'm working on a site that has directories for service providers and content about those services. My idea is to organise the services into groups, e.g. Web, Graphic, Software Development since they are different topics. Each sub domain (hub) has it's own sales pages, directory of services providers and blog content. E.g. the web hub has web.servicecrowd.com.au (hub home) web.servicecrowd.com.au/blog (hub blog) http://web.servicecrowd.com.au/dir/p (hub directory) Is this overkill or will it help in the long run when there are hundreds of services like dog grooming and DJing? Seems better to have separate sub domains and unique blogs for groups of services and content topics.
Intermediate & Advanced SEO | | ServiceCrowd_AU0 -
Duplicate Sub-domains Being Indexed
Hi all, I have this site that has a sub-domain that is meant to be a "support" for clients. Some sort of FAQ pages, if you will. A lot of them are dynamic URLs, hence, the title and most of the content are duplicated. Crawl Diagnostics found 52 duplicate content, 138 duplicate title and a lot other errors. My question is, what would be the best practice to fix this issue? Should I noindex and nofollow all of its subdomains? Thanks in advance.
Intermediate & Advanced SEO | | EdwardDennis0 -
Multiple domain level redirects to unique sub-folder on one domain...
Hi, I have a restaurant menu directory listing website (for example www.menus.com). Restaurant can have there menu listed on this site along with other details such as opening hours, photos ect. An example of a restaurant url might be www.menus.com/london/bobs-pizza. A feature i would like to offer is the ability for Bob's pizza to use the menus.com website listing as his own website (let assume he has no website currently). I would like to purchase www.bobspizza.com and 301 redirect to www.menus.com/london/bobs-pizza Why?
Intermediate & Advanced SEO | | blackrails
So bob can then list bobspizza.com on his advertising material (business cards etc, rather than www.menus.com/london/bobs-pizza). I was considering using a 301 redirect for this though have been told that too many domain level redirects to one single domain can be flagged as spam by Google. Is there any other way to achieve this outcome without being penalised? Rel canonical url, url masking? Other things to note: It is fine if www.bobspizza.com is NOT listed in search results. I would ideally like any link juice pointing to www.bobspizza.com to pass onto www.menus.com though this is a nice to have. If it comes at the cost of being penalised i can live without the link juice from this. Thanks0 -
Are sub domains considered completely different than the root domain?
We have a project that is going to generate duplicate content. If we move the new content to a sub-domain (E.g. product.domain.com) will it still be considered duplicate content to the root domain? Or is it like having two completely different domains? Thanks!
Intermediate & Advanced SEO | | tripled5110 -
Crawling error or somthing else that male my page unvisible ( Simple problem, no solved yet )
Hi, my problem isn't solved and nobody was able to answer my question: why isn't my page poltronafraubrescia.zenucchi.it indexed for the keyword poltrona frau Brescia? The same page on another domain was four on the ranking reluts... And now it redirects to the new one... An you explain me how to proceed? I trust you... Help me...
Intermediate & Advanced SEO | | guidoboem0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1