Will a robots.txt disallow apply to a 301ed URL?
-
Hi there,
I have a robots.txt query which I haven't tried before and as we're nearing a big time for sales, I'm hesitant to just roll out to live!
Say for example, in my robots.txt I disallow the URL 'example1.html'.
In reality, 'example1.html' 301s/302s to 'example2.html'. Would the robots.txt directive also apply to 'example2.html' (disallow) or as it's a separate URL, would the directive be ignored as it's not valid?
I have a feeling that as it's a separate URL, the robots disallow directive won't apply. However, just thought I'd sense-check with the community.
-
I would that the example2.html wouldn't be affected by the robots.txt as it can be that a bot will visit example2.html directly without visiting example1.html. Definitely as it could be that the page was picked up after the first time it visited example1.html.
-
I would have to agree, using your example if example1.html is blocked via robots.txt and when you visit it, it 301's to example2.html then it's not blocked by robots. as long as example2.html is discoverable via other indexed pages linking to it.
Does anyone else in the community have some insight they would like to share?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt Syntax for Dynamic URLs
I want to Disallow certain dynamic pages in robots.txt and am unsure of the proper syntax. The pages I want to disallow all include the string ?Page= Which is the proper syntax?
Technical SEO | | btreloar
Disallow: ?Page=
Disallow: ?Page=*
Disallow: ?Page=
Or something else?0 -
What to do about removing pages for the 'offseason' (IE the same URL will be brought back in 6-7 months)?
I manage a site for an event that runs annually, and now that the event has concluded we would like to remove some of the pages (schedule, event info, TV schedule, etc.) that won't be relevant again until next year's event. That said, if we simply remove those pages from the web, I'm afraid that we'll lose out on valuable backlinks that already exist, and when those pages return they will have the same URLs as before. Is there a best course of action here? Should I redirect the removed pages to the homepage for the time being using a 302? Is there any risk there if the 'temporary' period is ~7 months? Thanks in advance.
Technical SEO | | KTY550 -
Capitals URLs to Non Capitals...
Hi, I am working on a website which has capital urls and non capital urls which will be generating duplicate content, and I know it is better to use all lower case. The problem is that the page authority is better for the capital versions and I was wondering will it negatively impact the SEO of we 301 redirect the uppercase urls to the lowercase counterparts? Thanks.
Technical SEO | | J_Sinclair0 -
No Keyword in URL
SEOMoz (and other platforms) advise that I need to add my keyword to the page URL, however as far as I'm concerned it has been, so why don't these platforms see it. My home page URL is www.salesandinternetmarketing.com, but apparently I haven't added the keyword internet marketing to the URL, what advice can you give me please? Lindsay
Technical SEO | | lindsayjhopkins1 -
Robots.txt and joomla
Hello, I use joomla for my website and automatically all those files are blocked is that good or bad, so I remove anything and if so why ? User-agent: *
Technical SEO | | seoanalytics
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /images/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/ I also added to my robots.txt files my email address ( is that useful, I am afraid google passes PR to the email address )
and a javascript: void (0) because I have tabs on my webpage ( is that useful )
as well as a .pdf ( is it also useful ) any comments ? does anything need to be changed or is it ok ? Thank you,0 -
Duplicate Content and URL Capitalization
I have multiple URLs that SEOMoz is reporting as duplicate content. The reason is that there are characters in the URL that may, or may not, be capitalized depending on user input. A couple examples are: www.househitz.com/Pennsylvania/Houses-for-sale www.househitz.com/Pennsylvania/houses-for-sale www.househitz.com/Pennsylvania/Houses-for-rent www.househitz.com/Pennsylvania/houses-for-rent There are currently thousands of instances of this on the site. Is this something I should spend effort to try and resolve (may not be minor effort), or should I just ignore it and move on?
Technical SEO | | Jom0 -
Robots.txt blocking site or not?
Here is the robots.txt from a client site. Am I reading this right --
Technical SEO | | 540SEO
that the robots.txt is saying to ignore the entire site, but the
#'s are saying to ignore the robots.txt command? See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file To ban all spiders from the entire site uncomment the next two lines: User-Agent: * Disallow: /0 -
Re-write of url
Hi, I would like your input on the following dilemma I am wanting to target the keyword "download xml". at the moment Google indexes us on page 2 and indexes the page www.ourdomain.com/download.aspx I would like to rewrite the url to be /download-xml-editor.aspx The current page is a pr5 and is our most trafficked and externally inked to page. My thoughts are quite mixed on how to do this. approach 1: re-write url of "download.aspx" and setup permanent 301 redirect of download.aspx to download-xml-editor.aspx approach 2: create a new page called download-xml-editor and 301 redirect that to the current stronger page which is download.aspx approach 3: create new page called download-xml-editor with unique content and try and get that page to rank over time, allowing it to build up links and not compromise the current page, then later 301 redirect How would you deal with this and what are your recommendations
Technical SEO | | LiquidTech0