Help with htaccess
-
I just setup a WP install in a subfolder: domain.com/development/
However, there is an existing htaccess file in the root which contains the following:
RewriteRule ^([A-Za-z_0-9-]+)$ /index.php?page=$1 [QSA]
RewriteRule ^([A-Za-z_0-9-]+)/$ /index.php?page=$1 [QSA]
RewriteRule ^([A-Za-z_0-9-]+)/([a-z]+)$ /index.php?page=$1&comp=$2 [QSA]
RewriteRule ^([A-Za-z_0-9-]+)/([a-z]+)/$ /index.php?page=$1&comp=$2 [QSA]I need to leave the rules as-is due to the nature of CMS (not WP) under the root domain.
Is it possible to include an exception or condition which allows URL requests containing /development/ to resolve to that folder?
I tried to add:
RewriteRule ^development/([A-Za-z_0-9-]+)$ /development/index.php?page=$1 [QSA]
but this seems to send it in a loop back to the root.
Thanks!!!
-
Hi there,
To be able to give you an answer, could you please confirm: Do you want to apply specific rules to the pages inside the /development/ subdirectory that override the ones that are already included in the root file htaccess or something else?
Thanks for the confirmation!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Video SERP Help
Hello Friends,
Intermediate & Advanced SEO | | KINQDOM
I try to appear on search results of property related search terms with my property videos. Here is a sample property video
http://www.antalyahomes.com/videositemap.asp May you please check it and tell me what I do wrong? Thanks in advance for your time.0 -
Need some help/input about my Joomla sitemap created by XMap
Here is my current sitemap for my site http://www.yakangler.com/index.php?option=com_xmap&view=xml&tmpl=component&id=1 I have some questions about it's current settings. I have a component called JReviews that xmap produces a separate link for each category. ex: http://www.yakangler.com/fishing-kayak-review/265-2013-hobie-mirage-adventure-island 2014-09-03T20:46:25Z monthly 0.4 http://www.yakangler.com/fishing-kayak-review/266-2012-wilderness-systems-tarpon-140 2014-06-03T15:49:00Z monthly 0.4
Intermediate & Advanced SEO | | mr_w
http://www.yakangler.com/fishing-kayak-review/343-wilderness-systems-tarpon-120-ultralite 2013-11-25T06:39:05Z monthly 0.4 Where as my other articles are only linked by the content category. ex: http://www.yakangler.com/news monthly 0.4
http://www.yakangler.com/tournaments monthly 0.4
http://www.yakangler.com/kayak-events monthly 0.4
http://www.yakangler.com/spotlight monthly 0.4 Which option is better?0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Please help with creation of slideshare
Just wondering how I would go about creating something like this http://www.slideshare.net/coolstuff/the-brand-gap?from_search=1
Intermediate & Advanced SEO | | BobAnderson0 -
301 redirect on Windows IIS. HELP!
Hi My six-year-old domain has always existed in four forms: http://www**.**mydomain.com/index.html http://mydomain.com/index.html http://mydomain.com/ http://www.mydomain.com My webmaster claims it’s “impossible” to do a 301 redirect from the first three to the fourth. I need simple instructions to guide him. The site’s hosted on Windows running IIS Here’s his rationale: These are all the same page, so they can’t redirect to themselves. Index.html is the default page that loads automatically if you don’t specify a page. If I put a redirect into index.html it would just run an infinite redirect loop. As you can see from the IIS set up, both www.mydomain and mydomain.com point to the same location ( VIEW IMAGE HERE ) _Both of these use index.html as the default document ( VIEW IMAGE 2 HERE ) _
Intermediate & Advanced SEO | | Jeepster0 -
Adding Meta Languange tag to xhtml site - coding help needed
I've had my site dinged by Google and feel it's likely several quality issues and I'm hunting down these issues. One of Bing's Webmaster SEO tools said my xhtml pages (which were built in 2007) are missing Meta Language and suggested adding tag in the or on the html tag. Wanting to "not mess anything up" and validate correctly, I read in **W3C's site and it said: ** "Always add a lang attribute to the html tag to set the default language of your page. If this is XHTML 1.x you should also use the xml:lang attribute (with the same value). Do not use the meta element with http-equiv set to Content-Language." My current html leads like: QUESTION:
Intermediate & Advanced SEO | | mlm12
I'm confused on how to add the Meta Language to my website given my current coding as I"m not a coder. Can you suggest if I should add this content-language info, and if so, what is the best way to do so, considering valid w3c markup for my document type? Thank you!!!
Michelle0 -
Rankings Nose Diving Help Needed
Hey There SEO Community, I am trying to help these people: http://goo.gl/B1smo They once ranked in the top 10 for "lifewave" and "lifewave patches" but have disappeared. Any idea why and what I can do to help? Thanks!
Intermediate & Advanced SEO | | siteoptimized0 -
Need help identifying why my content rich site was hurt by the Panda Update.
Hi - I run a hobby related niche new / article / resource site (http://tinyurl.com/4eavaj4) which has been heavily impacted by the Panda update. Honestly I have no idea why my Google rankings dropped off. I've hired 2 different SEO experts to look into it and no one has been able to figure it out. My link profile is totally white hat and stronger then the majority of my competitors, I have 4000+ or so pages of unique, high quality content, am a Google News source, and publish about 5 new unique articles every day. I ended up deleting a 100 or so thin video pages on my site, did some url reorganization (using 301s), and fixed all the broken links. That appeared to be helping as my traffic was returning to normal. Then the bottom dropped out again. Since Saturday my daily traffic has dropped by 50%. I am really baffled at this point as to what to do so any help would be sincerely appreciated. Thanks, Mike [email protected]
Intermediate & Advanced SEO | | MikeATL0