Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
-
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4.
Is this hurting us in the search engines? Is rogerbot too good?
What can we do to cut off bots after the ".html?" ? Any help would be much appreciated
-
I had the same problem on http://www.tokenrock.com because I was doing a lot of URL Rewriting, it's a CMS system I wrote, but the same issue apply. I went from 7000+ errors according to SEOMoz, and I'm down to 700. Here's a few things I did:
Use canonicals on everything you possibly can.
Redirect 301 the items in the SERPS that are identical.
I'm not familiar with Magento to help you work though that side of it.
Having a link like: domainname/leather-chairs-244-16-price-1.html would work much better.
The ones you have listed are because somehow somewhere you (the site) have a link to it.
Unfortunately some of the CMS's are written by developers who don't fully understand SEO and why the ? is a bad thing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will URLS With Existing 301 Redirects Be as Powerful As New URLS In Serps?
Most products on our site have redirects to them from years of switching platform and merely trying to get a great and optimised URL for SEO purposes. My question is this: If a product URL has alot of redirects (301's), would it be more beneficial to me to create a duplicated version of the product and start fresh with a new URL? I am not on here trying to gain backlinks but my site is tn nursery dot net (proof:)
Intermediate & Advanced SEO | | tammysons
I need some quality help figuring out what to do.
Tammy0 -
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
AMP - The what, why and how of it
Hi There, In our company, we never had AMP implemented on sites that we SEO for. We want to start doing that now. I know the basics of AMP and their requirements, however, I need to know a lot more about it from an SEO perspective of this before I actually get developers onto it. I want to know all the 1) risks involved, 2) the best ways to implement it (plugin etc.), 3) why it is worth it. I also want to know how to see if a developer knows what he is talking about - and really will do it the right way without messing me up? Are there any specific questions I should ask, or information he should be aware of? Also, is there a way for it it be done for pages that have more than just text like quote forms, sliding headers etc.? Should we only do it for the blog section of our site? I would greatly appreciate any links to additional resources on this topic, (not why to use AMP, but everything else) I greatly appreciate you taking the time to answer my question
Intermediate & Advanced SEO | | Ruchy0 -
6 .htaccess Rewrites: Remove index.html, Remove .html, Force non-www, Force Trailing Slash
i've to give some information about my website Environment 1. i have static webpage in the root. 2. Wordpress installed in sub-dictionary www.domain.com/blog/ 3. I have two .htaccess , one in the root and one in the wordpress
Intermediate & Advanced SEO | | NeatIT
folder. i want to www to non on all URLs Remove index.html from url Remove all .html extension / Re-direct 301 to url
without .html extension Add trailing slash to the static webpages / Re-direct 301 from non-trailing slash Force trailing slash to the Wordpress Webpages / Re-direct 301 from non-trailing slash Some examples domain.tld/index.html >> domain.tld/ domain.tld/file.html >> domain.tld/file/ domain.tld/file.html/ >> domain.tld/file/ domain.tld/wordpress/post-name >> domain.tld/wordpress/post-name/ My code in ROOT htaccess is <ifmodule mod_rewrite.c="">Options +FollowSymLinks -MultiViews RewriteEngine On
RewriteBase / #removing trailing slash
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)/$ $1 [R=301,L] #www to non
RewriteCond %{HTTP_HOST} ^www.(([a-z0-9_]+.)?domain.com)$ [NC]
RewriteRule .? http://%1%{REQUEST_URI} [R=301,L] #html
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^.]+)$ $1.html [NC,L] #index redirect
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.html\ HTTP/
RewriteRule ^index.html$ http://domain.com/ [R=301,L]
RewriteCond %{THE_REQUEST} .html
RewriteRule ^(.*).html$ /$1 [R=301,L]</ifmodule> The above code do 1. redirect www to non-www
2. Remove trailing slash at the end (if exists)
3. Remove index.html
4. Remove all .html
5. Redirect 301 to filename but doesn't add trailing slash at the end0 -
Outside Top 10 Even though - Higher Domain/Page Authority/Higher On Page Grade
Hi, Note: this is for Australian search results - for people in Perth.
Intermediate & Advanced SEO | | HeadStud
The website is: http://thedj.com.au I am trying to optimise for the keyword 'perth wedding dj', but also 'wedding dj perth' and for some reason my website isn't even in the top 10 results. Here is what's weird though: My on-page grade with the On-Page Grader for the keyword 'wedding DJ perth' is an 'A' for http://thedj.com.au (http://awesomescreenshot.com/0135135hca) When checking the Keyword Difficulty in the Google Australia search enginge for 'wedding DJ perth' - there are 4 results which have a lower domain authority than 15 (in fact one result has a domain authority of 1) - http://awesomescreenshot.com/03f5134zd1 http://thedj.com.au has a Domain Authority of 23/100 and a Page Authority of 34/100. (http://awesomescreenshot.com/0bb5134tb8) So seeing as the page has gotten an A for on-page optimisation for the keyword 'wedding DJ Perth' and has a higher domain authority then many results in the top 10... why isn't it in the Top 10?! Bonus Question:
Why is DJ Avi showing up at the top of search results (Local listing) depsite the fact that:
a) He has no website to link to
b) No reviews for his listing
c) No keywords that I can see (other than the fact that he's a DJ)
Screenshot: http://awesomescreenshot.com/05151349cb Meanwhile our Local Places - Thanks,
Kosta
http://www.headstudios.com.au0 -
Redirecting site from html/php to wordpress
I've never come across this and haven't been able to really find anything that explains it very well. I want to get opinions before we make a definitive decision. Here's the scenario... I am working on a site that was built in HTML/PHP and some of the pages are ranking pretty well. (some page 1, but not number 1) We are going to start using the Wordpress platform by year's end. The pages that were built in html have been built a little spammy but they still rank. I just think they are keyword stuffed a little and not very "reader friendly" (I think the last person was spinning content). So, we've built completely new content on our new pages and we've commissioned really good content writers for them. I will be handling the on-page SEO going forward so I know what to do there. My questions are this.... Should I 301 the old pages to the new pages with the better content? (old pages have the .html or .php extensions so www.example.com/keyword.php will become www.example.com/keyword-keyword Is there any negative side to doing this since the content will be completely different then the old pages that are being 301 from. (Keywords are pretty much staying the same with the exception of minor variations. ie, www.example.com/red-cashmere-sweater.php to www.example.com/cashmere-sweater) I ask this because I've moved sites before where I've just changed the location of the same content. I've never done it where the content is changing and so is the URL extension. Thank you in advance for your help and guidance.
Intermediate & Advanced SEO | | DarinPirkey0 -
Anchor Text Diversification – Branded VS Non Branded – What is the best approach… if any?
Our organization competes in the Drug & Alcohol Treatment Category… very competitively I must say. While we create content for long-tail keywords, we focus on linking (blogging + Press Release + Acquisition, etc…) as the main strategy to increase relevancy for 4 major keywords. (Alcohol Rehab, Drug Rehab, Alcohol Treatment, and Drug Treatment)… all these terms have their respective landing pages, and we try to provide a good flow of new links coming to these pages on a weekly basis… Lately we have been acquiring more links than we anticipated… not a bad thing since they are from reputable websites… however I am a bit concern regarding the Anchor Text distribution of these links. Example Let’s say I get 100 links to my ‘Alcohol Rehab’ page – what is an appropriate percentage for the anchor text distribution? For example: Branded Links 20 - Keyword: St Jude Retreats
Intermediate & Advanced SEO | | dhidalgo1
Exact Match Links 70 - Keyword: Alcohol Rehab
Broad Links 10 - Keyword: Rehab Is this an ok distribution, or should I change things around? Hope you guys can help! Thanks!!!!0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0