Robots.txt question
-
I notice something weird in Google robots. txt tester
I have this line
Disallow: display=
in my robots.text but whatever URL I give to test it says blocked and shows this line in robots.text
for example this line is to block pages like
http://www.abc.com/lamps/floorlamps?display=table
but if I test
http://www.abc.com/lamps/floorlamps or any page
it shows as blocked due to Disallow: display=
am I doing something wrong or Google is just acting strange? I don't think pages with no display= are blocked in real.
-
Yes - there is bug in your robots.txt. You should wrote some as:
Disallow: /?display=table
or:
Disallow: /?display=*
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What do you add to your robots.txt on your ecommerce sites?
We're looking at expanding our robots.txt, we currently don't have the ability to noindex/nofollow. We're thinking about adding the following: Checkout Basket Then possibly: Price Theme Sortby other misc filters. What do you include?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Tricky 301 question
A friend has relaunched a website but his web guys (he didn't consult me!) didn't do any 301s and now traffic unsurprisingly has tanked. The old site and database no longer exists and there are now 2000+ 404's. Any ideas how to do the 301s from old urls to new product urls WITHOUT it being a massive manual job?
Intermediate & Advanced SEO | | AndyMacLean0 -
Merging 4 websites into one for a new site release (301 question)
Hi guys and girls, I have a client that has 4 very outdated websites with about 50 pages on each. They are made up like: 1 brand group and 3 for each individual key service they offer, so let's call them: brand.com (A) brand-service-1.com (B) brand-service-2.com (C) brand-service-3.com (D) We've rebuilt the main site and aggregated all the content from the others (99% re-written). Am I correct in thinking the process for the new lauch would be: 1. Launch the new site on brand.com (A) and 301 all the old brand.com (A) pages to the related pages on the new site. 2. Redirect the other websites (B,C,D) on a domain level to the new site on the brand.com (A) domain. 3. Clean up the old URL's, sitemaps, errors in Google WMT Is this right? Anything I missed/better practices? I was also wondering if I should redirect B,C,D in stages, or use page level redirects.
Intermediate & Advanced SEO | | shloy23-2945840 -
Question regarding geo-targeting in Google Webmaster Tools.
I understand that it's possible to target both domains/subdomains and subfolders to different geographical regions in GWT. However, I was wondering about the effect of targeting the domain to a single country, say the UK. Then targeting subfolders to other regions (say the US and France). e.g. www.domain.com -> UK
Intermediate & Advanced SEO | | TranslateMediaLtd
www.domain.com/us -> US
www.domain.com/fr -> France etc Would it be better to leave the main domain without a geographical target but set geo-targeting for the subfolders? Or would it be best to set geo-targeting for both the domain and subfolders.0 -
High level rel=canonical conceptual question
Hi community. Your advice and perspective is greatly appreciated. We are doing a site replatform and I fear that serious SEO fundamentals were overlooked and I am not getting straight answers to a simple question: How are we communicating to search engines the single URL we want indexed? Backstory: Current site has major duplicate content issues. Rel-canonical is not used. There are currently 2 versions of every category and product detail page. Both are indexed in certain instances. A 60 page audit has recommends rel=canonical at least 10 times for the similar situations an ecommerce site has with dupe urls/content. New site: We are rolling out 2 URLS AGAIN!!! URL A is an internal URL generated by the systerm. We have developed this fancy dynamic sitemap generator which looks/maps to URL A and creates a SEO optimized URL that I call URL B. URL B is then inserted into the site map and the sitemap is communicated externally to google. URL B does an internal 301 redirect back to URL A...so in an essence, the URL a customer sees is not the same as what we want google to see. I still think there is potential for duplicate indexing. What do you think? Is rel=canonical the answer? In my research on this site, past projects and google I think the correct solution is this on each customer facing category and pdp: The head section (With the optimized Meta Title and Meta Description) needs to have the rel-canonical pointing to URL B
Intermediate & Advanced SEO | | mm916157
example of the meta area of URL A: What do you think? I am open to all ideas and I can provide more details if needed.0 -
Issue with Robots.txt file blocking meta description
Hi, Can you please tell me why the following error is showing up in the serps for a website that was just re-launched 7 days ago with new pages (301 redirects are built in)? A description for this result is not available because of this site's robots.txt – learn more. Once we noticed it yesterday, we made some changed to the file and removed the amount of items in the disallow list. Here is the current Robots.txt file: # XML Sitemap & Google News Feeds version 4.2 - http://status301.net/wordpress-plugins/xml-sitemap-feed/ Sitemap: http://www.website.com/sitemap.xml Sitemap: http://www.website.com/sitemap-news.xml User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Other notes... the site was developed in WordPress and uses that followign plugins: WooCommerce All-in-One SEO Pack Google Analytics for WordPress XML Sitemap Google News Feeds Currently, in the SERPs, it keeps jumping back and forth between showing the meta description for the www domain and showing the error message (above). Originally, WP Super Cache was installed and has since been deactivated, removed from WP-config.php and deleted permanently. One other thing to note, we noticed yesterday that there was an old xml sitemap still on file, which we have since removed and resubmitted a new one via WMT. Also, the old pages are still showing up in the SERPs. Could it just be that this will take time, to review the new sitemap and re-index the new site? If so, what kind of timeframes are you seeing these days for the new pages to show up in SERPs? Days, weeks? Thanks, Erin ```
Intermediate & Advanced SEO | | HiddenPeak0 -
Multiple Keyword Research Questions, Help
Hello , I've been trying for several days to understand how keyword research works for a multi purpose website,I've read guides, articles even some chapters from the book" The Art of Seo" by O'Reilly and still no luck. It seems i can't wrap my head around keyword research,lets say I have a social gaming community website and I'm trying to rank it first on some low competition keywords + some long tail keywords.The website has functions like leaderboards, profiles,events, competitions,etc so it's not actually a news related website but it will have a blog. My website being on the games niche It would imply that I should target words that contain the word "Games" but this word generates millions of searches globally so ranking first its nearly impossible if the website is brand new. This made me pursue generic keywords formed with 2 / 3 words like fresh games, new games, mmorpg games, fps games,etc which still generate lets say 30.000 searches globally each. Due to the different areas of the website like latest game events,latest games competitions,etc I'm confused If i should pursue website specific keywords like latest games events, fresh games events, latest games competitions, upcoming games competitions but these too generate 30.000 global searches each,so... 0.should i use generic keywords or keywords that include site features? So let's say I decide to pursue generic "games" keywords,due to a high competition based on the keyword I decide to go a layer deeper and for the keyword "fresh games" I obtain keywords like** "fresh games 2011,top fresh games 2011, upcoming fresh games** " and thus building a list of 30 keywords that contain " fresh games".If i do this for the rest of the keywords: ** new games, mmorpg games, fps games,etc** I end up with a list of 10.000 keywords or more since each keyword generates other keywords. Is this the correct approach ? since generating 10.000 keywords sounds a lot and I'm getting the feeling that It's not how it supposed to be done,like were would I insert 10.000 keywords? So how do I know which keywords to pick and aim in order to try to get no.1 ranking? and why those? How many keywords should I use? and where should i put them? since it's not a news website so writing a lot of articles isn't an option. Should I focus on 2 words keywords with around 10.000-30.000 seaches or 2 words keywords + long tail keywords with less traffic like 100 - 5000? Is there a guide for the Keyword Analysis Tool since if i enter "fresh new games" i get a 39% keyword difficulty,is that hard to rank? and I don't know what all those color mean since some of them have higher numbers then others that are found at the top and how can i get beat a website that has has rank 10. So hopefully with your help & by some miracle I will finally be able to build a keyword list. Thank you !
Intermediate & Advanced SEO | | arching0 -
SEO question
Hi i changed my page titles for a competitive keyword last week and noticed it has dropped 9 search engine ranking positions. Was ranking 37 and now it 46. Would you guys leave it and see if it starts creeping back up or change again? the page title i used was across my pages for example was Primary keyword | secondary keyword | Heading on page thanks for you help
Intermediate & Advanced SEO | | wazza19850