Partial Match or RegEx in Search Console's URL Parameters Tool?
-
So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them.
Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789=
All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe?
Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?
-
No problem
Hope you get it sorted!
-Andy
-
Thank you!
-
Haha, I think the train passed the station on that one. I would have realised eventually... XD
Thanks for your help!
-
Don't forget that . & ? have a specific meaning within regex - if you want to use them for pattern matching you will have to escape them. Also be aware that not all bots are capable of interpreting regex in robots.txt - you might want to be more explicit on the user agent - only using regex for Google bot.
User-agent: Googlebot
#disallowing page.php and any parameters after it
disallow: /page.php
#but leaving anything that starts with par1=ABC
allow: page.php?par1=ABC
Dirk
-
Ah sorry I missed that bit!
-Andy
-
Disallowing them would be my first priority really, before removing from index.
The trouble with this is that if you disallow first, Google won't be able to crawl the page to act on the noindex. If you add a noindex flag, Google won't index them the next time it comes-a-crawling and then you will be good to disallow
I'm not actually sure of the best way for you to get the noindex in to the page header of those pages though.
-Andy
-
Yep, have done. (Briefly mentioned in my previous response.) Doesn't pass
-
I thought so too, but according to Google the trailing wildcard is completely unnecessary, and only needs to be used mid-URL.
-
Hi Andy,
Disallowing them would be my first priority really, before removing from index. Didn't want to remove them before I've blocked Google from crawling them in case they get added back again next time Google comes a-crawling, as has happened before when I've simply removed a URL here and there. Does that make sense or am I getting myself mixed up here?
My other hack of a solution would be to check the URL in the page.php, and if URL includes par1=ABC then insert noindex meta tag. (Not sure if that would work well or not...)
-
My guess would be that this line needs an * at the end.
Allow: /page.php?par1=ABC* -
Sorry Martijn, just to jump in here for a second - Ria, you can test this via the Robots.txt testing tool in search console before going live to make sure it work.
-Andy
-
Hi Martijn, thanks for your response!
I'm currently looking at something like this...
**user-agent: *** #disallowing page.php and any parameters after it
disallow: /page.php #but leaving anything that starts with par1=ABC
allow: /page.php?par1=ABCI would have thought that you could disallow things broadly like that and give an exception, as you can with files in disallowed folders. But it's not passing Google's robots.txt Tester.
One thing that's probably worth mentioning really is that there are only two variables that I want to allow of the par1 parameter. For example's sake, ABC123 and ABC456. So would need to be either a partial match or "this or that" kinda deal, disallowing everything else.
-
Hi Ria,
I have never tried regular expressions in this way, so I can't tell you if this would work or not.
However, If all 1000 of these URL's are already indexed, just disallowing access won't then remove them from Google. You would ideally be able to place a noindex tag on those pages and let Google act on them, then you will be good to disallow. I am pretty sure there is no option to noindex under the URL Parameter Tool.
I hope that makes sense?
-Andy
-
Hi Ria,
What you could do, but it also depends on the rest of your structure is Disallow these urls based on the parameters (what you could do in a worst case scenario is that you would disallow all URLs and then put an exception Allow in there as well to make sure you still have the right URLs being indexed).
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Parameters
Hi Moz Community, I'm working on a website that has URL parameters. After crawling the site, I've implemented canonical tags to all these URLs to prevent them from getting indexed by Google. However, today I've found out that Google has indexed plenty of URL parameters.. 1-Some of these URLs has canonical tags yet they are still indexed and live. 2- Some can't be discovered through site crawling and they are result in 5xx server error. Is there anything else that I can do (other than adding canonical tags) + how can I discover URL parameters indexed but not visible through site crawling? Thanks in advance!
Intermediate & Advanced SEO | | bbop330 -
301ing one site's links to another
Hi, I have one site with a well-established link profile, but no actual reason to exist (site A). I have another site that could use a better link profile (site B). In your experience, would 301 forwarding all of site A's pages to site B do anything positive for the link profile/organic search of the site B? Site A is about boating at a specific lake. Site B is about travel destinations across the U.S. Thanks! Best... Michael
Intermediate & Advanced SEO | | 945010 -
Can you disallow links via Search Console?
Hey guys, Is it possible in anyway to nofollow links via search console (not disavow) but just nofollow external links pointing to your site? Cheers.
Intermediate & Advanced SEO | | lohardiu90 -
URL Spoof Issue in Search Results
Hello! We could use some assistance diagnosing an issue. In order to avoid asking a convoluted question, I will try to break it down below: 1. A random foreign site is hacked and a subdirectory is added that is completely irrelevant to the root. a). i.e. http://www.um.org/prom_dresses/ 2. http://www.um.org/prom_dresses/ is just a phishing prom dress page 3. When you search "prom dress shop", the website that used to rank first (for good reason) was www.promdressshop.com. 4. www.promdressshop.com's home page has now been replaced by: um.org/prom_dresses/ – who is using prom dress shop's title tag and meta description. How is it possible that this hacked page (on um.org) is not only ranking above us, but is also starting to replace www.promdressshop.com's pages in search results. We do not believe www.promdressshop.com has been hacked but are open to any ideas. Please let me know if you would like any additional info. Thanks in advance! new
Intermediate & Advanced SEO | | LogicalMediaGroup0 -
Brand sections performing badly in SERP's but all SEO tools think we are great
I have had this problem for some time now and I've asked many many experts. Search for Falke in Google.co.uk and this is what you get: http://www.sockshop.co.uk/by_brand/falke/ 3rd Our competitor
Intermediate & Advanced SEO | | jpbarber
http://www.mytights.com/gb/brand/falke.html 4th Our competitor http://www.uktights.com/section/73/falke 104th this is us ????? 9th for Falke tights with same section not our falke tights section? All sites seem to link to their brand sections in the same way with links in the header and breadcrumbs, Opensite exporler only shows 2 or 3 internal links for our compertitors, 1600+ from us?
Many of our brand sections rank badly Pretty Polly and Charnos brands rank page 2 or 3 with a brand subsection with no links to them, main section dosn't rank? Great example is Kunert, a German brand no UK competition our section has been live for 8 years, the best we can do is 71st Google UK, 1st on Bing (as we should be). I'm working on adding some quality links, but our comtetitors have a few low quality or no external links, only slightly better domain authority but rank 100+ positions better than us on some brands. This to me would suggest there is something onpage / internal linking I'm doing wrong, but all tools say "well done, grade A" take a holiday. Keyword denisty is similar to our competiors and I've tried reducing the number of products on the page. All pages really ranked well pre Penguin, and Bing still likes them. This is driving me nuts and costing us money Cheers Jonathan
www.uktights.com1 -
Google Semantic Search: Now I'm really confused
I'm struggling to understand why I rank for some terms and not for other closely related ones. For example: property in Toytown but NOT properties in toytown property for sale in Toytown but NOT property for sale Toytown NOR properties for sale Toytown. My gut instinct is that I don't have enough of the second phrasing as inbound link anchor text -- but didn't Penguin/Panda make all that obsolete?
Intermediate & Advanced SEO | | Jeepster0 -
Google fluctuates its result on Chrome's private browsing
I have seen an interesting Google behaviour this morning. As usual, I would open Chrome's private browsing to see how a keyword is ranking. This was what I see... Typed in "sell my car", I see Auto Trader page on 3rd. (Ref:Sell My Car 1st result img) Googled something else, then re-Googled "sell my car" and saw that our page went to 2nd! I repeated the same process and saw that we went from 3rd to 2nd again. Has Google results gone mental? PaGXJ.png
Intermediate & Advanced SEO | | tmg.seo0 -
What Is The Preferred Url Structure For Se’s?
Here is my issue, my domain is abcdomian.com and I’m trying to rank the site for the keyword “example”. All of my content is under “abcdomain.com/folder/example/” and building content off of “abcdomain.com/example” is not an option. So I’m thinking about moving the content to “abcdomain.com/online-example/” and 301ing the old pages . Of the two paths below, which will have a greater impact on my rankings for the term “example”? Current: abcdomain.com/folder/example/
Intermediate & Advanced SEO | | samp582
Proposed: abcdomain.com/online-example/ Thoughts?0