How can I remove parameters from the GSC URL blocking tool?
-
Hello Mozzers
My client's previous SEO company went ahead and blindly blocked a number of parameters using the GSC URL blocking tool. This has now caused Google to stop crawling many pages on my client's website and I am not sure how to remove these blocked parameters so that they can be crawled and reindexed by Google.
The crawl setting is set to "Let Google bot decide" but still there has been a drop in the number of pages being crawled. Can someone please share their experience and help me delete these blocked parameters from GSC's URL blocking tool.
Thank you Mozzers!
-
Hi Vincent,
My short answer is: don't let Googlebot decide. Tell Googlebot which parameters should or should not create new pages. This is something you should do if you ever have indexation problems with parameters.
Do a site: search for a handful of these URLs with parameters to double check that the drop in the number of pages being crawled is because of these pages or because of something else. If it is because of these pages, you can quickly add them back to the index by using the "Fetch as Googlebot" tool. Once you have Google fetch something, you have the option of submitting it to the index.
(If it turns out the drop in crawled pages is from something else, a good way to figure out which pages are being affected is by creating multiple XML sitemaps and organizing them by site section, so when Google reports on how many of your URLs are in its index, you quickly know which section of the site is being affected. This post is really old, but still incredibly useful here.)
Double check that these URLs with parameters are in the XML sitemap, and that you have a number of internal links on prominent pages pointing to them. Even if these can only be temporary, those links will really help the process.
Hope this helps!
Kristina
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GSC problem: how to solve?
Hi all, Google Search Console gives me an error on these pages: info:https://www.varamedia.be/?utm_content=bufferbaaa4&utm_medium=social&utm_source=plus.google.com&utm_campaign=buffer info:https://www.varamedia.be/?utm_content=bufferece3f&utm_medium=social&utm_source=plus.google.com&utm_campaign=buffer I see there's an UTM tracking in the URL from Google+. We do have an account there but I don't see how this might give an error. Is this hurting our ranking score? How can we solve this?
Reporting & Analytics | | Varamedia0 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Analytics code removed and still collecting data
Google analytics code was removed from a website and then it started tracking a couple of days later to only stop again? How can that happen? Has the developer not removed the old code properly? Can the code be injected remotely?
Reporting & Analytics | | GardenBeet0 -
Pairing webmaster tools with analytics - user access
If you pair webmaster tools with an analytics account does this auto allow access for any of the authorised analytics users or do they have to be granted access individually from within GWT ? cheers dan
Reporting & Analytics | | Dan-Lawrence0 -
Google Webmaster Tools is showing wrong data - help?
Hey all, I'm seeing some weird problems with Webmaster Tools. Specifically: We've submitted a sitemap with 174k URLs. According to the WMT dashboard, only 21 are indexed, though if you search our site via site:sitename.com blah blah, there are thousands of pages returned. Why is WMT only showing 21 indexed pages? Yet if I go to Health -> Index Status, it's showing nearly 199k URLs indexed. This seems consistent with searching Google site:sitename.com blah blah. Under "Search Queries", it's showing "no data available". Not sure why as it's linked to the proper Google Analytics account, which has keyword data. Any ideas what I'm doing wrong here? Thanks.
Reporting & Analytics | | chimptech0 -
Totally Remove "localhost" entries from Google Analytics
Hello All, In Google Analytics I see a bunch of traffic coming from "localhost:4444 / referral". I had tried once before to create a filter to exclude this traffic source, but obviously I did it wrong since it's still showing up. Here is the filter I have currently: Filter Name: Exclude localhost
Reporting & Analytics | | Robert-B
Filter Type: Custom filter > Exclude
Filter Field: Referral
Filter Pattern: .localhost:4444.
Case Sensitive: No Can anyone see what I'm doing wrong and give me a push in the right direction? Thanks in advance!0 -
Webmaster Tools Showing Non-Existent Rankings
Hello World(pun intended) I am seeing a phenomenal keyword phrase ranking at an avg. 9.9 on google webmaster tools but not actually seeing the ranking in reality. If i did I would be rich. Does this mean I will soon see this ranking?
Reporting & Analytics | | TheGrid0 -
Analytics URL Tagging
For some reason I can't get Google Analytics to pick up my URL tags, am I doing something wrong? http://www.example.com/?utm_source=carscom&utm_campaign=3rdparty&utm_medium=referral
Reporting & Analytics | | kylesuss0