Search console validation taking a long time?
-
Hello! I did something dumb back in the beginning of September. I updated Yoast and somehow noindexed a whole set of custom taxonomy on my site. I fixed this and then asked Google to validate the fixes on September 20. Since then they have gotten through only 5 of the 64 URLS.....is this normal? Just want to make sure I'm not missing something that I should be doing.
Thank you! ^_^
-
You're welcome.
We as a community are here to help.If your issue is now fixed, you could mark this question as answered
Best luck.
GR -
Cool! Thanks Gaston! I'm glad I asked about this! ^_^
-
What sometimes happens is that when some URLs are marked as noindex, googlebot reduces its crawling frequency as they interpret that you really don't want that page to be indexed thus has value or so ever.
What you just did is tell GoogleBot to crawl specifically that page and "force" it to analyze and render that page. So GoogleBot now understood that the noindex is no longer set and that page should be indexed.
I'd wait a few days so that googlebot naturally crawls all your site again and eventually index every page that deserves to be indexed.If that doesnt happen in about 2 weeks, then there is a tool in the old Search Console, where you can tell GoogleBot to Crawl a single page and its links. That is under Crawl-> Fetch as Google. Request an URL to fetched, after a few minutes it a button: _Request indexing_will appear, there you'll have the option to "Crawl this URL and its direct links". This image might came handy: https://imgur.com/a/y5DbUVw
I'm glad it helped previously and hope the last helps you even more.
Best luck.
GR -
Whoooooooaaaaahhhhhh! that fixed it! what's the deal!? lol. why is this method instantaneous and the other method I was pointed to by google is taking months?....do I have to do this with each individual URL?
-
....or maybe that's what it found the last time it was crawled? I clicked the "request indexing" button.....we'll see what happens.
-
hmmm. it says:
Indexing allowed? No: 'noindex' detected in 'robots' meta tag....but I have the settings in yoast set up to allow indexing.....do you think maybe changes in yoast settings aren't applying retroactively?....
-
Sorry to hear that.
Its possible that googlebot still didnt find out that you've changed noindex tag.
Would you mind checking what does the _Inspect URL _tool report?
To find that, go to the new version of Search Console and enter one of that URL that should be indexed in the textbox.
Then clic on the "test live URL" button. This image could be helpful: https://imgur.com/a/CPvfwifThere you might get a hint of what is going on.
-
They're in google search console, but I have tried searching for a couple of them and they don't appear to be indexed :-(. I tried the method you suggested and that didn't bring up anything either.
-
Hi angela,
Those 5 out of 64 URLs.. Is that a report in Search Console? or only 5 URLs appear when searching in Google?
Search Console usually takes a little longer to update its reports on index status.Have you tried a site: search? Also using _inurl: _parameter.
For example: site:domain.com inurl:/category-noindexed/Hope it helps.
Best luck.
GR
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Load Balancer issues on Search Console
The top linked domains in search console are coming from our load balancer setup. Does anyone know how to remove these as unique sites pointing back to our primary domain? I was told Google is smart enough to ignore these as duplicate domains but if that was the case, why would they be listed as the top linked domains in search console? Most concerned....
Intermediate & Advanced SEO | | DonFerrari21690 -
Improvement in Page Speed worth Compromise on HTML Validation?
Our developer has improved page speed, particularly for Mobile. However the price for this improvement has been a HTML validation error that cannot be removed without compromising on the page load speed. Is the improvement in speed worth the living with the validation error? The concern is paying a high SEO price for this "fatal error". Or perhaps this error is in fact not serious? Fatal Error: Cannot recover after last error. Any further errors will be ignored. From line 699, column 9; to line 699, column 319 >↩ ↩ `OUR DEVELOPER'S COMMENT: | This was made following Google page speed insights recommendations. If we remove that, will loose on page load performance | The domain URL is www.nyc-officespace-leader.com`
Intermediate & Advanced SEO | | Kingalan10 -
How do we better optimize a site to show the correct domain in organic search results for the location the user is searching in?
For example, chicago-company.com has the same content as springfield-company.com and I am searching for a general non-brand term (i.e. utility bill pay) and am located in Chicago. How can we optimize the chicago-company.com to ensure that chicago's site results are in top positions over springfields site?
Intermediate & Advanced SEO | | aelite1 -
Articles marked with "This site may be hacked," but I have no security issues in the search console. What do I do?
There are a number of blog articles on my site that have started receiving the "This site may be hacked" warning in the SERP. I went hunting for security issues in the Search Console, but it indicated that my site is clean. In fact, the average position of some of the articles has increased over the last few weeks while the warning has been in place. The problem sounds very similar to this thread: https://productforums.google.com/forum/#!category-topic/webmasters/malware--hacked-sites/wmG4vEcr_l0 but that thread hasn't been touched since February. I'm fearful that the Google Form is no longer monitored. What other steps should I take? One query where I see the warning is "Brand Saturation" and this is the page that has the warning: http://brolik.com/blog/should-you-strive-for-brand-saturation-in-your-marketing-plan/
Intermediate & Advanced SEO | | Liggins0 -
How long does it take for google to update my meta tag des?
Hi Guys: I have 3 top level domains, basically - I can see that my website is ranking in another country (USA) for my keyword - however the domain name that is showing up for the USA search is correct, but the meta tag description is incorrect, this meta tag description is for the NZ domain - not the USA domain. I have updated the hreflang tags, when i submit my new sitemap how long does it take google to update the meta tag description? new
Intermediate & Advanced SEO | | edward-may0 -
How Google organic search results differ in Local Searches?
We all know Google displays nearby results by locating our ip address. My question is how does these results differ? For eg 1. If someone from Newyork search for "chinese Restaurant in Newyork" 2. Someone from California search for "chinese Restaurant in Newyork" 3. Someone from California changes his location to Newyork and search for "chinese Restaurant in Newyork" What are the factors the Google SERP looks into to display the result in local terms?
Intermediate & Advanced SEO | | rajeevEDU0 -
Does site size (page count) effect search ranking?
If a company has a handful of large sites that function as collection of unique portals into client-specific content (password protected), will it have any positive effect on search ranking to migrate all of the sites to one URL structure.
Intermediate & Advanced SEO | | trideagroup0 -
Blog Not Ranking Well at All in Search Engines, Need Help!
Hi Mozers, Need some help on a CMS I've been working with over the last year. The CMS is built by a team of guys here in Washington State. Basically, I'm having issues with clients content on the blog system not getting ranking correctly at all. Here's a few problems I've noticed: Could you confirm and scale these problems based upon being, "not a problem" "a problem" and "critical must fix" 1. The title tag is pulling from the title of the article which is also automatically generating a URL with underscores instead of dashes. Is having a duplicate URL, Title, and Title tag spammy looking to search engines? Are underscores on long URL's confusing google? Where shorter one's are fine (i.e. domain/i_pad/
Intermediate & Advanced SEO | | Keith-Eneix
(i.e.http://www.ductvacnw.com/blog/archives/2013/05/20/5_reasons_to_hire_a_professional_to_clean_your_air_ducts_and_vents), 2. The CMS is resolving all URL's with a canonical instead of a 301 redirect (I've told webmaster tools which preferred url should be indexed). Does using a canonical over a 301 redirect cause any confusion with Google? Is one better practice then the other? 3. The H1 tags on the blog pull from "blog category" instead of the title of the blog post. Is this is a problem? 4. The URl's are quite long with the added "archives/2013/05/20/5". Does this cause problems by pushing the main target keyword further away from the domain name? 5. I'm also noticing the blog post is actually not part of the breadcrumbs where we normally would expect that to populate after the blog category name, Problem? These are some of the things I've noticed and need clarification on. If you see anything else please let me know?0