No descripton on Google/Yahoo/Bing, updated robots.txt - what is the turnaround time or next step for visible results?
-
Hello,
New to the MOZ community and thrilled to be learning alongside all of you! One of our clients' sites is currently showing a 'blocked' meta description due to an old robots.txt file (eg: A description for this result is not available because of this site's robots.txt)
We have updated the site's robots.txt to allow all bots. The meta tag has also been updated in WordPress (via the SEO Yoast plugin)
See image here of Google listing and site URL: http://imgur.com/46wajJw
I have also ensured that the most recent robots.txt has been submitted via Google Webmaster Tools.
When can we expect these results to update? Is there a step I may have overlooked?
Thank you,
Adam -
Great, the good news is following submission of a sitemap via Webmaster Tools, things appear to be remedied on Google! It does seem, however, that the issue still persists on Bing/Yahoo.
Some of the 404's are links from an old site that weren't carried over following my redesign; so that will be handled shortly as well.
I've submitted the sitemap via Bing Webmaster Tools, as such I presume it's a similar matter of simply 'waiting on Bing'?
Many thanks for your valuable insight!
-
Hi There
It seems like there are some other issues tangled up in this.
- First off it looks like some non-www URLs indexed in Google are 301 redirecting to www but then 404'ing. It's good they redirect to www, but they should end up on active pages.
- The NON-www homepage is the one showing the robots.txt message. This should hopefully resolve in a week or two when Google re-crawled the NON-www URL, sees the 301 - the actual solution is getting the non-www URL out of the index, and having them rank the www homepage instead. The www homepage description shows up just fine.
- You may want to register the non-www version of the domain in webmaster tools, and make sure to clean up any errors that pop up there as well.
-
I just got this figured out, let's try dropping this into Google!
-
The 404 error could be around a common error experienced with Yoast sitemaps: http://kb.yoast.com/article/77-my-sitemap-index-is-giving-a-404-error-what-should-i-do
1st step is to try and reset the permalink structure, it could resolve the 404 error you're seeing. You definitely want to resolve your sitemap 404 error to submit a crawlable sitemap to Google.
-
Thanks! It would seem that the Sitemap URL http://www.altaspartners.com/sitemap_index.xml brings up a 404 page, so I'm a bit confused with that step - but otherwise it appears to be very clear!
-
In WordPress, go to the Yoast plugin and locate the sitemap URL / settings. Plug the sitemap URL into your browser and make sure that it renders properly.
Once you have that exact URL, drop it into Google Webmaster Tools and let it process. Google will let you know if they found any errors that need correcting. Once submitted, you just need to wait for Google to update its index and reflect your site's meta description.
Yoast has a great blog that goes in depth about its sitemap features: https://yoast.com/xml-sitemap-in-the-wordpress-seo-plugin/
-
Sounds great Ray, how would I go about checking these URLs for the Yoast siteap?
-
Yoast sets up a pretty efficient sitemap. Make sure the sitemap URL settings are correct, load it up in the browser to confirm, and submit your sitemap through GWT - that will help get a new crawl of the site and hopefully an update to their index so your meta descriptions begins to show in the SERPs.
-
Hi Ray,
With fetch as Googlebot, I see a redirection for the non-www, and a correct fetch for the www.Using SEO Yoast, it would seem the sitemap link leads to a 404?
-
Ha, that's exactly what I did.
I'm not showing any restrictions in your robots.txt file and the meta tag is assigned appropriately.
Have you tried to fetch the site with the Webmaster Tools 'fetch as googlebot' tool? If there is an issue, it should be apparent there. Doing this may also help get your page re-crawled more quickly and the index updated.
If everything is as it should be and you're only waiting on a re-index, that usually takes no longer than two weeks (for very infrequently indexed websites). Fetching with the Google bot may speed things up and getting an external link on a higher trafficked page could help as well.
Have you tried resubmitting a sitemap through GWT as well? That could be another trick to getting the page re-crawled more quickly.
-
Hello Ray,
Specifically, the firm name, which is spelled a-l-t-a-s p-a-r-t-n-e-r-s (it is easy to confuse with "Atlas Partners" which is another company altogether
-
What was the exact search term you used to bring up those SERPs?
When i search 'atlastpartners' and 'atlastpartners.com' it brings up your site with a meta description.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Issues / Partial Fetch Via Google
We recently launched a new site that doesn't have any ads, but in Webmaster Tools under "Fetch as Google" under the rendering of the page I see: Googlebot couldn't get all resources for this page. Here's a list: URL Type Reason Severity https://static.doubleclick.net/instream/ad_status.js Script Blocked Low robots.txt https://googleads.g.doubleclick.net/pagead/id AJAX Blocked Low robots.txt Not sure where that would be coming from as we don't have any ads running on our site? Also, it's stating the the fetch is a "partial" fetch. Any insight is appreciated.
Technical SEO | | vikasnwu0 -
Drupal, http/https, canonicals and Google Search Console
I’m fairly new in an in-house role and am currently rooting around our Drupal website to improve it as a whole. Right now on my radar is our use of http / https, canonicals, and our use of Google Search Console. Initial issues noticed: We serve http and https versions of all our pages Our canonical tags just refer back to the URL it sits on (apparently a default Drupal thing, which is not much use) We don’t actually have https properties added in Search Console/GA I’ve spoken with our IT agency who migrated our old site to the current site, who have recommended forcing all pages to https and setting canonicals to all https pages, which is fine in theory, but I don’t think it’s as simple as this, right? An old Moz post I found talked about running into issues with images/CSS/javascript referencing http – is there anything else to consider, especially from an SEO perspective? I’m assuming that the appropriate certificates are in place, as the secure version of the site works perfectly well. And on the last point – am I safe to assume we have just never tracked any traffic for the secure version of the site? 😞 Thanks John
Technical SEO | | joberts0 -
Some competitors have a thumbnail in Google search results
I've noticed that a few of my top competitors have a small photo (thumbnail) next to their listing. I'm sure it's not a coincidence that they are ranked top for the search phrase too. Is this really a help and how can it be done? Many thanks, Iain.
Technical SEO | | iainmoran0 -
Google place listings and search results- quick question.
Has anybody else noticed that they are ranking better on 'places' yet they have dropped off in the actual search results? We've had no message through webmaster tools. The same seems to have happened to our competitors.
Technical SEO | | onlinechester0 -
My organic search results are down 16% since the Penguin update 4/24
Penguin has affected my search results down 16% When I look at my SEOmoz scan the only problem I see is "too many on page links" The problem is that my blog for each month is considered one page-eg august 2007 I wrote many blogs-the total on page links was 106-but that included all the blogs that were written in a month. The other problem area is duplicate content. I thought Penguin was after "link farming" which I do not do. Any advice how I can correct this? Brooke
Technical SEO | | wianno1680 -
Duplicate titles / canonical / Drupal
I have a site where there are several duplicate titles, looks like mainly based on a parameterized vs. non-parameterized version of the page. I have what appears to be a proper canonical tag, but webmaster still complains of both duplicate titles & meta descriptions. A good example (taken out of webmaster report for http://igottadrive.com) is: /driving-tips/mirror-setup-and-use /driving-tips/mirror-setup-and-use?inline=true If I look at the page (in either case) there appears to be a correct canonical tag pointing to the base case. However, for some reason google is either ignoring the canonical or its not properly done. Any suggestions would be greatly appreciated.
Technical SEO | | uwaim20120 -
How can I exclude display ads from robots.txt?
Google has stated that you can do this to get spiders to content only, and faster. Our IT guy is saying it's impossible.
Technical SEO | | GregBeddor
Do you know how to exlude display ads from robots.txt? Any help would be much appreciated.0 -
Restricted by robots.txt and soft bounce issues (related).
In our web master tools we have 35K (ish) URLs that are restricted by robots.txt and as have 1200(ish) soft 404s. WE can't seem to figure out how to properly resolve these URLs so that they no longer show up this way. Our traffic from SEO has taken a major hit over the last 2 weeks because of this. Any help? Thanks, Libby
Technical SEO | | GristMarketing0