What is your experience so far, with the new Google's Meta Description length up to 320 characters?
-
I updated a few home pages and some landing pages, so far so good!
Although, I wish to know about other experiences, before continue updating. Thanks for your comments!
-
Hi Brooks,
Thanks for the input. It is great to know that it is also working in your ecosystem.
-
Agreed, and sometimes there can be power in a one word title - both for optimization and communication.
-
Agreed.
I don't have any data to prove it's usefulness, but there's something really nice and satisfying about a solid, short, and effective meta description.
-
"really just didn't need to be any longer than 160ish characters"
Thanks!
That's a very interesting thought. Especially if you can do effective "short and punchy" descriptions. If you ramble too long it stinks it up.
-
My agency just launched a new website and in the process updated many of our meta descriptions. Though in optimizing, we realized some really just didn't need to be any longer than 160ish characters.
However, for other pages, such as our services pages, the additional characters gave us a chance to introduce the page, detail what the user can find, and sort of "preview" the call to action.
We've already seen a little increase in CTR for some of our services pages.
Best of luck!
-
Hello William,
Thanks for the input.
I noticed that sometimes Google chooses text randomly. Till now I cannot find a pattern, sometimes from the first paragraph, others from the middle of the page. Although, regarding the pages already SEO optimized, I mean with adecuate page title, meta description and h1, it is showing the written meta description.Best regards
-
Hello,
I am in the process of updating Meta tags and top of page content to try and get relevant text and description tags to show in Google listings. On some of my pages Google uses my Mets "Description" tags and on many others, Google is using the content from the top of the pages. I am not sure how or why which one gets used so I am working on both tags and top of page content.
Best Regards
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Syndicated content with meta robots 'noindex, nofollow': safe?
Hello, I manage, with a dedicated team, the development of a big news portal, with thousands of unique articles. To expand our audiences, we syndicate content to a number of partner websites. They can publish some of our articles, as long as (1) they put a rel=canonical in their duplicated article, pointing to our original article OR (2) they put a meta robots 'noindex, follow' in their duplicated article + a dofollow link to our original article. A new prospect, to partner with with us, wants to follow a different path: republish the articles with a meta robots 'noindex, nofollow' in each duplicated article + a dofollow link to our original article. This is because he doesn't want to pass pagerank/link authority to our website (as it is not explicitly included in the contract). In terms of visibility we'd have some advantages with this partnership (even without link authority to our site) so I would accept. My question is: considering that the partner website is much authoritative than ours, could this approach damage in some way the ranking of our articles? I know that the duplicated articles published on the partner website wouldn't be indexed (because of the meta robots noindex, nofollow). But Google crawler could still reach them. And, since they have no rel=canonical and the link to our original article wouldn't be followed, I don't know if this may cause confusion about the original source of the articles. In your opinion, is this approach safe from an SEO point of view? Do we have to take some measures to protect our content? Hope I explained myself well, any help would be very appreciated, Thank you,
Intermediate & Advanced SEO | | Fabio80
Fab0 -
404's after pruning old posts
Hey all, So after reading about the benefits of pruning old content I decided to give it a try on our blog. After reviewing thousands of posts I found around 2500 that were simply not getting any traffic, or if they were there was 100% bounce & exit. Many of these posts also had content with relevance that had long ago expired. After deleted these old posts, I am now seeing the posts being reported as 404's in Google Search Console. But most of them are the old url with "trashed" appended to the url. My question is: are these 404's normal? Do I now have to go through and set up 301's for all of these? Is it enough to simply add the lot to my robots.txt file? Are these 404's going to hurt my blog? Thanks, Roman
Intermediate & Advanced SEO | | Dynata_panel_marketing0 -
404's and Ecommerce - Products no longer for sale
Hi We regularly have products which are no longer sold and discontinued. As we have such a large site, webmaster tools regularly picks up new 404's. These 404 pages aren't linked to from anywhere on the site any longer, however WMT will still report them as errors. Does this affect site authority? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Site's disappearnce in web rankings
I'm currently doing some work on a website: http://www.abetterdriveway.com.au. Upon starting, I detected a lot of spammy links going to this website and sort to remove them before submitting a disavow report. A few months later, this site completely disappeared in the rankings, with all keywords suddenly not ranked. I realised that the test website (which was put up to view before the new site went live) was still up on another URL and Google was suddenly ranking that site instead. Hence, I ensured that test site was completely removed. 3 weeks later however, the site (www.abetterdriveway.com.au) still remains unranked for its keywords. Upon checking Web Master Tools, I cannot see anything that stands out. There is no manual action or crawling issues that I can detect. Would anyone know the reason for this persistent disappearance? Is it something I will just have to wait out until ranking results come back, or is there something I am missing? Help here would be much appreciated.
Intermediate & Advanced SEO | | Gavo0 -
WMT Showing Duplicate Meta Description Issues Altough Posts Were Redirected
Dear Moz Community, Some time ago we've change the structure of our website and we've redirected the old URL's to the new ones. About 2,000 posts were redirected at that time. While checking Webmaster Tools a few days ago I've discovered that about 500 duplicate meta-description issues appear in the "HTML Improvements" area. To my surprise, altough the old posts were redirected to the new path, WMT sees the description of the old posts similar with the one of the new post. Moreover, after changing the structure all meta-descriptions were modified and they weren't the same used before the restructure. For example I've redirected /blog/taxi-transfer-from-merton-sw19-to-london-city-airport/ to /destinations/greater-london/merton-sw19/taxi-transfer-to-london-city-airport-from-merton/ Now they are shown as having duplicate content. I've checked the redirects and they are working. I get the same error from the redirected pages for about 150 titles. Did anyone else get this errors or can you please offer me some suggestions about how I can fix this? Thank you in advance! Tiberiu
Intermediate & Advanced SEO | | Tiberiu0 -
Effect SERP's internal 301 redirects?
I'm considering installing Wordpress for my website. So I have to change the static URL's from /webpage.html to /webpage/. Yet I don't want to lose in the SERP's. What should I expect?
Intermediate & Advanced SEO | | wellnesswooz1 -
How can Google index a page that it can't crawl completely?
I recently posted a question regarding a product page that appeared to have no content. [http://www.seomoz.org/q/why-is-ose-showing-now-data-for-this-url] What puzzles me is that this page got indexed anyway. Was it indexed based on Google knowing that there was once content on the page? Was it indexed based on the trust level of our root domain? What are your thoughts? I'm asking not only because I don't know the answer, but because I know the argument is going to be made that if Google indexed the page then it must have been crawlable...therefore we didn't really have a crawlability problem. Why Google index a page it can't crawl?
Intermediate & Advanced SEO | | danatanseo0 -
Creating 100,000's of pages, good or bad idea
Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities. e.g. Stirling
Intermediate & Advanced SEO | | PottyScotty
Stirling paintball
Stirling Go Karting
Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit0