How Many Words in Content for Good SEO?
-
I have heard it's best to have 400+ words of content for strong SEO per page. I believe this is true for the most.
I have a project in mind, however, that I am considering doing 100-200 words of content per page. This is for a glossary of terms for my industry, where I have a unique page for each term that describes what that term means w/ 1 image and a few links to related products.
Is having just 100-200 words going to be enough? Each page will still be unique, original content.
Or is it best to really try for longer articles?
In other words, is there a general rule for # of words per page for search engines to see the page as valuable and unique and to give it good ranking?
- Give me a BIG THUMBS UP if you found this question useful. It won't cost you anything! Thanks!
-
This site gets a lot of image search traffic. I can not tell you if the captions assist with that traffic. But I do know that kickass images on a same-topic page will get traffic.
-
I have a hunch that part of the Google Algo is how long people spend on a page, and lots of big pretty pictures along with text entices people to look at said pretty pictures thus stay on each page longer while reading.
-
Very informative on the added content getting long tail traffic. How did the images help though? Was it just the "content captions"?
Utah Tiger
-
Thanks, EGOL. That confirms what I'm thinking.
You mentioned image captions. I don't normally use them - do they help? I'm assuming this is for blog posts if its captions?
-
Just saying what happened here...
I had about 80 pages about topics that are similar to glossary entries.
I tossed them up with an image and two sentences. That was 3 or 4 years ago. Some of them ranked on the second or third page of google and they pulled a little traffic. Not much traffic a few visitors per day for most of the pages.
Then, after a year or two I started upgrading them to a couple paragraphs and one or two nicer, larger images with captions. Within a couple of months rankings went up (I don't build links on this site). Some of them were on the first page of google but most not. Traffic shot up like a rocket. Not that much from a ranking increase but because now there were lots more words on those pages and they interacted with queries to bring lots of long tail visitors. The result was each of these pages started pulling a few dozen visitors each day.
Now, I am slowly upgrading these pages to articles of 500 to 3000 words and three to twelve images with generous captions, some with data tables some with a video. Rankings on the ones that I have improved moved up significantly and are now in the top 5 of google. Traffic on the pages that I have improved now hits a few hundred visitors per day per page. I am working on another one today.
Bottom line... When compared to trivial content, substantive content pulls in lots more long tail traffic and will sometimes rank a little better on the basis of the richer content alone. An investment in richer content will immediately increase your traffic.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
Seo for international sites
Hello, I have a question for the group, our main US site- http://www.datacard.com is utilized to move content to other regional sites like http://www.datacard.co.uk/ and http://www.datacard.fr/ and http://www.datacard.com.br/. Anyhow, we essentially have some regional content on those sites, but for ease of maintaining and updating the content we have a company translate this for us and then undergo an in country review for local people in our company to review the content. That being said the meta descriptions, titles, code, everything gets translated to that language. I know there are issue for SEO for these purposes as we get much better rankings with http://www.datacard.com. The regional sites are newer so this could be part of it. We don't have an agency helping us with SEo and i get a lot of questions on what can be done internally for this for regional sites with our current structure. Any tips you have? It would be greatly appreciated! Laura
Intermediate & Advanced SEO | | lauramrobinson320 -
How to find a good seo company?
Hello there, Can anyone recommend how to go about finding a good seo company?
Intermediate & Advanced SEO | | edward-may0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
How important is a good "follow" / "no-follow" link ratio for SEO?
Is it very important to make sure most of the links pointing at your site are "follow" links? Is it problematic to post legitimate comments on blogs that include a link back to relevant content or posts on your site?
Intermediate & Advanced SEO | | BlueLinkERP0 -
Need help with duplicate content. Same content; different locations.
We have 2 sites that will have duplicate content (e.g., one company that sells the same products under two different brand names for legal reasons). The two companies are in different geographical areas, but the client will put the same content on each page because they're the same product. What is the best way to handle this? Thanks a lot.
Intermediate & Advanced SEO | | Rocket.Fuel0 -
A very basic seo question
Sorry, been a long day and wanted a second opinion on this please.... I am developing an affiliate store which will have dozens of products in each category. We will not be indexing the product pages themselves as they are all duplicate content. The plan is to have just the first page of the category results indexed as this will have unique content about the products in that section. The later pagnated pages (ie pages 2,3,4,5 etc) will have 12 products on each but no unique content. Would the best advice be to add a canonical tag to all pages in the 'chairs' category pointing to the page with the first 12 results and the descriptions? This would ensure that the visitors are able to browse many pages of product but google won't index products 13 and onwards. Am I right in my thinkings? A supplemental question. What is the best way to block google from indexing/crawling 90,000 product listings which are pulled direct from the merchant so are not unique in the least. I have previous played with banning google from the product folder but it reports health issues in webmaster tools. Would the best route be a no index tag on all the product pages and to no follow all the products in the category listings? Many thanks Carl
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Effects on SEO with CDN
Should we be concerned about any adverse consequences to our site's SEO value when moving the site's assets (javascript files and css files) to a CDN (Akamai)?
Intermediate & Advanced SEO | | Volusion.com0