Question About Thin Content
-
Hello,
We have an encyclopedia type page on our e-commerce site. Basically, it's a page with a list of terms related to our niche, product definitions, slang terms, etc.
The terms on the encyclopedia page are each linked to their own page that contains the term and a very short definition (about 1-2 sentences).
The purpose of these is to link them on product pages if a product has a feature or function that may be new to our customers.
We have about 82 of these pages. Are these pages more likely to help us because they're providing information to visitors, or are they likely to hurt us because of the very small amount of content on each page?
Thanks for the help!
-
Thank you EGOL!
-
I would be afraid of 82 pages with a sentence or two. I would take one of two routes if this was my site...
A) Beef up each of these pages to one or two photos and one or two paragraphs, at least 100 words total.
B) Place all 82 of the definitions on a single big page that has on-page anchors so that links from other parts of your site can point straight to the definition.
We have an industry glossary on one of our sites with a few thousand terms. Most terms are accompanied by one photo and 50 to 200 words of text. We don't have individual pages for each term. Instead we have 26 pages, one for each letter of the alphabet. Some of these pages have over 100 terms. For several hundred of these terms we also have a substantive article with 500 to 5000 words and numerous photos. So, the bold term in the glossary links to the substantive article page.
Back in the early 2000s, the glossary was a good source of links and it got a lot of traffic. The value of the glossary for attracting links and traffic has declined over time. The value of the article collection has grown.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
General SSL Questions After Move
Hello, We have moved our site to https, Google Analytics seems to be tracking correctly. However, I have seen some conflicting information, should I create a new view in analytics? Additionally, should I also create a new https property in Google search console and set it as the preferred domain? If so, should I keep the old sitemap for my http property while updating the sitemap to https only for the https property? Thirdly, should I create a new property as well as new sitemaps in Bing webmaster? Finally, after doing a crawl on our http domain which has a 301 to https, the crawl stopped after the redirect, is this a result of using a free crawling tool or will bots not be able to crawl my site after this redirect? Thanks for all the help in advance, I know there are a lot of questions here.
Technical SEO | | Tom3_150 -
SEO Content Audits Questions (Removing pages from website, extracting data, organizing data).
Hi everyone! I have a few questions - we are running an SEO content audit on our entire website and I am wondering the best FREE way to extract a list of all indexed pages. Would I need to use a mix of Google Analytics, Webmaster Tools, AND our XML sitemap or could I just use Webmaster Tools to pull the full list? Just want to make sure I am not missing anything. As well, once the data is pulled and organized (helpful to know the best way to pull detailed info about the pages as well!) I am wondering if it would be a best practice to sort by high trafficked pages in order to rank them for prioritization (ie: pages with most visits will be edited and optimized first). Lastly, I am wondering what constitutes a 'removable' page. For example, when it is appropriate to fully remove a page from our website? I understand that it is best, if you need to remove a page, to redirect the person to another similar page OR the homepage. Is this the best practice? Thank you for the help! If you say it is best to organize by trafficked pages first in order to optimize them - I am wondering if it would be an easier process to use MOZ tools like Keyword Explorer, Page Optimization, and Page Authority to rank pages and find ways to optimize them for best top relevant keywords. Let me know if this option makes MORE sense than going through the entire data extraction process.
Technical SEO | | PowerhouseMarketing0 -
Auto genrated content problem?
Hi all, I operate a Dutch website (sneeuwsporter.nl), the website is a a database of European ski resorts and accommodations (hotels, chalets etc). We launched about a month ago with a database of about 1700+ accommodations. Of every accommodation we collected general information like what village it is in, how far it is from the city centre and how many stars it has. This information is shown in a list on the right of each page (e.g. http://www.sneeuwsporter.nl/oostenrijk/zillertal-3000/mayrhofen/appartementen-meckyheim/). In addition a text of this accomodation is auto generated based on some of the properties that are also in the list (like distance, stars etc). Below the paragraph about the accommodation is a paragraph about the village the accommodation is located in, this is a general text that is the same with all the accommodations in this village. Below that is a general text about the resort area, this text is also identical on all the accommodation pages in the area. So a lot of these texts about the village and area are used many times on different pages. Things went well at first and every day we got more Google traffic, and more and more pages. But a few days ago our organic traffic took a near 100% dive, we are hardly listed anymore and if we are at very low places. We expect the Google gave us a penalty. We expect this to be the case because of 2 reasons: we have auto generated text that only vary slightly per page we re-use the content about villages and area's on many pages We quickly removed the content of the villages and resort area's because we are pretty sure that this is definitely something Google does not want. We are less sure about the auto generated content, is this something we should remove as well? These are normal readable text, they just happen to be structured more or less the same way on every page. Finally, when we made these and maybe some other fixes, what is the best and quickest ways to let Google see us again and show them we improved? Thanks in advance!
Technical SEO | | sneeuwsporter0 -
Techniques for diagnosing duplicate content
Buonjourno from Wetherby UK 🙂 Diagnosing duplicate content is a classic SEO skill but I'm curious to know what techniques other people use. Personally i use webmaster tools as illustrated here: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/webmaster-tools-duplicate.jpg but what other techniques are effective? Thanks,
Technical SEO | | Nightwing
David0 -
Duplicate Page Content Report
In Crawl Diagnostics Summary, I have 2000 duplicate page content. When I click the link, my Wordpress return "page not found" and I see it's not indexed by Google, and I could not find the issue in Google Webmaster. So where does this link come from?
Technical SEO | | smallwebsite0 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
How to prevent duplicate content in archives?
My news site has a number of excerpts in the form of archives based on categories that is causing duplicate content problems. Here's an example with the nutrition archive. The articles here are already posts, so it creates the duplicate content. Should I nofollow/noindex this category page along with the rest and 2011,2012 archives etc (see archives here)? Thanks so much for any input!
Technical SEO | | naturalsociety0 -
How does Google determine freshness of content?
With the changes in the Google algorithm emphasizing freshness of content, I was wondering how they determine freshness and what constitutes new content. For instance, if I write a major update to a story I published last July, is the amended story fresh? Is there anything I can do in addition to publishing brand new content to make Google sure they see all my new content?
Technical SEO | | KnutDSvendsen0