Pros & Cons Of Closing Forum Discussions To New Comments
-
I work with a site that gets lots of members and conversions from its specialty interest community forum out of organic search. The site has been around for a number of years and still has many forum pages that do well in search that are 5+ years old. Some of these discussions may have dozens of comments and a few have thousand words.
The option has been floated to end accepting new comments on specific discussions at some point. I can see the negatives, such as folks coming out of search and wanting to register to comment, but not be able to... if we closed the discussion to new comments.
I've always assumed that new comments may freshen up a page a little organic search-wise, but honestly don't know how much that matters.
Are there any good reasons to close a discussion to new comments? Related question, what do you think is the optimum qty of unique text before paginating?
Thanks!
-
Hi Oleg & Chris,
Thanks for the answers. This is exactly what I've been thinking, but wanted to sanity check it... just to make sure I wasn't just having an initial negative reaction to someone else's idea.
Best... Mike
-
It can depend on the type of site you're running I suppose but generally I see too many benefits to fresh user generated content to go closing them.
If you had topics that were particularly seasonal and no benefit could be had from yourself or users after a certain date (maybe an obsolete product that you don't want new buyers to confuse with the latest version) it may be worth closing comments and providing a link to a discussion on the new product.
You're on the right track in that user-generated content is a great thing for your brand/product, user engagement and SEO in general so the more legitimate interaction you have the better. Fresh content is what search engines look for and genuine discussion gives users a reason to frequently return to your site, allowing you to build that rapport.
As far as quantity of unique text, Oleg is right in that this becomes almost a pure UX question. Whatever works best for user intent is what you should be doing in this case. I personally prefer minimal pagination, especially on mobile but I'm just 1 user and may not even be your target audience.
-
Are there any good reasons to close a discussion to new comments?
comments going off topic, a topic that can be split into several threads instead of one, low quality comments, hard to find information (poor ux). But overall, I think it would be great to just keep updating a strong thread. Keep building the authority + hit more long tail kws + QDF bump.
what do you think is the optimum qty of unique text before paginating?
I think this would be more of a UX issue as "paginated" pages are treated as part of a series.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New meta description limit?
Hi guys is the new meta description limit something temporary - Google just testing or can we assume it will remain moving forward? Just wondering if i should increase my meta description limit to 300 characters for all my sites. Cheers.
Intermediate & Advanced SEO | | cathywix0 -
Competitor Ranking in Positions 1 & 2
I have seen an increasing instance of where the same company, one of our main competitors, are ranking in positions 1 and 2 for the same search phrase. It appears that both the homepage and dedicated service page relevant to the search term are ranking, but surely having them at position 1 & 2 is not something search engines like Google want to encourage? I have also seen other instances of the same company ranking twice on page 1 but not necessarily in #1 or #2. Is this an anomaly or just something I have to live with?
Intermediate & Advanced SEO | | Kevbyrne2 -
Category Pages & Content
Hi Does anyone have any great examples of an ecommerce site which has great content on category pages or product listing pages? Thanks!
Intermediate & Advanced SEO | | BeckyKey1 -
Google Answer Box For Old Forum Discussions
Thinking about taking some pages that are one question discussion form pages and attempting to gain an answer box in Google SERPs. Just like here on Moz, each page has one questions and often many, many, answers. This would mean making an unordered list with the question and the components of the answer in a box, maybe with a jpg. The info in the box would be kind of a summary of key points in answering the question. I'm looking for suggestions on: a) Is this a good idea? b) What might help achieve the intended search result? c) Does it matter that the box answers are summarized from a variety of folks forum posts, so not just one author? d) Might it drive up bounce rate on the actual page? e) Other considerations? These are pages that have been around for awhile and aren't currently active discussions. Thanks!
Intermediate & Advanced SEO | | 945010 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
New Site Launch- Bad Rankings
So I am slightly stumped. I work on a few eCom sites and over the past few months we have been launching new products as well as some site updates/upgrades. The issue that I am having is this. Product that is actually new this season, as in we did not carry it before, is ranking fine...generally speaking we are top 3 in Google with Brand/Product name. However product that is not new, meaning it has a new color or graphic but has been in the catalog for years is not ranking. In Prior years the URL's ended with the product name, now however they have additional information added to the end for this season. Would not directing the old URL to the new URL effect the rankings of these pages? I am open to suggestions beyond redirecting as well. Thank you!
Intermediate & Advanced SEO | | K2_Sports0 -
Magento Hidden Products & Google Not Found Errors
We recently moved our website over to the Magento eCommerce platform. Magento has functionality to make certain items not visible individually so you can, for example, take 6 products and turn it into 1 product where a customer can choose their options. You then hide all the individual products, leaving only that one product visible on the site and reducing duplicate content issues. We did this. It works great and the individual products don't show up in our site map, which is what we'd like. However, Google Webmaster Tools has all of these individual product URLs in its Not Found Crawl Errors. ! For example: White t-shirt URL: /white-t-shirt Red t-shirt URL: /red-t-shirt Blue t-shirt URL: /blue-t-shirt All of those are not visible on the site and the URLs do not appear in our site map. But they are all showing up in Google Webmaster Tools. Configurable t-shirt URL: /t-shirt This product is the only one visible on the site, does appear on the site map, and shows up in Google Webmaster Tools as a valid URL. ! Do you know how it found the individual products if it isn't in the site map and they aren't visible on the website? And how important do you think it is that we fix all of these hundreds of Not Found errors to point to the single visible product on the site? I would think it is fairly important, but don't want to spend a week of man power on it if the returns would be minimal. Thanks so much for any input!
Intermediate & Advanced SEO | | Marketing.SCG0 -
Subdomains vs. Subfolders for unique categories & topics
Hello, We are in the process of redesigning and migrating 5 previously separate websites (all different niche topics, including dining, entertainment, retail, real estate, etc.) under one umbrella site for the property in which they exist. From the property homepage, you will now be able to access all of the individual category sites within. As each niche microsite will be focused on a different topic, I am wondering whether it is best for SEO that we use subdomains such as category.mainsite.com or subfolders mainsite.com/category. I have seen it done both ways on large corporate sites (ie: Ikea uses subdomains for different country sites, and Apple uses subfolders), so I am wondering what makes the most sense for this particular umbrella site. Any help is greatly appreciated. Thanks, Melissa
Intermediate & Advanced SEO | | grapevinemktg0