Best practices for retiring 100s of blog posts?
-
Hi. I wanted to get best practices for retiring an enterprise blog with hundreds of old posts with subject matter that won't be repurposed. What would be the best course of action to retire and maintain the value of any SEO authority from those old blog pages?
Is it enough to move those old posts into an archive subdirectory and Google would deprioritize those posts over time?
Or would a mass redirect of old blog posts to the new blog's home page be allowed (even though the old blog post content isn't being specifically replaced)? Or would Google basically say that if there aren't 1:1 replacement URLs, that would be seen as soft-404s and treated like a 404?
-
Retiring a large number of blog posts can be a significant task, and it's important to handle it thoughtfully to maintain the integrity of your website and its content. Here are some best practices for retiring hundreds of blog posts:
1. Assessment and Planning:
- Evaluate each blog post individually to determine its relevance, traffic, and importance.
2. Communication:** - Inform your audience about the changes in advance. Create a blog post or announcement explaining the decision to retire certain content.
3.Maintain SEO: - Update your sitemap to reflect the changes.
4.Content Audit:
Use the retirement as an opportunity to conduct a broader content audit. Assess the overall quality and relevance of your remaining content.
5. Learn from Analytics: - Analyze website analytics to understand the impact of retiring specific posts on traffic and user engagement.
- Evaluate each blog post individually to determine its relevance, traffic, and importance.
-
Best Practices for Retiring Hundreds of Blogs
Are you contemplating retiring hundreds of old blog posts? It's a significant decision, but fear not! Here are the best practices for a smooth transition.
Content Audit: Analyze traffic, engagement, and relevance to identify which posts to retire. Preserve high-performing and evergreen content for repurposing.
301 Redirects: Redirect retired blog URLs to related or updated content using 301 redirects. This preserves SEO value and prevents broken links.
Inform Your Audience: Notify your readers in advance about the changes. Explain why certain posts are being retired and assure them of fresh, valuable content to come.
Archiving: Consider archiving the retired posts on your website for reference purposes. This maintains historical context and might still attract occasional visits.
Promotion of New Content: Emphasize your latest and most relevant content. Utilize newsletters, social media, and email campaigns to highlight new posts and offerings.
Monitor Analytics: Keep a close eye on post-retirement metrics to gauge the impact on traffic and user behavior. Adjust your strategy as needed.
Remember, retiring old blogs opens up opportunities for fresh, engaging content that resonates with your audience. Embrace the change and watch your website flourish!
P.S. Explore Hamzastore.pk for unique and trendy 3D wall clocks Add a touch of style to your space with our premium collection.
-
Redirecting them in bulk might cause some loss of equity yes - are any of them particularly noteworthy or well linked to? Perhaps just those ones could be left up.
That said, if you have the option to leave these posts live on an archived subdirectory, why is it that you want to take them down at all? Usually the answer would be because they are duplicate or thin content, but clearly that is not the case.
-
@David_Fisher When retiring an old enterprise blog with many outdated posts, simply archiving them in a subdirectory may not be enough to prevent Google from indexing them. Redirecting all the old posts to the new blog's homepage without any relevant content could be seen as a soft-404 by Google.
The best approach would be to repurpose or update any relevant posts for the new blog and redirect only those specific posts. For the rest, create a custom 404 page that provides links to the new blog's homepage and other relevant content. This approach ensures a positive user experience and maintains SEO authority.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Understanding Redirects and Canonical Tags in SEO: A Complex Case
Hi everyone, nothing serious here, i'm just playing around doing my experiments 🙂
Technical SEO | | chueneke
but if any1 of you guys understand this chaos and what was the issue here, i'd appreciate if you try to explain it to me. I had a page "Linkaufbau" on my website at https://chriseo.de/linkaufbau. My .htaccess file contains only basic SEO stuff: # removed ".html" using htaccess RewriteCond %{THE_REQUEST} ^GET\ (.*)\.html\ HTTP RewriteRule (.*)\.html$ $1 [R=301,L] # internally added .html if necessary RewriteCond %{REQUEST_FILENAME}.html -f RewriteCond %{REQUEST_URI} !/$ RewriteRule (.*) $1\.html [L] # removed "index" from directory index pages RewriteRule (.*)/index$ $1/ [R=301,L] # removed trailing "/" if not a directory RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} /$ RewriteRule (.*)/ $1 [R=301,L] # Here’s the first redirect: RedirectPermanent /index / My first three questions: Why do I need this rule? Why must this rule be at the top? Why isn't this handled by mod_rewrite? Now to the interesting part: I moved the Linkaufbau page to the SEO folder: https://chriseo.de/seo/linkaufbau and set up the redirect accordingly: RedirectPermanent /linkaufbau /seo/linkaufbau.html I deleted the old /linkaufbau page. I requested indexing for /seo/linkaufbau in the Google Search Console. Once the page was indexed, I set a canonical to the old URL: <link rel="canonical" href="https://chriseo.de/linkaufbau"> Then I resubmitted the sitemap and requested indexing for /seo/linkaufbau again, even though it was already indexed. Due to the canonical tag, the page quickly disappeared. I then requested indexing for /linkaufbau and /linkaufbau.html in GSC (the old, deleted page). After two days, both URLs were back in the serps:: https://chriseo.de/linkaufbau https://chriseo.de/linkaufbau.html this is the new page /seo/linkaufbau
b14ee095-5c03-40d5-b7fc-57d47cf66e3b-grafik.png This is the old page /linkaufbau
242d5bfd-af7c-4bed-9887-c12a29837d77-grafik.png Both URLs are now in the search results and all rankings are significantly better than before for keywords like: organic linkbuilding linkaufbau kosten linkaufbau service natürlicher linkaufbau hochwertiger linkaufbau organische backlinks linkaufbau strategie linkaufbau agentur Interestingly, both URLs (with and without .html) redirect to the new URL https://chriseo.de/seo/linkaufbau, which in turn has a canonical pointing to https://chriseo.de/linkaufbau (without .html). In the SERPs, when https://chriseo.de/linkaufbau is shown, my new, updated snippet is displayed. When /linkaufbau.html is shown, it displays the old, deleted page that had already disappeared from the index. I have now removed the canonical tag. I don't fully understand the process of what happened and why. If anyone has any ideas, I would be very grateful. Best regards,
Chris0 -
Blog article cannibalizes our home page
Hello there, We're having a rather big SEO issue that I’m hoping someone here can help us with, perhaps having experienced the same thing or simply understanding what's going on. Since around June, our website's home page has lost the majority of its most important rankings. Not just dropping, but losing them entirely and all at once. We think it was self-inflicted: Almost at the same time, a blog article of ours (which we had recently updated) started ranking for almost all the same keywords. While our home page is a commercial page highlighting only our own product, the article that usurped the position is a comparison article, comparing our own solution to competitors. The reason we created that article is because we noticed a trend of Google increasingly favoring such comparison articles over dedicated product pages. But of course we didn’t plan to cannibalize our own home page with it. My question is whether anyone has experience with such a case? Is there a way to "tell"/influence Google to rank our home page again, instead of ranking that article? Thanks a lot, Pascal
Technical SEO | | Maximuxxx1 -
Guest blogging penalty
We would like to receive a blogging post from guest on our blog which links to their website and vice versa....a link from their blog to our website. Does this affect us in terms of Google's "guest blogging" scenario? We have natural link exchange from our partners...website to website from partners page.
White Hat / Black Hat SEO | | vtmoz0 -
Do any sub-domains act as Private Blog Networks?
Hi All, We can see now that Google rolled 2 unconfirmed algo updates this month and they are penalising spam links and sites which use Private blog networks; as said by some SEO experts. Do any sub-domains act as PBNs because of too much linking...like linking website pages from every page of sub-domains?
White Hat / Black Hat SEO | | vtmoz0 -
Local Map Pack: What's the best way to handle twin cities?
Google is increasing cracking down on bad local results. However, in many regions of the US there are twin cities or cities that reside next to each other, like Minneapolis-Saint Paul or Kansas City. According to Google guidelines your business should only be listed in the city in which your business is physically located. However, we've noticed that results just outside of the local map pack will still rank, especially for businesses that service the home. For example, let's say you have a ACME Plumbing in Saint Paul, MN. If you were to perform a search for "Plumbing Minneapolis" you typically see local Minneapolis plumbers, then Saint Paul outliers. Usually the outliers are in the next city or just outside of the Google map centroid. Are there any successful strategies to increase rank on these "Saint Paul outliers" that compete with local Minneapolis results or are the results always going lag behind in lieu of perceived accuracy? We're having to compete against some local competitors that are using some very blackhat techniques to rank multiple sites locally (in the map results). They rank multiple sites for the same company, under different company names and UPS store addresses. Its pretty obvious, especially when you see a UPS store on the street view of the address! We're not looking to bend the rules, but rather compete safely. Can anything be done in this service based scenario?
White Hat / Black Hat SEO | | AaronHenry0 -
Does posting a source to the original content avoid duplicate content risk?
A site I work with allows registered user to post blog posts (longer articles). Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content? Thanks!
White Hat / Black Hat SEO | | 945010 -
Title Tag - Best Practices
I'm pretty new to seo but think I'm starting to get a decent grasp on it. One thing I'm really struggling with is how to organize the meta title tags on my website. I work in real estate and I'm noticing a lot of my local competitors that are ranking for the top keywords seem to using that particular keyword on every title tag within their website. An example would be www.paranych.com. Many of his internal pages have the word "Edmonton Real Estate" in the meta title tag, yet his home page is the page that is ranking for that particular keyword. It doesn't seem logical to have every one of my pages featuring the same keyword, but there are many examples within my industry of this working. Is the best practice with meta title tags to have your keyword on every title tag of your site or just the home page? Thx, Barry
White Hat / Black Hat SEO | | patrickmilligan0 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0