Is Having Broken Outbound Links on old blogs posts an issue?
-
Please note that these old posts hardly get any traffic. Ive heard both sides on this.
thanks,
Chris
-
Great advice, love it. Thx!
-
This is on a small business website in one city.
Please note that these old posts hardly get any traffic.
I assume that the real issue with broken links is when they occur on one of your main navigational links that drives most of your traffic?
I just view this as a minor issue, where as broken links on main pages where there is a lot of traffic is much more alarming or pressing.
First, it sounds like you are making excuses for not fixing them.
If these posts get hardly any traffic then maybe they are not very good posts. If they are worthless then dump them. If they have potential then improve them.
No matter where these broken links are it is a sign of low form, of a website that is not tended, of a website that Google might view as low quality.
Today, more than ever, the game on the web is about quality. The most important thing that you can do is buy into that.
-
Thanks for the info, appreciate it and totally get it. Just seems like very low priority if you will when these links are on old posts (2014/13) that very few at most visit. Additionally, i read that the Google bot simply moves on. I just view this as a minor issue, where as broken links on main pages where there is a lot of traffic is much more alarming or pressing. Thx
-
My view is you should fix all broken links. Also investigate why broken.
The primary reason is the "google bot". In short Mr Google Bot hates broken links and it means he cannot crawl your site properly this could lead to a decrease in rankings, a Mr Bot stumbles into a dead end (broken link) and thinks the rest of your site has closed as well. Hasta la vista...
It is simply not worth the risk.
-
Hi there
I would fix them. Otherwise, if the articles aren't getting any traffic, rankings, or are out of date, I would see what I can do to improve those old posts or remove them. There may be opportunities to update, you never know.
It may be a good time to perform a content audit.
Broken outbound links are frustrating for users, especially if it's a link to an article backing up your points or data. Always make sure they work. Remember - at the end of the day you want to have a great user experience - outbound links are a part of them.
Plus, a lot of broken outbound links may show crawlers that you're not paying attention to your site, or the site isn't being kept up to date.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wordpress Blog, Schema and Authorship Settings
Hi Everyone, What is the best practice for authorship in 2018 and going forward? I am moving my entire blog over to a new wordpress theme so it's easier to read and navigate in an attempt to make it look better on the mobile and give better UX / CRO and implicit user feedback signals to google. On the old blog I would say who the author is in the URL, H1 and in the content. This includes an image of the author with an image alt with their name, qualifications and blurb. I've now set up each author as a 'user' for the new blog and their image and name comes up because I've marked those blogs as authored by that particular user in Wordpress. What should I do as far as the SEO elements are concerned? I have read Eric Enge's blog about authorship being dead here and also that authorship should be marked up in schema correctly - which I've done. Also I've read around how it provides indirect signals even though it's no longer a direct ranking factor. Should I tell wordpress to ignore the authorship SEO element by unticking the boxes relating to publishing authorship or let wordpress just do it's thing? Should I keep the images and alt tags and H1 in there or take them out and let the wordpress system take over the authorship SEO elements? It's going to look funny to have author (in wordpress theme) and then author details again just below? So what is the best practice for authorship in 2018 and going forward? Am I making too big a deal of it and can just let wordpress sort it out. Something it seems to do very well? Thanks in advance, Ed.
Local Website Optimization | | Smileworks_Liverpool0 -
Blogs/content marketing or slower salesfunnel on webshop?
Hi all, Im considering about building contents en blogs on a webshop, because a visitor will get see a lot of information about blogs, etc. The salefunnel will be chaotic, purchasing will be slower on a webshop. The webshop has more then 5000 products. Focus on gamers. For example Ikea or mahuranna shop, they have builded a website near their webshops. To get more traffic ofcourse, but its to hard to do both of them. Your focus will get lost and they way of communication on website/shop will be changing. Your brand and strategic will also change a lot, thats why im considering to find the right way. Who can give me an advice?
Local Website Optimization | | Dreamgame20160 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
Website ranking issues
Hi Moz, I have a question about one of our websites that has been ranking very poorly on it's current domain (fancydoorsedmonton.com) lately, but was ranked at #1 for the search term "Edmonton Doors" until last month. The main search terms we're targeting are "Edmonton Doors" and "Doors Edmonton". I made another post regarding the on-page SEO value and had some feedback from that, but there is another issue that seems more likely to cause an issue. There are 2 more domains set up to forward to their main domain: fancydoors.com was their old domain but was registered by someone else and had some questionable, X-rated content put on it. The domain has now been reacquired and redirected to their main domain. There isn't any more questionable content on there anymore. Would this domain's past affect it's current ranking? fancy-doors.com was another old domain of theirs now set up as a redirect. In the past they had another SEO provider work with this domain and did some bad SEO work for them with automated citations, etc. We changed the domain to fancydoorsedmonton.com to get away from that and also include Edmonton in the domain. If you have any ideas or feedback to provide based on this information it would definitely be a huge help to us. Thanks!
Local Website Optimization | | Web3Marketing870 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0 -
If I mention a client in a blogpost about SEO, do I have to use a rel= no follow link?
I do SEO and webstuff (obviously, that's why I'm here). I want to write a blog post congratulating my client for getting to #1 in the local listings for a search for "plumber". When I include my link to my client's site, should it be rel=no follow? Could they be penalized if I don't? Thanks,
Local Website Optimization | | aj613
Adam0 -
How do I fix duplicate content issues if the pages are really just localized versions?
Does this still hurt our SEO? Should we place different countries on their own respective domains (.co.uk, etc)?
Local Website Optimization | | fdmgroup0