Eliminate all comment handle links to avoid even the appearance of comment spam?
-
I've stopped putting my url behind my handle on the blogs in which I participate, out of the fear of the appearance of comment spam. It's not comment spam, we're talking about real interactions on a few blogs and forums.
What do you think? If it is limited to a handful of domains on which I am active, and there are no indications of comment spam in my overall link profile, is a handle link a bad idea?
The real purpose of the link is not to gain any link juice, but to direct the people I interact with in these comment conversations to my site if they would like. But it's not worth the risk of a google-slap.
-
Zachary : I believe Penguinn attacks for a lot LESS than the volume you're describing especially if there are few powerful links to begin with.
-
Thanks guys, of course it makes sense to do things the natural way. Was just over-reacting I suppose.
-
Google didn't condemn comments. Or comment link building.
Fire and brimstone comes when you use the likes of ScrapeBox to submit to hundreds or thousands of blogs, in a small timeframe, with not many varying phrases.
You're fine adding your domain. I'd even consider it an opportunity lost if you didn't.
-
To the contrary, some consider a certain amount of comment links to be part of a "natural link profile". Put your efforts and worries towards getting high quality links, the bad ones don't really matter too much unless that's all you have.
Also, it sounds like what you're doing is beneficial to the humans reading your comments, so weigh that benefit against any potential negative of the bots who are reading your comments.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I best handle parameters?
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning! The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here: Focus (ex: “Data Science”) Cost (ex: “$<5000”) City (ex: “Chicago”) State/Province (ex: “Illinois”) Country (ex: “Canada”) When a filter is applied to the directories page the CMS produces a new page with URLs like these: coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork My questions: 1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls. coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago VERSUS coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all) 2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city? Should I be changing page titles for the unique filtered URLs? I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ? https://support.google.com/webmasters/answer/1235687 An assortment of the other stuff I’ve read for reference: http://www.wordtracker.com/academy/seo-clean-urls http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/ http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
Technical SEO | | alovallo0 -
Unnatural links to your site--impacts links
Hi, I just recive a "nice" Massage at my WMT- Unnatural links to your site—impacts links _Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. Learn more._Did someone here came across any massage like this before?if so, any suggestion on what to so next?Whould love for some help! Thanks
Technical SEO | | Tit0 -
Is it possible to export Inbound Links in a CSV file categorized by Linking Root Domains ?
Hi, I am performing an analysis of the total inbound links to my homepage and I would like to have the total amount of inbound links categorized by the Linking root domains. For example, the Open Site explorer does offer the feature to show you the Linking Root Domains to your page. Then when you click on the first Linking Root Domain, it also shows you the Top Linking Pages ( Which means all the pages that link to your page from this particular top level domain) Now I would like to export this data to a CSV file, but open site explorer only exports the total amount of top level linking domains. Does anyone has a solution to this problem ? Thank you very much for the help in advance!
Technical SEO | | Feweb0 -
Too many links?
Hello! I've just started with SEOmoz, and am getting an error about too many links on a few of my blog posts - it's pages with high numbers of comments, and the links are coming from each commenter's profile (hopefully that makes sense they're not just random stuffed links). Is there a way to help this not cause a problem? Thanks!
Technical SEO | | PaulineMagnusson0 -
Removal of all low PR links
I have a lot of old directory links which where done years ago, most I think will be effecting my site. Is there away to find them all ie through open explorer and then remove them in one go?
Technical SEO | | Cocoonfxmedia0 -
Ratio of linking C-blocks to Linking domains
Hi, Our linkbuilding efforts have resulted in acquiring a high number of backlinks from domains within a C-block. We all know Google issues penalties whenever someone's link profile looks unnatural. A high number of backlinks but a low number of linking C-blocks would seem to be one of reasons to get penalized. Example: we have 6,000 links from 200 linking root domains coming in from 100 C-blocks. At what point should we start to worry about being penalized/giving off an unnatural look to mr G?
Technical SEO | | waidohuy0 -
Nofollow links appear to be still included in SEOMOZ crawl and Google
I have added the nofollow tag to links throughout my site to hide duplicate content from Google but these pages are still being shown in my SEOMOZ crawl. I also fetched an example page with the Googlebot within Webmaster tools and it showed all nofollow links. An example is http://www.adventurepeaks.com/news All News tags have nofollow but each tag is appearing in my SEOMOZ crawl report as duplicate content. Any suggestions on whether this is a problem or if i have applied the tag incorrectly? Many thanks in advance
Technical SEO | | adventure340 -
Spam Backlinks to My Website
today i have created inbound link report using Link Research & Analysis tool and i found that there are number of spam inbound link to my website from lots of blogs and other sites Which anchor text are not relevant to my site. It contain some abusive words in anchor text like "viagra expiration date" and other. I want remove these irrelevant backlinks. As there are very high number of links approx 9000, its almost impossible to remove the links manually. Is there any way to remove and restrict those backlink? Whats steps required to protect any negative affect to my website? Please advice asap.
Technical SEO | | saupari0