Agency VS Freelancer: Industry rates for a GOOD link builder
-
Hello Moz Community,
I am looking to hire a link builder for SEO purposes. What are the normal rates and what I should expect for my money? Can you guys please share your knowledge with me so I am better prepared when hiring the right person!
Bonus point for tips on what to look for when hiring someone and also best places to look for an SEO specialist!
I look forward to all your replies!
-
Thank you Marcus !
-
Hey. I have followed and read Eric's advice for years. He is an old hand. I am also unfortunately. His Link Moses newsletter is well worth the subscription fee. So much of this comes down to strategy and I am sure a consultation with Eric would give you enough direction for 12 months.
Hope that helps.
-
I can completely relate. Do you have Skype please? I'd love to know your process as I feel I am exactly in the same boat!
-
Hey Marcus,
I couldn't have hoped for a better reply. You really have helped me out a ton here! I will 100% talk to Eric Ward - have you used his services?
L
-
Hey
I think there are only like a handful of really good, all round link builders out there. Sure, some folks may get good at building links in a specific niche or sector but it can be tough to know every potential avenue.
Another problem we tend to see at the agency I run in the UK is that folks want absolutes:
- we want 10 x links from DA 30 domains per month for £1000 (or whatever)
But, the best link building is often a form of digital PR. So it can be a bit lumpy. Certainly, you can go out and find 10 sites that accept guest posts and hit the proposed metric, but are you getting any real big wins like that? How valuable are those links that are just so easy to get?
Likewise, you really want to get your strategy dialled in first. That is, have something that really deserves to be linked to. If you can create that linkable asset then it makes the link building process so much easier. If you don't have linkable assets then it could be that is the first step here so make sure the agency or freelancer can help in that regard (or go back to the drawing board).
There is a good overview here of how to start with the value for your link building efforts (by me - as a disclaimer).
:http://searchengineland.com/organic-traffic-link-building-small-businesses-269353
If you want the best check out Eric Ward:
http://www.ericward.com/evaluation.html
He can put you a strategy together and give you the direction. With the strategic elements in place you can then better understand what component parts you need to run that strategy. Not cheap but if you want to win big then... start with a solid strategy and layer your link building tactics over that.
So much of this depends on the current situation and marketplace. So you need some form of situation analysis. Are you in a crazy competitive space? Is the competition low end? The price you pay will need to consider the difficulty of the task at hand.
So... lots of moving parts but I certainly hope that helps.
Marcus
-
Very useful. Thanks for your input!
-
A lot of the time it's all about what you can do for them too; writing an article for a website on a subject and linking back to your own site as a form of payment (though you may actually ask for $$ too), especially early on in your 'outreach' campaign. Building a reputation is important.
Over time you may be asked for articles etc, or be able to show off your work in order to gain more exposure.
-
Thanks for the reply, fascinating for to me to read as I'm new to SEO and not heard of these commissioned articles. Is there a marketplace for these deals or is it all about the right contacts?
-
Sure.
In a broad sense, you get what you pay for. If you buy 10,000 links from fiver then expect it to be picked up almost instantly by Google and watch your rankings fall.
I know of people who commission articles for $300-500 and ask for a link back to a certain product or page as part of the deal, usually around 800-1000 words and on a reputable website, rather than some MIT personal student blog.
When it comes to link quality, it is usually the case that less links from high quality sources, over many links from low quality sources is best practise. However, building links naturally is a true recognition of quality.
-
I would still like to know what to expect from the industry in regards to this question but thank you for the reply!
-
I'd suggest doing the work yourself with the assistance of programs such as BuzzSumo.
Doing the work yourself will help keep you in touch with how your brand/business is being talked about in both social platforms and blogs, as well as everything in between.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Linking from purchased businesses to my own
Hi All, An SEO and Google guidelines question. We've recently purchased several local businesses that have websites. Legally, we've put a disclaimer saying we've purchased those businesses, the question is whether we should link from those sites to our main site. Will this bring a manual action from Google? It's legitimate that we'd like the visitors from those websites come to our main site because those business no longer named the way they were. So, is it OK to link from these sites to ours? Will this violate Google's guidelines regarding backlinking? Should we even link and if so add the rel:nofollow tag? Thanks!
Intermediate & Advanced SEO | | OrendaLtd2 -
Links Questions and advice?
I have a website which has a fair few link assets that are doing very well (a lot of really powerful sites have link to them with follow links) but my commercial pages are not doing as well as a lot of sites without any other investment than (mediocre) links direct to there commercial pages with at least 10% of them carrying the money anchor text. Even pages we have had a few links for with generalized real anchor text and reasonable links do not do as well as the above due to none of them carrying the money keyword? Is it me or does google still rely on links to the commercial page and keywords with anchor text to match the money term?
Intermediate & Advanced SEO | | BobAnderson0 -
Help in Internal Links
Which link attribute should be given to internal links of website? Do follow or No follow and why?
Intermediate & Advanced SEO | | Obbserv0 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Canonical link vs root domain
I have a wordpress website installed on http://domain.com/home/ instead of http://domain.com - Does it matter whether I leave it that way with a canonical link from the domain.com to the domain.com/home/ or should I move the wordpress files and database to the root domain?
Intermediate & Advanced SEO | | JosephFrost0 -
Counting over-optimised links - do internal links count too?
To whit: In working out whether I've too many over-optimised links pointing to my homepage, do I look at just external links -- or also the links from my internal pages to my homepage? In other words, can a natural link profile from internal pages help dilute overoptimisation from external links?
Intermediate & Advanced SEO | | Jeepster0 -
Links with Parameters
The links from the home page to some internal pages on my site have been coded in the following format by my tech guys: www.abc.com/tools/page.html?hpint_id=xyz If I specify within my Google Webmaster tools that the parameter ?hpint_id should be ignored and content for the user does not change, Will Google credit me for a link from the home page or am I losing something here. Many thanks in advance
Intermediate & Advanced SEO | | harmit360