Why does everyone use bitly?
-
Why do people use bitly? I thought it was just a way to share a link on twitter if the link was too long in url.
I see SeoMoz shares all their content with a bitly link. Even when they share it on Google+. Why?
-
sometimes you need to not worry about if/then SEO factors because that's how we end up falling into magic trick traps. Sometimes it's truly more valuable to allow other reasons to be the decider.
-
I'm a bit skeptical of using bit.ly. When someone does a 301 redirect some of the link juice gets lost, not ,much but some. Using a shortner will 301 the user. As Alan stated bit.ly is great for tracking and keeping your tweets within character limit.
Please don't quote me on this but Google has their own shortner goo.gl, perhaps Google's shortner will pass all the juice...
-
there are many ways to track traffic, however if you use a shortener service like bitly, you get a shorter URL for easier / cleaner distribution, you can rapidly evaluate a specific URL's performance tied directly to a specific marketing campaign method, and when other people spread that shortened link, it's just as effortless to track click-through for that specific initiative regardless of how many times that is shared. Much easier than digging into analytics and setting up unique campaigns.
It is of course, far from perfect, and there are many ways to get the data - bitly is just one and it's handy. Especially with the functionality of generating the unique shortened URL on the fly which is much faster than setting up custom tracking campaigns in GA.
-
Can't you use google analytics to determine people are coming to your site?
-
bitly is one of the most well established shortener services available. They provide click-through data across several data points. This helps site owners track the relative success or shortcomings of content they wish to promote, and can be used to determine what works and what doesn't from a marketing and promotion perspective.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using 2 cache plugin good or not?
Hi, Can anyone tell me - whether using 2 cache plugin helps or it cause any issue? Besides, when i used w3 cache plugin in WordPress its found like inline CSS issue to get cleared. So, i tried auto optimized but my website Soc prollect gone crashed in between while using the some. Is there any solution and can anyone tell me which plugin advantages to speed the site by removing java script and inline css at a time.
Technical SEO | | nazfazy0 -
Using 410 To Remove URLs Starting With Same Word
We had a spam injection a few months ago. We successfully cleaned up the site and resubmitted to google. I recently received a notification showing a spike in 404 errors. All of the URLS have a common word at the beginning injected via the spam: sitename.com/mono
Technical SEO | | vikasnwu
sitename.com/mono.php?buy-good-essays
sitename.com/mono.php?professional-paper-writer There's about 100 total URLS with the same syntax with the word "mono" in them. Based on my research, it seems that it would be best to serve a 410. I wanted to know what the line of HTACCESS code would be to do that in bulk for any URL that has the word "mono" after the sitename.com/0 -
Using 302s to redirect pages returning in 6 months
We are doing a 2-phase site redesign (in order to meet a deadline). An entire section of the site will not be available in the first phase, but will come back in 6 months. The question is, do we use 301s or 302s for those pages that will be coming back in 6 months? Is there a time limit on what is considered "temporary"? thanks in advance!
Technical SEO | | Max_B0 -
Why use noindex, follow vs rel next/prev
Look at what www.shutterstock.com/cat-26p3-Abstract.html does with their search results page 3 for 'Abstract' - same for page 2-N in the paginated series. | name="robots" content="NOINDEX, FOLLOW"> |
Technical SEO | | jrjames83
| | Why is this a better alternative then using the next/prev, per Google's official statement on pagination? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663744 Which doesn't even mention this as an option. Any ideas? Does this improve the odds of the first page in the paginated series ranking for the target term? There can't be a 'view all page' because there are simply too many items. Jeff0 -
Using a Feedburner RSS link in your blog's header tag
It was suggested in Quick Sprout's Advanced SEO guide that it's good form to place your Feedburner RSS link into the header tag of your blog. Anyone know if this needs to be done for every page header of the blog, or just the home/main/index page? Thanks
Technical SEO | | Martin_S0 -
Has anyone used a company to help promote their site
Hi, i receive around ten emails a day claiming they can help you get your site in the top ten in google, now i know most are a load of rubbish but i am just wondering if anyone has used any of these companies for a new site or an old site. I am about to launch a new site after xmas and i am just wondering if any of these companies are worth looking at to help promote the new site instead of doing all the ground work myself. Would love to know your thoughts
Technical SEO | | ClaireH-1848860 -
Using Rel=Author with Multiple Contributors
I have multiple contributors who provide content on our page. I have created an authors page that shows the picture and bio of each author along with their Google+ profile link. Each profile link goes to the authors respective profile where I have had them verify themselves as contributors. My question is will Google see each of these authors and attribute the rel=author tag correctly (even though they are listed on the same profile page) or will Google only take the first person I point to for Rel=Author?
Technical SEO | | PLEsearch0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0