Why doesn't everyone just purchase a .org tld?
-
Hi,
I am new-ish to SEO, and something just dawned on me today. I have read in many places that .org domains rank higher (even if slightly) than .coms.
Then why doesn't everyone just purchase .org TLDs?
For example, in my industry, most good .com domain names are taken, but .orgs are almost all free. Why not purchase a .org and capitalize on exact match search results?
seomoz is .org and it's far from being a non-profit
-
vishalkhialani, a quick note that only higher education institutions can get a .edu extension, it's not available to just anyone who wants to pick one up.
-
Thank you for your detailed reply Ryan.
I see what you're saying. I was thinking of .org domains mainly for the exact search term match domain names. If .org had any other inherent advantages, it would be a small bonus.
Since on-page is a very strong factor for my target keywords, I think this might give me some advantages.
-
Hi Elad.
Alan's answer is 100% correct. A .org site has absolutely no inherit value greater or less then a .com site with respect to search engine ranking. In fact, all the domains ranging from .net, .info, .edu, .gov etc all have the same value, zero. The value they gain is by building your site, adding content and earning links.
Where a particular domain has increased value is in public perception. A .com is seen as the legitimate business domain, which is as the domain was intended. Think of any major business such as McDonalds, Walmart, Facebook, Google, ATT, etc. and simply add a .com to it, you will land on the company's site. That is not the case of any other domain.
In that sense, .org is seen as for non-profits, .edu for educational institutions, and so forth. This is the public perception and it is by design. If you attempt to run a .org as a commercial site, you are likely to lose some traffic due to people not willing to conduct normal commercial business (i.e. shop online) with a .org site. SEOmoz pulls it off nicely in large part because of all the free SEO offerings: blog articles, Q&A, tools, etc. The basic services are offered for free and users can pay for upgrades. This business model combined with an exceptionally friendly organization and customer service works, but most businesses would not be able to pull it off.
With respect to an exact match, an Exact Match Domain (EMD) has been devalued and it is ridiculously overvalued by people who do not understand SEO. The domain name is one of over 200 ranking factors. You will find all the best names such as "insurance.org" have been taken. If you find a name left, it is because no one else wants it. The bottom line, the amount of traffic you can obtain with the EMD is not worth the effort it takes to provide the content and backlinks to make it work. You will receive a ranking boost for the exact match search, but not the rest of the searches for your site.
You clearly have a firm belief a .org site is advantageous. I am certain it is not, but feel free to purchase the domain and prove us all wrong. You clearly will have a bargain as there are plenty of domains available.
-
another view point is why don't you thin about the end user ?
What is it that you are selling or service your are providing ?
Example : if you are selling your consultancy services then i wud go for .com educational .edu.
Why ? cause of linkbait. Other edu will link to another edu but .com or .org might not get it.
-
-
That's just not true, as sad as it may be. $6.99 on GoDaddy, not questions asked. Even cheaper than a .com.
-
I know, my question is why not use it anyway?
-
Well... if two sites have more or less the same level of trust in the eyes of Google, I am betting the .org will get a little nudge.
-
-
Even if that is true (and I'm not sure it is), I was thinking of getting .org for the exact search match, more than for the .org-ness of them.
-
-
cause you can't get .org tld easily.
-
What ever said and done .org is considered more for non profit .com is more for commercial. Even if seomoz.org has it the other way.
-
its not true that .org gets higher rankings.
-
-
.orgs and .edus do not rank higher just because they're .orgs. or .edus. They rank high when they are truly worthy sites that have content of tremendous value and earn trust signals on a large scale naturally as a result of the quality they offer.
.orgs rank when they're purely focused in a laser-focus type way on the topic central to their non-profit mission. They earn links simply by offering some significant positive contribution to the world. They earn social media mentions for the same reason. People who care about the topic the site focuses on naturally want to share that and point to it and discuss it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What tools and metrics do you use to show a topic's search interest over time?
I have a foundation repair client that is down in leads for the structural repair portion of their business. They have not lost any major rankings, but leads are down compared to last year. They asked if people are searching for this type of work less this year compared to last. I checked Google Trends and Keyword Planner data but found very different results. Is either of these tools accurate, or is there a better tool to use?
Algorithm Updates | | DigitalDivision1 -
Our sitemap is not indexed i Google even though it's successfully processed
Hi, Ours is a WP hosted website. We have submitted the XML sitemap with a WP plugin. It's been successfully processed by Google but it's not been indexed in and can't be found in SERP. How to get this indexed? Will there be any low crawling of sitemap as it's not indexed? Thanks
Algorithm Updates | | vtmoz0 -
What do you think of SearchMetrics' claim that there are no longer universal ranking factors?
I agree that Google's machine learning/AI means that Google is using a more dynamic set of factors to match searcher intent to content, but this claim feels like an overstatement: Let’s be quite clear: Except for important technical standards, there are no longer any specifc factors
Algorithm Updates | | AdamThompson
or benchmark values that are universally valid for all online marketers and SEOs. Instead, there
are different ranking factors for every single industry, or even every single search query. And these
now change continuously. Keyword-relevant content, backlinks, etc. still seem to be ranking factors across pretty much all queries/industries. For example, I can't think of a single industry where it would be a good idea to try to rank for [keyword] without including [keyword] in the visible text of the page. Also, websites that rank without any backlinks are incredibly rare (unheard of for competitive terms). Doubtless some factors change (eg Google may favor webpages with images for a query like "best hairstyle for men" but not for another query), but other factors still seem to apply to all queries (or at least 95%+). Thoughts?0 -
How long you've seen it take to rank in small niche
Hello, How long do you see small niche sites taking to rank where they should be for their strength? Our last site took at least 6 months. Our current site's home page for our main term is stuck at around the 40th page and not moving. It's an exact match domain so it should be on at least page 2. We have one site in the industry already that carries similar products but it is much bigger with a much wider scope of products. It took a while to rank too. Our only backlinks I'm working on are Google & Youtube (and DMOZ), we have a facebook fan page. Our site is nicer than the site in position #1. Working on making as many pages as possible 10X content. Thank You, Bob
Algorithm Updates | | BobGW0 -
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
Schema.org Microdata or Microformats - Which Should We Use
Hi All, I'm wondering what would be the better alternative - schema.org microdata or microformats. I am aware that search engines such as Google, Yahoo, and Bing recognize Schema.org as the standard. Question is, will it have any negative affect? Our web developer here says that schema.org microdata may result in negative html. I don't think that it will affect our SEO, but I guess that's also something to shed some light on. So, what's the consensus here - should we implement schema.org or go with microformats - or, does it really make any difference?
Algorithm Updates | | CSawatzky1 -
Client's site dropped completely from Google - AGAIN! Please help...
ok guys - hoping someone out there can help... (kinda long, but wanted to be sure all the details were out there) Already had this happen once - even posted in here about it - http://www.seomoz.org/q/client-s-site-dropped-completely-for-all-keywords-but-not-brand-name-not-manual-penalty-help Guy was a brand new client, all we did was tweak title tags and add a bit of content to his site since most was generic boilerplate text... started on our KW research and competitor research... in just a week, from title tag and content tweaks alone, he went from ranking on page 4-5 to ranking on page 3-4... then as we sat down to really optimize his site... POOF - he was gone from the Googs... He only showed up in "site:" searches and for exact matches of his business name - everything else was gone. Posted in here and on WMT - had several people check it out, both local guys and people from here (thanks to John Doherty for trying!) - but no one could figure out any reason why it would have happened. We submitted a reconsideration request, explaining that we knew we hadn't violated any quality guidelines, that he had less than 10 backlinks so it couldn't be bad linking, and that we had hardly touched the site. They sent back a canned response a week later that said there was no manual penalty and that we should "check our content" - mysteriously, the site started to show back up in the SERPs that morning (we got the canned response in the afternoon) There WAS an issue with NAP mismatch on some citations, but we fixed that, and that shouldn't have contributed to complete disappearance anyway. SO - the site was back, and back at its page 3 or 4 position... we decided to leave it alone for a few days just to be sure we didn't do anything... and then just 6 days later, when we were sitting down to fully optimize the site - POOF - completely gone again. We do SEO for a lot of different car dealers all over the country, and i know our strategies work. Looking at the competition in his market, he should easily be ranked page 2 or 3 with the very minimal tweaking we did... AND, since we didn't change anything since he came back, it makes even less sense that he was visible for a week and then gone again. So, mozzers... Anybody got any ideas? I'm really at a loss here - it makes zero sense that he's completely gone, except for his biz name... if nothing else, he should be ranking for "used cars canton"... Definitely appreciate any help anyone can offer -
Algorithm Updates | | Greg_Gifford0 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0