Subdomain Research Tool
-
Does anybody know of a research tool that can track the amount of subdomains on a root domain?
Maybe there is a way to manipulate a Google search to display the different subdomains that are indexed?
-
Cool, looking forward to the ad-hoc query tool!
-
Thanks for the update Jon! Looking forward to the updates to come and to be able to manipulate the data in new ways. Should be exciting trying to figure new ways to explore things like subdomain counts.
Thanks again!
-
Hey guys!
So I spoke to the MozScape team and right now this is not possible. Although in theory we have the data to answer this question, due to the sheer size of the dataset the engineering teams have to make data structure and optimization decisions that favor certain use cases (e.g. 'show me all of the external followed links for a root domain'). Currently MozScape is not optimized to answer the use case 'show me all of the subdomains on a given root domain'.
However, you may know we are working on index updates that are going to change the way we store data - this is a huge project, but once it is completed we will be able to run ad-hoc queries against our data, and solve use cases like this.
Hope this helps!
Jon
-
This would be a very handy addition. As Michael has said, using search qualifiers is no quick (or absolute) solution. This would be a great tool for site audits as this was what I was looking for. It would also be useful for searching out other sites (that have been crawled by Moz) to quickly search out blog subdomains or other language subdomains.
Anyway, thanks for the response - looking forward to see if this becomes an available tool!
-
Thanks Michael - great idea! I could see this fitting in the MozBar, maybe in OSE. Let me do some digging into how our index is structured and I will get an update back up here on feasibility.
-
That's a good question...I've not seen a tool that magically does all of that, but certainly Moz could get that from the data they get when they crawl. I'll pass along that idea to Jon White.
I would use this tool myself during site audits, when I'm looking to see if the client's site has subdomains other than www that might be worth consolidating onto the www subdomain.
Today, I do it arduously with Google site: -inurl queries, e.g.
site:acme.com -inurl:www -inurl:blog
and then when I see a new subdomain appear, e.g. news.acme.com, then I append -inurl:news to the site: search.
This doesn't work if the client has decided that the www-less version of their domain is their preferred one...in this case, I'm totally SOL.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which is the best tool to check number of css, js, images and it's size?
Hello Team, Can you please suggest which is the best tool to check number of css, js, images and it's size of my competitors? Also what if they are using CDN network and bundling? Will that tool consider that also? Thanks! Wrights
Competitive Research | | wright3350 -
Tool to scrape data from Homedepot,ebay,amazon.
Does anybody know about a tool that you can use to aggregate data about the best seller products in categories for major retailers in USA? I did find a couple not so good tools for amazon and even ebay but retailers like walmart and homedepot are left out.
Competitive Research | | Harveyspecter0 -
Whats the Best Tool for Bulk SERPS Checking?
To get round the 'not provided' issue I've just generated a list of the 5000+ keywords that people have used to visit my site over the last 7+ years. I want to do a one off SERPS check for all these keywords and export into excel where I'll tie it up with lots of analytics and moz based data I was going to use the queries report on analytics but my data only goes back as far as May this year, so I'll use that if I have to (and probably will anyway for the impressions and CTR) but I'd prefer to cover everything so I can see any opportunities in my very long tail
Competitive Research | | Zippy-Bungle0 -
Your favorite/most comprehensive guide to keyword research?
what's your favorite guide to keyword research? in lieu of having a favorite, what is the most comprehensive guide you've come across?
Competitive Research | | Mozzin0 -
Website Neighborhood Tool
Hello,
Competitive Research | | Ric
Do you know a tool where you can find other websites with the same Adsense code or Analytics Code? Thanks, Eric0 -
Link Analysis Tool
Does anyone have/use any really good link analysis tools? I have been using SEOmoz open site explorer but don't find it that indepth or find that many links. I don't mind paying once the link analysis tool is good and I can export links to excell. I was thinking of buying this software http://www.link-assistant.com/features-and-editions.html Cheers,
Competitive Research | | Socialdude0 -
Link checking tool
I am trying to find the number of back links received from a partner for different pages of the site that I am currently working on. Site has several thousand pages. Does anybody know any tool that can pull the links to entire site from a particular source? Thanks.
Competitive Research | | gmk15670 -
How can I estimate a domain's overall organic search traffic - any tools?
Most of my analysis revolves around looking at rankings for specific keyword phrases that I've identified as important/relevant. But it'd be nice to be able to look at a domain and get a sense for how much organic traffic they get overall. If they're not ranking for the keywords I'm researching but have a lot of organic traffic that would be a nice signal to me that they are probably targeting other phrases more or have a big brand presence or something. Any suggestions? Thanks! Jeff Gibson
Competitive Research | | jeff.gibson0