Which tools are better? SEOMoz Tools or Bruce Clay's Tools.
-
I've ALWAYS wanted to hear some discussion on this, please give me your honest opinion so I can make the correct decision.
-
Ive used both Bruce Clay and SeoMoz toolsets. The one thing I really liked about Bruce Clays tools is that I can punch in a url and get a full analysis report within minutes. Seomoz takes a week. That's a major drawback in my opinion.Seomoz has a great q and a forum so thats a major plus.
-
You are most welcome Aspirant!
In my case, the parameters are disregarded due to the same page being listed 15 times, one for each sort order. An example of the parameters involved are ?order=asc, type=authorname.
In the above case, once Google has evaluated the main page, there is no reason for them to look any further. The other pages offer the identical content sorted differently, so I instruct Google to disregard those pages.
If you wish to tell Google to disable parameter pages take the following steps: Log into Google WMT, select your domain then Site Configuration > Settings > Parameter Handling tab. Choose your parameter from the list (in your case ?from is not on the list so you would ad it) then change the Action to ignore.
Without seeing your pages I am not sure if the above is the best approach for you, but if you want to use the process, here is the information. The process for Bing is very similar to Google.
-
Thank you for your reply Journeyman! Right now I have an article directory on one of my sites. SEOMoz Crawl errors are reporting the following:
www.sitename.com/articles/?from=0
www.sitename.com/articles/?from=10
www.sitename.com/articles/?from=100
www.sitename.com/articles/?from=105
etc.... www.sitename.com/articles/?from=225
The crawl errors are for duplicate page title and duplicate page content. You said, "In Google WMT and Bing I have provided instructions to disregard the parameter pages and only index the main page."
Do I need to be doing the same thing you're doing with the parameters in WMT? If so, do you have a link to Google that will show me how and explain things a little bit further about parameters?
-
Specifically in regards to crawl errors...I try to use seo tools from a multitude of different sources to see if the errors (if any) they are returning are comparable. If I see the same errors across multiple tools then I usually try to drill down and fix the problem at that point.
-
I used some of Bruce Clays a while ago, wasn't as many tools as you get here... he might have more now. I found that I preferred the SEOmoz tools anyway. I still think you get some great and valuable info from Bruce Clay Inc but over-all I'd say here was best for tools and data. Just my opinion.
-
How often are the tools the deciding factor in what you do?
-
My feeling are the same as Richard.
I only use the SEOmoz tools but would like to learn about other tools which overcome some limitations of the SEOmoz tools.
For example, I don't like the moz crawl reports. They lack the ability to offer customizations. I have category pages which offer the ability to use order parameters. In Google WMT and Bing I have provided instructions to disregard the parameter pages and only index the main page. I need a crawl tool that can either pull these settings, or allow me to set them in the tool.
A good crawl report is one that provides actionable data. To see the same 3000 error records every week is annoying, and it makes it more difficult to drill down to the few pages which actually require attention. It would be nice to share the crawl report with a client without having to say "just disregard those 3000 error messages as they are not important".
-
I am giving a thumbs up to this post as I am also interested in the findings. I have only use, and plan to use SEOmoz as their tools are very good, and are getting better.
The biggest issue I have with SEOmoz is the lack of incorporating old tools into the new campaigns. I am sure that is being worked on.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large site with content silo's - best practice for deep indexing silo content
Thanks in advance for any advice/links/discussion. This honestly might be a scenario where we need to do some A/B testing. We have a massive (5 Million) content silo that is the basis for our long tail search strategy. Organic search traffic hits our individual "product" pages and we've divided our silo with a parent category & then secondarily with a field (so we can cross link to other content silo's using the same parent/field categorizations). We don't anticipate, nor expect to have top level category pages receive organic traffic - most people are searching for the individual/specific product (long tail). We're not trying to rank or get traffic for searches of all products in "category X" and others are competing and spending a lot in that area (head). The intent/purpose of the site structure/taxonomy is to more easily enable bots/crawlers to get deeper into our content silos. We've built the page for humans, but included link structure/taxonomy to assist crawlers. So here's my question on best practices. How to handle categories with 1,000+ pages/pagination. With our most popular product categories, there might be 100,000's products in one category. My top level hub page for a category looks like www.mysite/categoryA and the page build is showing 50 products and then pagination from 1-1000+. Currently we're using rel=next for pagination and for pages like www.mysite/categoryA?page=6 we make it reference itself as canonical (not the first/top page www.mysite/categoryA). Our goal is deep crawl/indexation of our silo. I use ScreamingFrog and SEOMoz campaign crawl to sample (site takes a week+ to fully crawl) and with each of these tools it "looks" like crawlers have gotten a bit "bogged down" with large categories with tons of pagination. For example rather than crawl multiple categories or fields to get to multiple product pages, some bots will hit all 1,000 (rel=next) pages of a single category. I don't want to waste crawl budget going through 1,000 pages of a single category, versus discovering/crawling more categories. I can't seem to find a consensus as to how to approach the issue. I can't have a page that lists "all" - there's just too much, so we're going to need pagination. I'm not worried about category pagination pages cannibalizing traffic as I don't expect any (should I make pages 2-1,000) noindex and canonically reference the main/first page in the category?). Should I worry about crawlers going deep in pagination among 1 category versus getting to more top level categories? Thanks!
Moz Pro | | DrewProZ1 -
Tool to find compeitor keyword overlap?
I want to know which competitors rank for the same things I do. Is there a tool that can give me this info?
Moz Pro | | ShearingsGroup0 -
Is there anything SEMRush Guru or Enterprise level has that I can't do with SEOMoz Pro?
Okay fellow in-house SEOs, I need some advice please. We are considering an upgrade from SEOMoz Pro to Pro Elite. For a business our size, this is a major investment. I need some compelling reasons why we should go with this upgrade in lieu of simply adding an upgraded membership at SEMRush. Aside from more keywords tracked and more pages crawled (We only have about 5,000 pages, so I'm not sure more pages crawled is even something we need). Right now, it seems the only benefit to us would be the ability to track 2500 more keywords. We don't need more campaigns and we don't need more pages crawled. An extra $300/month is a lot to pay just for the sole benefit of adding more keywords. The reason we are considering this is because we have 6-7 product lines that are all quite different from each other, different keywords, different competitors. All of these products lines are enormously competitive, so of course we want to track each (with competitors) separately so that when I build a Ranking Index (which I learned how to do beautifull thanks to this post by AJ Kohn a few days ago), that my Ranking Index is a truer picture of how we are doing for keywords related to specific product lines, against the right specific competitors. Because of the keyword limitations in SEOMoz Pro (1,000 max), we can't come close to covering everything. Is increasing our level at SEOMoz really the best, most affordable option to do what we want to do? I realize I might get some bias here, but we are really trying to research cost-effective options, as $4,790 is a huge investment for us. Thoughts?
Moz Pro | | danatanseo0 -
Links don't add up
Sorry if this is obvious, but I'm new to seomoz. I've run an analysis for one of my pages and it's showing 624 total links - 118 internal and 6 external. Why doesn't the sum of the internal and external links equal the total links?
Moz Pro | | landmark10 -
Batch lookup domain authority on list of URL's?
I found this site the describes how to use excel to batch lookup url's using seomoz api. The only problem is the seomoz api times out and returns 1 if I try dragging the formula down the cells which leaves me copying, waiting 5 seconds and copying again. This is basically as slow as manually looking up each url. Does anyone know a workaround?
Moz Pro | | SirSud1 -
404 errors in SEOMoz crawl tool
I currently have several 404 errors in the latest crawls from SEOMoz. Here is an example of the error. http://dealerplatform.com/blog/2011/10/23/videos-for-auto-dealers/www.dealerplatform.com/ In all cases the error is a result of www.dealerplatform.com being added to the real url. Anyone seen this before? The site is a wordpress mutlisite. I don't see where this incorrect link is showing up anywhere on the website. Any advice would be helpful. Thanks
Moz Pro | | Chris_Gregory0 -
Keyword Difficulty Tool
Is there a way to use KDT and include my own URL in the process so that I can see (and show my client) how things look competitively across all these nice dimensions? All is well if my client's site is in the top 10 - but if it isn't, how can I get the same set of metrics on a specific URL as it pertains to a specific keyword? Do I somehow to remember it used to do this? Or am I imagining things? I can't seem to get it to work this way. Thanks,
Moz Pro | | seo_plus0 -
SEOmoz keyword rankings in campaign report
Hi. Does anybody know where the rankings summary has moved to in the campaign reports? I want to know how many keywords have moved up or down in the last week and can't find it anywhere!
Moz Pro | | neooptic1