How do I disallow a subdirectory in my reports?
-
My seomoz reports include thousands of pages from a "localmarket" subdirectory licensed to an ad packager that I am soon turning off. How do I remove this subdirectory from seomoz reports now?
-
I'll give that a try. Thanks!
-
Hi Dan,
I would suggest adding this to your robots.txt file within your root, within the next crawl of Roger this shouldn't be in your reports anymore.
User-agent: rogerbot Disallow: /localmarket*
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Why is Moz reporting 308s as 302s?
Got a looooong list of redirect issues in my crawl for a new client, all reported as 302s but as far as I can see they are all 308s... which is perfectly fine, right, or have I missed a memo? They even confirm the 308 status in the moz detail.
Moz Pro | | Algorhythm_jT0 -
Why does page report still have "Too Many Links" alert?
I'm a new Moz Pro user and see that the on-page optimization report includes "too many links" on page and references the "100 links" guideline that Google no longer endorses: http://www.youtube.com/watch?feature=player_embedded&v=l6g5hoBYlf0 Too many links is still an issue in terms of passing link juice, but not much else. Matt Cutts has stated that it is fine to have many internal links in this regard. Please correct me if I am mistaken.
Moz Pro | | Shannon-PayLoadz0 -
The report I tried to run on followerwonk is returning no results
http://wonk.ly/giQl this report is returning no results, even though an identical report for a competitor of ours is returning results just fine. Please help.
Moz Pro | | BeatportRDP0 -
Using the On-Page optimization Report Card
I am curious if there is a way to get the on page optimization report card to not show a grade of "F" for a page that I'm not targeting a particular keyword for? For example my home page has a lot of grade "F's" for keywords that are targeted on different pages.
Moz Pro | | kadesmith0 -
Will Social be Incorporated into the New Reports Interface
Just wondering if there are plans to integrate the social media reporting section with the new "Reports" section. Having a social section in the new Reports interface that could show weekly/monthly/yearly Facebook and Twitter interactions would be awesome to have all in one PDF report with the rest of the reporting options.
Moz Pro | | noBulMedia0 -
Duplicate Content and Titles in SEOMoz reports
I've had to rename some of the pages on my site and also move them to different locations. I placed a rel="canonical" on the old page pointing to the new one. The reports on my PRO Dashboard are telling me that I have Duplicate Content and Page Title errors. Do the SEOMoz automated reports take the rel="canonical" link into consideration or do I need to remove these pages and do a 301 redirect from the old to the new page?
Moz Pro | | TRICORSystems0 -
SEOMOZ reports the statistics.
SEOMOZ reports the Statistics, but where do i manage & improve??? Simply Statistics is all about SEOMOZ??
Moz Pro | | webicers0 -
Why is Roger crawling pages that are disallowed in my robots.txt file?
I have specified the following in my robots.txt file: Disallow: /catalog/product_compare/ Yet Roger is crawling these pages = 1,357 errors. Is this a bug or am I missing something in my robots.txt file? Here's one of the URLs that Roger pulled: <colgroup><col width="312"></colgroup>
Moz Pro | | MeltButterySpread
| example.com/catalog/product_compare/add/product/19241/uenc/aHR0cDovL2ZyZXNocHJvZHVjZWNsb3RoZXMuY29tL3RvcHMvYWxsLXRvcHM_cD02/ Please let me know if my problem is in robots.txt or if Roger spaced this one. Thanks! |0