SEOBook RankChecker Works at one location but not another
-
I use SEOBook's RankChecker to do spot checking of keywords for clients or for potential clients. I like the tool quite a bit, but I've noticed that it recently stopped working at our office (instead of rankings, I just get a series of dashes) but works fine from my home computer. I'm thinking that it may have to do with our company's firewall.
Anyone have any thoughts?
-
Aaron,
Thanks for your response. I have installed what I believe is the latest version (1.8.21). I have tried to turn off Google Instant, although this isn’t as easy as I thought. They removed the ability to turn it off and replaced it with a box where you can check “never show instant results”. Even when you do this, you still see instant results.
I have the delay between queries set to 12 seconds, so I don’t think that is the issue. My next steps will be to check the firewall and then delete my current extensions in Firefox to see if any conflict.
Eric
-
- Do you have the most recent version of the extension installed at your work?
- Have you turned off Google Instant?
- Do Google results render fine, or is there a captcha on them? If captcha, then have you increased the delay between queries?
- It could also be an issue with the Firewall or security software, but these tend to be much less common of issues than issues of having no delay time between searches or there being issues with Google Instant.
- Another issue could be another conflicting extension in Firefox.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same URL names in one domain
Hi All, I have 9 different subdirectories for languages in the same domain example: www.example.com/page.html www.example.com/uk/page-uk.html www.example.com/es/page-es.html we are implementing hreflang tags for the languages. I know it is better to translate URLs, but we won't for now, because all the NON-ASCII characters. But we are thinking to get rid of the dashes on the languages URL: -uk or -es, so it will be: www.example.com/page.html www.example.com/uk/page.html www.example.com/es/page.hrml would this be a problem? to have same page names even if they are in different subdirectories? would we need to add canonical tags, at least for the main domain URLs? www.example.com/page.html Thank you, Rachel
Technical SEO | | RaquelSaiz0 -
Do 301s still work after hosting is discontinued?
I am in the process of phasing out a website that has been acquired by another company. Its web pages are being 301 redirected to their counterparts on the website of the company that has acquired them. How long should I maintain the hosting of the phased out website? Technically, do 301s still work after the hosting has been discontinued? Thanks, Caro
Technical SEO | | Caro-O0 -
John Mueller says don't use Schema as its not working yet but I get markup conflicts using Google Mark-up
I watched recently John Mueller's Google Webmaster Hangout [DEC 5th]. In hit he mentions to a member not to use Schema.org as it's not working quite yet but to use Google's own mark-up tool 'Structured Data Markup Helper'. Fine this I have done and one of the tags I've used is 'AUTHOR'. However if you use Google's Structured Data Testing Tool in GWMT you get an error saying the following Error: Page contains property "author" which is not part of the schema. Yet this is the tag generated by their own tool. Has anyone experienced this before? and if so what action did you take to rectify it and make it work. As it stands I'm considering just removing this tag altogether. Thanks David cqbsdbunpicv8s76dlddd1e8u4g
Technical SEO | | David-E-Carey0 -
Cannot work out why a bunch of urls are giving a 404 error
I have used the Crawl Diagnostic reports to greatly reduce the number of 404 errors but there is a bunch of 16 urls that were all published on the same date and have the same referrer url but I cannot see the woood for trees as to what is causing the error. **The 404 error links have the structure:**http://www.domainname.com/category/thiscategory/page/thiscategory/this-is-a-post The referrer structure is: http://www.domainname.com/category/thiscategory/page/2/ Any suggestions as to how to unravel this would be appreciated.
Technical SEO | | Niamh20 -
Can You 301 Unwanted Links to Another Site?
I am trying to clean up my link profile, and have noticed that a I have a lot of crappy inbound links linking to some of my old pages. And those old pages have since been 301'ed to current pages. My question is, is it worth trying to 301 those old pages, and thus those crappy links, to another website? Would this do anything to clean up my link profile?
Technical SEO | | red6marketing0 -
Web config redirects not working where a trailing slash is involved
I'm having real trouble with getting working redirects in place to use on a site we're re-launching with a modified url structure. Old URL: http://www.example.com/example_folder/ New URL: http://www.example.com/example-of-new-folder/ Now, where the old URL's have a trailing slash the web.config simply will not accept it. It says the URL can start with a slash, but not end with a slash. However, many of my URL's do end with a slash so I need a workaround. These are the rules I'm putting in place: <location path="example_folder/"></location> Thanks
Technical SEO | | AndrewAkesson0 -
Duplicate content on the one domain (related to country targetting)
Hi - We have a client who has a TLD that they wish to target several markets using .com/au .com/us .com/sg Each will use duplicate content with slight variations to cater for the local market (spelling, industry jargon). They seem reluctant to register a TLD for each target market (which was our suggestion) and I am wondering what SEO penalties would apply for having a majority of duplicate content on the same domain – perhaps using subdomains would be better? Thanks!
Technical SEO | | E2E0