I keep getting Authentication Failed on the API
-
I have the credentials in the URL correctly but it will continue to fail authentication. I will not post them obviously but is there a problem with the API currently? I tried creating new credentials Also I have used this before so I am sure it is not a problem with the credentials.
I somehow managed to get Chrome to show the data. Firefox will not and the code i have written also return authentication failed. This is a bug on your end. Please fix it ASAP.
-
Hey Carmelo,
It looks like you are encountering a bit of a different issue. It sounds like you are receiving the error when you try to connect with Google Analytics rather than when you try to make an API call. I'm really sorry about that.
Google is currently having some authentication issues on their end. The issue seems to be clearing up and we are monitoring the issue as it resolved. If you are still encountering the issue, please contact our Help Team here: http://moz.com/help/contact for further assistance.
Chiaryn
-
Hi having same issue, just signed up for chessmazeinternational and getting same error back from analytics ?
Remote-authentication.failure
can you assist?
-
Colum,
You should not be regenerating your secret key so often, it's not doing you any favors and is just making it more difficult to help you.
The issue you are having is with how you are generating your expires and signature. I tested the call you posted in the forum that was giving you a 401 and the signature and expires did not line up. I'm not going to put your information here for your privacy, but the problem is definitely with your code and not with your API key or the API itself.
I hope that makes sense.
Thanks,
Joel. -
I tried the sample code, worked, tried mine, broke, changed back to sample code, broke. I also generated a new key (ill try that in 30 minutes) in case it was just not updating in the DB, in terms of waiting 30 minutes, I waited for 48 hours and still did not work.
-
Hey Colum,
That forum you are looking at is a bit old and I believe we have resolved the authentication issue, so it isn't an overarching problem. I also was able to grab the sample code on your account and get data back when I ran the code in my browser, so I would recommend trying to run a call again. If you regenerated your credential and still received the error, it may be that the new credentials hadn't updated yet in our database, since it can take up to 30 minutes for the database to recognize the new credentials.
I would suggest trying both the sample code on your account and a regular call to see if either are still giving you the error message. If you are no longer getting the error message on the sample call, but you are still getting it on the other call, it may be a formatting issue. I'm definitely seeing data for the sample call now, so it shouldn't be any issue with the actual credentials.
I hope this helps. Please let me know if you are still seeing any issues.
-
Did not know that was there. there seems to be others with the same problem so definitely a bug: https://seomoz.zendesk.com/entries/21886401-Another-401-Unauthorized
-
Good morning!
While I don't know the answer, maybe I can help! Have you tried posing this over at the API Forum? https://seomoz.zendesk.com/categories/6328-SEOmoz-APIs-and-Extras. That is usually the best place for API issues and questions. If you leave this question open too, we can go at this from two angles.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting a URL Unaccessible on the page grader
I'm optimizing a site for a financial advisor, here is the site: http://www.mattkeenancfp.com I am getting the message "that URL is unaccessible" when I try to use the on-page grader. This is an emerald website too, I'm not sure if that has any effect on anything though.
Moz Pro | | ryanbilak0 -
How to get organized for Freelance SEO
Hi there, I'm fairly new to SEO but have learned a substantial amount in the last 8 months or so. I currently run my own web design company but want to add a new SEO service to the list. My question to you all is this... How do other SEO'ers organise their week for clients with SEO needs? Do you dedicate a day per client, divide your day up into hourly segments per client? What i'm looking for is a rough guide as to how I should get organized for the new service that I want to provide. How do you monitor what you have been doing with a client? There are sooo many questions that I want to ask, ut feel I may end up driving people crazy. Any help/advice will be welcomed with open arms!! Many thanks.. Alex
Moz Pro | | SeoSheikh0 -
Roger keeps telling me my canonical pages are duplicates
I've got a site that's brand spanking new that I'm trying to get the error count down to zero on, and I'm basically there except for this odd problem. Roger got into the site like a naughty puppy a bit too early, before I'd put the canonical tags in, so there were a couple thousand 'duplicate content' errors. I put canonicals in (programmatically, so they appear on every page) and waited a week and sure enough 99% of them went away. However, there's about 50 that are still lingering, and I'm not sure why they're being detected as such. It's an ecommerce site, and the duplicates are being detected on the product page, but why these 50? (there's hundreds of other products that aren't being detected). The URLs that are 'duplicates' look like this according to the crawl report: http://www.site.com/Product-1.aspx http://www.site.com/product-1.aspx And so on. Canonicals are in place, and have been for weeks, and as I said there's hundreds of other pages just like this not having this problem, so I'm finding it odd that these ones won't go away. All I can think of is that Roger is somehow caching stuff from previous crawls? According to the crawl report these duplicates were discovered '1 day ago' but that simply doesn't make sense. It's not a matter of messing up one or two pages on my part either; we made this site to be dynamically generated, and all of the SEO stuff (canonical, etc.) is applied to every single page regardless of what's on it. If anyone can give some insight I'd appreciate it!
Moz Pro | | icecarats0 -
What do you get with mozpoints?
What is the point of collecting mozpoints? I read that you are able to purchase features, but what other perks are there with collecting mozpoints?
Moz Pro | | ReadyArtwork0 -
How do I get the Page Authority of individual URLs in my exported (CSV) crawl reports?
I need to prioritize fixes somehow. It seems the best way to do this would be to filter my exported crawl report by the Page Authority of each URL with an error/issue. However, Page Authority doesn't seem to be included in the crawl report's CSV file. Am I missing something?
Moz Pro | | Twilio0 -
How to get seomoz to re-crawl a site?
I had a lot of duplicate content issues and have fixed all the other warnings. I want to check the site again.
Moz Pro | | adamzski0 -
API Key
I am a Pro user and I am trying to find a way to create an SEOmoz API key but cannot find how to do...
Moz Pro | | netbuilder0