403 Forbidden Crawl report
-
Hi,
I am getting 403 forbidden crawl report on some of my pages. However the pages are loading fine. Also when asked my web developer told that some times reports show errors when there is nothing wrong. Also will the errors affect the SEO/Ranking etc.
Some of the links:
https://www.medistaff24.co.uk/contact-us/https://www.medistaff24.co.uk/elderly-care-in-evesham-worcestershire/
-
I have a locks business website about locksmith Tampa Florida
we are facing the same issue on the Main page -
A 403 Forbidden error means that the server denied access to the requested page. This can happen for a few reasons, such as:
- The user does not have permission to access the page.
- The page is not published yet.
- There is a misconfiguration on the server.
If you are getting 403 Forbidden errors on your website, it is important to first check that the pages are actually loading fine for users. You can do this by visiting the pages yourself or by using a tool like Google Search Console.
If the pages are loading fine for users, then the errors in the crawl report are likely false positives. This can happen if Googlebot encounters a temporary error when crawling your website. In this case, you can ignore the errors and they should eventually go away.
However, if the pages are not loading fine for users, then the errors in the crawl report are likely real. In this case, you need to fix the underlying issue that is causing the 403 Forbidden errors.
Here are some steps you can take to fix 403 Forbidden errors:
- Check the permissions on the files and folders that contain the pages that are returning 403 Forbidden errors. Make sure that the user account that Googlebot is using has permission to access these files and folders.
- Check the robots.txt file to make sure that Googlebot is not being explicitly denied access to the pages that are returning 403 Forbidden errors.
- Check the server configuration to make sure that there are no misconfigurations that could be causing the 403 Forbidden errors.
If you have tried all of these steps and you are still getting 403 Forbidden errors, then you may need to contact your web hosting provider for assistance.
Will the 403 errors affect the SEO/Ranking
As for whether or not 403 Forbidden errors will affect your SEO/ranking, it depends on a few factors.
-
If the pages that are returning 403 Forbidden errors are important pages for your website, then the errors could potentially have a negative impact on your SEO and ranking.
-
However, if the pages that are returning 403 Forbidden errors are not important pages for your website, then the errors are unlikely to have a significant impact on your SEO and ranking.
It is best to fix 403 Forbidden errors as soon as possible. This will help to ensure that Googlebot can access all of the pages on your website and that your website is crawlable and indexable.
Warm Regards
Rahul Gupta
Suvidit Academy
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On Page reports is empty
Hello, Yesterday I created my PRO account, I have several urls in top 50 instead of a have no report in On Page Reports, how low take the system for generating this? Thank you, Carlos
On-Page Optimization | | cahams0 -
Handling a Huge Amount of Crawl Errors
HI all, I am faced with a crawl errors issue of a huge site (>1MiO pages) for which I am doing On-page Audit. 404 Erorrs: >80'000 Soft 404 Errors: 300 500 Errors: 1600 All of the above reported in GWT. Many of the error links are simply not present on the pages "linked from". I investigated a sample of pages (and their source) looking for the error links footprints and yet nothing. What would be the right way to address this issue from SEO perspective, anyway? Clearly. I am not able to investigate the reasons since I am seeing what is generated as HTML and NOT seeing what's behind. So my question is: Generally, what is the appropriate way of handling this? Telling the client that he has to investigate that (I gave my best to at least report the errors) Engaging my firm further and get a developer from my side to investigate? Thanks in advance!!
On-Page Optimization | | spiderz0 -
Conflicting Canonical Tag in On-Page Report
I'm going through my site with the "One-Page report card for some PPC landing pages. If I have this canonical tag on my page : I get a Fail in the "Critical Factors Section here: Appropriate Use of Rel Canonical but a Pass in the Optional Factors here: Canonical URL Tag Usage If I take the canonical link out, the opposite happens, what am I missing? Is the format wrong? Thanks Michael
On-Page Optimization | | mjrinvent0 -
On-Page Report Card with https
Hi, Our site has a 301 redirect to https and I'm getting two different grades for my pages depending upon whether I type: https://www.domain.com (gets an A grade) or www.domain.com (gets a C grade) Is there a setting I need to use to make sure my campaign knows our site is at https? Thank you 🙂
On-Page Optimization | | GroundSix0 -
SEO Moz crawl has 3 missing page title errors when they are clearly there.
My SEO Moz crawl today has highlighted for errors where page titles are empty missing. For example: http://www.musicliveuk.com/live-acts/hire-wedding-entertainment/wedding-entertainment-kent This page clearly has a title as do the other 3. Is it a bug in the system or am I missing something?
On-Page Optimization | | SamCUK0 -
Crawling - Blue Notice - Canonical
Hi, I have 270x blue notices within crawl diagnostics in SEOMoz Pro labelled rel=canonical. My site has the rel=canonical tag set-up as I was advised to do so. See www.comparecurrency.co.uk Are these notices suggesting I have to remove the tag? Can somebody please explain this notice to me .. Thanks Olly
On-Page Optimization | | ojkingston0 -
How To Prevent Crawling Shopping Carts, Wishlists, Login Pages
What's the best way to prevent engines from crawling your websites shopping cart, wishlist, log in pags, ect... Obviously have it in robots.txt but is their any other form of action that should be done?
On-Page Optimization | | Romancing0 -
Is there a report in SEOMoz that will show me what keywords each page ranks for on my site?
I would like to find all of the keywords not just the keywords that I specified in the tracking section.
On-Page Optimization | | Court_H0