403 Forbidden Crawl report
-
Hi,
I am getting 403 forbidden crawl report on some of my pages. However the pages are loading fine. Also when asked my web developer told that some times reports show errors when there is nothing wrong. Also will the errors affect the SEO/Ranking etc.
Some of the links:
https://www.medistaff24.co.uk/contact-us/https://www.medistaff24.co.uk/elderly-care-in-evesham-worcestershire/
-
I have a locks business website about locksmith Tampa Florida
we are facing the same issue on the Main page -
A 403 Forbidden error means that the server denied access to the requested page. This can happen for a few reasons, such as:
- The user does not have permission to access the page.
- The page is not published yet.
- There is a misconfiguration on the server.
If you are getting 403 Forbidden errors on your website, it is important to first check that the pages are actually loading fine for users. You can do this by visiting the pages yourself or by using a tool like Google Search Console.
If the pages are loading fine for users, then the errors in the crawl report are likely false positives. This can happen if Googlebot encounters a temporary error when crawling your website. In this case, you can ignore the errors and they should eventually go away.
However, if the pages are not loading fine for users, then the errors in the crawl report are likely real. In this case, you need to fix the underlying issue that is causing the 403 Forbidden errors.
Here are some steps you can take to fix 403 Forbidden errors:
- Check the permissions on the files and folders that contain the pages that are returning 403 Forbidden errors. Make sure that the user account that Googlebot is using has permission to access these files and folders.
- Check the robots.txt file to make sure that Googlebot is not being explicitly denied access to the pages that are returning 403 Forbidden errors.
- Check the server configuration to make sure that there are no misconfigurations that could be causing the 403 Forbidden errors.
If you have tried all of these steps and you are still getting 403 Forbidden errors, then you may need to contact your web hosting provider for assistance.
Will the 403 errors affect the SEO/Ranking
As for whether or not 403 Forbidden errors will affect your SEO/ranking, it depends on a few factors.
-
If the pages that are returning 403 Forbidden errors are important pages for your website, then the errors could potentially have a negative impact on your SEO and ranking.
-
However, if the pages that are returning 403 Forbidden errors are not important pages for your website, then the errors are unlikely to have a significant impact on your SEO and ranking.
It is best to fix 403 Forbidden errors as soon as possible. This will help to ensure that Googlebot can access all of the pages on your website and that your website is crawlable and indexable.
Warm Regards
Rahul Gupta
Suvidit Academy
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can the Lightboxes on My Site be Crawled?
I'm trying to optimize my site, but I have lightboxes and I don't know if they are visible to the search engines. If they aren't, could you suggest something that I could do? THANK YOU so much!!!!! My site is lymphexpo.com
On-Page Optimization | | bosleypalmer0 -
Blocking Subdomain from Google Crawl and Index
Hey everybody, how is it going? I have a simple question, that i need answered. I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more. What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc? Hope to hear from you, Best Regards,
On-Page Optimization | | JesusD3 -
On Page Optimization Reports for Google UK Grade A - F
Hi, Can someone please explain how it is that one of my keywords ranks as a Grade A and a Grade F? This doesn't seem to make any sense? Thanks in advance. Heather
On-Page Optimization | | T1RBO0 -
Why is SEOMOZ Crawl Diagnostics not in sync with Webmaster Tools
Currently, my Website, according to the Crawl Diagnostics Summary, has 401 'Duplicate Page Title Errors'. But in Google Webmaster Tools, under Óptimization on the Left hand Side Toolbar, if you look up HTML Improvements, there are only 4 'Duplicate Title Tags'. I have two questions re this: A) Do 'Duplicate Page Title Errors' and 'Duplicate Title Tags' have the same meaning' ? , and B) why are there 401 errors located by the former, and just 4 by the latter?
On-Page Optimization | | ABCPS0 -
Popup windows are coming up as 404 error in moz reports.
We have several links showing up as 404 errors because of the way our site is set up. I want to know if this is hurting our ranking because Google sees it as 404 errors or is it just something I can ignore because it works for the user? If it is hurting our quality and therefore our rankings, how can I correct it so that best practices are used? I have attached an image of the links and here is an example page --> http://www.sourcemedicalequipment.com/Perch-Polyurethane-Industrial-Stool-18-25-p/idst2.htm The links in the description section have anchor text "All 3 Choices" and result in a popup information page. If you cut and paste the URL's that they are redirecting to end in a 404 but if you use the link on the page it results in a popup information window. Hope I explained that well. Thanks for your help! wrEmQnwWQo
On-Page Optimization | | BenRWoodard0 -
Why Does SEOMOZ Crawl show that i have 5,769 pages with Duplicate Content
Hello... I'm trying to do some analysis on my site (http://goo.gl/JgK1e) and SEOMOZ Crawl Diagnostics is telling me that I have 5,769 pages with duplicate content. Can someone, anyone, please help me understand: how does SEOMOZ determine if i have duplicate content Is it correct ? Are there really that many pages of duplicate content How do i fix this, if true <---- ** Most important ** Thanks in advance for any help!!
On-Page Optimization | | Prime850 -
Crawl Diagnostics - Duplicate Content and Duplicate Page Title Errors
I am getting a lot of duplicate content and duplicate page title errors from my crawl analysis. I using volusion and it looks like the photo gallery is causing the duplicate content errors. both are sitting at 231, this shows I have done something wrong... Example URL: Duplicate Page Content http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Duplicate Page Title http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Would anyone know how to properly disallow this? Would this be as simple as a robots.txt entry or something a little more involved within volusion? Any help is appreicated. Cheers Geoff B. (a.k.a) newbie.
On-Page Optimization | | GeoffBatterham0 -
Pages crawled
I noticed there is a limited in the number of pages crawled on galena.org? Will this number increase over time?
On-Page Optimization | | nskislak240