404 error only png and bmp
-
hey all,
i get 404 error only for png and bmp files, like this link:
http://www.binaryoptionstrategy.com/guide/weekly-news/page/wp-content/uploads/2010/07/12.png
any idea?
-
Shay,
You can use a tool like screamingfrog or xenu to crawl your site and see on which page you have links to those images.
-
hey,
yes this is the page, and i get 404 error after moz crawl...
i dont know where the link coming from.
thanks
-
Is this the page you are looking for?
http://www.binaryoptionstrategy.com/wp-content/uploads/2010/07/12.png
What are you trying to do? Where the link coming from?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can i check which inbound links to my site go to 404 pages?
I have external links coming into my site going to 404 pages, but i cant seem to find a way to search all broken links pointed at my website.
On-Page Optimization | | NickJPearse0 -
Will page be marked as 404 if you replace country specific letters from url?
What I'm reffering to is replacement of Polish characters from i.e "ł" to "l" or "ę" to "e". I believe it relates same way as other similar Slavic languages.
On-Page Optimization | | Optimal_Strategies0 -
Help recover lost traffic (70%) from robots.txt error.
Our site is a company information site with 15 million indexed pages (mostly company profiles). Recently we had an issue with a server that we replaced, and in the processes mistakenly copied the robots.txt block from the staging server to a live server. By the time we realized the error, we lost 2/3 of our indexed pages and a comparable amount of traffic. Apparently this error took place on 4/7/19, and was corrected two weeks later. We have submitted new sitemaps to Google and asked them to validate the fix approximately a week ago. Given the close to 10 million pages that need to be validated, so far we have not seen any meaningful change. Will we ever get this traffic back? How long will it take? Any assistance will be greatly appreciated. On another note, these indexed pages were never migrated to SSL for fear of losing traffic. If we have already lost the traffic and/or if it is going to take a long time to recover, should we migrate these pages to SSL? Thanks,
On-Page Optimization | | akin671 -
Fixing Index Errors in the new Google Search Console - Help
Hi, So I have started using the new Search Console and for one of my clients, there are a few 'Index Coverage Errors'. In the old version you could simply, analyse, test and then mark any URLs as fixed - does anyone know if that is possible in the new version? There are options to validate errors but no 'mark as fixed' options. Do you need to validate the errors before you can fix them?
On-Page Optimization | | daniel-brooks0 -
Thousands of 404's showing up from Wordpress Blog!?!?
Hey guys, Have recently seen thousands of 404 errors thrown up from my wordpress blog in Google Search Console. These are URL's trying to link (i'm not sure where from) to other parts of my site, but they are not relative to the site root... infact they are a mix of random folders/subfolders and pages on my site. E.g: http://www.MYSITE.co.uk/blog/how-to/driving-to-the-alps/attachment/2013-land-rover-range-rover-evoque-front-snow-1/st-martin-de-belleville/chalet-st-martin-de-belleville/ski-holidays/ski-holidays/summer/st-martin-de-belleville/summer/your-stay-st-martin-de-belleville.html This is a link to a picture on the blog: http://www.MYSITE.co.uk/blog/how-to/driving-to-the-alps/attachment/2013-land-rover-range-rover-evoque-front-snow-1/ And the rest of it is finding it's own way there! Any ideas? This is Wordpress by the way. Cheers, Paul. p.s. I got no help from the Wordpress community so am posting here! p.p.s I forgot to mention that MOZ is reporting these issues too, but running Screaming Frog does NOT show any 404's at all on my site...
On-Page Optimization | | SnowTrippin0 -
Internal and Link Juice Analysis - Too Many Links Error
Howdy! I have an analysis question related to internal links/link juice. Here is the general link set up of our site: 1. All Site Pages (Including Home Page): We have drop down "mega" menus in the header of everypage linking to various sub-categories on the site. So, because of this, in our header, we have a few hundred links to various pages on our site and these show up on every page of the site. 2. Product Pages: Header pages as mentioned above, but on top of that, we list out the keywords for that particular product and each keyword is linked back to our search results pages for that particular keyword. In General Moz is telling us we are having between 200-300 links on each product page. Currently, our Search Results pages are ranking higher and showing up in search more than our actual product pages. So, based on the above info, here are some thoughts: 1. Should we ajax in the Header links so that they aren't showing up for the search engines? Or, should we ajax them in only on all pages that are not the Home Page? 2. Should we get rid of the keyword links back to the Search Results pages that are on the product pages? What effect would these changes "actually" have? Does this just improve crawling? Or are there other positive results that would come of changes like these? We have hundreds of thousands of products, so if we were to make changes like these, could we experience negative results? Thanks for your help! Craig
On-Page Optimization | | TheCraig0 -
Deleted pages still registering as 404 pages.
I have a an all html site that I can only work on through the ftp. The previous marketing company ran a script that built thousands of location landing pages, but all they did was change the tags and headers and the keywords in the pages, other than that they are all duplicate pages. I removed them, but Google is reading them as 404 pages. How do I tell Google those pages don't exist? or do I just need to let the bots crawl it a few times and it will see that eventually?
On-Page Optimization | | SwanJob0 -
404 crawl errors with all url+domain
We have 187 crawl 404 errors. All urls on web make a 404 error that this http://www.domain.com/[.....]l/www.domain.com all errors added to the url, the url domain I put an example gestoriabarcelona.com/www.gestoriabarcelona.com
On-Page Optimization | | promonet
gestoriabarcelona.com/tarifas/www.gestoriabarcelona.com
gestoriabarcelona.com/category/noticias/page/7/www.gestoriabarcelona.com
gestoriabarcelona.com/2012/08/amortizacion-de-unaconstruccion/
www.gestoriabarcelona.com
[..] I don't know where can i find to solve errors Anyone can help me? Thanks0