HTML snapshot creating soft 404
-
Has anyone any experience with HTML snapshots? We have a recruitment client that has HTML snapshots against all job pages as they are built with AJAX.
The pages naturally die after around four weeks (the job vacancy runs out) and whilst the AJAX version of the page hard 404s, the HTML snapshot version returns a soft 404. How can we get it to mirror the dead page with 404 status?
-
A side note first. Something to consider on transient content for job listings like this that I have used on job sites I have worked on and worked pretty well - The unavailable after meta tag
http://searchengineland.com/googles-matt-cutts-seo-advice-unavailable-e-commerce-products-186882
"The “unavailable_after” Meta tag will allow you to tell Google that a page should expire from the search results at a specific time. "
This way your pages would be removed from the index on the date you list and if you have also removed the links from your sitemap etc, Google may not need to crawl them and find the 404 and/or soft404 to begin with.
The soft 404 (according to Google) means your server is not showing a 404 server response for the HTML snapshot version. I would try fetch as Google on those pages to see what Google is seeing and that may help you diagnose the situation. I may be that your server is giving a different response than the 404 and Google is questioning it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicates - How to know if trailing slashes are creating duplicate pages?
Hi, How do you determine whether trailing slashes are creating duplicate pages? Search Console is showing both /about and about/ for example but how do I know whether this is a problem? Thanks James
Technical SEO | | CamperConnect140 -
Are W3C Validators too strict? Do errors create SEO problems?
I ran a HTML markup validation tool (http://validator.w3.org) on a website. There were 140+ errors and 40+ warnings. IT says "W3C Validators are overly strict and would deny many modern constructs that browsers and search engines understand." What a browser can understand and display to visitors is one thing, but what search engines can read has everything to do with the code. I ask this: If the search engine crawler is reading thru the code and comes upon an error like this: …ext/javascript" src="javaScript/mainNavMenuTime-ios.js"> </script>');}
Technical SEO | | INCart
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element
in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create
cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer
the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). and this... <code class="input">…t("?");document.write('>');}</code> ✉ The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed). One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). Does this mean that the crawlers don't know where the code ends and the body text begins; what it should be focusing on and not?0 -
How to properly remove 404 errors
Hi, According to seomoz report I have two 404 errors on my site. (http://screencast.com/t/2FG8fA1dvGB) I removed them from google webmasters central about 2 weeks ago (http://screencast.com/t/MQ8XBvrFm ) , but they're still showing as an error in the next report (weekly update). Is there anything else you do about 404 or just remove urls through gwc? Or maybe seomoz data is delayed? Thanks in advance, JJ
Technical SEO | | jjtech0 -
What if meta description tag comes before meta title tag? Do the search engines disregard or penalize if the order is not title then description in the HTML?
Do the search engines disregard or penalize if the order is not title then description in the HTML? A client's webmaster is a newbie to SEO and did just this. Suggestions?
Technical SEO | | alankoen1230 -
So I created a site for the purpose of testing SEOMOZ
The site is build in wordpress and only has 1 post and no other pages. nonetheless seomoz tells me i have several duplicate pages. how do i fix this in wordpress. | Permission Marketing Dentistry http://permissionmarketingdentistry.com 2 1 0 Permission Marketing Dentistry http://permissionmarketingdentistry.com/ 2 1 0 admin | Permission Marketing Dentistry http://permissionmarketingdentistry.com/author/admin/ 3 1 0 Uncategorized | Permission Marketing Dentistry http://permissionmarketingdentistry.com/category/uncategorized/ |
Technical SEO | | dad7more0 -
Wordpress creates duplicate Title Tags.
Pasted from GWT with relation to Duplicate Title Tags: Gold News | FalkosGold - Page 2
Technical SEO | | MangoMM
/category/scrap-gold-news/page/2/
/tag/scrap-gold-news/page/2/ 2 Gold News | FalkosGold
/category/scrap-gold-news/
/tag/scrap-gold-news/ 2 Scrap Platinum | FalkosGold
/gold-platinum-prices/scrap-platinum/
/tag/scrap-platinum/ 2 Any idea how I fix this? Screenshot attached. Many thanks bVJeI.png0 -
HTML Forms Dilute Pagerank?
Today, we have way too many links on our homepage. About 30 of them are add-to-basket links (regular html links) pointing to a separate application. This application 302 redirects the client back to the referring page. I have two questions: 1. Does the current implementation of our buttons dilute pagerank? Bear in mind the 302 redirect. 2. If the answer to the first question is yes, would transforming the buttons into form buttons change anything to the better? We would still 302 back to the referring page. I know Gbot follows GET forms and even POST forms, but does GBot pass on pagerank to the form URL?
Technical SEO | | TalkInThePark1 -
404-like content
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages. the following page is an example: http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas? To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping: All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter. The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page. Like I said, the response codes are setup correctly, as far as Firebug can tell me. any thoughts would be most appreciated.
Technical SEO | | eseyo20