Html and css errors - what do SE spiders do if they come across coding errors? Do they stop crawling the rest of the code below the error
-
I have a client who uses a template to build their websites (no problem with that) when I ran the site through w3c validator it threw up a number of errors, most of which where minor eg missing close tags and I suggested they fix them before I start their off site SEO campaigns.
When I spoke to their web designer about the issues I was told that some of the errors where "just how its done" So if that's the case, but the validator still registers the error, do the SE spiders ignore them and move on, or does it penalize the site in some way?
-
Ryan, thanks so much for taking the time to answer, and so comprehensively too, I really appreciate it.
My client came around after I suggested that Getting quality backlinks to a website full of coding errors was like hanging a crystal chandelier in a toilet!! and that they where tying one of my hands behind my back by not sorting it out. Perhaps not the most expert answer but they got the point.
Thanks for some great information and a great answer all round.
-
**When I spoke to their web designer about the issues I was told that some of the errors where "just how its done" **
Are you OK with that response? If your client asked you why you took a course of action on their site would you expect the client to accept "it's just how things are done"?
Generally speaking, sites should use valid code. The W3C is the international body which establishes coding standards. They are made up of a group of people including representatives from Microsoft (IE), Google (Chrome), Mozilla (FireFox), Apple (Safari), etc. Valid code should appear correctly in all browsers.
Generally speaking again, a developer who writes valid code is following best coding practices. The code can be more easily reviewed by other developers. When invalid code is used, it is often due to sloppy coding practices such as not closing tags, using deprecated tags, not being familiar with the particular encoding of the language in use, etc. When I ask a developer why the code is not valid and the response is "it's just how things are done" the translation often is "I lack the knowledge / training / experience to write valid code".
Ok, now that I angered many developers let me take the flip side of the coin. Google.com does not validate. What's up with that? Well, you know the development team at Google is among the best in the world. Their project leaders likely have their doctorate degrees or at least master degrees. Many of them are authors of books on best coding practices. These guys clearly understand all the rules and are able to go past them to achieve better results in a given area, such as speed optimization which Google treasures.
In summary, leading companies can often employee the upper echelon of employees who thoroughly understand the rules and can break them for their benefit. Unfortunately, that does not trickle down to every day developers. Most of them do not have the knowledge / training / experience to make those calls and are simply either using sloppy coding practices or they are not taking the time to research other alternatives. They have deadlines and they jump on whatever works.
what do SE spiders do if they come across coding errors? Do they stop crawling the rest of the code below the error
The results vary based on the Search Engine and the type of error. Here are some examples:
1. There are some errors due to the "&" being used instead of the binary operator "&". Sometimes there are issues with various code where the & character may have another purpose and the interpreter may try to perform an operation on the code such as concatenation rather then simply reading the & as a character.
2. In html,
is a perfectly valid tag. In XHTML, there is a rule that any tags which are not used in a pair should be end in />. In other words, the correct form of the
tag in XHTML is
. If you have an XHTML document which generates 20 errors, and all of those errors are due to the developer using
instead of
then a crawler should handle that issue very well. The crawler recognizes and understands the
tag even though it is technically invalid code.3. An open div tag can cause a variety of issues. It all depends on what operation the div is performing. It could be very minor or a major issue.
Google does a great job of handling invalid code. Bing seems less tolerant of coding errors and much more selective.
A video you will likely enjoy: http://www.youtube.com/watch?v=FPBACTS-tyg
Summary
You should strive for valid code with your site. Coding errors can cause a variety of issues including making it harder for other developers to work on the site, causing the site to appear incorrectly in various browsers or devices, negatively impacting page loading times, and impeding search engine crawlers. It is not possible to say without a review of the specific error. While I do not develop websites, I do project manage the development of many sites. When the site is complete, the goal is to not have any validation errors. If a handful of errors exist, I request for the developer to try to eliminate them. If they cannot, I request an error-by-error explanation of why the error exists and why it cannot be eliminated. The result is a site which appears correctly in all browsers, is correctly crawled and interpreted by search engines, and is easily maintained by various developers.
A final note: just because a page validates does not mean it is developed well, and the reverse is true also. I would say with the exception of the top 1% of sites which are developed by teams of very well trained and experienced web professionals, sites which validate are likely better designed and maintained then sites which do not validate.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I Use A Code For Last Updated Blog Posts
Hi, I have a quick question about updating blog posts. When I add "Last Updated" to an updated post should this be in any particular type of code or simply just text, perhaps with the tag? Thanks An Advance
On-Page Optimization | | KNpaul0 -
How come my Domain Authority is slower by 2 points today. Did I do something wrong?
Hello. I've bein working hard Seo in the last months, i started testing Moz Pro, found many issues with my website which i fixed. Today was the big day i was going to be crawled again. But to my surprise my Moz Authority went down by two point. Did I do something wrong? I also had page rank 1 now i have cero. I going backwards. Please any help would be apreciated, regards.
On-Page Optimization | | ebest.cl0 -
Duplicate blogs across different domains
Hi, I am running a blog for several English speaking websites ( e.g Australia, UK ) and I plan doing SEO for these websites. I want to know if it's a must changing all the blogs for these countries in order to avoid Duplicate Content ?
On-Page Optimization | | kiraftw0 -
How to Fix Google Webmasters Soft 404 Errors for Wordpress Site?
I am getting Soft 404 errors in my Google Webmasters Tools don't know how to fix it the site in on WordPress CMS the site is "http://appdictions.com/" i am getting errors in the http://appdictions.com/members section...suggestion to fix the issue will be appreciated..
On-Page Optimization | | preferati0 -
Using Transcription Service For Videos - Have Question Around Search and Spiders
Hi All, So I have put together a weekly video series on security topics. I have read an SEOmoz post around how you can boost SEO by adding the transcription to the page, which makes perfectly good sense. My question is, can I include the first couple of paragraphs and then have a "read the full transcription" so when the user clicks, the rest of the content appears? Do the search engine spiders only crawl the first two paragraphs in this instance or do they crawl the whole thing even though the entire content is not on the page? I dont mind making the page longer and including the entire transcription if it is easier for SEO but if there is no difference, than I think the first option would be the best user experience. Thanks for the help Pat
On-Page Optimization | | PatBausemer0 -
CSS family names and whitespace
A CSS validation notes the following: Family names containing whitespace should be quoted. If quoting is omitted, any whitespace characters before and after the name are ignored and any sequence of whitespace characters inside the name is converted to a single space. Not sure what this means or how to fix. Help. thanks
On-Page Optimization | | casper4340 -
Status of Ajax and SEO? Changing navigation from plain HTML to AJAX.
We will change an old school HTML drop down navigation for an AJAX drop down one. What's your experience in regards to Google indexing AJAX? I know that Google is now able to read more stuff than in the past, but I need some stories "from the trenches" if you're willing to share. Thanks in advance.
On-Page Optimization | | gerardoH0 -
Why isn't SEOMoz using File Extensions (*.html etc) on any of their web page URLs?
...and what is the SEO benefit of this? This video from Matt Cutts suggests using file extentions, except for a directory.
On-Page Optimization | | magicrob0