Improvement in Page Speed worth Compromise on HTML Validation?
-
Our developer has improved page speed, particularly for Mobile. However the price for this improvement has been a HTML validation error that cannot be removed without compromising on the page load speed. Is the improvement in speed worth the living with the validation error? The concern is paying a high SEO price for this "fatal error". Or perhaps this error is in fact not serious?
-
Fatal Error: Cannot recover after last error. Any further errors will be ignored.
From line 699, column 9; to line 699, column 319
>↩ ↩
`OUR DEVELOPER'S COMMENT:
| This was made following Google page speed insights recommendations. If we remove that, will loose on page load performance |
The domain URL is www.nyc-officespace-leader.com`
-
-
Yeah sequence of load is also important when its time to go granular to find the true opportunities. Because the up-front evaluation time that can identify issues, can often result in faster-easier-more template-driven ways to speed up everything on a larger scale with less effort needed.
That doesn't mean its okay to ignore other bottlenecks. Just that the more clarity of understanding, the more likely real, sustainable success can be achieved.
-
I agree with Alan's points. I have also found WebSiteTest.com really useful. It allows for multiple runs on multiple devices and you can download the results in CSV. Expanding on Alan's point around looking at bottleneck points, when you use these tools, you need to take time to understand the waterfall chart as that is where you can see how the browsers interact with all of these files (html, css, js, images etc).
I have been doing a ton of reading on front end optimization recently. Aside from all of the above, you could have issue with the critical rendering path (great resources here and here). Many times folks look at a single asset and say, "This javascript file is too big, lets minify it and get faster!" That is a good thing and will help you. That said, you have to look at the render path as you may have that same smaller JS file blocking other downloads that need to be downloaded first to render the page faster. Optimizing the render path can give you some additional gains.
Good luck!
-
Kingalan1
I'm not a programmer by trade - the way I begin even considering these things is by running tests on various tool platforms.
For example, put a page you think is slow into URIValet.com - test as Googlebot. The resulting report has a block of information in it regarding total size of files processed. It breaks that data down to file types. Look at the CSS/JS lines - if they are more than 50k to 100k total for either CSS or JSS, there is almost certainly inefficiency in there, and likely unnecessary bloat.
Go to WebPageTest.org and do the same - put in the URL you want to check - choose a server location and DSL (which gives a fair mid-range speed evaluation), and Chrome as the browser emulator. The resulting report gives you a lot of information, however the one page in that report that may be most helpful in this situation is the "Details" report - if you go there, and scroll down, you'll get to the section that lists, line by line, every single file, script, image and asset processed for that page, and all of the data on speed of processing each step of the way (such as First Byte Time, DNS lookup, SSL lookup, and more). Those can reveal several individual bottleneck points.
-
Thanks for your excellent, highly detailed response!!
Is there a way to test the CSS files that my developer has created to see if they are coded in an efficient and concise manner?
We use a virtual private server at Inmotion Hosting and Amazon CDN for for images. So I would think that the hosting service is adequate. Traffic does not exceed 3000 unique visitors a month so the load on the server is minimal.
-
1. Taking shortcuts that are not sound sustainable based methods to gain value somewhere else is almost certainly going to become a problem when you least expect it at some future date and this is a great example. Moving CSS and or JS to below the proper location is a recipe for complete page display failure on any number of devices that may or may not current exist.
Have you tested your pages with Google's Fetch and Render to ensure they properly load, or where they may get a "partial" result? If they get a "partial" result, that's a red flag warning that you ignore at your own peril.
2. You haven't provided numbers - is the page speed improvement a case of going from 20 seconds to down to 5 seconds? Or is it going from 8 seconds to 6 seconds? Or what? This matters when evaluating what to care about and expend resources on.
3. If just moving those to their proper place in the page header section is causing speeds to slow down dramatically, you have bigger problems. First one that comes to mind is "why do those scripts / CSS files cause so much speed slowdown? Its likely they're bloated and need to be reduced in size, or they're housed on a pathetic cloud server that is itself doing you more harm than good.
-
I'm not sure if it would affect the current page speed but it would fix the invalid HTML error from the validator. If the validation errors concern you it might be worth giving it a try and testing the result? It's good to make sure that pages validate all the high issues at least to be sure of no possible display or rendering issues in different browsers now or in the future.
-
Would correcting the code in this manner so the html validates result in a slower page load timE?
-
That error is coming up from the validator because the links to your stylesheets are outside the ending body and html tags. The stylesheet links normally go within the tags at the top but I understand from what you've said for page speed these have been moved to the bottom page however no tags / html / stylesheets / javascript etc should be outside the ending and tags.
If you move the CSS stylesheet references and the comments so they are where the javascript files are (before the ending tags) that would fix the fatal error you are seeing.
Hope that helps!
-
Thanks so much. I understand most errors are not too important. However I wonder if a "fatal" error should not be of grater concern.
Thanks, Alan
-
I am not a developer so any developer with a SEO background can tell you better but in general page load speed is important both from user point of view as well as search engine rankings and as far as W3C validation is concern, there are quite a few errors that you can ignore in order to stick with your page load speed.
Hope this helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
External 404 pages
A client of mine is linking to a third-party vendor from their main site. The page being linked to loads with a Page Not Found error and then replaces some application content once the Javascript kicks in. This process is not visible to users (the application loads fine for front-end users) but it is being picked up as a 404 error in broken link reports. This link is part of the site skin so it's on every page. Outside of the annoyance of having lots of 404 errors being flagged in a broken link report, does this cause any actual issue? Eg, do search enginges see that my client is linking to something that is a 404 error, and does that cause them any harm?
Intermediate & Advanced SEO | | mkleamy0 -
How will canonicalizing an https page affect the SERP-ranked http version of that page?
Hey guys, Until recently, my site has been serving traffic over both http and https depending on the user request. Because I only want to serve traffic over https, I've begun redirecting http traffic to https. Reviewing my SEO performance in Moz, I see that for some search terms, an http page shows up on the SERP, and for other search terms, an https page shows. (There aren't really any duplicate pages, just the same pages being served on either http or https.) My question is about canonical tags in this context. Suppose I canonicalize the https version of a page which is already ranked on the SERP as http. Will the link juice from the SERP-ranked http version of that page immediately flow to the now-canonical https version? Will the https version of the page immediately replace the http version on the SERP, with the same ranking? Thank you for your time!
Intermediate & Advanced SEO | | JGRLLC0 -
Ecommerce category pages
Hi there, I've been thinking a lot about this lately. I work on a lot of webshops that are made by the same company. I don't like to say this, but not all of their shops perform great SEO-wise. They use a filtering system which occasionally creates hundreds to thousands of category pages. Basically what happens is this: A client that sells fashion has a site (www.client.com). They have 'main categories' like 'Men' 'Women', 'Kids', 'Sale'. So when you click on 'men' in the main navigation, you get www.client.com/men/. Then you can filter on brand, subcategory or color. So you get: www.client.com/men/brand. Basically, the url follows the order in which you filter. So you can also get to 'brand' via 'category': www.client.com/shoes/brand Obviously, this page has the same content as www.client.com/brand/shoes or even /shoes/brand/black and /men/shoes/brand/black if all the brands' shoes happen to be black and mens' shoes. Currently this is fixed by a dynamic canonical system that canonicalizes the brand/category combinations. So there can be 8000 url's on the site, which canonicalize to about 4000 url's. I have a gut feeling that this is still not a good situation for SEO, and I also believe that it would be a lot better to have the filtering system default to a defined order, like /gender/category/brand/color so you don't even need to use these excessive amounts of canonicalization. Because, you can canonicalize the whole bunch, but you'd still offer thousands of useless pages for Google to waste its crawl budget on. Not to mention the time saved when crawling and analysing using Screaming Frog or other audit tools. Any opinions on this matter?
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
How to speed up transition towards new 301 redirected landing pages?
Hi SEO's, I have a question about moving local landing pages from many separate pages towards integrating them into a search results page. Currently we have many separate local pages (e.g. www.3dhubs.com/new-york). For both scalability and conversion reasons, we'll integrate our local pages into our search page (e.g. www.3dhubs.com/3d-print/Bangalore--India). **Implementation details: **To mitigate the risk of a sudden organic traffic drop, we're currently running a test on just 18 local pages (Bangalore) = 1 / 18). We applied a 301 redirect from the old URL's to the new URL's 3 weeks ago. Note: We didn't yet update the sitemap for this test (technical reasons) and will only do this once we 301 redirect all local pages. For the 18 test pages I manually told the crawlers to index them in webmaster tools. That should do I suppose. **Results so far: **The old url's of the 18 test cities are still generating > 99% of the traffic while the new pages are already indexed (see: https://www.google.nl/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:www.3dhubs.com/3d-print/&start=0). Overall organic traffic on test cities hasn't changed. Questions: 1. Will updating the sitemap for this test have a big impact? Google has already picked up the new URL's so that's not the issue. Furthermore, the 301 redirect on the old pages should tell Google to show the new page instead, right? 2. Is it normal that search impressions will slowly shift from the old page towards the new page? How long should I expect it to take before the new pages are consistently shown over the old pages in the SERPS?
Intermediate & Advanced SEO | | robdraaijer0 -
Too many on page links
Hi I know previously it was recommended to stick to under 100 links on the page, but I've run a crawl and mine are over this now with 130+ How important is this now? I've read a few articles to say it's not as crucial as before. Thanks!
Intermediate & Advanced SEO | | BeckyKey1 -
How to speed indexing of web pages after website overhaul.
We have recently overhauled our website and that has meant new urls as we moved from asp to php. we also moved from http to https. The website (https://) has 694 urls submitted through site map with 679 indexed in sitemap of google search console. As we look through the google search console analytics we notice that google index section / index status it says: https://www.xyz.com version - index status 2
Intermediate & Advanced SEO | | Direct_Ram
www.xyz.com version - index status 37
xyz.com version - index status 8 how can we get more pages to be indexed or found by google sooner rather than later as we have lost major traffic. thanks for your help in advance0 -
Splitting one page into two
Good day everyone! If you have a page that ranks well for two highly competitive, yet mutually irrelevant, terms, but that the page will be split into two as part of a website redesign, would you 301 it to term X or term Y? What criteria do you use? Are there any other things I should do to avoid the wrong page ranking for the wrong term? I don't want users searching for term X to end up in page Y. Thanks!
Intermediate & Advanced SEO | | andrep0 -
List of Off page techniques
Hello Everyone, Please share the list of off page techniques to improve ranking and which techniques completely avoided?
Intermediate & Advanced SEO | | Alick3000