Improvement in Page Speed worth Compromise on HTML Validation?
-
Our developer has improved page speed, particularly for Mobile. However the price for this improvement has been a HTML validation error that cannot be removed without compromising on the page load speed. Is the improvement in speed worth the living with the validation error? The concern is paying a high SEO price for this "fatal error". Or perhaps this error is in fact not serious?
-
Fatal Error: Cannot recover after last error. Any further errors will be ignored.
From line 699, column 9; to line 699, column 319
>↩ ↩
`OUR DEVELOPER'S COMMENT:
| This was made following Google page speed insights recommendations. If we remove that, will loose on page load performance |
The domain URL is www.nyc-officespace-leader.com`
-
-
Yeah sequence of load is also important when its time to go granular to find the true opportunities. Because the up-front evaluation time that can identify issues, can often result in faster-easier-more template-driven ways to speed up everything on a larger scale with less effort needed.
That doesn't mean its okay to ignore other bottlenecks. Just that the more clarity of understanding, the more likely real, sustainable success can be achieved.
-
I agree with Alan's points. I have also found WebSiteTest.com really useful. It allows for multiple runs on multiple devices and you can download the results in CSV. Expanding on Alan's point around looking at bottleneck points, when you use these tools, you need to take time to understand the waterfall chart as that is where you can see how the browsers interact with all of these files (html, css, js, images etc).
I have been doing a ton of reading on front end optimization recently. Aside from all of the above, you could have issue with the critical rendering path (great resources here and here). Many times folks look at a single asset and say, "This javascript file is too big, lets minify it and get faster!" That is a good thing and will help you. That said, you have to look at the render path as you may have that same smaller JS file blocking other downloads that need to be downloaded first to render the page faster. Optimizing the render path can give you some additional gains.
Good luck!
-
Kingalan1
I'm not a programmer by trade - the way I begin even considering these things is by running tests on various tool platforms.
For example, put a page you think is slow into URIValet.com - test as Googlebot. The resulting report has a block of information in it regarding total size of files processed. It breaks that data down to file types. Look at the CSS/JS lines - if they are more than 50k to 100k total for either CSS or JSS, there is almost certainly inefficiency in there, and likely unnecessary bloat.
Go to WebPageTest.org and do the same - put in the URL you want to check - choose a server location and DSL (which gives a fair mid-range speed evaluation), and Chrome as the browser emulator. The resulting report gives you a lot of information, however the one page in that report that may be most helpful in this situation is the "Details" report - if you go there, and scroll down, you'll get to the section that lists, line by line, every single file, script, image and asset processed for that page, and all of the data on speed of processing each step of the way (such as First Byte Time, DNS lookup, SSL lookup, and more). Those can reveal several individual bottleneck points.
-
Thanks for your excellent, highly detailed response!!
Is there a way to test the CSS files that my developer has created to see if they are coded in an efficient and concise manner?
We use a virtual private server at Inmotion Hosting and Amazon CDN for for images. So I would think that the hosting service is adequate. Traffic does not exceed 3000 unique visitors a month so the load on the server is minimal.
-
1. Taking shortcuts that are not sound sustainable based methods to gain value somewhere else is almost certainly going to become a problem when you least expect it at some future date and this is a great example. Moving CSS and or JS to below the proper location is a recipe for complete page display failure on any number of devices that may or may not current exist.
Have you tested your pages with Google's Fetch and Render to ensure they properly load, or where they may get a "partial" result? If they get a "partial" result, that's a red flag warning that you ignore at your own peril.
2. You haven't provided numbers - is the page speed improvement a case of going from 20 seconds to down to 5 seconds? Or is it going from 8 seconds to 6 seconds? Or what? This matters when evaluating what to care about and expend resources on.
3. If just moving those to their proper place in the page header section is causing speeds to slow down dramatically, you have bigger problems. First one that comes to mind is "why do those scripts / CSS files cause so much speed slowdown? Its likely they're bloated and need to be reduced in size, or they're housed on a pathetic cloud server that is itself doing you more harm than good.
-
I'm not sure if it would affect the current page speed but it would fix the invalid HTML error from the validator. If the validation errors concern you it might be worth giving it a try and testing the result? It's good to make sure that pages validate all the high issues at least to be sure of no possible display or rendering issues in different browsers now or in the future.
-
Would correcting the code in this manner so the html validates result in a slower page load timE?
-
That error is coming up from the validator because the links to your stylesheets are outside the ending body and html tags. The stylesheet links normally go within the tags at the top but I understand from what you've said for page speed these have been moved to the bottom page however no tags / html / stylesheets / javascript etc should be outside the ending and tags.
If you move the CSS stylesheet references and the comments so they are where the javascript files are (before the ending tags) that would fix the fatal error you are seeing.
Hope that helps!
-
Thanks so much. I understand most errors are not too important. However I wonder if a "fatal" error should not be of grater concern.
Thanks, Alan
-
I am not a developer so any developer with a SEO background can tell you better but in general page load speed is important both from user point of view as well as search engine rankings and as far as W3C validation is concern, there are quite a few errors that you can ignore in order to stick with your page load speed.
Hope this helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why rankings dropped from 2 page to 8th page and no penalization?
Dear Sirs, a client of mine for more than 7 years used to have his home page (www.egrecia.es) between 1st and 2nd page in the Google Serps and suddenly went down to 8 page. The keyword in question is "Viajes a Grecia". It has a good link profile as we have built links in good newspapers from Spain, and according to Moz it has a 99% on-page optimization for that keyword, why why why ??? What could I do to solve this? PD: It has more than 20 other keywords in 1st position, so why this one went so far down? Thank you in advance !
Intermediate & Advanced SEO | | Tintanus0 -
Will 301 Redirects Slow Page Speed?
We have a lot of subdomains that we are switching to subfolders and need to 301 redirect all the pages from those subdomains to the new URL. We have over 1000 that need to be implemented. So, will 301 redirects slow the page speed regardless of which URL the user comes through? Or, as the old urls are dropped from Google's index and bypassed as the new URLs take over in the SERPs, will those redirects then have no effect on page speed? Trying to find a clear answer to this and have yet to find a good answer
Intermediate & Advanced SEO | | MJTrevens0 -
How to deal with everscrolling pages?
A website keeps showing more articles when pressing a "load more" button. This loads additional category pages with a page parameter (e.g. ...?page=1, ...?page=2, etc.), as suggested by Google to get all pages indexed. The problem is that this creates thousands of additional, duplicate pages, with a duplicate title, header, and very unfocused content. They also show as duplicate content in Moz. The pages are indexed by Google, but none of them is ranking. What do you guys think: add a no-follow to the load-more button, so search engines will never see them? Thanks for your input!
Intermediate & Advanced SEO | | corusent1 -
Many pages small unique content vs 1 page with big content
Dear all, I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
Intermediate & Advanced SEO | | Eurasmus.com
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content. The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla). When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages. What would you choose? Let me know what you think. Thanks!0 -
Tips for improving this page
I have made a content placeholder for a keyword that will gain significant search volume in the future. Until then I am trying to optimize the page to rank when the game launches and the keyword gains volume. http://hiddentriforce.com/a-link-between-worlds/walkthrough/ Is there anything I can do to improve the optimization for the phrase 'a link between worlds walkthrough' A lot of my competitors are already setting up similar placeholder pages and doing the same thing. I have 2 fairly large gaming sites that will place a banner for my walkthrough on their site. I did not pay for the links. I do free writing/ other services in exchange for this. I have been sharing the link socially. It has almost 200 likes and a handful of shares, tweets, g+ votes
Intermediate & Advanced SEO | | Atomicx0 -
Site speed tests
In webmaster tools my site is showing that it is taking longer and longer to load, and it has now doubled. Is there a way to check which pages are the problem? The site is quite large so I can't check them one at a time.
Intermediate & Advanced SEO | | EcommerceSite0 -
Will Creating a Keyword specific Page to replace the Category Section page cause any harm to my website?
I am running a word press install for my blog and recently had 3 of my main keywords set as categories. I recently decided to create a static page for the keywords instead of having the category page showing all the posts within the category, and took it off the navigation bar. I read about setting the categories to use NO index so the search engines can shine more importance on the new pages i created to really replace where the category was showing. Can this have a negative effect on my rankings? http://junkcarsforcashnjcompany.com junk car removal nj is showing the category section, So i placed the no index on it. Will the search engines refresh the data and replace it with the new page I created?
Intermediate & Advanced SEO | | junkcars0 -
SEOMOZ crawl all my pages
SEOMOZ crawl all my pages including ".do" (all web pages after sign up ) . Coz of this it finishes all my 10.000 crawl page quota and be exposed to dublicate pages. Google is not crawling pages that user reach after sign up. Because these are private pages for customers I guess The main question is how we can limit SEOMOZ crawl bot. If the bot can stay out of ".do" java extensions it'll perfect to starting SEO analysis. Do you know think about it? Cheers Example; .do java extension (after sign up page) (Google can't crawl) http://magaza.turkcell.com.tr/showProductDetail.do?psi=1001694&shopCategoryId=1000021&model=Apple-iPhone-3GS-8GB Normal Page (Google can crawl) http://magaza.turkcell.com.tr/telefon/Apple-iPhone-3GS-8GB/1001694/.html
Intermediate & Advanced SEO | | hcetinsoy0