Page speed in relation to SEO
-
I cannot seem to find any information about this, so I thought I would try to get a few people's opinion. How do you think pagespeed is measured in terms of Google using it as a ranking factor? Do you think they use their internal Pagespeed app? Something during the crawl? Your GA site speed?
-
Right, I am familiar with. Actually what I have done is kind of interesting in that regard. The product I work with is a CMS that produces large CSS and JS files. So I have profiled the CSS to render the above the fold, then after the page is loaded append the full CSS file to style the rest. The same with the JS too. The reason it can't go into production like that is because the unsightly flash of unstyled content. But it seems the crawlers are picking them up fine for the cached page images.
-
You should definitely check out our search engine ranking factors, which includes looking at speed compared to other stuff.
-
It does, quite a bit better actually, http://screencast.com/t/nZtKg3CwttJy and desktop http://screencast.com/t/xuMKM1KN Most all of the problems are coming from 3rd party integrations. Like say with GA it throws off a deferred parsing error, a cache life error, and a minimize resource error.
-
I am hoping the risk is going to be low considering what is being blocked. I all reality what I am blocking is what Google grips about, mainly javascript files with short cache lives. A facebook remarketing code, a norton ssl javascript, a freshdesk chat app, a mailchimp js file, and their fonts. So I don't think crawlability will be affect and it looks like the page render is not affected either.
-
I am going to reply to the body since my follow up is relevant to all of the replies. One thing I have been trying is hiding elements when it comes to Googlebot. Basically the things that slow the site down that a lot of sites have, such as tracking pixels, analytics, stats programs, chats, and things like that. I have even noticed that whatever Google uses to render the previews seems to have their whole font library installed on it too. Any thoughts on this practice?
-
I would assume their PageSpeed Insights (https://developers.google.com/speed/docs/insights/about), as that's the measurement tool they've freely given us. But who knows with Google...
-
I'm pretty sure they're using something like the internal pagespeed tool but with probably way more data points to look at. I'm also pretty confident that they use another way of capturing the time it takes to load the first things on your screen so they can check how good or bad the user experience is.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page titles in browser not matching WP page title
I have an issue with a few page titles not matching the title I have In WordPress. I have 2 pages, blog & creative gallery, that show the homepage title, which is causing duplicate title errors. This has been going on for 5 weeks, so its not an a crawl issue. Any ideas what could cause this? To clarify, I have the page title set in WP, and I checked "Disable PSP title format on this page/post:"...but this page is still showing the homepage title. Is there an additional title setting for a page in WP?
Technical SEO | | Branden_S0 -
Types of SEO Help
I have a web site that is going well but I think it could be better as far as usability and design. Also, I am sure an SEO professional would have some things to do to optimize. It seems though, that all the SEO companies either want to have along term contract or they don't work with my technology. Does anyone know of a company that would take my Visual Studio/C# project and tweak it for usability, design and SEO features for an hourly or set price?
Technical SEO | | Banknotes0 -
Find where the not selected pages are from
Hi all Can anyone suggest how I can find where gtoogle is finding approx. 1000 pages not to select? In round numbers I have 110 pages on the site site: searech shows all pages index status shows 110 slected and 1000 not selected. For the life of me I cannot fingure where these pages are coming from. I have set my prefered domain to www., setup 301 's to www. as per below RewriteCond %{HTTP_HOST} ^growingyourownveg.com$
Technical SEO | | spes123
RewriteRule ^(.*)$ "http://www.growingyourownveg.com/$1" [R=301,L] site is www.growingyourownveg.com any suggestions much appreciated Simon0 -
Can you 301 redirect a page to an already existing/old page ?
If you delete a page (say a sub department/category page on an ecommerce store) should you 301 redirect its url to the nearest equivalent page still on the site or just delete and forget about it ? Generally should you try and 301 redirect any old pages your deleting if you can find suitable page with similar content to redirect to. Wont G consider it weird if you say a page has moved permenantly to such and such an address if that page/address existed before ? I presume its fine since say in the scenario of consolidating departments on your store you want to redirect the department page your going to delete to the existing pages/department you are consolidating old departments products into ?
Technical SEO | | Dan-Lawrence0 -
Determining When to Break a Page Into Multiple Pages?
Suppose you have a page on your site that is a couple thousand words long. How would you determine when to split the page into two and are there any SEO advantages to doing this like being more focused on a specific topic. I noticed the Beginner's Guide to SEO is split into several pages, although it would concentrate the link juice if it was all on one page. Suppose you have a lot of comments. Is it better to move comments to a second page at a certain point? Sometimes the comments are not super focused on the topic of the page compared to the main text.
Technical SEO | | ProjectLabs1 -
Duplicates on the page
Hello SEOMOZ, I've one big question about one project. We have a page http://eb5info.com/eb5-attorneys and a lot of other similar pages. And we got a big list of errors, warnings saying that we have duplicate pages. But in real not all of them are same, they have small differences. For example - you select "State" in the left sidebar and you see a list on the right. List on the right panel is changing depending on the what you selecting on the left. But on report pages marked as duplicates. Maybe you can give some advices how to improve quality of the pages and make SEO better? Thanks Igor
Technical SEO | | usadvisors0 -
Are there negative SEO implications to pages without any images?
Hi Mozzers, Do you think there are any negative effects of having no images on a page but several hundreds words of text? (There is a logo image and call to action buttons). Thanks!
Technical SEO | | Charlessipe0 -
Page MozRank and MozTrust 0 for Home Page, Makes No Sense?
Hey Mozzers! I'm a bit confused by a site that is showing a 0 for home page MozRank and MozTrust, while its subdomain and root domain metrics look decent (relatively). I am posting images of the page metrics and subdomain metrics to show the disparity: http://i.imgur.com/3i0jq.png http://i.imgur.com/ydfme.png Is it normal to see this type of disparity? The home page has very little inbound links, but the big goose egg has me wondering if there is something else going on. Has anyone else experienced this? Or, does anyone have speculation as to why a home page would have a 0 MozRank while the subdomain metrics look much better? Thanks!
Technical SEO | | ClarityVentures0