Page speed in relation to SEO
-
I cannot seem to find any information about this, so I thought I would try to get a few people's opinion. How do you think pagespeed is measured in terms of Google using it as a ranking factor? Do you think they use their internal Pagespeed app? Something during the crawl? Your GA site speed?
-
Right, I am familiar with. Actually what I have done is kind of interesting in that regard. The product I work with is a CMS that produces large CSS and JS files. So I have profiled the CSS to render the above the fold, then after the page is loaded append the full CSS file to style the rest. The same with the JS too. The reason it can't go into production like that is because the unsightly flash of unstyled content. But it seems the crawlers are picking them up fine for the cached page images.
-
You should definitely check out our search engine ranking factors, which includes looking at speed compared to other stuff.
-
It does, quite a bit better actually, http://screencast.com/t/nZtKg3CwttJy and desktop http://screencast.com/t/xuMKM1KN Most all of the problems are coming from 3rd party integrations. Like say with GA it throws off a deferred parsing error, a cache life error, and a minimize resource error.
-
I am hoping the risk is going to be low considering what is being blocked. I all reality what I am blocking is what Google grips about, mainly javascript files with short cache lives. A facebook remarketing code, a norton ssl javascript, a freshdesk chat app, a mailchimp js file, and their fonts. So I don't think crawlability will be affect and it looks like the page render is not affected either.
-
I am going to reply to the body since my follow up is relevant to all of the replies. One thing I have been trying is hiding elements when it comes to Googlebot. Basically the things that slow the site down that a lot of sites have, such as tracking pixels, analytics, stats programs, chats, and things like that. I have even noticed that whatever Google uses to render the previews seems to have their whole font library installed on it too. Any thoughts on this practice?
-
I would assume their PageSpeed Insights (https://developers.google.com/speed/docs/insights/about), as that's the measurement tool they've freely given us. But who knows with Google...
-
I'm pretty sure they're using something like the internal pagespeed tool but with probably way more data points to look at. I'm also pretty confident that they use another way of capturing the time it takes to load the first things on your screen so they can check how good or bad the user experience is.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Coming soon SEO
Hi, I was wondering what is the best practice to redirect all the links juice by redirecting all the pages of your website to a coming soon page. The coming soon page will point to the domain.com, not to a subfolder. Should I move the entire website to a subfolder and redirect this folder to the coming soon page? Thanks
Technical SEO | | bigrat950 -
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
Page disappeared from Google index. Google cache shows page is being redirected.
My URL is: http://shop.nordstrom.com/c/converse Hi. The week before last, my top Converse page went missing from the Google index. When I "fetch as Googlebot" I am able to get the page and "submit" it to the index. I have done this several times and still cannot get the page to show up. When I look at the Google cache of the page, it comes up with a different page. http://webcache.googleusercontent.com/search?q=cache:http://shop.nordstrom.com/c/converse shows: http://shop.nordstrom.com/c/pop-in-olivia-kim Back story: As far as I know we have never redirected the Converse page to the Pop-In page. However the reverse may be true. We ran a Converse based Pop-In campaign but that used the Converse page and not the regular Pop-In page. Though the page comes back with a 200 status, it looks like Google thinks the page is being redirected. We were ranking #4 for "converse" - monthly searches = 550,000. My SEO traffic for the page has tanked since it has gone missing. Any help would be much appreciated. Stephan
Technical SEO | | shop.nordstrom0 -
Is the Authority of Individual Pages Diluted When You Add New Pages?
I was wondering if the authority of individual pages is diluted when you add new pages (in Google's view). Suppose your site had 100 pages and you added 100 new pages (without getting any new links). Would the average authority of the original pages significantly decrease and result in a drop in search traffic to the original pages? Do you worry that adding more pages will hurt pages that were previously published?
Technical SEO | | Charlessipe0 -
Help! Pages not being indexed
Hi Mozzers, I need your help.
Technical SEO | | bshanahan
Our website (www.barnettcapitaladvisors.com) stopped being indexed in search engines following a round of major changes to URLs and content. There were a number of dead links for a few days before 301 redirects were properly put in place. And now, only 3 pages show up in bing when I do the search "site:barnettcapitaladvisors.com". A bunch of pages show up in Google for that search, but they're not any of the pages we want to show up. Our home page and most important services pages are nowhere in search results. What's going on here?
Our sitemap is at http://www.barnettcapitaladvisors.com/sites/default/files/users/AndrewCarrillo/sitemap/sitemap.xml
Robots.txt is at: http://www.barnettcapitaladvisors.com/robots.txt Thanks!0 -
If a permanent redirect is supposed to transfer SEO from the old page to the new page, why has my domain authority been impacted?
For example, we redirected our old domain to a new one (leaving no duplicate content on the old domain) and saw a 40% decrease in domain authority. Isn't a permanent redirect supposed to transfer link authority to the place it is redirecting to? Did I do something wrong?
Technical SEO | | BlueLinkERP0 -
Pager + SEO - Is it possible?
Hi, I am having this issue. I know that pager are not friends with SEO, but I want to know which is the best to do in this situations. for example, I work in a news company, and I have a lot of news pages that are very extensive so I use a pager. Well here I have the problem. suppose that the url is www.mysite.com/news/id/here-comes-the-title When you enter that URL you are viewing the first page that has this meta: title keywords description Now, the problem comes when the user goes to the page 2 of this news article. What I shall do? 1- Change the url to www.mysite.com/news/id/here-comes-the-title-PAGE2 www.mysite.com/news/PAGE2-id/here-comes-the-title www.mysite.com/news/id/PAGE2/here-comes-the-title 2- in the page 2,3,4,5 ... add a meta robot noindex? In the option 2 I think that I am loosing the opportunity to index the body of my article. Is this correct? Thanks
Technical SEO | | informatica8100 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0