Moving Code for Faster Crawl Through?
-
What are best practices for moving code into other folders to help speed up a crawling for bots? We once moved some javascript from an SEO's suggestion and the site suddenly looked like crap until we undid the changes. How do you figure our what code should be consolidated? What code do you use to indicate what has been moved and to where?
-
Yes, I misread your post.
Best,
Christopher -
But isn't this really for things like Google Analytics code? The code I am talking about is stuff like CSS and Javascript that could be in external files but are not currently.
-
But isn't this really for things like Google Analytics code? The code I am talking about is stuff like CSS and Javascript that could be in external files but are not currently.
-
Have you considered using Google Tag Manager? It's much cleaner, and the scripts run asynchronously to speed up page load times.
Best,
Christopher
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moved company 'Help Center' from Zendesk to Intercom, got lots of 404 errors. What now?
Howdy folks, excited to be part of the Moz community after lurking for years! I'm a few weeks into my new job (Digital Marketing at Rewind) and about 10 days ago the product team moved our Help Center from Zendesk to Intercom. Apparently the import went smoothly, but it's caused one problem I'm not really sure how to go about solving: https://help.rewind.io/hc/en-us/articles/*** is where all our articles used to sit https://help.rewind.io/*** is where all our articles now are So, for example, the following article has now moved as such: https://help.rewind.io/hc/en-us/articles/115001902152-Can-I-fast-forward-my-store-after-a-rewind- https://help.rewind.io/general-faqs-and-billing/frequently-asked-questions/can-i-fast-forward-my-store-after-a-rewind This has created a bunch of broken URLs in places like our Shopify/BigCommerce app listings, in our email drips, and in external resources etc. I've played whackamole cleaning many of these up, but these old URLs are still indexed by Google – we're up to 475 Crawl Errors in Search Console over the past week, all of which are 404s. I reached out to Intercom about this to see if they had something in place to help, but they just said my "best option is tracking down old links and setting up 301 redirects for those particular addressed". Browsing the Zendesk forms turned up some relevant-ish results, with the leading recommendation being to configure javascript redirects in the Zendesk document head (thread 1, thread 2, thread 3) of individual articles. I'm comfortable setting up 301 redirects on our website, but I'm in a bit over my head in trying to determine how I could do this with content that's hosted externally and sitting on a subdomain. I have access to our Zendesk admin, so I can go in and edit stuff there, but don't have experience with javascript redirects and have read that they might not be great for such a large scale redirection. Hopefully this is enough context for someone to provide guidance on how you think I should go about fixing things (or if there's even anything for me to do) but please let me know if there's more info I can provide. Thanks!
Intermediate & Advanced SEO | | henrycabrown1 -
Crawl Budget and Faceted Navigation
Hi, we have an ecommerce website with facetted navigation for the various options available. Google has 3.4 million webpages indexed. Many of which are over 90% duplicates. Due to the low domain authority (15/100) Google is only crawling around 4,500 webpages per day, which we would like to improve/increase. We know, in order not to waste crawl budget we should use the robots.txt to disallow parameter URL’s (i.e. ?option=, ?search= etc..). This makes sense as it would resolve many of the duplicate content issues and force Google to only crawl the main category, product pages etc. However, having looked at the Google Search Console these pages are getting a significant amount of organic traffic on a monthly basis. Is it worth disallowing these parameter URL’s in robots.txt, and hoping that this solves our crawl budget issues, thus helping to index and rank the most important webpages in less time. Or is there a better solution? Many thanks in advance. Lee.
Intermediate & Advanced SEO | | Webpresence0 -
Text to Code Ratio & SEO
Hi Has anyone had experience of updating their text to code ratio if its too high & whether this has much impact on SEO performance? I am trying to prioritise tasks & wondered if this is something which should be higher on my list. Thank you 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Moving a lot of pdfs to main site. Worth trying to get them indexed?
On my main site we link to pdfs that are located on another one of our domains. The only thing that is on this other domain is the pdfs. It was setup really poorly so I am going to redesign everything and probably move it. Is it worthwhile trying to add these pdfs to our sitemap and to try and get them indexed? They are all connected to a current item, but the content is original.
Intermediate & Advanced SEO | | EcommerceSite0 -
Importance of Unique Content Location in Source Code
How much does Google value placement of unique content in the source code vs where it is visually displayed? I have a case where my unqiue content visually displays high on page for the user, but in the source code the unique quality content is below duplicate type content that appear across many other domains (think e-commerce category thumbs on left side of screen and 80% right side of screen unique stuff). I have the impression I am at a disadvantage because these pages have the unique / quality content lower in source code. Any thoughts on this?
Intermediate & Advanced SEO | | khi50 -
Does Google read code as is or as rendered?
Question - Does Google read code as is or as rendered? So for example, with a Facebook Like box, it has all the profile pictures of people...will Google see these as all separate links or ignore them?
Intermediate & Advanced SEO | | jhinchcliffe0 -
Best way to move from mixed case url to all lowercase?
We are currently in the process of moving our site from a mixed case structure i.e -> <sitename>/franchise/childrens-child-care/party/Bricks-4-Kidz/company-information.cfm</sitename> to all lowercase i.e -> <sitename>/franchise/childrens-child-care/party/bricks-4-kidz/company-information.cfm.</sitename> In order to maintain as much link juice as possible, should we be using 301 redirects to point from the old to the new? or would it be more advantageous to wait for the next crawl and the link juice would also be somewhat maintained even though the all the upper case letters have been converted to lowercase?
Intermediate & Advanced SEO | | franchisesolutions0