The Web Developer's SEO Cheat Sheet [Free Download]
You're an SEO. You've mastered Moz's starter guide to Search Engine Optimization. You deal with botched site migrations, messy title tags across thousands of pages, earth-shaking Google algorithm changes, and more on the daily. Who would have thought that one of the toughest parts of your job would be getting your recommendations actually implemented?
That's why your professional relationship with your web developers is so darn important. They hold the lock and key to almost everything you need to do to optimize a site's on-page and technical SEO — and more often than not, they don't understand why you're making so many requests of them when they've already got their hands full maintaining a website.
Getting your developers to care about and prioritize your SEO fixes isn't easy, and that's exactly why we created the Web Developer's SEO Cheat Sheet, a free download you can share with anyone on your team to get the rundown on technical and on-page best practices.
Download the free SEO Cheat Sheet
Ever since then-Mozzer Danny Dover created the original version in 2008, the SEO Cheat Sheet has been downloaded tens of thousands of times by developers and marketers alike.
Countless beginner and advanced SEOs have printed it out, laminated it, and hung it on their walls as a quick reference to the most impactful best practices in search engine optimization. Web developers and software engineers also find it handy to easily reference SEO technical standards.
Fully updated for 2020 by senior SEO scientist Britney Muller, the Web Dev's SEO Cheat Sheet breaks down all of the overlapping pieces of SEO and web development so your teams can live in perfect harmony — or close to it.
But before you call your first meeting, make sure you have all of the data! Going in armed with numbers and real examples paints a clearer picture and allows your two teams to start working on a plan of action.
Run a Full Site Audit Report to get started
If you already have a Moz Pro account set up, simply go to your Campaign and select Custom Reports from the left-hand navigation. Choose Full Site Audit from the list of templates and hit Create Report.
This report will lay out many of the technical and backend SEO errors that your web development team can help you with.
If you don’t have a Moz Pro account, you can sign up for a free trial and start gathering data for your site. Once the crawl is finished, follow the steps outlined above.
Critical Crawler Issues
The first thing to tackle with your buddies on the web dev team should be your Critical Crawler Issues. These issues include 400- and 500-level HTTP status errors.
400-level errors mean that the content can’t be found or it is gone altogether, while 500-level errors indicate an issue with the server.
Why does it matter for SEO?
As SEOs, our worst nightmare is discovering that a high-value, high-traffic page is returning a 400- or 500-level error. Users hitting an error page don't wait around for it to be fixed — at best, they try to find what they're looking for elsewhere on your site. At worst, they head to your competitor.
Explaining the importance of this to your devs should be fairly straightforward — these are errors that make pages entirely inaccessible to site visitors. The web devs on your team have put long hours into creating and maintaining your site; they don't want that hard work to be for nothing.
Getting the dev perspective
Critical issues on important pages should be avoided at all costs. Part of prevention lies in understanding both why something went wrong in the first place and how you can support your devs in the future. After uncovering what happened, work with them to make a plan to keep you informed of any big site changes that might occur and offer them support. Is there a list of high-value pages you can provide? Can you help with prioritization? Ask your devs the best way you can help and cultivate a positive working relationship even when urgent errors arise.
Fixing critical crawler errors with your web developer
Together, your teams can identify and solve these issues by navigating to the Critical Crawler Issues section of Site Crawl in Moz Pro.
Download the CSV file, prioritize your needs, and outline a plan to get fixes implemented.
Crawler Warnings
Next up, let’s talk about your Crawler Warnings. These may be a little tricker to explain to your web dev team in terms of impact, but they're worth the effort.
Crawler warnings include:
- Meta Noindex: This means there is a “noindex” value in the robots tag and search engines will not index the content. As a result, this page isn't going to show up when people search.
- X-Robots Nofollow: This means search engines aren’t going to crawl (follow) any of the links on the page because of the “nofollow” value in the x-robots response header. It also means that if you have any internal links on that page, Google isn’t going to crawl them, even if the link is the only way Google could get to a page.
- Meta Nofollow: This means there is a “nofollow” value in the robots tag and search engines will not follow any of the links on the page. It also means that if you have any internal links on that page, Google isn’t going to crawl them, even if the link is the only way Google could get to a page.
- X-Robots Noindex: This means search engines can’t index the content because there is a “noindex” value in the x-robots response header. It also means that this page is not going to show up in search.
Why does it matter for SEO?
Some pages are intentionally hidden from search engines, such as staging sites or those hidden behind a login screen, so not every crawler warning is going to be something your web dev must fix. Double-check whether URLs coming back with a crawler warning are truly pages you want to hide. If important pages are lost behind incorrect noindex/nofollow tags, you're missing out on valuable traffic and ranking opportunities.
Getting the dev perspective
Try to understand why certain pages were tagged with noindex/nofollow, then work with your developers to create a strong communication plan when site changes are made. If you're able to have a view into code deploys, you may catch mistakes before they launch.
Fixing crawler warnings with your web developer
When you see noindex/nofollow applied to pages you intend to rank, it can be a massive hindrance to organic traffic. While there are instances in which many of these tags could be used appropriately, a mistaken application of them can really cost a business.
Help your web dev understand their value by showing them how much traffic certain keywords on affect pages could bring in. Tie it to revenue and show them how much money you make on organic traffic alone! They'll get a sense for how important accuracy is when it comes to noindex/nofollow, as well as an understanding for how their decisions can help your site's most important pages boost the bottom line.
To get started on these Crawler Warnings, navigate to the Crawler Warning section of your Site Crawl, download the CSV file, determine what needs fixing, and lay out a plan of action.
For more information on noindex/nofollow usage, refer to the Robots Exclusion Standard section of the Web Developer's SEO Cheat Sheet.
Redirect Issues
Redirect issues include temporary redirects, redirect chains, and meta refreshes. These all have an impact on user experience (and crawlability), and therefore also impact SEO.
Why does it matter for SEO?
Temporary redirects, or 302s/307s, divert users from one URL to another. Think of these redirects like a road construction detour — the user can’t go this route today, but eventually, they will be able to use this route. That’s how Google sees these temporary redirects and because of that, they don’t pass as much link equity (ranking power) to the page — which is not ideal! Especially if you're planning on never opening that old road (or URL) back up and the “detour” is actually the new route.
Redirect chains are exactly what they sound like — a redirect from one page to another that redirects to another page and so on. The problem with this is that it takes a few seconds for every redirect to load on the user side. Oh, and let’s not forget that, again, Google is dropping link equity at every stop, so by the time there've been a few redirects, a good amount of equity has been lost. Plus, when a chain is too long, Google's crawler will no longer attempt to reach the final page. That means your page won't make it into the index, and you've lost an opportunity to rank.
Meta refreshes are made in the HTML code and tell the server to redirect the user after a certain amount of time. This can be very confusing for the user and lead them to leave the site. Moreso, these redirects pass no link equity.
Getting the dev perspective
As we've said before, it's important to understand why a decision was made before you can find an effective way to fix it. Figure out why your developers chose to use certain redirects, and make sure to communicate with them why one decision is better than another. If they have a regular process in place for implementing redirects, see where you can fit into that process and provide insight.
Fixing redirect issues with your web developer
When working with your web devs on redirect issues, there are a few key points to get across:
- Load times. Page speed is a ranking factor, and the longer a user (and a search engine) waits for a page to load, the more ranking power that page loses.
- User experience. The longer a user waits for the target page to appear, the more likely they'll bounce away and head to a competitor. That affects traffic, engagement metrics, and eventually revenue.
- Crawling and indexing. Your devs have worked hard to support and maintain the website, and the last thing they want is for that work to be for nothing. Redirect issues can keep pages from being crawled and indexed, which means they might as well not exist at all.
To identify the redirects on your site, navigate to the Redirect Issues section of your Site Crawl, download the CSV file, determine and prioritize what needs fixing, and get to work.
The HTTP Status Code and Performance and Page Speed sections of the Web Dev SEO Cheat Sheet will be helpful here, so make sure to share it with your devs!
Metadata Issues
The list of issues that this section covers is rather long, so bear with us!
- Missing Title
- Title Too Long
- Title Too Short
- Multiple Titles
- Missing Canonical
- Missing Description
- Description Too Long
- Description Too Short
- URL Too Long
- Overly Dynamic URL
Why does it matter for SEO?
While much of this metadata isn’t directly linked to ranking factors, all of these elements do affect how your content looks in the search engine result pages (SERPs). Every one of them can affect the clickability of your ranking pages, and if a page doesn't draw clicks, you lose both traffic and valuable user engagement signals that help power rankings.
Getting the dev perspective
How have your web devs viewed metadata in the past? It's entirely possible that it wasn't on their radar or considered very important. Try to figure out where they're coming from, then make the case for why you ought to be included in decisions around the site's metadata.
Fixing metadata issues with your web developer
Show your team the differences between what you consider good and bad when it comes to metadata. If you have the time, pull click-through rate data for different examples of each and compare them, then determine the amount of traffic lost from unoptimized metadata.
To find metadata issues on your site and begin correcting them, navigate to the Metadata Issues section of your Site Crawl, download the CSV file, determine and prioritize what needs fixing, and get started.
It’s highly encouraged that you share the HTML Elements section of the SEO Cheat Sheet with your devs. There you'll find common best practices for each of these elements.
Content Issues
Content issues can mean many things. The ones we focus on in this section of the site audit are duplicate content, duplicate titles, thin content, slow load time, and missing H1s. Each of these issues can negatively affect the way search engines see your content.
Why does it matter for SEO?
If your content and code look too similar (duplicate content), Google may not know which page you want to rank, causing either the wrong page to rank or keeping them out of the rankings altogether.
On the other hand, duplicate title tags can really confuse a user. If a user gets to a SERP and you have two listings with the same title, which are they supposed to click? If you’re lucky, they may read the meta description to decide, but realistically, they’ll either skip over it, choose the wrong link, or head to a competitor.
Thin content can also hurt your rankings. Often, thin content fails to fulfill searcher intent; since Google's main goal is to satisfy the searcher, you can see how this might hurt rankings and traffic.
Slow load times are the best way to drive searchers away to your competitors. Google knows this, subsequently putting an emphasis on page speed in their ranking algorithms.
Lastly, headlines (H1s) ought to be present to tell Google what your page is about. If you don’t have one or aren't using them correctly, Google may not get a clear understanding of your page. Proper use of heading tags also plays a big role in site accessibility, so using your heading tags correctly is paramount for helping screen readers and nonsighted visitors parse content correctly.
For more information on content issues and their solutions, refer to the Canonicalization and Performance section of the Web Dev SEO Cheat Sheet.
Getting the dev perspective
How have your devs viewed content issues in the past? What's a heavy lift for them when it comes to implementing your fixes, and how can you create a path forward that makes it easiest on both of you? Ask questions to help you understand where they're coming from, and create replicable processes to get your fixes live on the site.
Fixing content issues with your web developer
What all of this comes down to is user experience. If a searcher is looking to find a page to fulfill a need, they need to easily determine what to click in the SERPs and to access good content quickly. If the right page is hard to find or doesn’t have enough information, the user will bounce and not convert.
Every session has the opportunity to earn you money, but you have to earn it by fulfilling the searcher's need. When talking to your pals on the web development team, be sure to explain that to them — and show them the numbers! For example, if your bounce rate is too high, it could be due to slow load times or a page with no headline. Do your research, and use that research to power your conversations.
To get more information on what Content Issues we’ve identified on your site, navigate to the Content Issues section of your Site Crawl, download the CSV file, prioritize what needs fixing first, and get cookin'.
Learn about how SEOs and developers can work better together in this Whiteboard Friday with Helen Pollitt:
You’ve got this!
You're well-equipped with everything you need to work effectively and positively with your web dev team. You know exactly what to look for and what information to bring to the table.
If you haven’t gotten Moz Pro yet, don’t forget to sign up for your free trial so you can pull all of that data you need to get started.
Alt Text
Duplicate Content
Robots.txt
Robots Meta Directives
Schema.org Markup
HTTP Status Codes
Page Speed
Conversion Rate Optimization
Domains
URLs
Canonicalization
Redirects
Core Web Vitals
Performance Metrics: Opportunities & Diagnostics
H1 Tags
The Ultimate Guide to Image SEO: Optimizing Your Visuals for Search
Show Less