Basic Technical SEO Checklist for Startups
After finding similar technical requirements during initial setup of SEO on projects I thought it would be useful to make a checklist that ensures the ‘basics’ were being incorporated each time. A version of the checklist I’m using on new projects is below. The focus is on the initial technical setup requirements rather than the ongoing link building/content marketing side of SEO so if you’re not comfortable working with code you may need to get some help from your development team.
Basic Technical SEO Checklist
Objective
Create a solid foundation of SEO best practices on your website to allow search engines to correctly index your site.
Technical Checklist:
- Ensure the overall structure of your websites HTML and content conforms to current best practice for SEO (see schema.org).
- Check to make sure you are using Headline tags (H1, H2 etc) correctly.
- Make sure you use descriptive file names and IMG ALT tags for photos (eg: ‘your-brand-name.jpg’ vs ‘qzrt1w.jpg’).
-
Use correct <title>
tags for your site, limited to 70 characters. It’s usually good to have ‘Your Brand’ at the end of the tag. - Provide Meta descriptions for each page to display on SERPs (Search Engine Results Page – the page that appears when you search on Google/Bing)
- Use Rich Snippets Markup where possible, such as: for Products – Name, Description, Price, Creator etc. – schema.org/products for example of markup
- Set Facebook/Twitter Open Graph tags
- Create a Google+ local business page to produce more informative SERPs.
- Use canonical tags to point to the ‘authoritative’ page
- Make sure URL slugs are descriptive of products/services
- Create an XML Sitemap and make sure pages are being updated automatically (see: XML-sitemaps.org )
- Create accounts with Google and Bing Webmaster tools. Submit crawl requests, sitemaps and perform SEO checks.
- Ping Google/Bing etc for RSS updates.
- Your Page Load Speed must be tested on an ongoing basis and kept consistently low – consider Caching, using a CDN to serve assets, or minimising assets where possible to decrease your load speed.
- Make sure your robots.txt is correctly set to allow web crawlers access.
About David Turnbull
I'm a New Zealander in Sweden, building web-apps with Ruby on Rails.
Recent posts
-
The Essential Mix 30
Dec 09, 2023
#music -
From bartending to tech marketing
Oct 13, 2022
#life -
Tools of the trade (my gear list)
Mar 21, 2021
#gear