Yeah… I know… Our job (and what we love) is to code. And SEO is marketing, no code at all. But for some reasons, you need a bit of it, so let me tell you what you need to have at least on your website. Nothing less, nothing more.
Page Title
Here, I do not talk about the H1 of your website (patience!) but about the title of your HTML page. Yes, the one that is dislpayed in your browser’s tab.
Don’t
- put just the name of your brand, unless you’re Nike or AirBnb.
- put “Home page” on the home page.
Do
- let the user know just by reading this what your website is about
- add location data if it’s accurate. E.g. you are a company called “Kitten” that offers pet sitting services only in Hamburg, then you should have something like “Kitten – Pet Sitting in Hamburg”.
In any case, try providing the practical information your potential customer is likely looking for.
H1 please!
I know, sometimes, because it doesn’t fit with the designs, you’re tempted to use directly a h4 or no title at all. But don’t!!! Never! Use CSS rules to make your H1 look the way you want it. Think of the purpose of your page, extract keywords and build your H1 with those, build you h2 with secondary keywords and so on… Good “H”s come with common sense regarding the purpose of your website.
robots.txt
You know that many robots are visiting the whole web, nothing new here. In most cases, you don’t care about them as long as they are not seeking security breaches. But for “obvious reasons” you want to interact with Google’s or Facebook’s in a SEO context. This file, should be at the root of your project or in your base domain, should be named “robots.txt” and defines the way robots can interact with your website. As there are many ways to handle robots, and this is not an article about that precisely, have a look on mozilla’s or the robotstxt website for extended documentation on the subject and tools. Anyway, a robots.txt file must reference the locations of all the sitemaps you may have on your website. Which leads us to…
Sitemap XML
I won’t digress on the subject as the name is explicit enough but basically this is an XML document where you provide a set of crawlable urls mostly for search engines. This is a protocol widely used, and having it helps a lot for natural referencing. Just keep in mind that only non malicious crawlers will respect what you put in this file and will not go further. Malicious ones will still try finding something they can exploit. Documentation standard-sitemap.org or sitemaps.org.
Language
Language is generated in your website most of the time. I belive that if you have an international website you use solutions like I18N, which are already handling it for you. If you are not, it’s easy as <html lang="fr">
with the correct language code to define it. Even if you’re not planning on translating your website, it’s important to specify the language as people might set their search engines settings to show them only results in a specific language and they won’t be able to find you easily. Here are some more informations on how to set your website language properly.
$ man ascii
You must specify your website char encoding, or character set. Most used set is UTF-8 as HTML5 defaults to it. HTML5 handles UTF-16 as well, which HTML4 does not. Find an extended guide to character encoding. I’d recommend reading it all, there is the whole history of char encoding in this article, and I love knowing the history of the tools we use. Do like it you too?
Keywords density
For that one, I use the tool SEO Quake, a free browser extension that allows me to check everything in this list, but I use it mostly for checking keywords density in my pages. It’s useful to see this because you will get a great overview of which keywords you have the best natural refencing on, and sometimes you will get surprises. With this you will know if you need to refactor your text and titles to get more weight on certain words. Advices I would give are pretty much the same as for the page title, the purpose of your website or product should be on top of that list, and location if your services are localized.
HTTPS
Tim Berners-Lee aka TimBL created HTTP thirty years ago and it was great just like that for a while. But as there are many attacks on internet, the community created encryption and was looking for a way to apply it to the world wide web communication protocol. So HTTPS came in. Then, in october 2018 Google decided to favour only websites that can be browsed through HTTPS in its search results and it became mandatory to have it on our websites as this is still the most used search engine. For the record, there are many other search engines that value your privacy so don’t hesitate to check this article that provides a list of options to give a try to.
Good news, Clever Cloud sponsors Let’s Encrypt and our collaboration allows us to automatically provide you with SSL certificates. Have a look at the blog post that explains how it works. If you already have your certificates or want to use others, find the process for setting them up in our documentation.
Social sharing metas
I’ve hesitated before adding this one to the list, but think about a link sent to you on IM, and there’s neither picture nor description. Not very clickable, or you will at least ask yourself some questions before clicking on it. So I would recommend having at least meta tags generated for the dominant social networks. Facebook has created a standard, the Open Graph Protocol, which is used by them and many others like Instagram. As the standard is being pushed by a lot of big players, there are also several implementations in several languages that you can find by yourself. As I want to make your life easier, here is a tool that will help you generate the right metas for your pages, including Twitter cards (yeah, they don’t use OG).
Common sense generalities
Don’t mess with your urls
Try to keep them short and simple. Long urls with lots a weird characters for the non developer eye are not well referenced.
Be inclusive with alts
Every <img/>
you provide must have a alt
property. This increases the accessibilty of internet for people with visual impairment.
Use iframes with parcimony
Iframes are not part of the standard web page layout and IMO if poorly used just give your website a frankensteinian look. Use them when you need them and use APIs or RSS to gather content from other websites instead whenever possible.
Avoid using flash
Adobe Flash is deprecated and always was a great point of entry for attackers so it was never super popular among search engines even though it gave us lots of fun with in-browser games.
Favicon
On top of the fact that it helps for natural referencing, it helps your users identify you and is a good opportunity to put your logo somewhere one more time. Here is a favicon generator and here is an an how-to.
<!DOCTYPE>
Simple instruction that tells browsers which version of HTML you are using. Define it this way.
Bonus: Use the schema vocabulary
This one is not obvious, but schema.org purpose is to help get more structured data on the web by adding vocabulary that acts as a tag for your content. No obligation here, but this is becoming a standard so it’s worth looking into.
Roll up to top
With this checklist, you have the minimal requirements to get natural referencing from search engines, but don’t forget I hate doing SEO as much as you do. So this is absolutely not an extended guide to SEO. If you feel like going further, have a look at this article, it’s great and the design of the website is super cool.