Home    search engine optimization

Search engine optimization (SEO) tips

As many of you already know, SEO has been an important topic for every web developer for more than a decade now. It's not enough that you only make a website but you need to put it on the map, so to speak. There are and were in the past techniques how to do that effectively, some of which no longer apply to modern search engines, mostly because people were overly exploiting them to gain rankings.

I was first introduced to SEO in 2008 and it has been a long ride since then. So here I'm going to tell you about things that were, things that have been and things that are new. Some things are well known and can be learned and implemented in fairly short amount of time while other things, such as content and external link building, take time. You can think of it more like a never ending process. Know that the list of things here isn't complete but it is a construct, schematics if you will, that will produce good results, if you can follow it and remain on track during the process.

Now, there is one thing you need to know about SEO before diving into the topic. The exact algorithms, responsible for rankings, used in popular search engines such as Google, Bing and Yahoo are being kept private. Of course, for a good reason, so that there is no cheating. So over the years SEO has become more of an overall project, meaning that you cannot just focus on one thing and spam the life out of it just because search engines allow it. Instead, all the things have to be approached carefully to achieve the best results. I will explain what works and what does not, as well as what to avoid according to the best of my knowledge.

But before doing any of that, there is a need to emphasize that without doing prior research of the market and the targeted public your efforts may not bring the desired effect. Therefore, a plan has to be made beforehand. Now, the question arises, how to make a plan without knowing the basics of SEO? You can't, at least not a proper one and that is why this content was created, to help you understand the whole picture.

One more thing, when thinking of SEO, always have in mind that a visitor is the king and you are a servant. Do what's best for the king and search engines will like you, that is how you should approach any project from the start. Quality matters, whether be performance, usability, layout or content.

PART 1 - Onsite optimization

Things we will be looking at are page title, meta description and meta keywords, all found in HTML head section, SEO friendly urls and content, where we will discuss using proper HTML tags from headings to image titles and originality of the content using normal keyword density and internal links. But the first thing to look at is the site structure and layout which, while indirectly, does have an impat on the final product and rankings.

Structure and layout

Some people might say, how is this even important to SEO? Well, technically it is not, but most things that are depend on it. Let's consider a couple of things here. For example, if your site is poorly structured you might have problems with implementing headings or content correctly. Same thing with layout and design. If a visitor has trouble navigating or reading content, it means that the visitor will leave your site and forget about it, let alone recommend it on social networks which do provide the most natural backlinks that improve rankings. This also applies to mobile devices, so make sure your page is friendly for both desktop and mobile devices because search engines do care about it. Things should be made simple so that everyone can understand where the elements such as navigation are and where the content is. Also make sure to style the headings and text and images correctly so that everything is in place and easily readable. One thing to note here is that every time you make a layout, make sure the page looks the same on different browsers. Most used are Chrome, Internet Explorer, Safari and Firefox. There are also different screen resolutions to be considered here. So, again, make sure these things are not overlooked when making a plan.

Page title

This is one of the most important things when optimizing your site. It is the first thing it shows under the hits when a search query is made. It says what the page is all about, here is where your primary keywords should be displayed. Why I'm saying this in plural? Because title can be specific and can target different queries that point to the same result. Of course, it should be related, for example if we are discussing fruits, then you could go with both apples and oranges in the title as well. Or, even better, expand the title and say "Healthiest fruits. Where to buy fresh organic apples and oranges?" It all depends on the context of your content and what the research has shown people are entering in search engines.

However, it is not a good idea to stuff too many keywords in the title or make the title too long or too short. Don't just go with "Fruits" or "Apples" or "Oranges". This way, you will make it too generic and your site might get lost in the swarm of internet sites. The title needs to be as specific as possible.

How to implement:

When building static pages, never think of title as one of your include files that will be the same for all pages. Each page should have its own title and it should go with:


<title>A long-tailed title about search engine optimization</title>

in the <head> section of HTML file. When doing it for dynamic pages, make sure to have saved the title in database along with other elements of the content and then call it back in the title. Once you have the row elements in an array, it would look like this:


// PHP example
<title><?php echo $row['title'];?></title>
Meta description and keywords

In the past both were important parts of SEO strategy but recently meta keywords were abandoned due to people abusing this particular tag. But as it states in the article, it is not against the policy, at least for Google, to be still using them.

For meta description, it is recommended to use somewhere between 130 to 160 characters to describe what the specific page is all about. It should be unique and should use keywords to draw attention to people looking for what you offer and consequently improve rankings. Just remember that meta description is used just below the title in the hits of a search query. As for meta keywords, there is no need for you to include them but it doesn't hurt if you do. Just avoid stuffing as many as you can and avoid repeating the words. For example if you used "apple" already then perhaps avoid using "apple sauce". It would be better to just use the latter in such case.

How to implement:

It goes the same as for title, for static pages, each page should have a unique tag that goes in the <head> section of HTML file.


<meta name="description" content="Unique meta description about search engine optimization that will help you improve rankings of your sites">

<meta name="keywords" content="seo, search engine, rankings, websites, static, dynamic, pages">

As for dynamic pages, make sure to add columns for meta description and meta keywords and save them to database so when the row is called upon, you can do this:


// PHP example
<meta name="description" <?php echo 'content="'.$row['meta_desc'].'"';?>>

<meta name="keywords" <?php echo 'content="'.$row['meta_keys'].'"';?>>
URLs

Uniform resource locators (URLs), as the term states, are in fact important part to SEO because not only it enables the retrieval of the content but it also states what the content is all about. Not to mention that it appears, along with title and meta description in hits of a search query in search engines. It is not as important as the two, but nonetheless it is relevant. So URL should contain up to 3 words but no more than 4 to still be considered friendly and acceptable. However, even these multiple words should not contain and promote different keywords. If we go back to "fruits" a bad example would be "domain.com/oranges-lemons-apples.php". On the other hand a good example would be "domain.com/fresh-organic-apples.php". It means that you should find out what the most important query is that you are after and put it in the URL.

How to implement:

Well for static pages this is easy. Just name the file the way you want URL to look and then hyperlink to it. You can also put them in the folder and then add the folder to the link. If we have a folder named "fruits" and inside a file named "fresh-organic-apples.php" the link would be:


<a href="/fruits/fresh-organic-apples.php">Link to fresh organic apples</a>

//The url would look like this:
https://domain.com/fruits/fresh-organic-apples.php

But with dynamic pages, things are not so easy since there is usually only one file that is calling multiple contents from the database. I will explain this in future tutorial.

Headings

Headings are another part of the puzzle when it comes to SEO. When doing the work on the content, you should include them in a vertically hierarchical order which means that <h1> always comes first before <h2> that comes before <h3> and so on. They should also include your keywords but should not be repeated in the process. Meaning that <h1> should be different to <h2> and so on. Every heading can contain different keywords related to the content of the site. You may think of it as optimizing your site for multiple search queries but again in hierarchical order. So basically <h3> should contain keywords less important than <h2> and <h1>. It is also imporant to note that <h3> and <h2> should appear vertically lower than <h1>. If you can follow this schematics, you'll be fine. Now one thing that begs the question is whether or not go on forever when having more than three headings per page. You could but I haven't noticed any considerable difference, for example, between using <h4>, <h5>, <h6> or just using <h3> repeatedly when it comes to incrementing. It is up to you how to handle it. The thing to avoid here is masking the headings to be invisible, for example using the same color as background and putting them aside from content. This is considered as a black hat technique to improve rankings. It won't work and may get your site/s banned from search engines.

How to implement:

Since HTML tags can be included in the database, here is no real difference between static and dynamic pages. Everything else should be self-explanatory but just in case here is an example:


<h1>The most important and valued heading</h1>
<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam quis sollicitudin leo. Integer eu justo pellentesque, pretium leo a, rhoncus orci. Phasellus mollis pretium accumsan.</p> 

<h2>The second most important and valued heading</h2>
<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam quis sollicitudin leo. Integer eu justo pellentesque, pretium leo a, rhoncus orci. Phasellus mollis pretium accumsan.</p>

<h3>Third most important and valued heading</h3>
<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam quis sollicitudin leo. Integer eu justo pellentesque, pretium leo a, rhoncus orci. Phasellus mollis pretium accumsan.</p>
Image names and tags

When dealing with images, it helps if you include keywords into image names and title and alt tags. The names, as oppossed to the mentioned tags, should not be more than 3 words long, which should also be separated from each other by - or _ sign. Also make sure to avoid repeating keywords on different images on the same site. Additionally, you may want to compress image files to achieve performance that will, on a grand scale, improve your rankings.

How to implement:

<img src="/images/seo_for_beginners.jpg" title="Search engine optimization for beginners" alt="SEO basics for websites"/>
Content and keyword density

This is the most important aspect and the toughest thing to do when it comes to your website. It requires concentration, dedication, determination, knowledge and time. This is where you engage with visitors which also determines the rankings. It is not so much with keyword density as it is with quality, originality and the very essence of what the site offers. You should definitely not mislead people into clicking on your website only to offer them something else instead. It means that the content you offer is the source of your research on keywords. It is the very beginning of a plan.

If you don't know where to even begin with this, you could always write articles beforehand and then run them over a keyword density tool to see what your content is empahsising on, which are the keywords you used most. And then you can either begin researching the market for it or change your articles, if you feel you are way off course to what you are trying to accomplish. As for checking the density of keywords or accumulated keyword queries with 2 or more words in it, don't beat yourself up when preparing the content for publishing. Mostly because no one knows what the correct density is for search engines. In theory it should be from 1% to 2% but that's a guess at best, if you ask me. It should just feel natural for anyone reading your content so don't worry too much about it. Unless you've made a serious effort to consciously adding the keywords of your interest in the content, you will be below 3% naturally, on any given search query.

Another thing you should keep in mind is the originality. If you've been sincere to yourself and wrote everything in your own words, there should be no duplicates before publishing. But just in case you can always check it on Copyscape. First several checks are free. It is also good for discovering people stealing your content, not that you can do much about it though but gladly they will be ranked below you since you were the author of the original content. Well, at least on Google, the crawler seems to be registring the date time correctly.

Internal backlinks

Take wikipedia for example. How many times have you ended up reading an entirely different page than you first entered in? Everything on their website is linked to a related site that already exists on the same domain. And if you looked closely enough, it is only linked once, usually the first time it appears. This is very important, not only for SEO but also for your visitors. They should be able to click on a topic that is mentioned on the current site but is described in details on another site of your website. This way you promote other sites and the crawler who comes to your site will also note the links pointing to a particular site. Just don't over do it. It should be done once or, in case of a large amount of text, twice, depends where on a page visitor might end up. This mostly goes for hyper-links pointing to a specific part of the site.

How to implement:

There are two options how to link to a different site. You can either have absolute links or relative links. I would recommend using the latter because while absolute links may guarrantee success in any case, you will find it a pain in the ass, if you have to migrate the site to a different domain or server or implement SSL to it. So using relative path is much much better but there is a catch. There are different perks using relative paths but at this point you should already know the basics of HTML.


//Absolute path
<a href="http://www.domain.com/apples.php">Absolute link to internal site</a>

//Relative path, note the / before it starts, it means that it grants access to the root dir of the site and goes from there.
<a href="/apples.php">Relative link to internal site</a>

PART 2 - Offsite optimization

Here we will look at how to pick a SEO friendly domain, how to get quality external backlinks, how to boost response time between a client and the server and make things friendly for SEO bots visiting your website. The final thing we will discuss in this group is the SSL encryption that, believe it or not, gives a small boost to SEO, at least on Google.

Picking a domain

This is quite important when it comes to SEO. The top ranking query from your research should definitely be in the domain, if possible but not at the cost of ruining the domain. There should be no more than 3 words in the domain. Best way to do this is to put the words together, like "freshorganicapples" for example. If that is not possible, then use - sign to separate the words. Top ranking domains are .com, .net and .org but you should always go for .com whenever possible. However, as some of you have already found out, these domains are hard to come by so sometimes you could also go with .info, .biz, .blog or even local endings such as .us or .co.uk, depends on the content you have and what market you are actually targeting. Remember, it is always keyword query over domain extension, just don't use risky extentsions, ever. Also, have in mind that the domain should be easy to remember by people to boost up direct traffic. It's not going to be easy, never is, when selecting a proper SEO friendly domain for your website.

External backlinks

What are the external backlinks and how to get them? Well, for example, in the past you had all these free directories that were published only for the purpose of indexing your website and linking back to it. So you could basically spent days just registering your website. It kind of helped but not that much since most of these directories were ranked low in the engines. Anyway, people started abusing these but not only this particular service. Exchanges of links were made between non-related but nonetheless respected websites in order to gain rankings and it went to the point where people started buying and selling external backlinks in masses from high to mid ranking websites. Well, these days are long gone and nowadays buying backlinks is considered a black hat tehcnique by most of the known and respected search engines. In other words, don't do this, it could get you banned from top search engines.

So the question is how to get quality links that are registered by the search engines? One simple thing you can always do is to add links to social networks, such as Facebook and Twitter, so your visitors can do it for you, if they find your site relevant. This should be done for each page separately so that only the page in question gets shared. Mostly to avoid confusion among people clicking on links. You may use this social link generator. Another way is to promote the pages yourself on social networks with providing a short summary of the content and link to the page. You may also do that on forums, public sites, such as wiki, and other platforms, if the rules allow it.

Another thing is to link all of your existing websites that have related content. This still works, if done right. Just remember to link your website about fruits to the other website about food and not about construction. Another solution that you could optionally use, of course depending on your expertise and content, is to write an article for a respected news or science website and provide a backlink to your site where the content has originated from. Speaking of which, you may want to copyright all of the material and make it clear to your visitors that they may publish your content elsewhere on the web but only if a backlink is provided to the original site. This pretty much sums it up and now you can see why this can be done indefinitely.

Response time

From my experience, this is very important. Why? Well, it's the same thing as structure and layout. The king is unhappy, if he has to wait, right? I mean, do you personally like waiting while the page loads so you can read or do whatever the site is all about? Statistics show that more than 50% of people lose patience and leave before, which adds to unfriendlines of the site. So what can we do about it to speed things up?

First, identify your main traffic public. Do they come from US or Japan, from Brazil or France? Why this is important is because it helps us making our choice where our remote server will be located for better and faster service. It doesn't have to be exactly in the same country but should at least be on the same continent. Next, purchase a good server with a good connection to the internet and a good technical support. You may also want to do a research on that but me personally, I recommend Webfaction.

Then there comes a bit of technical skills. Note that when, as a client or a visitor, you are requesting a site, the server has to open up the file and include all files that are needed for the page to load properly. Now the bigger the files are, the longer it takes to load. It also depends on the functions used for communication between different applications. So make sure to optimize your code. Also avoid heavy scripting and being overly dependant on different libraries. This mostly goes for Javascript. But if you really have to, then save the libraries to the server and serve them locally to reduce amount of time needed for making an external connection to those libraries. Another thing is to make sure to compress all the images and videos. Those can be quite heavy, especially for mobile or people with bad internet connection.

For more tweaks and for testing your sites, I recommend Google's Page speed insights. Completely free service that you will learn a lot from.

Make site friendly for SEO bots

This is not just about being friendly but here the point is to tell SEO bots exactly where to crawl and what folders and files to index. First thing we will look at is robots.txt which has to be created and put in the root folder. With robots we will disallow any files and folders we don't want indexed, such as admin folder, stylesheet files, scripts, images and non-content related include files. The other thing to look at is sitemap.xml which enables search engines to crawl more efficiently through your website. This is important, if your pages are not well linked to each other or through navigation, for example if you use archives or if you have a really large website with an abundance of non-related pages. Also it is always a good idea to provide a sitemap file when establishing a new page with a new domain.

How to implement:

Create a file and name it robots.txt. Now let's say we want to exclude admin, script and image folders and a couple of files. It would look like this:


User-agent: *
Disallow: /admin/ 
Disallow: /scripts/
Disallow: /images/
Disallow: /headtags.php
Disallow: /style.css

Alternatively you may use robots meta tag in the head section to tell the crawler what to do:


//Crawler will scan and index all the pages
<meta name="robots" content="index, follow">

//Crawler will index the current page and stop there
<meta name="robots" content="index, nofollow">

//Crawler will not index the current page but will scan the rest of the pages
<meta name="robots" content="noindex, follow">

//Crawler will not index the current page and will not scan the rest of the pages
<meta name="robots" content="noindex, nofollow">

As for sitemap.xml, you can generate it online, download and put it in the root folder of the website.

TLS/SSL encryption

Now that this is becoming a standard for every website and web application, search engines do provide a small bonus in rankings for secure connections. Not that long ago, you would only see it implemented on login pages or pages dealing with credit cards and such but these days you may also see it elsewhere. What transport layer security (TLS) or older term, secure socket layers (SSL), bascially does is that it encrypts traffic between the server and a client. So every form, such as login, search, contact...etc., that you use as a client is encrypted and cannot be intercepted in plain text. It provides anonymity and prevents theft of any personal information and interests which is evidently and, in my opinion, rightfully being promoted by big companies. Whether or not this small bonus in rankings can hold on in the future I don't know but right now it will give you the edge over non-secured sites.

Feel free to share with your friends: