SEO Basics: Introduction To Search Engine Optimisation

Why optimise your web site?

Success is biased at being top of the rankings. The top-ranked site for a search does better than #2, and the top few get many times more visits than the results at the bottom of the first page, which in turn get many times more clicks than results on page 2 etc. etc.

The skew is so great that the benefits of getting higher up the rankings far outweighs the costs. You should be optimising your web sites to get high rankings.

Aim to get in the top 5 or 10 results for one or more specific search terms.

Marketing preparation

The vital first part of optimising your web site for search engines is simply to decide what key words or phrases you want to do well on. (Later, we’ll look at how you implement these phrases – to make your page seem to be more about them.)

The fact is that there’s no such thing as a generally optimised web site. Sure, you can make a site search-engine “friendly”, which means making it clean, transparent and easy for search engine spiders to navigate. That should be taken for granted. But you can only optimise for specific key words that you decide in advance.

Before you decide the key words, you have to think about your market. Who do you want to visit your web site? Only then can you address those people by the search terms they’re typing in.

Try to be as specific as you can be when defining your market.

Let’s take an example: You’re a self-employed web designer. What terms do you select for your SEO exercise?

Well, “web design” would seem to be a good start. But if you optimise your site for such a generic term, you come up immediately against several problems:

SEO Problem #1 – Competition

There’s an awful lot of competition for the term “web design”, so it’s extremely difficult to get high up the rankings. You’ll be lucky to get into the top few dozen results, which is hardly worth doing.

You’ll actually find more success aiming to get in the top few search results for a much narrower market.

Problem #2 – Dilution

As well as the sheer amount of competition, a search for “web design” will return a wide range of results. You’ll get people talking about web design, people doing it, and people selling related products and services.

How valuable would it be for your site to feature as part of that crowd? Even when you do succeed, how valuable is each “eyeball” that sees your link on the search engines results pages (SERPs)? For each 100 people that see your link and description on the results, how many are likely to be looking for what your site offers? If it’s only a handful, those eyeballs had better be highly valuable. Most sites benefit from more focused results.

When considering how much dilution you can handle, also consider how many visits you actually need. If you’re a web designer, you don’t need 1000 visits from prospective clients per day, because you’ll never be able to deal with the contact requests coming from those visitors! Surely it would be much better to get 20 visits per day from highly-qualified prospects.
Plan your SEO Campaign

When you’re clear about your market – who they are, what they want, and what they may be looking for – the next step is to select the combination of search terms that will deliver you:

The right number
of the right visitors
who are looking for what you offer

Formula for Search Engine Optimisation Success

These 2 considerations point to a simple formula for success:
Success = Term popularity × Value of a view ÷ Competition

Now the term “web design” may match a lot of searches, but it will match “web design tutorials”, “web designers boulder colorado”, and a plethora of other things. So while the popularity may be high, the value of showing up on a search is comparatively low, and there are a lot of competing web sites out there.

A term more focused on the site’s content and goals – like “Web Designer small business Oregon” – will be less frequently searched for, but the value of each view will be higher and the competition smaller, which should result in more success.

It stands to reason that there will be a point that the search term becomes too specific and simply doesn’t generate enough views to deliver enough value to the business, no matter how accurately qualified the views may be. And logically, somewhere between will be an optimal point, where the value is high, the competition low, and the term generates enough views to support the site with an appropriate number of qualified leads.

In reality, the only way to address the problem of how specific to go is to apply a combination of educated guesswork with trial and error. But a handy rule of thumb is to ask, “How specific does the search term need to be to reach just the people we need?” Sounds simplistic, but it implies being general enough not to exclude the people you want to reach, while being specific enough to exclude the people you don’t need to reach.

Always look to exclude first the people who won’t end up paying the wages. For example, if you’re a plumber who’s based and works only in Littletown, definitely consider adding “Littletown” to your search

You can fine-tune things to get the right overall amount of traffic to the site, both before implementing an SEO campaign, and throughout. If there’s too much general traffic, generating too few conversions, you have the option to go more specific. If there’s not enough traffic, you can experiment with broadening your search terms to reach more eyeballs.
Recommended SEO Keyword Search Tool

I’d recommend as a tool to help you figure out the ideal search terms.

It has a free trial, which lets you enter a bunch of possible terms and then indicates how much traffic each term is likely to attract, plus the competition that’s out there for the term, producing an overall success indicator. (You just have to apply the “value of a view” part of the formula.)

Typos and mis-spellings

It can be a productive tactic to select mis-spellings of search terms in order to compete for a slice of normally competitive search traffic.
Spread your Bets with Multiple Pages for Search Terms

The good news is that you don’t really have to choose just one search term. Many sites specifically feature a range of pages, each targeting a different search term.

There are two very good reasons for applying multiple terms on the same site.

One reason is you may have several valid search terms, for example covering a range of products or services.

Secondly, you can’t predict how well you’ll do on each particular term until you try it, so it makes sense to test several combinations to find out which performs better. The best statistics packages (like Google Analytics) will help you track how traffic comes into your site.

If you’re not trying a range of search terms through different pages on your site, perhaps you should consider it, as you could be missing out on more successful tactics. A professional SEO agent will continually test different combinations of key words and phrases to find the best approach for particular organisational goals.

It’s common to find a combination of more generic pages that include a rich variety of keywords, pointing to a number of more specific pages that focus on more detailed terms. In fact, this is a natural structure for many sites, where you have menus that list and point to groups of content, products, or services.

The menu, bridge, or index pages, will naturally reflect the breadth of important words in the pages they link to, whereas the specific (”leaf”) pages should obviously feature a narrower range of specific key terms, repeated more frequently.

How Search Engines Work

When you get to the really sharp end of SEO, with lots of sites vying for the top places on highly competitive terms, SEO professionals invest a great deal of time running tests and analysing data to try and work out how the search engine algorithms work (from week to week). That specific time-based intelligence doesn’t concern us here, but obviously it pays to have some insight into how search engines in general work, and it’s actually fairly straightforward!

The important thing to do is simply to understand the goal of a search engine, which is simply to provide the best possible match of results for a search term. And imagine that there’s a team of people somewhere trying to tweak the system to produce better results than yesterday.

They want their algorithm to promote the best content that’s most likely to be what you or I are looking for when we type in a particular term. Part of that is defending the search engine’s algorithm from being tricked.

I find it helps me to imagine that, if I were one of these people, faced with two similar but not identical web pages, how would I figure out the most likely best match, just going on the raw information available as a computer sees it (i.e. not being able to understand intelligently what the page is about). And matching a search term to known pages really pivots on establishing what a page is about, something SEO people refer to as “aboutness” – the subject, topic, or scope of a page.

It doesn’t matter what issue you’re considering – the approach of putting yourself in the algorithm designer’s shoes is universally applicable. A form I find helpful for anticipating specific questions on how search engines work is “All other things being equal…”. When put in this way, the answer is usually straightforward.

Try these questions for starters, and think how you’d respond:

”All other things being equal, is a page that features the exact search phrase in its title a better match than one that contains all the words in the search term but in a different order?”
”All other things being equal, is a page that has been updated in the last week likely to be more relevant than one that has not been updated for two years?”
”All other things being equal, is a page that features the key phrase 5 times in 50K of markup likely to be more relevant than one that features the key phrase 5 times in 30K of markup?”
”All other things being equal, is a page with 100 inbound links from a variety of unrelated sites likely to be more relevant than a page with 30 links from relevant respected web pages?”

The people who have to write search engine logic are just doing the same as you, but they have to analyse the thought processes and turn them into a set of rules, and assign relative importance to all the various rules.

There are dozens of factors at play that you can use to test the “aboutness” of a web page, but these generally fall into two group: “on-page” and “off-page” factors.
Off-page Factors

Before Google, all search engines trusted the page content itself to say what it was about (on-page), even trusting the keywords in the page’s metatags. Google’s biggest innovation was to include information from pages that linked to the page in question.

So, if your page links to my page with the words “Great site for learning web design skills”, Google would count those words as meaningful in establishing the “aboutness” of my page. The more people that link to the page using similar terms, and the higher proportion of links that use them, the more relevant those keywords will seem.

This web-like democratising of description (”off-page” factors) made it harder to trick the search engines, and Google gained massive market share simply by being a bit more accurate than its competitors. Nowadays, all the main search engines use both on-page and off-page factors, in different combinations and in different ways, to establish what a web page is about.

A case in point, that demonstrates the power of off-page factors, particularly to Google, is an article I wrote called “Current Style in Web Design”. This has been in the #1 spot on Google search for “Web 2.0 design” for well over a year. Not because I’ve loaded the article with those keywords – in fact I didn’t originally mention “Web2.0” anywhere in the article!

It got to that position because so many people linked to the page using those key words in their links. The position is due almost entirely to off-page factors!

Obviously, it’s much harder to control what happens on other sites than what’s contained on your own site. So there is clearly a limited number of things you can do to influence your off-page factors.

What you want, of course, is to get lots of people to link to your site using your keywords. How can you increase the chances of that?


“Linkbait” just means providing great content that people find useful and interesting, think that other people will find useful and interesting, and link to. This is the single most simple, and often most overlooked, technique.

I can’t recommend highly enough the principle of giving away your knowledge for free. I go into this in more detail in ‘Save the Pixel’, but to recap briefly here, the idea is that publishing your specific knowledge on your web site simply attracts links that turn into traffic, meaning that lots of people see the knowledge you possess.

You win in the long term because people who aren’t in the market to pay for that knowledge will make use of it and go away, in which case you haven’t lost anything. But people who are aware of a need to apply that knowledge to their own particular situation can be convinced by your transparent demonstration of knowledge more than they would be by sales rhetoric claiming knowledge or capability, and are more likely to trust you and to become customers.

Another way to generate content is to have comments or full forums on your site, which invite visitors to submit their own thoughts. This can generate whole pages loaded with keyword combinations you may never have thought of.

One more great idea is that businesses should always publish every question they receive from a prospect in a Q&A section on their web site. Not only does this show openness and expertise that benefit your brand, but short questions and answers are also likely to contain a nice proportion of keywords, as well as usually being easy and quick to generate, and provide content that looks appealing in search engine results as it contains a specific answer to a specific question.
Feeding keywords

You can’t always get people to use the keywords you want them to when linking to your content, but there are things you can do to increase the probability. The #1 thing is probably to give your content a strong, concise, and meaningful title that is likely to be good enough that no one is going to bother coming up with new link text.

To support that, making sure that it’s very explicit what a page is about will make it more likely that someone will select meaningful terms that happen to match web searches.

For example, put an explicit short summary after the main heading, reuse key terms in your headings and throughout the text. (Of course, you may find that people find their own terms to describe your page, which you may not expect, but that can be to your benefit as I found with my “Current Style” page.)
Getting free or paid links

You can purchase links from other related web sites, which often let you control the link content, but the costs can be prohibitive for many small sites. In general, if you go down this route, aim to get links from pages that have a similar subject matter to the page you want them to link to. It may be better to rent 10 links from smaller, specific pages than to spend the same budget on one link from a single higher-profile but more generic page.

There are usually lots of directories you should look into, many of which are free. Definitely look at getting included on Yahoo, DMOZ (the Open Directory project), as well as any industry-specific, or locale-specific listings you can find.

Link exchanges and web rings

People have long sought to trick the search engines into believing that their sites are more popular than they really are, by swapping links with a single other site (you link to me, I’ll link to you), which is a simple link exchange, or by extending the chain into “web rings”, where the inbound and outbound links don’t point directly to the same place.

This is clearly a grey area, and it’s not something that I’d get involved with personally, but judging by the number of requests we get every week from people proposing link exchanges (despite the fact we make it very clear we don’t do them), there’s still a lot of demand. But the penalty for getting identified as a trickster by a search engine can be severe, so avoid this technique for any important domain.
On-page Factors

I’ve covered off-page factors briefly. Let’s look at what actually goes in your HTML markup.

Put simply, when search engines look at your page, they’re trying to figure out what the page is about, so that they can match the page to searches for the same kind of thing.

The general goal of on-page optimisation is to optimise the “aboutness” of page content. This pretty much means arranging the content so that the target search terms feature prominently compared to less relevant content.

If you were a search engine, how would you work out what a page is about?

Obviously, you’d start by looking for special terms & phrases. You’d look past the ordinary words like “and, but, then etc.”, but any meaningful terms will stand out as indicative of the subject of the page/site.

Sometimes the page has a focused, highly specific subject. Other times it seems to be about lots of related things. you’ll often find links to other pages (or site), which give an indication of what those pages are about too (off-page factors).

So if you’re tring to top the charts for the search phrase “Web Designer small business Oregon”, how do you make your home page (for example) more about that?

Thinking from the search engine’s point of view, and asking a bunch of “All other things being equal…” questions, it seems pretty obvious that the following would increase the specific “aboutness” of the page/term.

The places to position key words in your HTML definitely have a hierarchy of importance. The precise balance of power is different for different search engines, and the algorithms are continually tweaked, so while full-time SEO pros make it their job to get as close as they can to the secret numbers, no one really knows for sure.

But logic indicates a general order of priority, which I’ll sketch out below.

Remember, search engine spiders don’t really understand the language your content is written in. They’re just machines programmed for a job. What they do is mostly counting
The page says it’s about the term
Title tag very important

The main place a web page says what it’s about is using the tag.

So key words or phrases in your page’s title tag will count big towards a page’s aboutness.
Heading 1 tag very important

The other key announcement of a page’s subject is the main heading


It’s not clear whether it’s better to have one or multiple

tags, but logically it would seem to be no better to use loads of major headings containing loads of keywords than just one containing the same proportion of keywords.

(I tend to use one

on my sites, on the principle that every article should have a title on the page.)
Meta tags moderately useful

Meta tags contain content that is not displayed on the page, but is read and used by browsers and search engines. Their contents vary in importance, but they’re pretty much irrelevant for Google.

The main 2 meta tags you should always have are the description and keywords.

The meta description tag is literally a description of a web page and its purpose. The description used to be displayed with search results, but Yahoo and Google (at least) now display excerpts from page contents, which shows how the importance of meta content has diminished.

The meta keywords tag is a comma-delimited list of key words or phrases, and really cannot be trusted, although most experts recommend you still keep them.
The page contents reveal what they’re about

When Google launched, it focused more on the real evidence derived from what pages really say than what they say they say. Prior to Google, meta tags were taken as significant, but Google caused a sea change when it started prioritising off-page factors and real content above metadata.

Continuing the theme of what the page content actually reveals about the page’s “aboutness”, we must consider the rest of the words.

Minor headings are more significant than regular text. Use headings –



and feature your key terms in them.
Content at the top of the page is more significant, because pages start with more high-level descriptions and introductions. In other words, web pages normally start by saying what they’re about!
Any non-text content, such as images, are also content and should reveal any “aboutness”, using the alt property (for short descriptions) or longdesc for more content-rich images.

Consider Keyword-content Ratio

All other things being equal, a page that features a good density of search terms in a smaller page size cannot be less meaningful than a page with the same number of keywords in a larger file.

Briefly, you should always strip as much non-content out of your page code as you can. Ideally, this means writing clean, semantically-correct HTML that has no style information in the markup itself (it should all be in external CSS files, just as JavaScript code should be in external files).

Keep your Content Realistic and Human-Readable

Search engines will smell a rat if the ratio of key search terms in your page content is too high.

Don’t overstuff your content with too many repetitions of key terms. Search engines are looking for real content, not artificially enhanced content, which means a natural balance of search terms to the dietary fibre of other content.

The trick, of course, is to figure out how much is too much, and the only way to know (as SEO pros do) is to run continual tests using dummy content that is not business-critical.

Also, there’s little point getting loads of traffic to a page when the page content is not useful, readable, or helpful.

Posted by on March 21, 2012