An effective Search Engine Optimization campaign begins by using keywords to target the subjects which you intend your web site to rank for. In order to rank in the search engine results pages (SERP’s), a web site has to have a position of relevance to the search engine for a given search term. This is usually achieved by having keyword rich content on the site that coincides with the search term that the internet user types into the search engine.
In organic, (natural) search engine optimization, there are several factors that contribute to the ranking for a search term. Each search engine provider has their own unique set of criteria that they use to determine which web site is most relevant for that specific term. The factors that are used to determine search engine rankings are then calculated by a complex search engine algorithm.
Due to the high amount of targeted customer traffic that stands to be gained by a web site that has a top-ranked position in the search engines, the search providers keep the details of the factors which make up their algorithm a closely guarded secret.
Even though the search providers are careful with the exact formula that they use to gauge the rankings of web sites, there have been several constants discovered that when applied, produce favorable results.
Keywords – Content without Keywords = Website without Traffic
Keywords are at the center of any effective search engine optimization campaign. The basic rule of thumb in search engine optimization is that content is king. So, it is important to base all the content of a web site on accurate and informative content.
The first thing to perform in an optimization campaign is keyword research. To find the search terms that are generating traffic for the topic that you want to rank for, you need to determine what the internet users are searching for. This may be accomplished by several different means however the most common way is to use a keyword selector tool.
There are several different keyword tools that may be used. The one tool that was the industry standard for quite a while was the Overture keyword selector tool which, as of January 07′, has been retired . The data in the Overture database still remains available for historical data, but is no longer being publicly updated by Yahoo!, the company that now owns the service.
There are however several other keyword tools that may be used. Some free and some available for purchase or subscription based service.
- Free: SEO Book keyword suggestion tool:
- Free: Google AdWords keyword suggestion tool:
- Various prices: Keyword Discovery:
You Have to do your Homework – Keyword ResearchThere are few things to keep in mind before starting keyword research. The fact is that you will probably not be able to rank for a very competitive single-word search term like “money” or “cars”. For that matter, there are many two-word search terms, like “real estate”, that you will not be able to rank for either. So, in order to achieve rankings for your site with a subject like real estate, you have to look for related terms that will allow you to obtain a high position and yet not be too obscure so that you will still be attracting relevant traffic. For instance, you might try ranking for the search term “South Florida Real Estate Agency”, or “South Florida Real Estate Brokers”.
In order to determine the best search terms to use on content pages, I generally make a spreadsheet based on the topics that I feel are of most relevance.
There is an instructional Video I Created on “Search Engine Optimization Keyword Research”, over in You Tube … (http://www.youtube.com/watch?v=5Lw1es477LI)
The research phase is generally considered a very crucial part of the campaign because if the keyword terms that you choose are not searched for by internet users or if there are too many pages indexed in the search engines for that specific search phrase, your chances of achieving decent ranking results are very low.
There are some SEO’s that suggest that every page on a web site should be based on a keyword. This strategy may produce a high amount of pages that rank well in the SERP’s. However, the other side to that is sacrificing quality and readability. It may be difficult for a potential customer to perceive a web site as an authority or as a respectable source of information, if that web site has a directory structure that appears to be nothing more than a group of phrases. Moreover, it is very important to maintain a level of simplicity in the navigation and layout of a web site so that when a visitor comes to the site they may quickly identify the areas that would be of interest to them.
If a visitor is trying to determine which link on the navigation is relevant to their search and is only provided a selection of keyword-driven entry page links, the site may elicit a response of being shallow, which could in turn damage the reputation of the site.
As a good rule of thumb, it is suggested to have at least three, and no more than ten, keyword-driven content pages for every topic that you are attempting to rank for.
More Keyword Stuff? Oh Yes! … Keyword and Keyword Phrase Usage
After you have performed your keyword research for the topics that you want to target, there are some considerations that need to be addressed before actually sitting down to start writing. One of the key concepts that many successful SEO’s tend to focus heavily on is keyword or keyword phrase density.
Keyword density is the measurement of how many times a given keyword or keyword phrase is used in the entire body of the page. For instance, if your web page contains 500 words on the page, and your keyword or keyword phrase is used 10 times, you would have a density of 2%.If the keyword density is too high, the search engine algorithm may consider the results to be search engine spamming which in worst case scenarios, can have your site banned from the search engines. If on the other hand, the keyword density is too low, the page will not rank for the search term. Seeing that there are fairly severe consequences for creating a web page with a keyword density that is too high, it is generally suggested to air on the side of caution and limit your density to “safe” levels.
Though keyword density is a very controversial subject, I have found that keyword densities are relatively safe between 3% and 6%. However, you should keep in mind that algorithms may also consider an overall density of keyword usage for the whole site as well as for individual pages. If you have ten pages that use the keyword “real estate” at a density rate of 5 %, and those pages comprise half the pages on the site, then the site ratio for that keyword just jumped to 12.5 % which is not good.* (5% x 10 pages = 50% / 2 (half the pages)) = 12.5%
Keyword density should be used primarily as a gauge to determine if you have your search term either enough, or not enough times on a page, and should only be used as a gauge.
There are many other factors that contribute to ranking and density is really just a useful tool that can assist you in creating a good balance of keyword rich content. Keyword density alone will not produce search engine rankings.
*There is no evidence which suggests that search engines are calculating results based on total web site keyword density. However, many SEO’s consider total density to be an important factor to measure.
In Real Estate – It’s Location, Location, Location – In SEO – It’s Content, Content, Content
In search engine optimization, content CAN determine your location in the search engine results pages. Web sites that have a high quantity of quality content, naturally appeal to internet users. But, it is not the quality of the content itself that contributes to rankings – it’s the keywords in the content.
Most internet users are searching for informative articles regarding the subject matter that they are searching for. So, it is only natural that if a web site contains information that the user finds authoritative and informative, they will be encouraged to have confidence in the views, opinions, and suggestions that may be implied from the content. However, if a web site has very little or poor quality content that does not express a level of confidence in the consumer, they are most likely going to return to the SERPs’ to search for information that they find confidence in.
Many people that are involved in the creation of web content have a misguided belief that if the content on the site is good, the site will rank well based solely on the quality of the content. This is simply not true. Though good content is paramount to a search engine optimization campaign, the quality of the content has little if any significance in search engine rankings. Again, the quality of the content is not what contributes to rankings – it’s the keywords in the content.
So, you may be wondering, “If content is king why doesn’t that equate to rankings?” The answer is quite simple – the search engine algorithm.
The search engines give relevance to content on a web site based on the factors in their algorithm, not the quality of the content. So to achieve rankings, it is necessary to understand what the search engine considers as relevant. Not what you or I consider as relevant.
To answer the question that’s on the fringes of your subconscious … it’s the keywords in the content.
How to Put it All Together – Web Pages for the Search Engines
In order to achieve search engine rankings for your web site, there are two things that are necessary.
First, you have to have keywords that internet users are searching for which;
- There are not too many pages in the search engine index for …
- You need to have good keyword rich content that is created following your own keyword / phrase density levels.
The third part of the process is based on creating the content in a manner that will produce effective results. One of the first matters at hand is the amount of text that should be used. I recommend keeping the length smaller and adding more pages because search engine robots have been noted as only crawling to a depth of around the first 100 – 150 words.
Personally, I write my pages anywhere between 400 and 600 words in length and sometimes a bit longer. But the basic logic here is this: The more pages that you have, the more crawlable content you have. So, the answer is to simply create more small-sized pages. Furthermore, robots seem to crawl and index lightweight pages better. So, it might be wise to keep the page size rather low by limiting the amount of words, and especially the amount of images that you use on a web page.
Note however, that there really isn’t an ideal page size because different search engines index at a varying rate of depth and page weight. For instance, Yahoo! doesn’t typically index pages that are over 200KB. Google on the other hand, finds pages that are between 500 to 550 KB as “preferable” with regards to page weight.
When creating the content there are some very important guidelines that need to be considered as well. The placement of your keywords on your web page should be at the top of the page since the prime real estate for the robots are the first one hundred words or so. I generally like to follow a self-guided rule of writing the keyword or keyword phrase with in the first two opening sentences.
After that, I follow a formatting technique which tends to elicit excellent results for me, which is:
- Intro-bullet list
This simple format has several powerful elements that contribute to optimization here. The key to using this format successfully is the positioning of the first three elements; the paragraph, call-to-action, bullet list. The use of this combination supports:
- The creation of keyword or keyword phrases in the first few sentences of the opening paragraph.
- The ability to create keyword or keyword phrase based links in the Call-to-Action.
- Many more keyword linking opportunities and supporting topics in the bullet list.
Web pages that are created with this format and use relevant search terms in the bullet list have a tendency to rank for the content in the bullet list as keyword stems on the main keyword.
Oh Boy! Let’s work on In-Page Optimization
So now we come to the point where we start to look at the implications of actual HTML – web page creation. There are several key areas that are important to proceed carefully at when creating an optimization based web site. The first matter is deal with how the Meta tags are used.
There are three main Meta tags that are relative to the creation of optimized web pages.
The first, and with out a doubt the most important is the Meta Title tag. The Title tag should contain the keyword or keyword phrase and any relevant stems provided that the quantity of the entire title is within an acceptable character count. As with many areas of SEO / SEM, there are many opinions on how many characters should or should not be included in the Title tag. I normally stay between 50 to 60 characters when I am performing optimization on a site for a client. However, I will sometimes push the envelope if I am working on a site for which I am the only liable party. In short, if you use too many characters / words – the site may be penalized for “spamming” the search engines.
The next tag which I believe to be equally as important in an antecedent and consequent sort of way, is the Meta description tag. I feel that this tag is highly underrated and requires some consideration. The Meta Description tag works very much as a support to the Meta title tag and if it is created properly, can offer several opportunities to add keyword relevance and / or stems that may promote a wider marketing saturation for the keyword or keyword phrase.
For instance, let’s take a look at this example of the Meta Title tag and the Meta Description tag “in agreement” with each other:
<title>Meta Tag Optimization Services - Campaign - SEO Specialist - Consultant</title>
<meta name="description" content="Meta tag optimization services - To achieve high search engine ranking</xmp></pre><pre><xmp>it is important to understand there are many factors which contribute to the performance of an SEO campaign. ">
At first look, it seems that this is nothing more than a simple title and description. Well, it should because it is really that simple. The Title has the keyword phrase and relevant stems trailing after it to promote a wider saturation for the main topic of the keyword phrase. Though in this example the character count is a little higher than I would recommend, the Title contains strong keyword phrase support and the description reaffirms the keywords and stems so that the phrase is stated twice and the stems are stated twice. This is a simple example of how the description tag offers a lot of opportunity to increase saturation for relevant search terms.
The next Meta tag is the center of much controversy and that is the Meta Keywords tag. The rule for this tag is quite simple – and should be closely adhered to. Only use between 10 and 12 words at the most, and the keywords that are used in the tag should be on the page that you are placing the tag on.
Lastly, but equally as important, do not use keyword stems like “keyword, keyword stuffed, keyword stuffed Meta tag”. The search engine will consider this spamming and the web site could suffer quality deductions for improper usage.
It Might Look Great – But What If Your Customers Can’t Find It?
Though it may be much more complicated to build a web site based on the requirements of the search engine spiders, the results can be the difference between being on the first of the search engine results pages, and being so far in the results, that the site is rarely if ever visited for the search query which the page is (presumably) attempting to rank for.
Links – The Holy Grail of SEO
Good for Weatherford Texas Internet marketing and Fort Worth SEO Expert is the best and nicest to work with.
One of the main factors that a search engine relies heavily upon is the density of internal and external links. The largest major search engine, Google, gauges to a large degree, the “position” of a web site based on the quality and quantity of links for the site.
Internal links should be created with keyword relevance in mind. Link text and internal links are considered to be of major importance when weighing a web site’s relevance for a keyword or keyword phrase. The basic idea here is that a keyword in a link should point to a destination URL that is relevant to the keyword or keyword phrase.
If the keyword phrase is “web developer”, the link should be pointing to a URL that has the keyword phrase “web developer” in it. If the link has “web developer” in it and is pointing to a URL that has “contact form”, your keyword strength is “draining” and the value of that keyword is thereby reduced. I feel that an analogy using electrical current explains this type of scenario quite well. If the continuity is not consistent from end to end the power is either lost, or drained out of the connection.
Another important matter regarding internal linking is the amount of links on a page. My advice is to keep links to no more than one hundred per page. If a web page contains more than one hundred links, the search engines may consider the site to be part of a link-farm or a link-exchange which may result in obtaining negative consequences for having a “bad character”, and the site may receive penalties for being in a “bad neighborhood.”
A large part of search engine optimization is based on the quality of inbound links to a web site. The relevance of the links that point to a site, and the PageRank of the site that is pointing to that site, have a great deal of importance regarding the “class” of the site that is receiving the inbound link.
For simplicity, Google has categorized web sites into “classes”, much like the model that society or Object Oriented Programming where each web site (Object or Person) holds a certain degree or position based on their attributes, and characteristics.
This somewhat describes how Google considers the character of a site. Quite simply, web site links are much like that of neighborhoods. If you have neighbors (inbound links) that hold a position of status (respectable neighbors), then Google gives your web site credit as being of status based on the company you keep. (In a good neighborhood)
If however a site has low ranking inbound links that are of little or no relevance to a web site then Google considers the links as being of a poor quality and does not give credit to the site for those links.
That Should Get You … Started
Though this is no where near everything that should go in to a comprehensive search engine optimization campaign, you should be able to get a good start from here and build up enough information to get you on your way towards ranking in the SERP’s.
For a really great resource of “FREE” stuff on SEO, check out the resources in Google’s Webmaster tools … there is a wealth of information in there that should really help you get up to speed with search engine optimization. If that doesn’t have what your looking for … just Google it!