Open Mon - Fri 09:00-17:00
Email [email protected] Call Now! +44(0)1689 602 248
Open Mon - Fri 09:00-17:00
Email [email protected] Call Now! +44(0)1689 602 248

Search Engines and Search Engine Optimisation – Claim Your Own Share of Internet Business

Search Engines and Search Engine Optimisation – Claim Your Own Share of Internet Business

What Is A Search Engine?

A search engine is a computer program that searches documents for specified keywords and returns a list of the documents where the keywords matched the search criteria. Although search engine is really a general class of program, the term is repeatedly used to specifically describe systems like Google, Alta Vista and Excite, that enable users to search for information on the World Wide Web.

In general, a search engine works by sending out a spider to fetch and return as many documents as possible. Another program, called an indexer, then reads these documents and creates an index based on the words contained in each document. Each search provider uses a unique proprietary algorithm to create its indices such that, in an ideal world, only meaningful results are returned for each query presented to the search engine.

What Is SEO?

SEO is a process of improving web pages so they can rank well on search engines results for specific keywords. There are two categories of SEO: On-page optimisation and Off-page optimisation. On-Page optimisation simply refers to every methods and techniques used to change web page source code, while Off-page optimisation is more focused on external factors such as methods used to get back links into the website. At the core of SEO are the keyword phrases for which a high ranking in search engines results pages (SERP’s) is sought.

How Do Search Engines Work?

What we refer to today as a search engine is a more complex search portal. Search portals are generally synonymous with gateway, for a World Wide Web site that is or proposes to be a major starting site for users when they get connected to the Web and need to find information. There are portals of a generalised nature and specialized or niche portals. Some major general portals include Google, Yahoo, Excite, Netscape, Lycos, CNET, Microsoft Network. Here, I will examine the typical composition of a search engine results page (SERP). Every search portal delivers results from different data sources. The different ways in which each search portal prepares and serves these information to the users is what gives each portal its own unique identity.

The seo strategy you adopt for your site can be affected by changes to the way a major search portal presents its search results.

A typical search engine results page is comprised of three major components: spider-based (organic) listing, sponsored listings, and directory listings. Not all SERPs contain all three components, and some portals incorporate additional data sources depending on the search term used.

1. Spider-based (Organic) Listings

Spider based results are featured as the primary element of SERPs by most search portals. These listings are referred to as editorial, free, natural, or organic. For consistency, I will be referring to spider-based listings as organic listing throughout the rest of this article.

Each spider based engine uses its own unique algorithm, or formula to determine the order of search results.The databases that power organic search results primarily contain pages that are found by web crawling spiders. Some search providers offer paid inclusion and trusted feed programs that guarantee the inclusion of certain pages in the database.

Organic search listings are the primary focus for search engine marketers and consultants, but they are not the only focus. The continued use of pay-per-click is essential to a well rounded strategy. It paramount to know that most of today’s search portals do not operate their own spider-based search engine; instead they acquire results from one of the major search players like Google and Yahoo. There are many search portals in the market, but only a handful operate crawling search engines. The following major search portals operate their own spiders, Google, Yahoo, Ask and MSN.

2. Sponsored (Pay-per-click) Listings

It costs alot of money to run a search portal. Spider based engines operate an extremely high expense, and expense that most can’t afford. Portals that don’t operate their own spider based search engines must always pay to obtain spider search results from portals that does. Today, even the smallest size search portal can generate revenue through sponsored listings. Sponsored listings is simply where a company has paid to have their site prominently listed when someone searches for certain words or phrases. Sponsored listings is used primarily used as their primary search results by Meta search engines.

Sponsored listings in addition to helping search portal stay afloat, provide an excellent complement to organic search engines by connecting searchers with advertisers whose sites might not otherwise appear in the search results.

Majority of portals do not operate their own pay-per-click advertising service. They instead show sponsored results from one or more partners and earn a percentage of those advertisers’ fees. The major PPC providers are Google AdWords, Yahoo Search Marketing and Microsoft’s adCentre.

The PPC advertising model is simple. Advertisers place bids against specific search terms. Advertisers’ ads are returned with the search results, when searchers click on one of those words or terms. Advertisers are charged per click amount they bid for those terms each time searchers click on those words or terms.

Different ranking methods are used by different PPC providers to rank their sponsored listings. These methods start by advertisers bidding against each other to have their ads appear alongside the results returned for various search terms’ but each method has its own broad matching options to allow a single bid to cover multiple search terms.

PPC ranking systems are no longer as simple as allocating the highest positions to the highest bidders.Google’s methodology, combines the click through rate of an advertiser’s listing (that is, the number of clicks divided by the number of times the sponsored listing is displayed) with the advertiser’s bid in assessing where the PPC advertisement will be located. One of the reasons why Google has gained popularity significantly is, that its method tends to optimise the revenue generated per search.

3. Directory Listings

Directory originate from human edited web directories, such as Look Smart, The Open Directory, and the Yahoo Directory. Most search portals offer directory results as an optional search, thereby requiring the user to click to see them. Search portals have lately reduced their emphasis on directory listings, as quality of organic search results improved.

However, the importance of directory listings has not diminished in obtaining organic engines rankings.

Search Engine Optimisation

It is vital to note that there are two steps involved in getting a web page into search results pages.

Getting in the search index.
Getting the web page to the top of the final indexed results before it is presented to the searcher.
The first step is relatively easy to accomplish. You need to let the spiders know that your new web pages exists and how to find them. This can be easily be achieved by pointing to the new page from an already indexed page. Some search providers provide an option for a new URL to be manually submitted for inclusion into their index.

Achieving step 2 successfully is the most difficult. Bulk of Search Engine Optimisation tasks is centered around this. Search engines spend some incredible amount of time and effort on making their algorithms find find the best ways to effectively rank sites. According to the Search Engine leader Google, there are in excess of 200 factors that determine the rank of a web page in the SERPs (Search Engine Results Pages).

SEO is the process of making changes to your web pages to conform to Search Engines standards in order to rank at the top of Search Engine Results for related keywords that your site optimised for.

What Do Search Engines Look For?

Before delving into the specifics of SEO, let me first explain what search engines really want.

Search Engines have a goal of been able to fulfill the search needs of its users by providing with highly targeted relevant information. Google makes most of its money from selling advertisements that are served along with regular search results. Google is able to do this because it is the market leader and the most popular search engine in the world.Google is widely used because it provides users with the relevant results from their searches.

What Search Engines Don’t Like

It is extremely important to have a sound knowledge of what search engines don’t want in order to succeed in your SEO campaign. The lack of this knowledge may mean that, your perfectly optimized site may not be indexed and could even be banned. Therefore, when you learn about factors that influence search engine rankings, you should also learn which tactics to avoid.


Search providers hate tactics intended to fool them into awarding high rankings to irrelevant pages. These tactics are called “spam.” They strive to provide the most relevant results to their users, but spam clutters their indices with irrelevant information.

Some website owners create spam after they have learned which criteria spiders use to rank pages. For example, they give high scores to pages filled with keywords. This is termed as keyword stuffing. Website owners and Webmasters came up with a way to add more keywords without sacrificing a site’s appearance. Invisible text are used ( the background and the text are the same colour, so the text is not seen by the visitors). Previously, robots that indexed invisible keywords ranked those sites higher for keyword frequency and weight.

Spiders are now familiar with this technique and class it as spam. The use of invisible text can get your site banned from most leading engines.

The following techniques are usually considered spam by major search engines:

Meta refresh tags
Invisible text and overuse of tiny text
Irrelevant keywords in the title and meta tags
Excessive repetition of keywords (classed as keyword stuffing)
Overuse of mirror sites (same sites that point to different URLs)
Submitting too many pages in one day
Identical or nearly identical pages
Submitting to an inappropriate category (for directories)
Link farms
Frames, dynamic content and Flash intros.
Although search engines won’t penalize you for the use of frames, dynamic content and multimedia files, they will have difficulty indexing such pages.

Recently, some engines started to index dynamic content. However, most search engines are still unable to index multimedia and dynamic pages, and those that are, don’t index all of them. Here’s a list of files that search engines don’t index:

Text in graphics (use ALT tags)
Pages that require registration, cookies or passwords
Java applets
Acrobat files (PDF), except Google
Dynamic content (URLs with “?” in them), except Google, AltaVista, FAST and Inktomi
Multimedia files (Flash, Shockwave, streaming video)
Build Workaround Pages to Avoid Indexing Problems

If your site consists largely of files that search engines don’t index, create workaround pages. Workaround pages should contain the most important information about your products or services. Workaround pages should be optimized just like doorway pages.

Search engines will index workaround pages even if they can’t index the rest of a site. Always link to these pages from your site map to make sure search engines can spider them, and submit workaround pages rather than non-optimized pages.


SEO Best Practices to Consider

There are tried and tested SEO tactics that you can use to ensure your site is successfully crawled, indexed, and positioned in the major search engine results pages. Though SEO tactics evolve with the technologies used to render Web sites, certain fundamentals remain constant.

How can you tell if your site adheres to current SEO best practices? You should start by providing honest answers to the following questions.

Are the keywords you’re targeting relevant to site content?

Are targeted keywords popular phrases used in search engine queries?

Do page titles start with your targeted keywords?

Does your site employ H1 header tags for prominent content titles?

Is your permanent body copy contextually sufficient and keyword-rich?

Do text links include targeted keywords that point users to pages within your site?

Do you use CSS image replacement in graphical navigation on the site?

Do graphics used in the site have descriptive, keyword-rich alternative attributes that are useful for visitors to your site?

Does your Web site have a site map with text links?

Do the URLs of your dynamic, database-driven pages look simple and static?

Does your site have a flat and simple directory structure?

Does your site’s home page and other key category pages have PageRanks?

Is your site listed in Open Directory?

Do you routinely list your site in other trusted, human-reviewed online directories?

Does all pages in your Web site have keyword-rich meta tag descriptions?

Does your site have a custom error page?

Does the site’s file names and directory names include targeted keywords?

Does your site avoid using pop-ups?

Is the exact same content visible to both users and search engine spiders?

Do you avoid free-for-all linking offers?
If you take time to Implement the above 20 most fundamental elements of SEO best practices (or at least bulk of them) and avoiding the worst seo practices should provide you with a sound approach to better visibility for your Web site in major search engines.

Dr. Jatau, D.K. is Technical Director/Solutions Architect Webxcell Media & Hosting Group, an award-winning web hosting and web Development Company located in Nottingham, Nottinghamshire, England. To claim your own share of internet business with our search engine visibility product, visit us at: or call us direct at: 44 (0) 8450 52 35 44.

Copyright © 2010 by Dr. Dan K Jatau and Webxcell Hosting

Article Source: