Saturday, May 17, 2008

How to Improve Search Engine Placement

Are you intent on achieving higher placement, rankings and readership for your website? If so, here are a few proven SEO fundamentals known for achieving optimal search engine placement.

The strategies covered in this article are excellent for improving search engine positioning as well as promoting recognition of your content, which develops trust rank and authority over time. Examples of this marketing tactic are prevalent in industries ranging from fortune 500 companies, non profit organizations, and social media networks as well as small businesses and individual blogs. Regardless of market or niche, the results are consistent top 10 placement and VIP status in search engines.

So, what is this closely guarded Seo secret that supercedes page rank? It is Website Trust & Authority, so let's take a look at how to earn it for your website and all subsequent pages published from your URL.

How to create Trust Rank and Website Authority for improving search engine placement for your Niche

One well known secret amongst industry experts is developing relevant content using exact match broad search terms and surrounding it with contextual (more specific) links and useful resources. For example, if you build a website based on this strategy (like Wikipedia's 1.9 million articles in the US for example) search engine algorithms associate your website as an authority since you are creating relevance for the various communities that comprise the web. The reward for such a contribution is hub status otherwise noted as pure search engine visibility. This is essentially because the prime directive of search engines is to provide users with the most relevant results that correspond to their query, essentially giving them what they want and when they want it (without the clutter).

Websites are evaluated with scrutiny from a wide array of criteria from search engine algorithms, their primary method to assess your site is through the use of spiders to collect snapshots of your content. It is the spider (otherwise known as a user agent, or web bot) that gathers data from all crawlable pages to assess if websites are providing valuable content, or fail to deliver relevance beyond their own self serving motives (we are the best, biggest, leading, professional, keyword, keyword, in our city, state, planet,etc.).

In which case, sites containing such generic terms rarely appeal to the Google bot, Slurp or other user agents that scour the web looking for truly UNIQUE content to assign authority and elevate in the algorithm. If you give the spiders what they want, they will return more frequently to glean your pages and assign rankings for your new pages as well as revive older pages you may have long since forgotten. Page rank is by the page, so essentially to increase your potential return on investment, each page of your website can target high volume key words with laser like precision if optimized properly.

One example of website authority, is when you see two web pages returned in a search query from the same website. This is a by-product of authority, multiple pages in your site that have the ability to rank independently equal a higher probability for conversion. This type of search engine placement occurs (multiple listings) when one or more of your keywords are common denominators on several pages.

It is important to vary the context of the keywords and not to group them too close together in your body copy. There are two reasons for this, (1) you could set off spam filters in the algorithm and impact your rankings in a negative fashion through penalties and (2) keywords grouped too closely rank are considered one keyword and only rank well if someone were searching for them the way they appear in the group. A keyword density of 3-5% for the total number of words on your page is sufficient of roughly one keyword per sentence, with no more than 3 per paragraph. Writing in this fashion will keep you well within the appropriate keyword density for achieving higher rankings as well as score points with the humans that read your posts.

When developing new content for your website, try surrounding your keywords with adjectives or qualifiers to augment their contextual impact (affordable, professional, best, etc.). This also provides excellent long-tail search results (unintentional optimization for your pages) when prospects are using longer queries in search engines to find related content. You should optimize EACH PAGE on your site for specific keywords (in order to broaden your net of catch phrases or keywords) for the web.

Elements known to create website authority and trust rank in search engines are:

1. Relevant Content
2. Consistent Updates to Your Pages
3. Deep Linking from Authoritative Sources

Relevant Content

Relevant content can be classified by factors such as it's usefulness, the vocabulary used to convey the message or conclusion or the resources it references. In addition content that is easy to read, copy that compels the reader to take action or piques their interest will typically garner a larger audience in the blog or social networking community for example, which translates to the ability to monetize your site with sponsored advertising programs, banners, etc.

When creating a new page, ask yourself does the content truly leave the viewer satiated with a sense of satisfaction? If so you have fulfilled their reason for visiting in the first place and will strike a chord with them. The result, trust which is favorable for sales conversion.

Write every page this way and you can determine who reads your content based on the optimization utilized, using your titles and descriptions as signal flares for search engines to identify the context of your pages. The real secret here is in tweaking the pages which leads into the next point.

Consistent Updates to your Pages

Refresh your content regularly to improve search engine rankings. It is a known fact, websites that refresh their content regularly can acquire hub or authority status.

A website can gain authority from exhibiting unique patterns which distinguish them from other sites such as

(a) a steady flow or increase in traffic
(b) increased link popularity from other sites proving valuable one way links (Wikipedia is an example of this)
(c) a healthy ratio of inbound to outbound links to or from other related sites.

This is why blogs for example, organically rank higher in search engines than most pages, since they are by their very nature aligned with the principles of what search engine algorithms reward. If you are not using a blog on your own server within your site as a subdirectory or sub domain, I would highly suggest it, due to the long term benefits it can provide for keeping all of your pages crawled regularly from search engines. This also works well in tandem to making tweaks and adjustments to your keywords, titles and descriptions which invariably have their impact on improving rankings.

The description tag, the bin and the ability to rank again and again for multiple keywords on each page:

One SEO secret for long-tail search engine optimization is, in your description tag for each page, try to use tactfully written "exact match terms" as much as possible for the main keywords on your page. One other point to note if you are optimizing your pages specifically for Google. Google has a memory of every revision you have every made to your site, so every page title and every description you every wrote for your pages is stored in a profile (known as the bin) for your site, when someone types a query, it checks the bin for an exact or partial match and then returns the most appropriate response from any number of websites and respective pages within those sites (you can see this in the description of the search results, they change based upon your query). So, in essence, the more you update your pages strategically, the more frequently Google returns and catalogs your content, the more keywords you can amass for each page in your website.

This is a true gem of knowledge if you realize that, even though you may have changed the page, the original version and all versions since are still in the bin, so imagine each page on your site being able to rank for terms that may or may not still exist those pages but still are indexed in the digital ghost of your pages. This means that over time, through strategic planning, you can build each page up for specific keywords and subsequent rankings associated with those keywords.

This is why you always hear experts in the SEO industry stating to refresh your content. By tweaking your website i.e. re-optimizing your pages (recycling your pages, tightening up the content or changing titles, descriptions and keyword variations) you compound the ranking potential for each page in your site. Using the bin / memory theory, you can continually optimize your pages for broader or more specific search terms. You can still rank for keywords, sentences or paragraphs that existed on your pages months ago as well as still return search results from the current re-optimized keywords that may be more in tune with the times.

Deep Linking from Authority Sites

Last but not least, one of the final ingredients in improving search engine ranking and gaining website authority is links. The more relevant the link the better, but more importantly if you can acquire links from websites with trust rank, unique C Class IP addresses (the first 8000 URL's created), or social networking or bookmarking sites such as web 2.0 sites like Digg, Reddit, Del.icio.us, etc. This can dramatically impact your own website authority, since websites with high page rank and thousands of inbound links have the ability to pass on their VIP influence from search engines to your site over time through one-way permanent links.

This is a legitimate method which requires patience, persistence and perseverance to attain the goal, but any noteworthy accomplishment can be measured by the fruit of it's labor. One thing to consider, is not loading up your home page with links from other sites, this appears unnatural, if you are using directories, blogs, social networking sites or article marketing to expand your search engine visibility, then make sure that when applicable that your spread the links evenly to sub directories, or less popular pages in your site.

The reason will become apparent over time, as you see pages with a page rank of zero from your website outrank page rank 4, 5 or 6 websites who may have strong on page SEO factors, but have neglected off page Seo link building to cement their rankings.

In summary:

1. Create value through providing on-topic, relevant content in relationship to the title and description of the page.

2. Re-Optimize content regularly with emphasis on the description tag and using exact match terms for high volume keywords.

3. Add breathtaking adjectives where possible or terms that can augment long-tail search results (affordable, professional, company, services, etc. )

4. Create new pages with the optimized goal in mind, instead of trying to make a square peg fit in a round hole. New content adds value (and additional page rank) and if you start with the goal in mind, you can build fresh (deep) links and rank in no time vs. trying to re-hash content indexed from older pages which are associated with narrower or less popular terms.

So now that you have a better idea of what is required, get busy and start building website authority and trust rank for your website today.

Jeffrey Smith is an seasoned search engine optimization strategist and founder of Seo Design Solutions in Chicago. Jeffrey has been involved in internet marketing since 1995 and brings fresh optimization methods and solutions for business seeking methods to improve search engine placement

0 Comments:

Post a Comment

<< Home