Sunday, February 4, 2007

eMarketing Sabotage - Top 10 Steps To Kill Your Search Engine Marketing Practices

We at America Web Works find ourselves amazed at the amount of effort people spend trying to fool or manipulate their positioning in search engines. People seem to focus on the shortcuts to success and NOT on their Web site or the true value their content provides to their prospects.

In the spirit of educating marketers about best practices, we present this list of ten things you can do to sabotage your search engine marketing project in a "New York" second.

1. Invisible (Ghost) Text

You have kept a good secret! Your visitors might not have noticed, but all search engine crawlers have been trained to be on the lookout for this obvious technique, last fashionable circa 1997. The search engines may very well purge all your pages from their index due to deceptive practices.

And, if you are feeling really frisky, you can make this technique even more effective if the invisible text has absolutely nothing to do with the content of the page it sits within.

2. Frames Usage

Search engines are not "frame-friendly". Once they encounter a pesky frame, they either stop flat in their tracks because the frame doesn't give them anywhere else to go, or they locate the pages beyond the frames and point people to that locale - which won't have the frames included with it.

There's truly no need to use frames and make attempt to justify it by believing it will improve the prospect's experience.

If your prospects can't uncover your site or they find slices and slivers of you, how much then have you actually assisted them?

3. Why Be Fresh And Original?

Why try to be unique, it's just too hard anyway? It sounds foolish, but it occurs quite often. If you find something of real interest on another site, just burning a copy and slapping your links on the top does not make you a unique force on the Net. And how many actual shopping sites selling the exact same discounted products are enough for the average Web? In my book, the more sites you mirror, the least effective you will become.

4. Chubby Web Pages (Obesity Kills)

Sites with lots of graphics, animation, Flash and music do pose many disruptive elements with the search engines. Not only can it confuse your prospects, who are looking for obvious information and links, the search engines may not feel you are very relevant because they cannot be sure what to make of your Web site.

If you have a site made up of nothing but heavy graphics and multimedia, not only will you give the search engines zero to index, you may also aggravate any prospect running with a slower connection speed. In nothing else, at the least, use ALT-tags to explain images for text browsers, the hearing impaired and search engines.

5. Redirects, Redirects, And More Redirects

You may be using "redirects" within your Web pages to track clicks for advertising and also to pull together information about your site visitors. Your Web pages may be indexed, but you may not rank well at all. The search engines may not be able to see the correlation that exist between your Web pages because the redirect code often blocks their path, unlike direct text linkage.

6. Lengthy URL's

Dynamic (ever-changing) e-commerce and shopping Web sites that use parameters and their session ID's often manifest these difficult URL's nicely.

If your Web site has lengthy URL's sprinkled with question marks, percentage signs, Session ID's, and at least three parameters, you're degrading your hopes for search engine superiority.

Lengthy URL's do not look very attractive to individuals searching and the site URL's contain calls to various databases.

Leading the way for the search crawler directly into your database may quite possibly be a sure-fire way to send them spidering elsewhere.

7. Forgotten about your No Index Tag and Robots.txt?

Have you created a plan to keep all those nasty search bots out? Do you have a robots.txt file living on the root of your site? Does this file contain the following:

User-agent: *disallow /

Or does your Web property have a Meta-tag:

Be extra nice to your Webmaster. He or she may depart from your company in the future and leave this little monster behind for you to find at the end of a needlessly expensive investigation into why the search engines will not make nice with your Web site.

If you are using the special robots protocol, you will not want to forget to remove them altogether if you are going live from a beta testing process.

8. Doorway Pages

Doorway pages (also know as jump pages and bridge pages) and anything that is created specifically for a search engine and does not contain more than valuable content or products for your prospect, is not an effective search marketing tool.

If you're not providing true content, the search engines will discover this and may penalize your entire online site. If you've jammed yourself into this hole, you'll probably need to return back to start with a new domain name.

9. Identical Meta-Tags And Titles

You worried over every single unique page of the Web property while developing it, but you didn't spend a lot of concern that each page should be tagged (or classified) that way.

Imagine walking into your public library where every single book had the exact same title. What better way to tell a search bot to "take a hike" than showing them that all of your content is exactly the same. You will most likely see fewer of your Web pages indexed and much less traffic than you might otherwise.

Here's a quick checklist to consider for your Meta-Tags and Titles:

* Do they deliver a "call to action"?

* Do they use relevant keywords and phrases?

* Is your "Title" less than 80 characters?

* Do they accurately describe what the page is about?

* Are these consistent with the page?

Free Meta-Tag Builder:
http://www.americawebworks.com/metatagplus/

(Be sure to bookmark that link!)

10. Linking Networks

Did you find a service that's offering to link thousands of other Web sites to you today? Taking part in these programs may effectively indicate to the search engines that you really do not want their valuable traffic. The quality of these link pages as well as their overall content "value" to a human visitor is very low.

Most search engines do come together in agreement and can severely penalize accordingly. Sites that get marked as link spammers, may be briskly informed that they should find a new domain name and begin all over again.

I advise you to take these lessons in "eMarketing Sabotage" for what they are, guidelines to help you operate your good e-business practice free and clear of the many pitfalls and mistakes of other marketers and improve on your own level of success in conjunction with search engines strategies.

Soon, with a sound plan, a bit of smart work and a solid attention-to-detail approach, your Web pages may rank highest among today's top search engine results.

Happy Marketing!

How To Design A Search Engine Friendly Website

There are many websites that fail to target their required traffic, even if they've had some search engine optimisation work done. One of the main causes for this is simply because the website isn't search engine friendly. This is a basic essential that needs to be incorporated into the design of all websites at the outset ? think of it as the foundation to establishing your search engine optimisation strategy.

This article aims to highlight the areas a web designer should think about and incorporate into their design for maximum search engine effectiveness:

1. Search Engine Friendly Pages

It is important that when you design your website you not only bear in mind what your website requirements are, but also what the requirements are for search engines. Best way to approach this is to remember that search engines don't really care about how nice or complicated your graphics or flash movies are, or how snazzy your javascript is. Instead search engines look at the code behind your page. Therefore if you want to impress a search engine, then your code needs to be nice and easy to read. Now from this I don't mean adding 'comment' tags and breaking the lines of code up with spaces, but to ensure that the elements the search engine is interested in, i.e. Title tag, Description tag, Keyword tag (these days only some search engines really use the keyword tag), Alt tag, are readable near the beginning of the code. Search Engines don't like wadding through lines and lines of javascript to get to the core areas that can help you page's ranking. Therefore careful planning and positioning of your page elements is required.

TIPS:
- If you're using table for laying out your page then make them simple and not too complex.
- Avoid using frames.
- If you need javascripts for navigation purposes, then use smaller scripts to call up the bulk of the javascript from a different file.
- Think twice on how to use graphics ? make them relevant to your content and use the Alt tag for all images.
- Position the main content of the page before the images, or at least with the images nested between the text.

2. Keywords

Having good keywords is one of the most important areas to consider when designing a website/webpage.

One of the best tools for this is Wordtracker (www.wordtracker.com), which allows you to identify good competitive keywords for your pages.

In general the range of keywords associated to your pages can be very extensive therefore for good concentration and prominence of keywords it is advisable to carefully select the top 10-15 keywords. You can always export the results to Excel and try out other competitive keywords if the ones you selected initially do not produce any noticeable benefits.

TOP TIP:
Wordtracker offer a one day subscription to their service from which you can squeeze nearly 2 ? days worth of use! Here's how ? Sign-up for the service on the evening of Day 1 (the service will be available almost immediately so you can start searching for your competitive keywords straight away). You will also be able to use the service for the whole of Day 2 and strangely for the whole of Day 3! Enough time to get some good keywords for a lot of pages!

3. Content

Many search engines look at the main body of the page and identify keywords and phrases that are used within the text.

TIP:
Use competitive keywords relevant to the purpose of the page within the main body of the page. Always try and ensure that the keywords are prominent within the text body, i.e. they appear near the beginning of the page, they are defined using the 'heading' tag, they are typefaced in bold, or they are used as hyperlinks.

4. Page Title

This is arguably one of the most important areas of a page and needs special attention to ensure that a good title is selected. Similar to many other areas of designing a search engine friendly page, the Page Title should also have a good keyword which describes the page content. To keep within the limits of many search engines the number of words for the Title shouldn't exceed nine.

5. Page Description

Another important area to work on for good ranking is the Page Description. This is the text found under the META Description tag and is displayed to users in the search results. Again, it is a good idea to pay attention to the use of good keywords when writing the description, which should be short (not more than 20-25 words) and sells your page before the user has even opened it!

6. Graphics

We've covered the use of graphics briefly above, emphasising the importance of using an Alt tag containing the relevant keyword(s). Although the use of images can be nice and very appealing to a website, it is also important to bear in mind that they shouldn't overpower the textual content of your page. As a general rule of thumb it is best to stick to a 70/30 ratio (70 text/30 images).

7. Site Map

A Site Map is a fantastic way for search engines to find all your juicy pages on your website. There are many free Site Map tools available on the web that'll create your site map instantly.

8. Navigation Links

Navigation links to other pages on your website should be nice and easy. There are some engines which find it difficult to navigate through to the other pages on your website if the nav bar is too complicated, e.g. complicated pop-ups, use of flash, etc. Therefore if your site does have complicated navigation then it's always a good idea to implement simple text based hyperlinks to your common pages at the bottom of every page on your website.

Following the basic suggestions above will help lay the foundation to apply further good search engine optimisation advice which will make the difference in your overall search engine ranking.

This finer area of SEO is beyond the realm of this document and will require further investment based on individual needs.

Arif Hanid
Internet Marketing Manager for Ambleton Computing.
Professionals in bespoke Internet Developement and Marketing.

Search Engine Marketing 101 For Corporate Sites

When most people want to find something on the web, they use a search engine. Millions of searches are conducted every day on search engines such as: google.com, yahoo.com, msn.com and many others. Some people are looking for your website. So how do you capture people searching for what your site has to offer? Through techniques called search engine marketing (SEM).

This tutorial is foundational information for anyone looking to implement search engine marketing. This tutorial will also help you understand how the search engines work, what SEM is, and how it can help you get traffic.

What is a Search Engine?

All search engines start with a "search box", which issometimes the main focus of the site, e.g. google.com, dmoz.org, altavista.com; sometimes the "search box" is just one feature of a portal site, e.g. yahoo.com, msn.com, netscape.com . Just type in your search phrase and click the "search" button, and the search engine will return a listing of search engine result pages (SERPs). To generate SERPs the search engine compared your search phrase with information it has about various web sites and pages in its database and ranks them based on a "relevance" algorithm.

Search Engine Classes

Targeted audience, number of visitors, quality of search and professionalism is what determines a search engine's class. Each search engine typically target specific audiences based on interest and location. World-class search engines look very professional, include virtually the entire web in their database, and return highly relevant search results quickly.

Most of us are familiar with the major general search engines; google.com, yahoo.com, msn.com. A general search engine includes all types of websites and as such are targeting a general audience. There are also the lesser known 2nd tier general search engines; zeal.com,ask.com,whatyouseek.com. The primary difference is that 2nd tier engines are lesser known and generate significantly less traffic.

There are also several non-general or targeted search engines that limit the types of websites they include in their database. Targeted search engines typically limit by location or by industry / content type or both. Most large metro areas will have local search engines that list local businesses and other sites of interest to people in that area. Some are general and some are industry specific, such as specificallylisting restaurants or art galleries.

Many other targeted search engines list sites from any location but only if they contain specific types of content. Most webmasters are familiar with webmaster tools search engines such as; webmasterworld.com, hotscripts.com, flashkit.com and more. There are niche SEs for practically any industry and interest.

Search Engine Models

There are two fundamentally different types of search engine back ends: site directories and spidering search engines. Site directory databases are built by a person manually inputting data about websites. Most directories include a site's url, title, and description in their database. Some directories include more information, such as keywords, owner's name, visitor rankings and so on. Some directories will allow you to control your website's information yourself others rely on editors that write the information to conform to the directory standards.

It is important to note that most directories include directory listings as an alterative to the search box for finding websites. A directory listing uses hierarchal groupings from general to specific to categorize a site.

Spidering search engines take a very different approach. They automate the updating of information in their database by using robots to continually read web pages. A search engine robot/spider/crawler acts much like a web browser, except that instead of a human looking at the web pages, the robot parses the page and adds the page's content it's database.

Many of the larger search engines will have both a directory and spidering search engine, e.g. yahoo.com, google.com, and allow visitors to select which they want to search. Note that many search engines do not have their own search technology and are contracting services from elsewhere. For example, Google's spider SE is their own, but their directory is and Open Directory; additionally aol.com and netscape.com both use Google's spider SE for their results.

There are a few other search engine models of interest. There are some search engines that combine results from other engines such as dogpile.com and mamma.com . There are also search engines that add extra information to searches such as Amazon's alexa.com, which uses Google's backend but adds data from its search bar regarding tracking traffic to the site.

Getting In

One of the most important things to understand about the SE database models is how to get into their database and keep your listing updated. With a search directory, a submission needs to be done to provide the directory all the information needed for the listing. It is generally recommended that this be done by hand, either by you or a person familiar with directory submissions. There are many submission tools available that advertise they automate the submission process. This may be fine for smaller directories but for the major directories, manual submissions are worth the time.

Not all search directories are free; many charge a one-time or annual fee for review. Many of the free search directories have little quality control. For free directories you may have to submit your site several times before being accepted.

There are three different methods for getting into spidering search engines; free site submission, paid inclusion and links from other sites. Virtually all spidering SEs offer a free site submission. For most, you simply enter your url into a form and submit. Paid inclusion is normally not difficult, except for the credit card payment. For free site submission there is no quality control. The SE may send a spider to your site in the next few weeks, months or never. Typically with paid inclusion you will get a guarantee that the page you submitted will be included within a short amount of time. The other standard way to get included is to have links to your website from other web pages that are already in the SEs database. The SE spiders are always crawling the web and will eventually follow those links to find your site.

Once you are in a search engine database, you might change your site and need the search engine to update their database. Each directory handles this differently; generally each database will have a form for you to submit a change request. Spidering search engines will eventually find the change and add your updates automatically.

Getting High Rankings

Getting into a search engine database is only the first step. Without other factors you will not rank in the top positions, a prerequisite for quality traffic. So how do you get top positions? You can pay for placement with sponsored links that is covered in the next section. To place well in the free, organic SERPs, you will need to perform search engine optimization.

Search engine optimization is one of the most complicated aspects of web development. Each search engine uses a different algorithm, using hundreds of factors, that they are constantly changing, and they carefully guard their algorithm as trade secrets. Thus no one outside of the search engines employ knows with 100% certainty the perfect way to optimize a site. However, many individuals called search engine optimizers have studied the art and derived set of techniques that have a track record for success.

In general, there are two areas to focus on for top rankings; on-page factors and linking. On-page factors mean placing your target keywords in the content of your site in the right places. The structure of and technologies used on your website also play a role in on-page factors. Linking, refers to how other website's link to yours and how your site links internally.

Search Engine's Marketing Offerings

Search engines in the early days of the web were focused solely on serving the visiting searcher. They worked to capture as much of the web as possible in their database and provide fast, relevant searches. Many early website owners learned to reverse engineer the relevancy algorithms and to make their sites "search engine friendly" to get top rankings. They were the first search engine optimizers, manipulating the search engine's natural or organic SERPs as a means of generating free web traffic.

Often times these optimized sites compromised the integrity of the SERPs and lowered the quality for the searcher. Search engines fought, and continue to fight, to maintain the quality of their results. Eventually, the search engines embraced the fact that they are an important means for marketing websites. Today most search engines offer an array of tools to balance website's owners need to market while maintaining quality for the searcher.

You can generally break search engine marketing tools into free and for-pay. Realize these classifications are from the search engine's point of view. Effort and expense is required to setup and maintain any search engine marketing campaign.

Organic rankings are still one of the most important ways to drive quality traffic. Search engines now seek to reward ethical, high-quality websites with top rankings and remove inappropriate "spam" websites. While organic rankings can produce continual free traffic, it takes time from an experienced individual to achieve optimum results. Additionally, organic placement offers no guarantees, it generally takes months to get listed and can be unpredictable once listed.

Some search engines offer services that add more control to your organic campaign. Most of these services will list / update your site faster or will guarantee that all essential content is listed. For integrity reasons, no major search engine offers higher organic rankings for a fee.

If you need top rankings quickly, pay-per-positioning (PPP) is the most popular way to go. PPP rankings appear in normal organic SERPs but are usually designated as "sponsored listings". PPP listings use a bidding process to rank sites. If you are the top bidder, e.g. willing to pay the most per click on a given phrase, you will have top placement. The 2nd highest bidder is two; the next is 3 and so on. While most PPP works using this model, some search engines offer modifications such as Google's AdWords where bid price and click-through rates are both factors for positioning.

Search Engines have many other marketing tools, such as search specific banner ads; listings on affiliate sites and more.

Getting Started

The majority of websites have sub-optimal search engine marketing. Most sites have no effective search engine marketing and are continually missing out on valuable leads. Many other websites are too aggressive, wasting money on low value traffic or harming the functionality of their site due to over optimization. Too many sites are even paying money and receiving no results because they have trusted unethical or inexperienced search engine optimizers.

All SEM campaigns should start with a strategic evaluation of SEM opportunities based on return on investment (ROI). You need to assess how much each lead is worth for each keyword phrase and determine which SEM tools will achieve the best ROI for the phrase.

You also have to decide how much you want to do in-house vs. retaining an expert. A qualified expert will typically produce better results faster, but the high expenses may destroy the ROI. Often it is best to work with an expert as a team, the expert to develop the strategy and internal staff to perform implementation and ongoing management.

Tom McCracken is the Director of LevelTen Design, a Dallas based e-media agency. He has over 14 years of experience in software engineering and marketing. He has developed solutions to improve custom service and communications for some of the worlds largest companies. With an education in chemical engineering and economics from Johns Hopkins University, his background includes; web and software development, human factors engineering, project management, business strategy, marketing strategy, and electronic design.

Choosing a good domain name isnt always so simple.

So you need a domain name for your brand new internet business. You may even have some cool ideas for a new domain name combination that will really impress your friends. Question is, is your new domain name going to help your business or hurt it?

What could be simpler than choosing a domain name right? Wrong. There are a number of things you need to consider and research before you register your favorite domain name.

First off, what is a domain name and why would I want one?

A domain name makes our lives much easier when surfing the internet. You see, all computers on the internet are actually referenced with what is called IP addresses. On the internet, IP addresses are four sets of numbers that serve like street addresses allowing two computers to talk over a network. An example of an IP address is the one for Google.com. It is 216.239.39.99. If you enter this IP address into the address bar of your browser it will bring you to Google's home page in that very same way that typing www.google.com would get you there. Unfortunately, we humans have difficulty remembering our phone numbers let alone so many digits for all kinds of sites. That's one of the main reasons domain names were invented.

Domain names make it easy for us humans to remember how to find a site. Most people know Google.com and anyone familiar with the internet knows that to reach Google, you simply type www.google.com in your address bar and you are transported to their website. The same goes for Disney.com, Microsoft.com, CNN.com, etc?

Now you would think that choosing a domain name would simply be a matter of choosing something that is unique and that people would remember. The problem with that approach is that most of us don't have the money needed to turn our name into a brand name on the mass market. Most of us need to rely on our prospects reaching our website through other means. The best of these are search engines.

Choosing a good domain name for your site starts with the main keywords you have chosen to focus on for your website. Before you launch your business, you should conduct some preliminary research online to determine which keywords have the most traffic and the least number of other websites competing for that particular keyword. Some tools that help in this are the Overture keyword suggestion tool and Wordtracker.com. Both of these tools will give you a rough idea of how much traffic each of your chosen keywords will likely get each month. This helps to determine which keywords to focus on.

Should you choose a domain name that includes your main keywords?

In most cases, the answer is yes. Google and to some degree Yahoo both give you a small boost for your domain name. If your domain name happens to contain your targeted keywords, your domain name will help you in your quest for higher search engine rankings. Now if you do everything else wrong, having your main keywords in your domain name will not magically catapult you to the top of the listings. Many other parts of your site must be working for you as well. Other things you can do to improve rankings are beyond the scope of this article.

Choosing a keyword rich domain is a smart business move.

For some sites, it could be the edge they need to move up a few spots in the search engines. When choosing a keyword rich domain name, you may want to consider hyphens between your keywords. An example is cheap-airline-tickets.com. Current research trends for Google and Yahoo suggest that hyphens are the only way to separate keywords within a URL that will give you a rankings boost.

Why not simply choose your company name? Simple. Is your company a household name? Are you so dominant in a category that people have stopped referring to the generic name of your category and use your brand name like Kleenex has for tissue paper? If so, register your company name. If not, register a keyword rich domain wherever possible.

You may be thinking, "But I already own a domain name that is my company name. Should I go and register a new domain and point it to the same site? The short answer is no. Years ago, you could improve your rankings on search engines simply by setting up lots of doorway pages and having them all link back to your home page with all kinds of domain names. That tactic nowadays can backfire. You are better off optimizing individual pages within your existing website than you are creating a whole bunch of "fluff" sites just to increase rankings.

The technique I suggest above is really best suited for brand new business ventures. If you still have not registered your domain name for that special online business you are about to start, then make it keyword rich wherever possible. If you have already launched your business, you'll just have to take advantage of this information next time you start another online venture.

Your Website Title Could Be Costing You Money

Nothing could be simpler than the title you give to your web pages right? Unfortunately, the vast majority of the websites I visit these days have absolutely terrible titles that hurt their online business. The title of your website is a very important part of getting good rankings on most of the major search engines. A good title also goes a long way towards getting your prospects to click on your listings.

If you go to Google right now, and type any search phrase you want, you get back a listing of web sites that match the keywords you entered in. If you look closely, you'll notice that each search listing' hyperlink is also the title of that website. The title you choose needs to describe to your prospects what your website is all about. It needs to be able to entice your prospects to click on your listing over any other listing. If your title is simply your company name, you are most likely loosing lots of traffic. You will also find it difficult to rank highly on relevant keywords to your site.

Here are some things to consider:

1. Make sure you use relevant keywords

Keywords are simply search terms that your web site prospects will type into a search engine in order to find you. The keywords you are targeting need to be included in your title. Your keywords also need to be as close to the beginning of the title as it makes sense to do. For example, if you were selling shoes online and you were targeting the keyword "children shoes", you could have a title like "Children's shoes for hard to fit children." Notice how the targeted keywords were at the beginning of the title. Putting your keywords at the front of your title speaks to keyword prominence. Prominence refers to the importance of your keyword in the title. If your main keywords are at the very beginning of the title, it is said to have a prominence of 100%. If they are at the very end of the title, they have a prominence of 0%. As much as possible, you want to have your main keywords appear towards the beginning of your title.

2. Consider using your main keyword twice in your title

If you are optimizing your site to rank well on Google, you should also consider finding a way to include your main keyword twice in the title. The trick is to do this without making the title sound stupid. One way I do this is to use the pipe character | between your main keywords. For example, if I was writing a title for a fishing website and the main keyword I was targeting was 'fishing charter' I could repeat the keywords this way, "Fishing Charter | Are you ready for a fishing charter you won't soon forget?" This example gets my target keyword at the beginning and manages to repeat it again without making it look stupid.

3. Persuade your prospect to click on your link

The link that your prospects will see when they do a search of your website on a search engine will almost always be the title of your website. Even if you get to the first page on a search engine for the keywords you are targeting, you still need to persuade your prospect to click on your link over all the others around you. If you title is not persuasive, or even non-existent, you won't get the traffic you expect even if you are number one in the listings. Your title must be persuasive.

4. Say what you want in 65 characters or less.

Almost all search engines limit the length of the title that will appear to the search engine surfer. Google for instance, only displays the first 60 to 66 characters. Sometimes, a webmaster will try to include every one of their keywords in the title in the hope that all of their keywords will be picked up by the search engines. Keep your main keyword prominent in the first 65 characters of your title. Do this while making sure that your title is properly targeted to your target market. You can include your secondary keywords in the body of your web page, but keep them out of the title unless it makes sense to keep them in. The rule of thumb for including secondary keywords in your title is to include them only if you can still keep the title persuasive to your website prospects.

Your website title is crucial to your success online.

Your title is vital to your efforts of getting traffic online. Make sure it is descriptive, and persuasive. It needs to include your main keywords as close to the start of your title as it makes sense to do. You need to avoid repeating the same keywords over and over again. This may work for keywords that have little or no competition on them, but it won't work for any keyword that gets even a decent amount of traffic on them.

The Budget Webmaster?s 6 Step Guide to Improving Existing Rankings in Google

The Budget Webmaster's 6 Step Guide to Improving Existing Rankings in Google

You know the scenario. You get an occasional click from Google for a certain keyword. You go to find out why you aren't getting more clicks, and you find out that you're ranked in the 30's, 50's, or heaven forbid, the 300's. "Great", you think, "I finally get ranked for a good keyword and it's a worthless ranking".

Not necessarily.

If you got ranked for a keyword you wanted At All, the game's not over yet. If your site's content is geared towards that subject, you can get your ranking in search engines increased, at no cost. How?

The first thing you want to do is find out how well you are ranked for this keyword. For Google in particular, this used to be a difficult chore. In the old days of 2003, you'd spend your valuable time doing a search on your desired keyword, then a sub-search for your site, and crawling through pages of listings to find out exactly where you stood.

Now there is hope in the form of the following website. Direct your browser to:

http://www.googlerankings.com/index.php

You can use this site to find out what number you come up for in the Google listings, which can be very powerful information if used correctly. If you're ranked in the top 1000, you have a shot at raising your listing for that page by tweaking the page to be a little more relevant.

So, secondly, you have to know how good a shot you have at getting a better listing. Go to:

http://www.searchguild.com/difficulty/

I posted a tip about this a month ago, and it's also in the free optimization Guide I released the week of March 7th. It tells you how hard it is to rank well for certain keywords in Google. You'll need a free Google API key to use it.

Now that you know your chances, the third piece of information you need to know is how much traffic you can expect. Digital Point has a free tool that gives an approximation of how many hits per day a good ranking gets. Access it here:

http://www.digitalpoint.com/tools/suggestion/

Okay, let's say everything checks out so far. You rank in the top 1000. The term you want won't be that hard to get, and will get you enough traffic per month to justify your efforts.

Our fifth step is to take the term you chose and optimize your page.

This site does periodic reports on the search engines, and their February report gives their analysis of what the best ranking pages in Google have in common. And as a free bonus, it will also tell you what Yahoo wants. Follow the following link for details- http://www.gorank.com

Now that you know what to shoot for, you need to know how the page you want will measure up- you need to calculate your keyword density. You can also do the sixth step at gorank.com - it has a free tool that will calculate it for you. Prepare your page with that in mind, re-upload, and you're almost done.

Great, you're all set. Now you should submit your site to Google, right?

Wrong. Absolutely not. If you can help it, you should never, ever submit any page of your site to Google. Let it find you. HOW it finds you can affect your page rank. I don't mean that there is a standard penalty for submitting. There's been speculation on that for a while but I have yet to prove it matters.

What I DO know from personal experience and testing on my member's sites, is that getting the Googlebot search engine spider to happen upon your site shaves up to 6 weeks off the standard time it takes for indexing. You can show up in Google in as little as 4 days.

Which site links to you can also affect your Google Page Rank. While this is not as important as it once was, it still carries significant weight? my site didn't start getting spidered on a daily basis until my Page Rank increased to 5.

So even if the spider comes to your site on a Monthly basis, you're better off waiting for the spider to come back by. That's the seventh step, let your page be re-discovered with it's great new changes.

And yes, there's an even faster, better way to get Google.com's search engine spider to re-index that page, but that's another article, isn't it?

If Content is King, then surely Relevance is Queen!

There has been a lot of to-ing and fro-ing in the search
engine world of late and there are lots of conspiracy
theories as to why these things happen.

It is easy as a webmaster to get caught up in these webs of
intrigue.

You get email notes about them, you view so-called experts'
thoughts on bulletin Boards -  hey you probably even read
things in newsletter articles!

Well I hope so anyway....

The big driver for webmasters currently appears to be
content and link building. 

While link building is important I don't believe it makes
Queen.  Maybe a Prince.  Content and links DO go hand in
hand but, without relevance,  the Kingdom is doomed. Sorry I
will stop the analogy now! :-)

If your site is about finance, then finance content is best
supported by finance link exchanges.  Relevance!

If your site is about finance, then finance content
supported by casino link exchanges from a PR8 site while in
the short term may help,?but all the signs are saying this
is not a long term strategy.

Okay,so what is the best strategy?

Keep EVERYTHING relevant.  It is that simple. 

Make sure that you only swap or link to sites that are
relevant to the content on your pages.  Yes I am suggesting
link exchanging on pages of your site not a links page.

Links pages seem to be being abused.  There are rumours that
pages called links, resources or partners are not passing
page rank. You could be wasting your time building links
that are not giving you any benefits!

Delivering relevant links from relevant content is the
future.

Look at sites such as www.bbc.co.uk or
www.independent.co.uk.  News sites have the right idea.
They have 2 or 3 relevant internal links to other
articles on the same topic or links to internal tools that
are related. These usually can be found at the right hand
side of the article.

They also then have weblinks or external links to sites of
interest that are related to the topic.  These are relevant!

Another benefit of this is that with a content rich site you
can add hundreds of links quite legitimately and really add
some value both to your Rankings and your users.

With a content-poor site it is difficult, you have to add
link pages or create a links directory. A five page site
will need to add 10 or 12 good link pages to compete and
even then with algorithm changes, this may not be prudent.

Having a site with 400 pages means you can easily add 3
links per page, so you have 1200 link options straight away.

Hopefully this explains that relevance runs a close second
to content.

Always bear in mind when writing content that relevant
links will not only boost your search engine rankings,
but you will also add a service to your visitors.

2004 © J2 Squared Limited. All Rights Reserved.

Get Better Search Engine Rankings with RSS

RSS is the latest craze in online publishing. But what  exactly is RSS?

RSS or Rich Site Syndication is a file format similar to XML, and is used by publishers to make their content available to others in a format that can be universally understood.

RSS allows publishers to "syndicate" their content through the distribution of lists of hyperlinks.

It has actually been around for a while, but with the advent of spam filters and online blogging, it is fast becoming the choice of ezine publishers who want to get their message across to their subscribers.

However, not much attention has been given to the advantages RSS provides for search engine optimization.


Why Search Engines Love RSS

Many SEO experts believe that sites optimized around themes,or niches, where all pages correspond to a particular subject or set of keywords, rank better in the search engines.

For example, if your website is designed to sell tennis rackets, your entire site content would be focused around tennis and tennis rackets.

Search engines like Google seem to prefer tightly-themed pages.


But where does RSS figure in all this?

RSS feeds, usually sourced from newsfeeds or blogs, often correspond to a particular theme or niche.

By using highly targeted RSS feeds, you can enhance your site's content without having to write a single line on your own.

It's like having your own content writer - writing theme-based articles for you - for free!


How can RSS improve my Search Engine Rankings?

There are three powerful reasons why content from RSS Feeds is irresistible bait for search engine spiders.


1. RSS Feeds Provide Instant Themed Content

There are several publishers of RSS feeds that are specific to a particular theme.

Since the feed is highly targeted, it could contain several keywords that you want to rank highly for.

Adding these keywords to your pages helps Google tag your site as one with relevant content.


2. RSS Feeds Provide Fresh, Updated Content

RSS feeds from large publishers are updated at specific intervals. When the publisher adds a new article to the feed, the oldest article is dropped.

These changes are immediately effected on your pages with the RSS feed as well. So you have fresh relevant content for your visitors every hour or day.


3. RSS Feeds Result in More Frequent Spidering

One thing I never anticipated would happen as a result of adding an RSS feed to my site was that the Googlebot visited my site almost daily.
To the Googlebot, my page that had the RSS feed incorporated into it was as good as a page that was being updated daily, and in its judgement, was a page that was worth visiting daily.

What this means to you, is that you will have your site being indexed more frequently by the Googlebot and so any new pages that you add to your site will be picked up much faster than your competitors.


How does this benefit you as a marketer?

Well, for example, let's says a top Internet Marketer comes out with a new product that you review and write up a little article on, and that your competitors do the same.

Google generally tends to index pages at the start of the month and if you miss that update, you will probably need to wait till the next month to even see your entry in.

But, since your site has RSS feeds, it now gets indexed more frequently. So the chances of getting your page indexed quickly are much higher.

This gives you an advantage over the competition, as your review will show up sooner in the search results than theirs.

Imagine what an entire month's advantage could do to your affiliate sales!


Why Javascript Feeds Are Not Effective

Some sites offer javascript code that generates content sourced from RSS feeds for your site.

These are of absolutely no value in terms of search engine rankings, as the googlebot cannot read javascript and the content is not interpreted as part of your page.

What you need is code that parses the RSS feed and renders the feed as html content that's part of your page.

This is achieved using server side scripting languages like PHP or ASP.

A good free ASP script is available from Kattanweb
http://www.kattanweb.com/webdev/projects/index.asp?ID=7


An equally good PHP script is CARP
http://www.geckotribe.com/rss/carp/


So in conclusion, besides optimizing on page and off page factors, adding RSS feeds to your pages should be an important part of your strategy to boost your search engine rankings.


Satyajeet Hattangadi is the CEO of Novasoft Inc, a software
solutions provider, that specializes in affordable
customized software solutions. http://www.novasoft-inc.com
Get the Free Email Course "RSS Riches" and learn how to use
RSS to get high search engine rankings and monetize your
website at http://www.trafficturbocharger.com

The Other Side of the Search Gods Abracadabra!

Thousands of servers ...billions of web pages.... the possibility of individually sifting through the WWW is null. The search engine gods cull the information you need from the Internet...from tracking down an elusive expert for communication to presenting the most unconventional views on the planet. Name it and click it. Beyond all the hype created about the web heavens they rule, let's attempt to keep the argument balanced. From Google to Voice of the Shuttle (for humanities research) these ubiquitous gods that enrich the net, can be unfair ...and do wear pitfalls. And considering the rate at which the Internet continues to grow, the problems of these gods are only exacerbated further.

Primarily, what you need to digest is the fact that search engines fall short of Mandrake's magic mechanism! They simply don't create URLs out of thin air but instead send their spiders crawling across those sites that have rendered prayers (and expensive offerings!) to them for consideration. Even when sites like Google claim to have a massive 3 billion web pages in its database, a large portion of the web nation is invisible to these spiders. To think they are simply ignorant of the Invisible Web. This invisible web holds that content, normal search engines can't index because the information on many web sites is in databases that are only searchable within that site. Sites like www.imdb.com - The Internet Movie Database , www.incywincy.com - IncyWincy, the invisible web search engine and www.completeplanet.com - The Complete Planet that cover this area are perhaps the only way you can access content from that portion of the Internet, invisible to the search gods. Here, you don't perform a direct content search but search for the resources that may access the content. (Meaning - be sure to set aside considerable time for digging.)

None of the search engines indexes everything on the Web (I mean none). Tried research literature on popular search engines? AltaVista to Yahoo, will list thousands of sources on education, human resource development, etc. etc. but mostly from magazines, newspapers, and various organizations' own Web pages, rather than from research journals and dissertations- the main sources of research literature. That's because most of the journals and dissertations are not yet available publicly on the Web. Thought they'll get you all that's hosted on the web? Think again.

The Web is huge and growing exponentially. Simple searches, using a single word or phrase, will often yield thousands of "hits", most of which will be irrelevant. A layman going in for a piece of info to the internet has to deal with a more severe issue - too much information! And if you don't learn how to control the information overload from these websites, returned by a search result, roll out the red carpet for some frustration. A very common problem results from sites that have a lot of pages with similar content. For e.g., if a discussion thread (in a forum) goes on for a hundred posts there will be a hundred pages all with similar titles, each containing a wee bit of information. Now instead of just one link, all hundred of those darn pages will crop up your search result, crowding out other relevant site. Regardless of all the sophistication technology has brought in, many well thought-out search phrases produce list after list of irrelevant web pages. The typical search still requires sifting through dirt to find the gold. If you are not specific enough, you may get too many irrelevant hits.

As said, these search engines do not actually search the web directly but their centralized server instead. And unless this database is updated continually to index modified, moved, deleted or renamed documents, you will land yourself amidst broken links and stale copies of web pages. So if they inadequately handle dynamic web pages whose content changes frequently, chances are for the information they reference to quickly go out-of-date. After they wage their never ending war with over-zealous promoters (spamdexers rather), where do they have time to keep their databases current and their search algorithms tuned? No surprise if a perfectly worthwhile site may go unlisted!

Similarly, many of the Web search engines are undergoing rapid development and are not well documented. You will have only an approximate idea of how they are working, and unknown shortcomings may cause them to miss desired information. Not to mention , amongst the first class information, the web also houses false, misleading, deceptive and dressed up information actually produced by charlatans. The Web itself is unstable and tomorrow they may not find you the site they found you today. Well if you could predict them, they would not be god!...would they?! The syntax (word order and punctuation) for various types of complex searches varies some from search engine to search engine, and small errors in the syntax can seriously compromise the search. For instance, try the same phrase search on different search engines and you'll know what I mean. Novices... read this line - using search engines does involve a learning curve. Many beginning Internet users, because of these disadvantages, become discouraged and frustrated.

Like a journalist put it, "Not showing favoritism to its business clients is certainly a rare virtue in these times." Search engines have increasingly turned to two significant revenue streams. Paid placement: In addition to the main editorial-driven search results, the search engines display a second - and sometimes third - listing that's usually commercial in nature. The more you pay, the higher you'll appear in the search results. Paid inclusion: An advertiser or content partner pays the search engine to crawl its site and include the results in the main editorial listing. So?...more likely to be in the hit list but then again - no guarantees. Of course those refusing to favor certain devotees are industry leaders like Google that publishes paid listings, but clearly marks them as 'Sponsored Links.'

The possibility of these 'for-profit' search gods (which haven't yet made much profit) for taking fees to skew their searches, can't be ruled out. But as a searcher, the hit list you are provided with by the engine should obviously rank in the order of relevancy and interest. Search command languages can often be complex and confusing and the ranking algorithm is unique to each god based on the number of occurrences of the search phrase in a page, if it appears in the page title, or in a heading, or the URL itself, or the meta tag etc. or on a weighted average of a number of these relevance scores. E.g. Google (www.google.com) uses its patented PageRank TM and ranks the importance of search results by examining the links that lead to a specific site. The more links that lead to a site, the higher the site is ranked. Pop on popularity!

Alta Vista, HotBot, Lycos, Infoseek and MSN Search use keyword indexes ? fast access to millions of documents. The lack of an index structure and poor accuracy of the size of the WWW, will not make searching any easier. Large number of sites indexed. Keyword searching can be difficult to get right.
In reality, however, the prevalence of a certain keyword is not always in proportion to the relevance of a page. Take this example. A search on sari - the national costume of India ?in a popular search engine, returned among it's top sites, the following links:
?www.scri.sari.ac.uk/- of the Scottish Crop research Institute
?www.ubudsari.com/ -a health resort in Indonesia
? www.sari-energy.org/ - The South Asia Regional Initiative for Energy Cooperation and Development

Pretty useful sites for someone very much interested in knowing how to drape or the tradition of the sari?! (Well, no prayer goes unanswered...whether you like the answer or not!) By using keywords to determine how each page will be ranked in search results and not simply counting the number of instances of a word on a page, search engines are attempting to make the rankings better by assigning more weight to things like titles, subheadings, and so on.
Now, unless you have a clear idea of what you're looking for, it may be difficult or impossible to use a keyword search, especially if the vocabulary of the subject is unfamiliar. Similarly, the concept based search of Excite (instead of individual words, the words that you enter into a search are grouped and attempted to determine the meaning) is a difficult task and yields inconsistent results.

Besides who reviews or evaluates these sites for quality or authority? They are simply compiled by a computer program. These active search engines rely on computerized retrieval mechanisms called "spiders", "crawlers", or "robots", to visit Web sites, on a regular basis and retrieve relevant keywords to index and store in a searchable database. And from this huge database yields often unmanageable and comprehensive results....results whose relevance is determined by their computers. The irrelevant sites ( high percentage of noise, as it's called), questionable ranking mechanisms and poor quality control may be the result of less human involvement to weed out junk. Thought human intervention would solve all probes....read on.

From the very first search engine ? Yahoo to about.com, Snap.com, Magellan, NetGuide, Go Network, LookSmart , NBCi and Starting Point, all subject directories index and review documents under categories ? making them more manageable. Unlike active search engines, these passive or human-selected search engines like don't roam the web directly and are human controlled, relying on individual submissions. Perhaps the easiest to use in town, but the indexing structure these search engines cover only a small portion of the actual number of WWW sites and thus is certainly not your bet if you intend specific, narrow or complex topics.

Subject designations may be arbitrary, confusing or wrong. A search looks for matches only in the descriptions submitted. Never contains full text of the web they link to - you can only search what you see titles, descriptions, subject categories, etc. Human-labor intensive process limits database currency, size, rate of growth and timeliness. You may have to branch through the categories repeatedly before arriving at the right page. They may be several months behind the times because of the need for human organization. Try looking for some obscure topic....chances for the people that maintain the directory to have excluded those pages. Obviously, machines can blindly count keywords but they can't make common-sense judgement as humans can. But then why does human-edited directories respond with all this junk?!

And here's about those meta search engines. A comprehensive search on the entire WWW using The Big Hub, Dogpile , Highway61, Internet Sleuth or Savvysearch , covering as many documents as possible may sound as good an idea as a one stop shopping. Meta search engines do not create their own databases. They rely on existing active and passive search engine indexes to retrieve search results. And the very fact that they access multiple keyword indexes reduces their response time. It sure does save your time by searching several search engines at once but at the expense of redundant, unwanted and overwhelming results....much more ? important misses. The default search mode differs from search site to search site, so the same search is not always appropriate in different search engine software. The quality and size of the databases vary widely.

Weighted Search Engines like Ask Jeeves and RagingSearch allows the user to type queries in plain English without advanced searching knowledge, again at the expense of inaccurate and undetailed searching. Review or Ranking Sources like Argus Clearinghouse (www.clearinghouse.net),
eBlast (eblast.com) and Librarian's Index to the Internet (lii.org) . They evaluate website quality from sources they find or accept submissions from but cover a minimal number of sites.

As a webmaster, your site registration with the biggest billboards in Times Square can get you closer to bingo! for the searcher. Those who didn't even know you existed before are in your living room in New York time!

Your URL registration is a no-brainer, considering the generation of flocking traffic to your site. Certainly a quick and inexpensive method, yet is only a component of the overall marketing strategy that in itself offers no guarantees, no instant results and demands continued effort for the webmaster. Commerce rules the web. Like how a notable Internet caveman put it, "Web publishers also find dealing with search engines to be a frustrating pursuit. Everybody wants their pages to be easy for the world to find, but getting your site listed can be tough. Search sites may take a long time to list your site, may never list it at all, and may drop it after a few months for no reason. If you resubmit often, as it is very tempting to do, you may even be branded a spamdexer and barred from a search site. And as for trying to get a good ranking, forget it! You have to keep up with all the arcane and ever-changing rules of a dozen different search engines, and adjust the keywords on your pages just so...all the while fighting against the very plausible theory that in fact none of this stuff matters, and the search sites assign rankings at random or by whim.

"To make the best use of Web search engines--to find what you need and avoid an avalanche of irrelevant hits-- pick search engines that are well suited to your needs. And lest you'd want to cry "Ye immortal gods! where in the world are we?", spend a few hours becoming moderately proficient with each. Each works somewhat differently, most importantly in respect to how you broaden or narrow a search.

Finding the appropriate search engine for your particular information need, can be frustrating. To effectively use these search engines, it is important to understand what they are, how they work, and how they differ. For e.g. while using a meta search engine, remember that each engine has its own methods of displaying and ranking results. Remember, search strategies affect the results. If the user is unaware of basic search strategies, results may be spotty.

Quoting Charlie Morris (the former editor of The Web developer's journal) - "Search engines and directories survive, and indeed flourish, because they're all we've got. If you want to use the wealth of information that is the Web, you've got to be able to find what you want, and search engines and directories are the only way to do that. Getting good search results is a matter of chance. Depending on what you're searching for, you may get a meaty list of good resources, or you may get page after page of irrelevant drivel. By laboriously refining your search, and using several different search engines and directories (and especially by using appropriate specialty directories), you can usually find what you need in the end."

Search engines are very useful, no doubt. Right from getting a quick view of a topic to finding expert contact info...verily certain issues lie in their lap. Now the very reason we bother about these search engines so much is because they're all we've got! Though there sure is a lot of room for improvement , the hour's need is to not get caught in the middle of the road. By simply understanding what, how and where to seek, you'd spare yourself the fate of chanting that old Jewish proverb "If God lived on earth, people would break his windows."

Happy searching!

Liji is a PostGraduate in Software Science, with a flair for writing on anything under the sun. She puts her dexterity to work, writing technical articles in her areas of interest which include Internet programming, web design and development, ecommerce and other related issues.