SEO (Search Engine Optimization) Resource

SEO stands for search engine optimization. Approximately 70% of website traffic comes through search engines. This is an art of online internet marketing which helps websites rank high in search results. The purpose of this blog is to provide all important tools, article and information related to SEO. Go4outsourcing.com is an online services marketplace which allows you to find professional service provider for your SEO needs.

Optimizing a website so that it ranks highly in the search engines when someone searches for specific phrases related to the site is referred to as search engine optimization (SEO). Some important rules in SEO are unchanged whereas some of the rules for what can make a site friendly are constantly changing. This blog provides various important articles, tools and tricks which will help in optimizing your website in order to increase traffic and profit. Information and tools provided are free to use. There are some useful articles written by top seo consultants who works with various search engine optimization company.



Discuss Search Engine Marketing other professionals



Monday, January 09, 2006

A Comprehensive Guide for beginners to Content Writing for Search Engine Optimization

Search Engine Copywriting

Copywriting is an art of writing information for various forms of media. It has evolved through time from its early forms in books, magazines and newspapers, Copywriting has again transcended from its usual form and practices into the new internet era; Copywriting as utilized by the Search Engine Optimization business is also known as Internet Content Writing, Web Content Writing, amongst other terms. The various terms of Copywriting shall be used interchangeably throughout this text.

This article will try to tell you about the basics of copywriting and its advanced application on the SEO aspect. This article aims to provide the beginners in the Search Engine Optimization industry, an in-depth but friendly guide to seo content writing, as well as providing the more advanced copywriters with a guide to remind them of the several tricks they might have forgotten about the craft.

This guide shall be divided into the three parts of the copywriting process: the before, during, and after. This is the first part of the guide dealing with the things a copywriter must do BEFORE writing the copy. Succeeding parts shall be posted separately.

Before Writing

Before doing any writing you should first know the purpose why you’re writing that content. Your purpose should be clear and definite so no equivocation of ideas will exist that might confound your readers. Is the writing for sports? Is it for entertainment? Is it educational? These things should be clear on your mind before you write your copy, so a natural flow will exist as you write.

Another thing to consider is to know whom are you writing for and who are the people you wish to convey the message to? Knowledge of your audience will give you many benefits: people with different cultures only respond to a specific approach you use, technical terms would be very trivial when talking to beginners while spelled out and explained details would be very time consuming for experts. The internet is used by a vast network of people and your target may only comprise a very small minority. It is important that you address your target effectively if you want more conversions (making site visitors into customers) on your web site.

About the resources

Knowing the right information will certainly give you the right results. Knowing what people want and what they are searching for will be one of the keys to make it big in this business. One of the things that can help you acquire this information is through case studies, surveys and polls that can be found all over the internet. Most of these studies provide general demographic information about internet users. If you’re lucky enough (since it is discouraged), you might even stumble with information regarding the searching habits of different demographics.

Once you have decided to use particular information from the internet, make sure that it is from a reliable author or source. Incorrect and inaccurate data proliferates all over the internet and it happens that you may be misled by others to use them, so, see to it that the articles or studies you are about to use are made and conducted by certified educational institutions or known private companies so you will not have any problems about their authenticity.

Another effective source of information from the internet are pages which rank high among search engines especially those that are related to yours. Analyze and learn the effective things they have done to increase their PageRank and apply them to your work. You could also check out the pages of your top competitors, you might learn a lot from them but be careful not to copy their stuff as it is since they will be constantly checking out their competition. Copyright guidelines are finally catching up with those who replicate content, ending blacklisted by major search engines.

SEO forums are also helpful in guiding you about the latest trends in the Search Engine Optimization business. Experts usually crowd in these forums to discuss the tricks and trends of the business. Moreover, new updates and trends about Search Engine Algorithms and Technology can be found on these forums so it is highly advisable that you check out those forums. However, the forums might be a little too complicated for beginners as terms often become too technical to understand even by seasoned users.

About the words

Now let us go down to business! It is time to know what are the keywords and keyphrases you will use for your copy! The key words and phrases would be the ones that you will use and try to integrate throughout the whole copy. It would be the bait you place in the hook in order to attract and hopefully catch your potential customers.

First of all, you and your client should brainstorm together (face to face if possible) about the keywords and keyphrases you want to use for the copy. It is important that you brainstorm together so that you will be able to stay true to the brand and have an effective choice for use in the search engine optimization efforts. You could make use of different keyword tools found in the web such as Keyword Discovery, GoodKeywords, WordTracker, Overture, etc. (issues regarding their usability and effectiveness will be discussed separately). These tools can be downloaded or used directly over the internet should you choose to utilize it.

In choosing keyphrases or keywords remember to start with and use popular but “not-so-competitive” terms since it would be very difficult to compete with more established websites if you have just been starting. The above-mentioned tools will help you determine which key words or phrases you could use.

One word keywords are very difficult if not impossible to compete with as it would have a more general scope compared with keyphrases. For example, if you are trying to write content for a company selling educational toys, choosing a keyword like “toys” would be a stupid idea since search engines would give around one hundred million hits for that particular keyword, while changing it into keyphrase like “toys for students” or “educational toys” would only have hits of around five million. This means that the chance that a web searcher would actually go to your website would be 100,000,000:1 under the keyword “toys” while choosing the keyphrase “educational toys” means a chance of 5,000,000:1, greatly increasing your chance of being visited. Besides, customers are more likely to refine their searches since using or typing just one word searches would mean being bombarded with a lot of unwanted information than they need, costing more time and effort.

Your keywords should specifically target (1) the product or service that you are offering and (2) what people actually type whenever they use the search engines in looking for products and services like yours. A good example would be when writing content for a company selling kilns for bricks, you should not optimize for the keyword “kiln for bricks” if most people actually type “oven for bricks” when they are looking for such equipment. It is useless to optimize for the term kiln when most people opt to type oven since a few if none will be looking for the term kiln.

You should also identify and discover various words and terms which are closely related to your keyword or keyphrases. Some key-terms and keyphrases are so intimately intertwined with others that one group associates it with a particular field while another choose to associate it with something else. One good example would be Cosmetic Surgery. Cosmetic Surgery is a medical procedure, so, it can be regarded as something related with medicine and surgery, while it is also correct to say that it is related with cosmetics and beauty. Since the fields of medicine, surgery, cosmetics and beauty are popular fields, optimizing for both the cosmetic and the surgical aspect of the keyphrase Cosmetic Surgery would bring more keyword hits for searches from individuals of both parts of the spectrum.

Another thing to consider is to integrate local terminologies or equivalents of your products or services when optimizing with key words or phrases. An “elevator” in the US would be a “lift” in the UK, a “truck” in the US would be a “lorry” in the UK, and the list goes on. When trying to sell products or services for a huge demographically different society, you should optimize for both of the groups as each would tend to search for the more familiar local term. Better yet, you could create different sites for different demographic groups, replacing particular key words and phrases; enabling you to cater to both.

Moreover, it would be wise to consider placing regional information or regional key words or phrases. Integrating regional information along with keywords and keyphrases enables users who prefer more specified searches to visit your website. You would also benefit from the limited competition because of the more specified search. Most people looking for products and services in the internet would certainly prefer to find what they need locally, so adding local regional information would definitely be of great help to you and your potential client. Another benefit is that you could add another keyword, which is the regional information to your existing key word or key phrase. For example, instead of having just “plumbing services” add “Atlanta” before ‘plumbing services’. This would give you an edge over competitors as it would profoundly decrease your competition.

About the content

Now that you have the key words and phrases you would need its time to plan about the general thrust of the content, on what the content should be like.

Generally, the main idea of writing content is for it to be able to provide useful information for visitors in your site. You are primarily writing for the readers, the human visitors of your site, and about the products and services that you have to offer. Secondary to that idea is to provide the search engines information so they could properly and accurately index your site according to its proper category, so anyone who wants to look for something in particular, through the use of search engines, would eventually find what he needs. In other words, your content should be both customer-oriented and search engine friendly.

In order to do that, you need to plan properly on how to do your copywriting. The whole text should be able to give them what they need and want to know about the products and services you have. Hence, it is highly advisable that you read a lot of information about the subject product or service before you write the actual copy. The goal is to become extremely knowledgeable about the product, so you can explore all the possibilities and play with its strengths and weaknesses and write everything that is needed.

One important thing to remember is to write content that is unique. Copying content is not only plagiarism and cheating but is also a serious offense that could cause painful penalties under existing Copyright laws. More and more Intellectual Property Rights watch dogs are reporting cases of content stealing and have gained some grounds over the years. Major search engines are now penalizing sites which illegally acquire content from other sources. Penalties include permanently putting sites under a blacklist, sort of a “permanent not to contact sites” for crawlers. Lawsuits and cases about web content writing are now increasing day by day, with more countries enacting laws on Intellectual Property Rights. The risks are just too great if you plagiarize and copy content. So make sure that you quote or place endnotes when you choose to use parts of other’s content.

And lastly, your content should be written in plain, simple, and natural language so as not to destroy the natural flow of words as you write. Highly technical words and terms should be reserved for highly technical discussions, and should be discouraged for everyday internet use.

About the mood

You might be wondering what a section about mood is doing in a seo copywriting article, well, it certainly has a LOT to do about content writing. The mood of the reader would certainly affect the way he views a certain product or service. If you did not properly take care of the emotional side of your customer with your writing, consider him gone. Individual moods are affected by a lot of factors; although primarily it is internal, external factors could also affect his mood significantly and luckily what that individual reads is one of them.

First of all, you should be ‘in the mood’ for writing. Good copies are mostly written by writers who are either inspired or enlightened with what they are about to write. Content writers should make sure that they are in this special mood because the consequence of the opposite would be a very bad copy. A reader is also likely to be ‘drawn’ by an emphatic copy written superbly which would eventually end up making the reader get what you are offering.

One thing you could do to achieve that is to utilize emotional appeal to the reader. Try to integrate personal articles like “you”, “we”, and “us” more often; try to get your visitors as involved as possible. Avoid being too passive as it would prevent you from establishing a connection or a relationship with your target reader.

Keep your readers or customers engaged with your site. Make them think and interact by asking questions, giving riddles or trivia. All these create an air of friendliness for potential customers, and once you’ve made them comfortable reading, they are more likely to respond positively to you. As much as possible make them do all their transactions within your site, give out all the details about what you are offering so that can know everything they need to know. Trying to get online visitors ask questions and product or service information offline will be too cumbersome for them so be as accessible as possible.


Liam Anthony,Ghost Writer/Copy Writer,www.seoglobalpro.com
Article Source: http://EzineArticles.com/?expert=Liam_Anthony
Save to i89

Search Engine Optimisation Copywriting – the Top Ten Pitfalls and How to Avoid Them

In the last few years, search engine optimisation copywriting in the UK and around the world has changed beyond recognition, as has the way sites are optimised by their design, coding and links. However, the biggest changes have been with SEO copywriting. Some of the same old mistakes are being made, and with all the changes to the ways search engines rank sites, fresh pitfalls are appearing. This article looks at some of the most common mistakes and omissions in SEO copywriting – and how to avoid them.

1. Too much time on the look, not enough on the content. If, like me, you’re in the business of SEO copywriting, this is a perennial bugbear. The content of your website is more important than its design, and it’s going to be even more key in the future. Search engines rank websites for what’s in them. You’re almost certainly paying your site design people a great deal of money – but you’re wasting it if your copy is an afterthought and few people visit your website. Invest time and money in copywriting. Better still, talk to your copywriter while the site is being designed, rather than ask him or her to fill in the empty spaces afterwards.

2. Lack of keywords. Keyword selection is the most critical single factor in search engine optimisation. Yet all too often businesses ignore it. If you’re a blue chip company it rarely matters – people are going to come to your site anyway. But any small or mid-size company ignores it at their peril. If your site isn’t optimised in the way it’s written (not just in the way it’s coded) then you’re losing out on customers – big time.

3. Optimising keywords that no-one is searching for. Your company may pride itself on its great service, but it’s pointless to optimise ‘great service’ or anything along those lines, as no-one will be searching for it. (In fact it can be positively counter-productive, as some search engines treat ‘service’ as a stop word and mark down accordingly.) You can find a free search engine query tool at www.overture.com, or you can pay for a more detailed and comprehensive one at www.wordtracker.com. These will tell you which terms have been searched for recently and how often.

4. Optimising keywords that everyone is searching for. You need to be specific in what you optimise. If you’re selling jewelry (or ‘jewellery’, as it’s spelled in the UK), then it’s no use simply optimising for the word ‘jewelry’. Be more specific. Even phrases like ‘antique jewelry’ or ‘beaded jewelry’ are searched for many thousands of times a month. Find out what people are searching for and see what you’re up against by going to a couple of search engines and entering those terms. If your competitors are all optimising for a specific term, it’s probably best to avoid it if you can find an alternative that will still bring in the traffic.

4. Alternate spellings and endings. Think laterally, think creatively, think how others would spell or term something. Are you going to optimise for ‘jewelry’ or ‘jewellery’ – or both? How about ‘website’ or ‘web site’? – both versions are common. And so on. Don’t try and cover all the bases – but do try and check them against what’s being searched for and how many times and in what context you’ll find that keyword on the internet. That way you’re more likely to make the best choices.

5. Keyword density. The general rule of thumb is to try and get them in headings or subheads, and early on in the copy. Two to five times overall on a page, with an absolute maximum of three different keywords per page is what to aim for. Some pundits recommend keyword density of up to 5%. This is almost certainly too much, and some search engines will actually penalise you for it.

6. Clumsy use of keywords. Beware of your copy becoming awkward if you try and repeat your keywords too often:

If you’re looking for wonderful widgets, this is the best place for wonderful widgets. Our wonderful widgets are better than any other wonderful widgets you’ve heard of…”

Copy like that puts off anyone reading your website. And nowadays, when keywords are crowded in like that, it’s putting off the search engines as well.

7. The amount of text. Opinions vary as to exactly how long a page should be. Your homepage should be no longer than around 250 - 300 words, but you can easily double that if needs be for other pages. All pages should have clear headings, subheads, and short paragraphs. A page could be as little as 100 words. What it won’t be, if it’s optimised correctly, is a single paragraph of 30 -50 words.

8. Missing the extras. Text links within your site and anchor text pointing to it are important elements of search engine optimisation copywriting. Text links between pages in your site make it easier for search engine spiders to travel across the whole site. You should therefore always look to include them within your site, unless your site is too complex for it to be practicable, in which case your site needs a hierarchical structure. Anchor text is the visible text in a hyperlink – as in the following:

“Effective search engine optimisation copywriting is essential for getting the most out of your website.”

Of course, the anchor text that helps your site up the rankings is actually on a hyperlink from an outside site – but good anchor text is text that’s written in the right way, with the correct keyword. So get your copywriter to suggest anchor text with which outside sites can link to yours.

9. Doorway pages that aren’t proper pages. Doorway pages are – or were – simply pages within your site that were optimised so that very often they were the first pages that visitors reached. However, the phrase ‘doorway page’ nowadays tends to refer a page that has very little to do with a site, but is merely optimised for a couple of key phrases and aims to immediately redirect the visitor to the site proper. There’s nothing wrong with optimising several pages on your site – in fact it’s generally an excellent idea, as it allows you to cover many keywords. Just make sure that each optimised page has original content, is a genuine part of your site, and is shown on your sitemap.

10. Resting on your laurels. This is perhaps the most common failing of all. A properly optimised site should get you up near the top of the rankings. But you’ll need to keep working on it if you want to stay there. Every day around 7 million items – documents, pages whatever – are added to the internet. Your competitors are going to be choosing keywords and optimising websites of their own. One way to develop and keep high rankings is with relevant links. Another is by adding original content, such as articles or newsletters – so keep your copywriter busy.


Peter Wise is an advertising copywriter, website copywriter and SEO copywriter based in London, UK. He also writes direct mail, brochures, newsletter articles and press releases. You can reach him at +44 (0) 7767 687524. For further information, please visit http://ideaswise.com/.
Article Source:
http://EzineArticles.com/?expert=Pete_Wise
Save to i89

Using Free-Reprint Articles as a One-Way Link Back Strategy

UNDERSTANDING THE IMPORTANCE & CHALLENGE OF GETTING INBOUND LINKS

With the advent of Google's Link Popularity algorythms, people began to aggressively hunt for the almighty inbound links.

It is hard to get links back to one's site using freebie strategies unless you want to spend your days posting comments on guestbooks and setting up your own doorway pages on alternate sites.

Google made this a bit tougher. Google started using a second algorythm called PageRank. PageRank is a system that has been designed with the intent of valuing the importance of a page in the Link Popularity calculations. No longer does the sheer number of links to a site matter. Instead, one must look at the quality of the websites that link to their site.

This is the reason why posting comments on guestbooks barely gets noticed. These are not quality links.

In this age of multiple domains on a single webserver, Google has taken this into account too. If one has a hundred domains on the same server, pointing back and forth to each other, this is of no real value either as far as Google is concerned.

What is important to them is when sites that have been given a PageRank of at least 4, and those sites point to another site that does not reside on the same I.P. address or webserver. As far as Google is concerned, this is usually another unrelated person who is pointing to a valuable resource. That is why this type of link has a better overall value to Google.

Anything else is looked upon as an attempt to skew the search results without good cause.

Google realizes that we all have the need to promote ourselves, and that we will do so with our own selfish interests at heart. Our own selfish interests do not necessarily reflect the interests of Google's search users. So, Google works hard to weed out our selfish attempts at shoddy self-promotion.

THE CONCEPT OF RECIPROCAL LINKING

When webmasters realized the complexities of the Google ranking systems, they began to understand that they needed to go off of their own server to find those all-important inbound links.

Always looking for the easy way, many webmasters turned to reciprocal linking strategies. Those who are selling these concepts are doing so on the premise that it is an easy and inexpensive way to build inbound links.

To get the reciprocal link, you are supposed to go to the search engines and find a site that looks like it has content similar to yours, but not a direct competitor.

Next, you should check to see that the site you are looking at has a PageRank of 4 or higher.

Then, you are to put a link to that site on your own domain.

Once you have completed the above steps, you are to contact the webmaster that owns the site you just linked to and to tell them nicely that you have placed a link to their site, and would they be kind enough to put your link on their website?

TIP: My site clearly states that "I do not do reciprocal linking, so do not ask." Believe it or not, I get a half dozen requests every week for reciprocal links. Not only do I know that they did not read my policy on reciprocal links which is linked to from my contact page, I also know that they did not look at my site when they are telling me that my site is related to theirs. ;-)

Read the site policies at the web sites you visit, and you might be able to save yourself some time.

A week after you have done your linking campaign, you must go back to the sites you contacted to see if your link is on their site. If it is not, then you take their link off of your site.

THE LIE OF RECIPROCAL LINKING STRATEGIES

The people who sold you on reciprocal linking policies have usually done so in order to sell you an ebook or their services. Often, they will tell you that it is a simple and inexpensive way to improve your inbound links and link popularity.

They are selling a lie, and let me tell you why.

Most sites that will agree to exchange links with you will do so only if they have a PageRank of less than 4, or if your PageRank is equal to or higher than theirs.

If your site or their site has a PageRank of less than 4, then the other has not really gained any real value from the exchange.

It takes a lot of time to find people who will link to you, and it takes even more time to validate that the link remains active.

Google has caught onto the reciprocal linking schemes too and has started to penalize sites that provide two-way links to each other, although the penalty is not as drastic as other penalties can be.

I still have a few two-way links on my own domains, but that is fine, as they are not for the benefit of the search engines, but for the benefit of my visitors.

Personally, my time is much too valuable to play the reciprocal linking game. That is something that you should consider as well.

How long does it take you to score one reciprocal link of any real value? How much is your time worth to you? How many dollars do those link exchanges add to your bottom line?

In the end, it is about money, and how much money you are generating for the time spent. After all, your time is worth something too, right?

Treat yourself right. How much is your time worth? $10 an hour? $20 an hour? $50 an hour? Now, calculate how many hours you have spent on getting one reciprocal link that actually sent you one visitor.

Let's suppose that your time is worth $20 an hour. And let us further suppose that it takes you 3 hours to get one quality link to your website. That link has cost you $60. Now how much traffic does that link send you?

If your reciprocal linking efforts cost more than they return, then they are a fool-hearty adventure.

THERE IS A BETTER WAY

You should not be wasting your time generating reciprocal links. Instead, you should be expending your time and resources generating one-way inbound links.

Imagine this. Suppose you could send out an advertisement for your business that had a link back to your website included in it. And suppose you would not have to beg or pay big bucks to get your ad published in ezines or on websites. And then suppose that the publishers and webmasters who saw your ad would be clammoring to put it into their ezines or on their websites, at no additional cost to you.

Can you imagine that being possible?

Well, it is possible. And it happens everyday.

Consider this. You are reading this article right now because you are hoping that I can teach you something about how to make your business more profitable.

And, you are reading this article in an ezine or on a website right now.

Here is the deal. I am attempting to teach you how to do something or about something. It does not matter what I am teaching, so long as the topic of the article appeals to your interests right now, and it is of interest to the people who, like you, are most likely to visit my site and buy my services.

The ezine publisher or the webmaster read this article and felt that it could be helpful to you and your goals. So, they published it and made it available to you for your review.

When you reach the end of this article, you will see a nine line by 65 character wide advertisement for my own business. In short, it is called the "Resource Box" or the "About the Author Information".

If I did my job well by attracting the interest of publishers and webmasters and selling them on the idea of publishing my article, and then attracted your attention to this article, I will see the benefits of this endeavor.

If at the conclusion of this article, you feel educated or entertained, then you will be more likely to read my Resource Box and visit my website. Perhaps, you might even decide that you would like to use my services to promote your own business. ;-)

If the publisher or webmaster who printed this article is following the rules of publication, then you will be able to click on the link to my website in the Resource Box.

Please note that I have not included a single link in the body of the article. Links inside an article should be directed only to third-party websites that provide a resource that will support the context of your article.

For example, my favorite location to be published is: http://www.YourMembership.net It is one of the few ezines I read weekly, and boy howdy, I am tickled everytime they publish an article of mine. Especially when you stop and realize that their subscriber base consists of more than 700,000 web marketers and that it costs over $2800 to buy a full-run ad there. I was published there again in April of 2005. ;-)

YOU CAN DO THIS TOO

I tell people all of the time that they can write articles too. Many don't believe me, but it is true. Everyone has something that they can teach to someone else.

If you honestly feel that you cannot write your own articles, then pay someone to write them for you. If you can write your own basic and specialized knowledge down on paper, you can hire others to edit it into an article that will be published.

When you have an article of good quality, you can either submit it yourself to publishers and webmasters as a free-reprint article, or you can pay someone who specializes in that activity to do it for you.

IN THE END...

In the end, you will have your own articles in circulation that will be published in ezines and on websites. Ezines can send you a sudden flood of traffic, and websites can deliver the all- important one-way inbound links that you need to grow your link popularity and to bring potential customers to your website.

And guess what else? Once your article gets into circulation as a free-reprint article, it can continue to be republished for years to come, generating new traffic and sales for years to come.

Contrary to the results that are being generated from reciprocal linking, free-reprint articles will actually permit you to earn more money from your promotions than it will cost you to run your promotions.


ABOUT THE AUTHOR:

Bill Platt is the owner of http://www.LinksAndTraffic.com
When you are tired of the struggle of the link building process, it might be time to consider our "Links And Traffic" services.
When you are ready to employ more Natural Linking Strategies in increasing your link popularity, "Links And Traffic" can help.
When you are ready for your links to actually generate
click-through traffic, we are here.
This is not a link rental system or a reciprocal linking scam. We Guarantee our results.
Article Source:
http://EzineArticles.com/?expert=Bill_Platt
Save to i89

What's RSS and How Can It Help You

RSS is a very important tool that lets you easily syndicate the content on your website to readers all over the world. It can help you drive visitors to your website, increase ad revenue, expand your website's reach, and move to the top of the search engines.


First of all, what is RSS? RSS was developed in 1997, and originally stood as an acronym for Rich Site Summary, although it is now known as Really Simple Syndication. It is coded in XML format, and allows you to publish the information on your website in a format that usually includes a headline, a short description of the article, and then a link to the article on your website.


There are a few formats in which your RSS feed can be published. People who use an RSS reader can subscribe to your feed through a software program on their computer. They will be notified when there is a new article, and they can read the headline and a short description, and then they can click the link to your site to read the rest. This can also be done online, and is supported on websites such as My Yahoo, Google, MSN, and AOL, just to name a few. Anybody with a website can also publish your content, by using a RSS feed aggregator. An aggregator can be in the format of PHP, or Javascript, which are just two of the most popular. Then the headline and short description will be display on the website, and then their visitors will click through onto your website.


RSS is coded in XML format. This is the type of coding that it is done in. But do not worry, you do not need to be familiar with XML in order to publish an RSS feed. You simply need to look at a template, and then base your own off of it. On XML.com there is a sample RSS feed which you can use: http://www.xml.com/pub/a/2002/12/18/dive-into-xml.html.


So now once you have your RSS feed created, you will need to promote it. The easiest way is through your website. You've probably seen little orange buttons that say RSS or XML on them, and then if you click on them you get a page filled with code. The code that you see is the XML of an RSS feed. You can use these buttons on your website to link to your RSS feed. There are also other buttons that can be used to automatically add your RSS feed to websites such as My Yahoo, Google, etc. You can find most of these buttons at this website: http://www.twistermc.com/shake/RSS-index.php. You can search the web to find out how to properly link these graphics to sites such as My Yahoo.


While linking to your feed off of your website will get your regular visitors to subscribe to your feed, you will not be able to reach out to a larger audience and help people find your website that otherwise wouldn't have. A great way to reach out to larger audiences is by submitting your feed to RSS directories. There are many RSS directories on the web where people can add their feed for free, and then people looking for them can find them quickly and easily. You can search for "RSS directory" on Google, and manually submit to each individual directory, which will work but can be very tedious. There is a program that I use, however, called RSS Feeds Submit. It will automatically submit your feed to over 80 directories and RSS search engines. It costs $29.95, but if you feel that you can afford it, I'd recommend it to you. Click here to go to their website.


RSS is great, but how can it help you in the search engines? While it would usually take weeks and weeks to get indexed and crawled by Yahoo, there is a secret using RSS that can get you indexed in under 48 hours. It is actually quite simple to do. You must have a Yahoo account, which you probably already have. Sign in, and go to my.yahoo.com. Once you see the My Yahoo page, find and click on "Add Content" right under the search box. Once you are on the next page, find and click on "Add RSS by URL." Then type in the URL directly to your RSS feed, so it should probably end in .xml. A page should come up confirming that it was added, and previewing what it looks like. This will add your site to the top of Yahoo's spider's queue. Every time that you update your RSS feed, you should ping Yahoo to let them know that is has been updated, and then they will recrawl it. You can ping Yahoo by typing the following address into your address bar, replacing URL-TO-YOUR-FEED to the URL you used to add the feed originally: http://api.my.yahoo.com/rss/ping?u=URL-TO-YOUR-FEED. From my own experience, I have learned that frequently updating your feed and pinging Yahoo will result in your site moving up in the rankings.


Now you know the wonderful ways that RSS can benefit your website. I wish you the best of luck in implementing RSS into your website, and hope that you are successful.

------------------------------------------------------------------------------------------------
About the author
David Amherst has created and authored many websites, including:
SEO Resources and Information

6WY Web Directory

Daily Web Hosting News

Affiliate Marketing Tips and Information

Save to i89

What Are Doorway Pages?

Source: http://searchenginewatch.com
Author: Danny Sullivan

Webmasters are sometimes told to submit "bridge" pages or "doorway" pages to search engines to improve their traffic. Doorway pages are created to do well for particular phrases. They are also known as portal pages, jump pages, gateway pages, entry pages and by other names.

Doorway pages are easy to identify in that they have been designed primarily for search engines, not for human beings. This page explains how these pages are delivered technically, and some of the problems they pose.

Low Tech Delivery

There are various ways to deliver doorway pages. The low-tech way is to create and submit a page that is targeted toward a particular phrase. Some people take this a step further and create a page for each phrase and for each search engine.

One problem with this is that these pages tend to be very generic. It's easy for people to copy them, make minor changes, and submit the revised page from their own site in hopes of mimicking any success. Also, the pages may be so similar to each other that they are considered duplicates and automatically excluded by the search engine from its listings.

Another problem is that users don't arrive at the goal page. Say they did a search for "golf clubs," and the doorway page appears. They click through, but that page probably lacks detail about the clubs you sell. To get them to that content, webmasters usually propel visitors forward with a prominent "Click Here" link or with a fast meta refresh command.

By the way, this gap between the entry and the goal page is where the names "bridge pages" and "jump pages" come from. These pages either "bridge" or "jump" visitors across the gap.

Some search engines no longer accept pages using fast meta refresh, to curb abuses of doorway pages. To get around that, some webmasters submit a page, then swap it on the server with the "real" page once a position has been achieved.

This is "code-swapping," which is also sometimes done to keep others from learning exactly how the page ranked well. It's also called "bait-and-switch." The downside is that a search engine may revisit at any time, and if it indexes the "real" page, the position may drop.

Another note here: simply taking meta tags from a page ("meta jacking," as Infoseek calls it), does not guarantee a page will do well. In fact, sometimes resubmitting the exact page from another location does not gain the same position as the original page.

There are various reason why this occurs which go beyond this article, but the key thing to understand is that you aren't necessarily finding any "secrets" by viewing source code, nor are you necessarily giving any away.

Agent Delivery

The next step up is to deliver a doorway page that only the search engine sees. Each search engine reports an "agent" name, just as each browser reports a name.

The advantage to agent name delivery is that you can send the search engine to a tailored page yet direct users to the actual content you want them to see. This eliminates the entire "bridge" problem altogether. It also has the added benefit of "cloaking" your code from prying eyes.

Well, not quite. Someone can telnet to your web server and report their agent name as being from a particular search engine. Then they see exactly what you are delivering. Additionally, some search engines may not always report the exact same agent name, specifically to help keep people honest.

IP Delivery / Page Cloaking

Time for one more step up. Instead of delivering by agent name, you can also deliver pages to the search engines by IP address, assuming you've compiled a list of them and maintain it.

Everyone and everything that accesses a site reports an IP address, which is often resolved into a host name. For example, I might come into a site while connected to AOL, which in turn reports an IP of 199.204.222.123 (FYI, that's not real, just an example). The web server may resolve the IP address into an address: ww-tb03.proxy.aol.com, for example.

If you deliver via IP address, you guarantee that only something coming from that exact address sees your page. Another term for this is page cloaking, with the idea that you have cloaked your page from being seen by anyone but the search engine spiders.
Save to i89

Tuesday, January 03, 2006

Best free web tools for SEO

You can click on the link below to access more then 50 best available free web tools for SEO.

http://www.go4outsourcing.com/resources/tools.asp

Feel free to link to our resource center and tools page.
Save to i89

How To Avoid Search Engine Spamming?

Search Engine Spamming also known as Spamdexing (spamming and indexing) is the practice of deliberately manipulating web pages to obtain high search engine rankings. Spamdexing is used to mislead search engines indexing program and to gain ranking position which they do not deserve.

Search Engine Optimisers are always on the look out for techniques to make their site rank well. They end up using spam techniques either knowingly or unknowingly simply to boost their search engine rankings. Improper use of SEO will sometimes result in site's getting penalised. With Google's pilot program now on, most webmasters are redefining their SEO methods followed so far.

The spam tactics mentioned below could either block search engine robots form crawling your site properly or get your site penalized in certain search engines/directories. Make sure you are well aware of these tactics before designing or optimising your website.

10 Search Engine Spam tactics known -


1. Hidden keywords

2. Keyword stuffing

3. Use of unrelated keywords

4. Hidden Links

5. Redirects

6. Doorway Pages

7. Unreadable tiny texts

8. Link farms

9. Cloaking

10. Mirror sites

Hidden Keywords

Hidden keyword also known as invisible text is the most common form of spam practised on websites. Hidden text or content will not be seen by human visitors and are only meant for the search engines spiders. The purpose of using hidden text is to increase the keyword density of the webpage and also to trick the search engines into indexing the text on the page. Hidden texts are used through HTML format and also using Cascading Style Sheets (CSS).

Invisible text through HTML is done by stuffing in keywords which have the text colour same as that of the background colour, therefore making it invisible to visitors eyes. Hidden text through HTML is easily detectable by search engines these days with the help of search engine filters.
Hidden text through CSS is slightly different compared to method used through HTML, in CSS the colour of the text is stored in external CSS file. Search Engines find it difficult to crawl external files, but are in the process of enhancing their filters to identify CSS spam.

Keyword Stuffing

Keyword Stuffing is implemented by adding block of keywords/keyphrases onto the webpage. Keyword stuffing is practised to increase the density of targeted keywords, thereby tricking the search engine robots into considering the page to be more relevant for the search phrase.

Search Engines easily detect keyword stuffing and therefore it would be wise to refrain from using keyword stuffing. On practising keyword stuffing your site will be blacklisted and banned from the search engine index.

Use of Unrelated Keywords

Unrelated keywords is the practice of using keywords on the webpage which are not related to sites content. This method is followed to trick a few people searching for such words into clicking at their sites link.

Generally people will quickly leave such sites when they do not find the information they were searching for.

Hidden Links

Hidden links are used to increase the link popularity of the site. These links are hidden from the reach of visitors and are used to fool the search engines. They are usually in the form of small hyperlinked dots.

Redirects

Redirection is the process of taking the user to another page without his or her intervention by using META refresh tags, CGI scripts, Java, JavaScript, Server side redirects or server side techniques.

Doorway Pages

Doorway pages are low quality web pages that contain very little content, stuffed with targeted keywords and keyphrases. Doorway pages are designed to rank highly within the search results. A doorway page will generally have "click here to enter" in the middle of it.

Unreadable Tiny Text

Tiny text spam consists of placing keywords and phrases in the tiniest text imaginable all over site/webpage. Most people can't see them, but spiders can and will ban such sites eventually.

Link Farms

Link farms are webpages created solely for search engine ranking purpose that consist of long list of unrelated weblinks on page.

These type of pages/sites are penalised by most search engines.

Cloaking

Cloaking is the practice of dynamically generating keyword rich content to the search engine robots, while providing different content to the actual visitors. By cloaking the user will not be able to see the code of the page shown to the search engines.

Most search engines have devised methods to detect cloaking and have banned sites which have followed such an activity.

Mirror Sites

Mirror sites are identical sites hosted on different domains/website address. By mirroring sites one can build hundreds of duplicate pages with entirely different urls. Intention of adopting mirror sites is to quickly populate your sites pages and also to increase inbound links.

Large websites often provide mirror sites in different countries so that their users can download from a closer server in hopes of achieving faster download speeds. While this is a perfectly legitimate practice, many search engines consider it “spam” because the number of pages is doubled, the content being identical.

Most Search Engines do not appreciate duplication of sites, but some of them are vulnerable to such technique as yet.

How to avoid Spam ?

Search Engine marketeers are forced to follow spam techniques due to many reasons, there maybe branding restrictions, site navigation demands, immediate sales increase, style guidelines and more. With thousands of seo providers out there, the competition to get high rankings in search engines has become intense. For many of these Search engine optimisation providers quality is a secondary issues, empty promises are made to win the trust of clients.

Search Engines value content rich sites and that is the only key to success. The much needed resources for good Search Engine optimisations are developers, content writers and skilled SEO's. In most cases the company cannot afford to provide these resources due to financial deficiencies, resulting in Search Engine Spam. Unethical SEO or search engine spam originates due to mixture of greedy website owners and search engine spammers.

Each Search Engine has its own distinctive Spam penalties. Google greys out the page rank of the penalised site, providing a zero page rank. Sometimes the entire site is banned from the search engines. Once you are banned you may have to apologise and resubmit your site for re-inclusion in search engine directories. It will take a couple of months before you see your site listed again in the search engines.

The only way to avoid this is to hire a good SEO who does not offer top 5 search engine rankings, but optimise your site with well written content and user navigation. With good optimisation practice you will find your site eventually listed in the top search engine positions and gain long lasting quality traffic.


Prema S is Search Engine Optimisation Executive for UK-based internet marketing company, Star Internet Ltd. Cients of Star Internet benefit from a range of services designed to maximise ROI from internet marketing activities. To find out more, visit http://www.affordable-seo.co.uk


Article Source: http://EzineArticles.com/?expert=Prema_Sunder

Save to i89

Wednesday, December 28, 2005

What is a Robots.txt File?

Search engines look at millions of web pages to come up with search results. They do this with what we call "search engine spiders." This makes sense - spiders crawling around on the Web. But another word for them is "robots" because they are simply unmanned programs gathering data automatically. I can't help but picture them as the characters in the new animated movie "Robots."

In the beginning, these robots spidered every page, every file, attached to the Web. This caused problems for both the search engines and the people using them. Pages that really aren't worth looking at, such as, say, header files to be included in all pages on a site, were being spidered and showed up in search results. Have you ever searched on Google and gotten a partial page as a result?

The solution was for Google and other search engines to begin looking for a robots.txt file in the root folder of each site (http://www.mydomain.com/robots.txt) to determine what should and shouldn't be searched. This is named, "The Robots Exclusion Standard." This simple text file, created with Notepad or other simple text editor gives you complete control by telling the robots not to spider certain folders in your site. The result is happier visitors who come to your site from search engines and get only full pages that you want them to see, not partial, test or script pages you don't want them to see.

Let's look at some examples to get started:

This allows all spiders to spider all pages on your site. The * is a wildcard that means “all spiders.”

User-agent: *
Disallow:

This is the opposite of the above example. This one tells all spiders to NOT spider your whole site. You might want this if you have a test site, for example, that is not live yet.

User-agent: *
Disallow: /

This example tells all robots to stay out of the cgi-bin and images folders.

User-agent: *
Disallow: /cgi-bin/
Disallow: /images/

This example tells only the WebFerret robot to not spider the page ferret.htm. It’s only an example. I have nothing against WebFerret. The user agent code for Google is googlebot.

User-agent: WebFerret
Disallow: ferret.htm

It is important that the file is a simple text file – do not use Microsoft Word to create it. And be careful of how you type – it must look exactly like the above examples, with caps only for the first letter, just the right spacing, etc. A poorly done robots.txt file could harm your site more than help it. For a cool online robots.txt file validator, go to http://www.searchengineworld.com/cgi-bin/robotcheck.cgi.

As an e-commerce consultant for over three years, and Web designer for over ten, Chuck Lasker has been helping individuals and organizations utilize the Internet in almost every arena. Chuck's e-newsletter and blog, The MerchantHowTo.com Report, at http://www.MerchantHowTo.com, is free and popular amongst e-store owners.

Article Source: http://EzineArticles.com/?expert=Chuck_Lasker

Save to i89

How to Prevent Duplicate Content with Effective Use of the Robots.txt and Robots Meta Tag

Duplicate content is one of the problems that we regularly come across as part of the search engine optimization services we offer. If the search engines determine your site contains similar content, this may result in penalties and even exclusion from the search engines. Fortunately it’s a problem that is easily rectified.

Your primary weapon of choice against duplicate content can be found within “The Robot Exclusion Protocol” which has now been adopted by all the major search engines.

There are two ways to control how the search engine spiders index your site.

1. The Robot Exclusion File or “robots.txt” and

2. The Robots <> Tag

The Robots Exclusion File (Robots.txt)

This is a simple text file that can be created in Notepad. Once created you must upload the file into the root directory of your website e.g. www.yourwebsite.com/robots.txt. Before a search engine spider indexes your website they look for this file which tells them exactly how to index your site’s content.

The use of the robots.txt file is most suited to static html sites or for excluding certain files in dynamic sites. If the majority of your site is dynamically created then consider using the Robots meta Tag.

Creating your robots.txt file

Example 1 Scenario

If you wanted to make the .txt file applicable to all search engine spiders and make the entire site available for indexing. The robots.txt file would look like this:

User-agent: *

Disallow:

Explanation

The use of the asterisk with the “User-agent” means this robots.txt file applies to all search engine spiders. By leaving the “Disallow” blank all parts of the site are suitable for indexing.

Example 2 Scenario

If you wanted to make the .txt file applicable to all search engine spiders and to stop the spiders from indexing the faq, cgi-bin the images directories and a specific page called faqs.html contained within the root directory, the robots.txt file would look like this:

User-agent: *

Disallow: /faq/

Disallow: /cgi-bin/

Disallow: /images/

Disallow: /faqs.html

Explanation

The use of the asterisk with the “User-agent” means this robots.txt file applies to all search engine spiders. Preventing access to the directories is achieved by naming them, and the specific page is referenced directly. The named files & directories will now not be indexed by any search engine spiders.

Example 3 Scenario

If you wanted to make the .txt file applicable to the Google spider, googlebot and stop it from indexing the faq, cgi-bin, images directories and a specific html page called faqs.html contained within the root directory, the robots.txt file would look like this:

User-agent: googlebot

Disallow: /faq/

Disallow: /cgi-bin/

Disallow: /images/

Disallow: /faqs.html

Explanation

By naming the particular search spider in the “User-agent” you prevent it from indexing the content you specify. Preventing access to the directories is achieved by simply naming them, and the specific page is referenced directly. The named files & directories will not be indexed by Google.

That’s all there is to it!

As mentioned earlier the robots.txt file can be difficult to implement in the case of dynamic sites and in this case it’s probably necessary to use a combination of the robots.txt and the robots tag.

The Robots Tag

This alternative way of telling the search engines what to do with site content appears in the section of a web page. A simple example would be as follows;

In this example we are telling all search engines not to index the page or to follow any of the links contained within the page.

In this second example I don’t want Google to cache the page, because the site contains time sensitive information. This can be achieved simply by adding the “noarchive” directive.

What could be simpler!

Although there are other ways of preventing duplicate content from appearing in the Search Engines this is the simplest to implement and all websites should operate either a robots.txt file and or a Robot tag combination.

Should you require further information about our search engine marketing or optimization services please visit us at http://www.e-prominence.co.uk – The search marketing company

Article Source: http://EzineArticles.com/?expert=Andrew_Allfrey

Save to i89

Tuesday, December 27, 2005

Stuck In the Google Sandbox? The Sandbox Solution

What is the Google Sandbox?

It’s the mysterious, possibly non-existent web purgatory where millions of sites languish without rankings, visibility, or traffic.

How Do You Know if You’re Stuck in the Sandbox?

Here’s a rough set of criteria for you.


  • Your site is indexed and appears with the proper title, snippet, and url (www, or no www whichever you picked) – type site:www.yoursite.com into Google to check this.
  • Your site has PageRank (use the google toolbar, or nichebot.com to find this)
  • It is regularly crawled, and the cache dates are newer than 10 days
  • On your keyword, you rank in the top 20 for allinanchor, allintext, and allintitle. To check this, type, e.g. allinanchor: and see if you’re in the top 20.
  • You do not rank within the first 1000 places for the keyword the site was designed for

Does the Sandbox Really Exist, or is it Just the Google Algorithm?

This is a big controversy. Everyone has a different opinions.

Don’t listen to guys who handle bluechip companies - they optimize older, high PR sites. It’s your everyday mainly-new-sites webmaster who knows this problem intimately. In fact, all the big sites need to do is place the keyword in the title and they're on the first page. This gives them an unfair advantage not unlike what the “media elite” has enjoyed for decades.

Regardless of whether the sandbox is a separate phenomenon from the algorithm, the degree of prejudice against new sites has hurt quality of Google’s search results. This is especially true with products and topics both new and urgent. The bigger sites may not be covering it, but searchers end up there without high quality answers.

The common wisdom now is that if you’re looking for new websites, go to MSN or Yahoo instead. Neither of these sites is using this kind of filter. Many websites rank in the top 10 (for their targeted keywords) on these two search engines, yet are nowhere to be found in Google.

Why Would Google Do This?

Google frowns upon SEOs who try to overly influence ranking, so they needed to find a way around SEO factors to deliver quality results. So they’d look for signs of SEO in websites, e.g. how consistent the addition of backlinks is, and how repetitive (vs. natural) the anchor text of backlinks is, and they consider the age of the site and its backlinks.

Redesign Penalties

Similarly, websites that have made the mistake of too comprehensively redesigning their look, content, or navigation have been shocked to find that they get penalized for this updating. Google seems to prefer a “frozen in time” or “moving like molasses” kind of internet. But to be fair, this is something that had to be included to beat spammers who were buying old websites and refueling them with keyword spam.

Why Do You Get Sandboxed?

Some sites have gotten out of the sandbox in a week, while others can take up to a year or more. No one knows if any one contributing factor gets you out sooner rather than later. Some point to the age of inbound links, or the frequency with which your site acquires them. Some say that getting too many inbound links too quickly appears artificial, and is flagged as spam. But others argue that Google can’t know how fast a site should acquire links. A website that received national news coverage, for example, could acquire hundreds or thousands of links in a day.

It’s likely that no one outside of Google fully understands how the sandbox works. The problem has been noticed and discussed for nearly 2 years, and no one has given a satisfactory answer. What’s crystal clear is that Google has made it so complex that it cannot be reverse engineered.

How Long Will You Be Making Sand Castles?

The delay seems to vary anywhere from four to 11 months. Since we don’t know exactly upon what and to what degree the filter depends, it’s likely a different magic combination for every site- and this fits with webmasters’ experience. So keep your head down, develop content, get inbound links, and eventually you’ll get out.

Some suggest that when you come out of the sandbox, you are not fully free. They notice a “rationing” or gradual increase in traffic from Google. In the meantime, older sites may rank better than you, regardless of the quality of their look, feel, and content. Deal with it. Keep your head down and keep working.

Another wrinkle: some webmasters suggest that sandboxing can occur at the page level, not simply at the site level, and that it is the bigger money/traffic keywords that get sandboxed. Again, this could simply be due to the level of competition on that keyword, as the entire site is not sandboxed if you’re getting rankings and traffic from other keywords.

Is There a Way to Trick the Sandbox Filter?

Some webmasters have talked about finding cracks in the algorithm… and they mainly involve backlinks. For a while, there was a lot of linkspam on blogs, but everyone – Google, bloggers, and blog providers – have cracked down on that exploit.

The real sandbox solution is not a trick - unless you define everything done by the SEO-aware as “tricky”. The answer is to grow your content and backlinks naturally over time. Don’t look for the quick buck, the quick ranking, or the easy way out. Go back to basics and build websites that people can use and enjoy. Exchange links with quality websites.

To avoid frustration, I’d suggest, if web building is what you do full time, that you begin a new site every month or two – eventually, if you’ve worked consistently on all of them, you’ll have one after another emerging from purgatory and flourishing in the rankings.


Since 1999, San Diego SEO Consultant Brian B. Carter, MS, has reached more than 2 million readers online. His most popular site ranks in the top 1% of all major websites. Brian's second book, "How I Made $78,024.44 in Six Months Using the Newest Secrets of AdSense and Overlooked Keywords" will be available in October, 2005 from his website, Ranking High on Search Engines



Article Source: http://EzineArticles.com/?expert=Brian_Carter

Save to i89

Strategies For Submitting Your website to web Directories

Webmasters like you and me are always on the lookout to enhance the link popularity of their sites. One method of doing so is to seek out authority sites in your niche – sites that are widely known on the Internet (through the sheer number of back links) and have been around for some time (think more in terms of several years). Unfortunately, such websites are often at a Page Rank of 6+ and as such, link exchange or text link ad placement is very, very expensive.

Luckily, there is a cheaper alternative. You can use web directories to not only enhance your search engine visibility (through increased link popularity), but by targeting niche categories and using sponsored listings where necessary, you can get a big jump in your traffic as well.
So, let's get started.

What is a Web Directory?

A web directory is a collection of links broken down into relevant categories. Think Yahoo! and their directory, the Open Directory Project or even the Google Directory (which, incidentally, is pulled from the ODP). At its most basic level, a web directory is a collection of bookmarks made available to the public. In other cases, like Yahoo, it is a professional resource for people actively looking for information.

To get listed in such a directory, you can either get listed for free (which might take a while), or in many cases, pay a one-time fee to have your website reviewed and entered in the directory. One major exception is Yahoo, which charges a recurring fee for its commercial listings, and we'll look at that later.

The Benefits of Being Listed in a Web Directory

In theory, there are two main benefits of being listed in a web directory:

- Increased link-popularity due to a one-way link from a highly-respected resource.
- Increased traffic due to being listed in a directory that is searched by many people every day.

In reality though, these benefits are directly related to how popular the directory is itself, and how much money you have paid for your listing. Of course, if the link is for free, there is nothing to worry about.

But if you are paying for submission, you need to know some very important facts.

Link Popularity
As far as link-popularity is concerned, you need to factor in several variables:
- The Page Rank of the directory
- The Page Rank of the category page on which you are listed
- Where you are listed on the page
- The number of competing websites on that page
- If there are other websites in your niche that can offer you the same conditions for the same price or less (very, very important).

The last point is very important from the cost/benefit angle. A web directory, while being a hub itself, is NOT a niche website or an authority site. Even within categories and sub-categories, the lack of valuable content means that web directories are, at the end of the day, link pages and nothing more.

Where directories win out is the fact that they require one-time fees. In contrast, authority sites (or most websites with a Page Rank greater than 5 or 6) tend to use text links as a source of revenue, and thus charge monthly fees. A directory listing then becomes a much better option (but only for link popularity).

More Traffic

Directory listings are also used as traffic building opportunities. How this works is that many directories are searchable for their visitors, so that users can look for information. In theory this is great – you can get lifetime traffic for just a nominal payment, but you should not expect a sudden deluge of traffic from just one directory listing. Here's why:

Most directories, apart from the top twenty or so, are usually used for link-building and not pure searching. This means that while people may use GoGuides or Yahoo for regular searches, you should expect that the smaller directories are mainly for link-popularity, and plan your investment as such.

Many directories offer listings based on an alphabetical ordering, or a first-come, first-serve ordering. In both cases, your website has quite a big chance of being lost out in the noise.
Directory-search algorithms differ greatly. Some directories, like JoeAnt, base their search on keyword relevancy (which makes it more of an exercise of stuffing your directory listing with keywords rather than making a good website), while others take a more "editorial" approach by factoring in editor ratings. And still, many directories display sponsored listings first, reinforcing the adage that even on the Internet, it's your advertising budget that talks, and not necessarily the quality of your website.

After you factor in the above points, you realize that there are only a handful of web directories where it is a definite benefit to "pay" to be listed. And even then, you cannot just rely on just being listed – sponsor listings get much more exposure. But before we discuss these dozen or so web directories, I'd like to tell you how you can make sure your website is accepted.

How To Get Listed – An Overview

Getting listed in a web directory is a function of three things:

Time

It takes a certain amount of time before an editor can review your website and approve (or reject) your request. This is usually anywhere between a week to almost never (in huge web directories like Yahoo and Dmoz). You can reduce this to within a week by using the paid listing option.
Money

Apart from Dmoz, the big directories usually require a nominal payment for your website to be listed. While you can calculate the benefit of such a listing from reading the previous section, know that usually there are several listing options, which for the better services (that give your website more directory visibility) obviously cost more.

Quality

In some cases (very rarely nowadays), directory inclusions can be rejected due to the poor quality of a website. Maybe the editor considered that your website was not 'useful' enough (meaning it had little or no useful / original / any content), or sometimes, there may be moral issues (although editors are urged to abide by directory guidelines and not personal beliefs). If rejected, you will almost always receive feedback (you might have to ask for it) on how to improve your website.

In earlier days, quality was a big issue. Today, it is still a major concern for top directories like Yahoo, but this is more to separate the truly atrocious from the rest rather than to separate the best from the rest.

Each web directory has their personalized criteria, but there are two crucial elements to getting listed:

Paid Inclusions – Apart from Dmoz, and some directories where you can sign-up to be an editor, the top directories require payment – anywhere from $15 to $299.
Website Quality – By this I don't mean design; I'm talking about having truly useful information – even if your website is a commercial website, simply putting up a bunch of affiliate links will not count as a quality website.

The Big Guns
Yahoo and Dmoz are the two biggest directories on the Internet, and it's only fair that I talk about how to get listed on them individually.
Yahoo
A listing in Yahoo's directory has direct benefits:
Google – and perhaps other search engines as well – give your website an added importance if your website is listed in the Yahoo directory.

Yahoo is the portal of choice for millions of users. This makes your potential target market at least in the hundreds of thousands, even for obscure niches.

To get listed in the Yahoo directory, you have to access
Yahoo Directory Submit and work from there. You will be required to open a Yahoo account, if you don't have one already. The review process will cost $299, and is no guarantee that your website will be listed.

However, if you have a useful website, and follow the guidelines detailed by Yahoo, there is no reason for your website to be rejected.


The Open Directory Project

Dmoz, or the open directory project, is a directory that rivals the reach of Yahoo. Why? Because directories like the Google Directory and many others are powered by the results from Dmoz. This gives a listing in the ODP a very high premium.

However, because a listing in Dmoz is essentially free, there is very little you can do about the time factor. Many websites that are submitted are never indexed, and that happens mainly due to a lack of time.

On the other hand, quality websites that are added into their relevant categories are almost always accepted, so make sure you follow their guidelines.

Instructions for submitting the Open Directory Project can be found here: http://www.dmoz.org/add.html

Resources
Getting listed in Yahoo and Dmoz is the bare minimum for any website looking to establish themselves at the top of their niche. And if you're looking to move beyond the big two and move on to second-tier directories, here's a quick list.

Directories
Editorial Note: The author failed to provide links to the directories below so we took the liberty of adding them. Hopefully, we got them right.

Find Web Designers - multiple paid options
Portal Boost Directory - non-profit website – free, commercial website - $15
Around The Web - $15
Index Unlimited - multiple paid options from free to $99.
GoGuides - $39.99 or $69.95
Data Spear - $39.99
This Is Our Year - $19.95
Browse8 - $35
Uncover The Net - Multiple paid options, $39 one-time to $29/month
Rubber Stamped - $25
Joe Ant - $39.99
Best Of The Web - $39.95

Directories of Directories

In addition, there are several directories that are focused completely on directories (you can find similar listings by looking through Yahoo or Dmoz).
Directory Archives
Complete Planet

A directory listing is, in most cases, useful only for the link popularity. In such scenarios, if you can find better deals on authority sites in your niche, then you should go for them. However, a directory listing is cheaper (one-time versus monthly payment), and with the big directories like Yahoo, Dmoz and GoGuides, it can also bring you reasonable traffic.

As always, remember that directory listings form a small part of your overall online marketing strategy. If you don't have the budget for a Yahoo listing, don't sweat it – focus on other forms of marketing, and come back to it when you can afford it. Directory listings are important, but only when you are looking to squeeze every possible drop of search engine placement out of your links and your website.
------------------------------------------------------------------------------------------------
About The AuthorArticle by Brad Callen. Get your free guide on getting top search engine rankings!
http://www.seoelite.com/7DaysToMassiveWebsiteTraffic.htm
Save to i89

Saturday, December 17, 2005

Beginner SEO Checklist

Every webmaster is concerned as to what SEO techniques should be followed to gain a maximum search engine rating and exposure. Wouldn't it be nice to have a concrete checklist of such items? The following is my effort to create a comprehensive system of simple SEO techniques that with time will guarantee a top ranking for your site.

1. Select a list of realistic keywords

First, let me explain how the keyword selection process works. The tool I highly recommend on using is a Keyword Selector Tool from Overture, notable feature of which is ability to see keywords or keyword phrases that are most popular and bring the largest number of visitors. When selecting your keywords, keep the following in mind:

  • Highly ranked keywords are already taken
A search for SEO as a keyword returns 74002 as an estimated number of times keyword SEO was searched last month. SEO Company is second with 14271. Googling SEO returns 30,000,000 URLs with SEO Chat and SEO Today ranking first and second. You can imagine number of Google referrals these two sites get!

  • Target low-medium traffic keywords
Performing the same Overture link analysis, I can see that at the bottom of the list are keywords such as web site seo and web seo with 562 and 536 searches respectfully. This tells me that these keywords are not yet competitive and it would make sense to optimize for them.

As you can see, less competitive keywords have a greater potential of bringing more traffic. So be smart about picking your keywords, avoid highly competitive ones, but also don't settle for the rock bottom.

  • Be specific when selecting keywords
If possible, select keywords that are specific to your site. As an example, select Honolulu SEO company as a keyword phrase rather than SEO company. This will greatly narrow down your site's focus and will contribute to a higher Search Engine relevance.

2. Incorporate keywords in your Title tags

Search Engines place high importance on what you have in your Title tags. In fact it's so important, that Google lists search results by Title tags. You can see your search queue bolded in each item. It's very important that every page on your site contains a title, every title contains a keyword or a keyword phrase, and your content matches the idea behind your Title. More on that last one to follow.

3. Optimize your content for selected keywords

Content is king on the Net and Google prides itself for bringing you the most relevant results. So here's a list of must-do content optimization techniques.

  • Place keywords in H1, H2,... tags
This is the next important item search engines like to take a peek at after your Title tag. Obviously all headers should be enclosed by H1 tags and all sub headers by H2 tags. Ideal situation is to have keywords in your Title tag match those in H1 or H2 tags. H3, H4, and so on are also relevant, but they are not as highly stressed as their lower-numbered brothers.

  • Use keywords in your anchor text
When linking to a different page, use a phrase with your keywords as anchor text for a link. For best results, I highly recommend using keywords that appear in Title and Header tags of the page you are linking to.

  • Separate your keywords from the rest of content
A great technique to keep in mind is to have keywords bolded and/or have them in larger font, and/or even a different color. Perhaps employ all three together. Think of how you can do that with your Header tags.

  • Move more important keywords toward the top of the page
Google loves this one. Since most hotspots of a site are located at the top, this tells search engines that your website is in fact about ideas behind keywords and not spam. Utilizing this technique together with the one right before will be a pretty powerful SEO move on your part.

  • Interlink pages like crazy
Not only will this technique allow Search Engine robots to index your site properly, it will also contribute to your relevance. Don't forget to keep anchor text and keywords in mind when doing so.

4. Create a robots.txt file

Robots such as Googlebot come to your page and the first thing they look for is a robots.txt file that contains a list of instructions. Robots.txt should be located in the root directory of your site and using it, you can specify which pages you want indexed, which ones you don't, which robots you want indexing your content, which ones you don't. The list goes on. In this article I will assume that your goal is to have all bots index all your pages. Open up that Notepad and begin writing some code.

<span style="font-family:verdana;font-size:85%;"><div style="text-align: center"><i>User-agent: *</span>



<span style="font-family:verdana;font-size:85%;">Disallow:</i></div></span>

This code snippet allows all bots to index your entire site. If you want to take full control of robots.txt, I highly recommend reading a full set of instructions at SearchEngine World.

If you don't feel like writing a robots.txt file, you can use the following meta tag:

This tells robots to follow the links from this page to get more pages.

5. Create a Sitemap

Sitemaps are great for letting Search Engines know where to look for new pages. You should make sure that every page is reachable via links on your site and you'll have no troubles getting indexed by robots. Remember that content is food for robots and they are pretty hungry. If you haven't done so already, you should take advantage of Google's Sitemaps program where you manually tell Googlebot which pages are indexable and how often they are updated, which brings me to my next point

6. Keep your content fresh all the time

You will not get indexed if Search Engines stop visiting you and you will not rank highly if you stop getting indexed. Google and other Search Engines love fresh content. This is part of the reason why certain blogs are outperforming major websites. Keeping content fresh can be as easy as incorporating RSS headlines on your pages or as time consuming as adding a blog to a corporate website.

7. Acquire relevant backlinks

To gain importance, you need important sites to link to you. Of coarse to have that done, you need to be useful enough to get a backlink. In either case, .gov or .edu backlinks are jackpot. Google boosts you up the rankings in a matter of days, depending on how often Googlebot crawls your site. There are a few black-hat SEO techniques to get .gov and .edu backlinks, but I won't get into them here. This is purely white-hat SEO. If you are interested though, shoot me an e-mail and I'll do my best to educate you.

Anyway, try to get sites in your niche to link to you. They should be optimized for similar keywords as yourself, and links should have anchor text that contains your site's keywords.

8. Get older and wiser

Google naturally ranks older pages higher than their younger competitors. Reasoning behind that is obvious I think. There are techniques such as purchasing an older domain name, but I don't recommend spending your money on that. Google's current algorithm also monitors the number of times domain name has changed owners and incorporates that into determining rank. My best recommendation is

9. Be patient

When it comes to Google or any other Search Engine, patience is the key. Factors like Sandbox have some part in webmasters' irritations. It all boils down to determining how trusted your site is. Backlinks play a major role. In this case, you might acquire a single backlink from a well ranked site and fly up the rankings, while tens of backlinks from lower ranked dot-com's will barely wiggle your toes.

Most importantly remember, it takes time for Google to index the Web. Billions of sites are updated daily and they all need to be crawled. A few extra incoming links can shorten the wait period by a lot, so that should be your goal for the next few months while you patiently wait. Now on to the final point.

10. Don't cheat the bots

You worked so hard on building content, optimizing your pages, researching keywords, and reading SEO techniques. Don't go on and waste all of this just by spaming sites with links (comment spam on blogs), over-stuffing content with keywords (please, people are actually reading you!), or hiding text (blending your keywords into the background just for bots sakes). I don't want to go into all black-hat techniques here, you get the idea. The point is that simple misdemeanors like that can get you banned from Google or any other Search Engine. After spending so much time trying to get in, it would suck to get kicked out, wouldn’t it?

Ignat Drozdov is an SEO working in Washington DC, specializing in new business launches in Europe and Asia. Ignat is also an editor of BlogSEO.

Article Source: http://EzineArticles.com/?expert=Ignat_Drozdov

Save to i89