In the good old days, google search engines discover content on the web and then rank them based on very primitive algorithm.
Mostly, the rankings of each web page depends on the keywords being used.
For instance, if you have a web page about SEO, and you use the phrase "google search engine optimization" 20 times in your page, the google search engines will determine that the page must be about SEO and then rank your page accordingly.
In any way, such primitive algorithm is open to manipulation. Website owners who figured that out soon started to stuff their web pages with keywords. They even go as far as using invisible keywords on their web pages!
As a result, the quality of the google search engine results declined rapidly. The google search engines immediately made a shift and invested in technology that identify web pages based on context. Overnight, many of those keyword stuffed pages disappeared from the google search engines.
There is, however, still one tacky problem - the google search engine indexes are still made up of similar content (or more commonly known as duplicate content).
Duplicate content, for sure, is not good for google search engines because the same pages appear on in the search index over and over again. This makes it difficult for the user to discover what they are looking for immediately. Also, more and more website owners are taking benefit of the situation by scraping content or creating derivative content and then posting them on their own blogs or web sites.
To prevent this problem from getting out of hand, the google search engines knew that they had to evolve some more. And they did. Soon, they came up with algorithm that helps them detect web sites that are filled with duplicate content. And like the first time round, many web sites with duplicate content disappeared from the google search engine indexes.
So what remains in the search index now?
The answer is understandable - web sites with original and soaring quality content remains at the top of the search indexes. And because so many low quality sites have dropped out of the search indexes, these original sites are enjoying even more traffic!
So if you want your web sites to rank well in the search results, always make sure that over 90% of the content is unique and original. What does this mean?
1) Ideas are generated by you or other human writers.
2) No scraping and derivative content is allowed.
3) If you are writing from your own mind, you should be doing okay.
Every now and then, you may have guest writers posting non-original content on your site. That is alright as long as you agree the content is of soaring quality, and that it benefits your readers. Also, since content is submitted by guestsFind Article, they should only make up 10% of your overall content. The rest should still be unique and original.
Source: Free Articles from ArticlesFactory.com
ABOUT THE AUTHOR
Webhostingpad is the best place to host your personal and business websites. And you can use the latest webhostingpad coupons for extra discount, this TWHR25 is the best webhostingpad coupon for highest discount price.
By: Gen Wright