Friday, October 06, 2006
Common Search Engine Principles
by: Mike Francis
Search engines will be your primary assistants in the internet, to satisfy your queries. As every one is familiar with, search engines are used to list the websites, which are appropriate to provide information about the key word phrase. In the world of internet marketing, every websites developers need their website to be listed top positions of search engine results and adopt search engine optimization techniques to optimize the websites according to the search engine requirements. However to start with optimization, primarily you have to be aware about the common search engine principles, which will detail about the basic working pattern of search engines.
Search engines are computer programs, created to find out particular web sites from the millions of websites present in internet. There are many types of search engines, such as crawler based, human based and hybrid. In crawler based search engines particular programs for the searching process, whereas, for human based search engines such as directories and DMOZ, human editors will list out the pages, according to the relevancy of the content. In hybrid search engines, crawlers may be used for the search, but, human editorial intervention will be present. Out of the three, programmed search engines are more common and their principles are quite different from human searches.
The most common principle of search engines is that their language will be HTML, and all the information will be coded in HTML codes. The web server will be entry page of the search engines. And, a specific input area such as box is present in the particular HTML page to enter the key word phrase. The crawler based search engines will use specific algorithmic programs to locate and list the pages. Even though the basic components will be same, the grouping of the specific algorithms will differ according to the search engine.
Spiders or robots will be the primary searching algorithms for search engines, which identifies the web pages. It will identify the web pages in its HTML source tags, and it differs with a browser for not having any visual components such as texts or graphics. Crawlers are used to find out the documents in the links prescribed in the web pages. Indexing is the next phase, in which the web pages will be listed according to their characteristic features such as text, HTML tags and other specialties. The indexed pages have to be stored for the search engines, and databases are the common store houses of the web pages. The result pages will then rank the priority of the web pages, according to the various factors of consideration in a search engine listing. And, the viewer can see the results in the display pages of the web server in HTML visual version.
About The Author
Mike Francis' home page http://www.pegasusdirectory.com is a free directory that helps webmasters promote their sites and get relevant backlinks.We are now offering a new source of traffic http://www.pegasusdirectory.com/myspacebulletins.php.
Search engines will be your primary assistants in the internet, to satisfy your queries. As every one is familiar with, search engines are used to list the websites, which are appropriate to provide information about the key word phrase. In the world of internet marketing, every websites developers need their website to be listed top positions of search engine results and adopt search engine optimization techniques to optimize the websites according to the search engine requirements. However to start with optimization, primarily you have to be aware about the common search engine principles, which will detail about the basic working pattern of search engines.
Search engines are computer programs, created to find out particular web sites from the millions of websites present in internet. There are many types of search engines, such as crawler based, human based and hybrid. In crawler based search engines particular programs for the searching process, whereas, for human based search engines such as directories and DMOZ, human editors will list out the pages, according to the relevancy of the content. In hybrid search engines, crawlers may be used for the search, but, human editorial intervention will be present. Out of the three, programmed search engines are more common and their principles are quite different from human searches.
The most common principle of search engines is that their language will be HTML, and all the information will be coded in HTML codes. The web server will be entry page of the search engines. And, a specific input area such as box is present in the particular HTML page to enter the key word phrase. The crawler based search engines will use specific algorithmic programs to locate and list the pages. Even though the basic components will be same, the grouping of the specific algorithms will differ according to the search engine.
Spiders or robots will be the primary searching algorithms for search engines, which identifies the web pages. It will identify the web pages in its HTML source tags, and it differs with a browser for not having any visual components such as texts or graphics. Crawlers are used to find out the documents in the links prescribed in the web pages. Indexing is the next phase, in which the web pages will be listed according to their characteristic features such as text, HTML tags and other specialties. The indexed pages have to be stored for the search engines, and databases are the common store houses of the web pages. The result pages will then rank the priority of the web pages, according to the various factors of consideration in a search engine listing. And, the viewer can see the results in the display pages of the web server in HTML visual version.
About The Author
Mike Francis' home page http://www.pegasusdirectory.com is a free directory that helps webmasters promote their sites and get relevant backlinks.We are now offering a new source of traffic http://www.pegasusdirectory.com/myspacebulletins.php.
Previous Posts
- Why I Allow People To Advertise on My Blog
- Data and MP3 Compression: Understanding "Digital"
- Microsoft’s Xbox 360 vs. The Sony’s Playstation 3
- Do You Need A Cell Phone
- How To Do The Twenty One Card Magic Trick
- The Top Ten Security Tips For Halloween
- The Halloween Spirit
- Halloween is Tricky for Food-Allergic Kids
- Philippines Travel: Exploring Sagada On A Shoe-Str...
- Subic Bay Philippines