How Do Search Engines Work - Web Spiders
Search engines are the ones that expose your site to the attention of potential customers. Therefore, it is important to understand how search engines function and how they provide details to the person who is initiating the search.
There are two kinds of engines for search. The first one is run by crawlers, also known as spiders or crawlers.
Search Engines employ spiders for indexing websites. If you submit your site's websites to the search engines, by filling out their submission form and the spider will crawl through your website. The term "spider" refers to an automated program executed through the system of search engines. Spider goes to a website, scans the content on the actual website, and the website's Meta tags, and follows the links to which the site's connection. The spider returns all the information to a central database in which the data is then indexed. It will go to every link on your website and then index these sites too. Certain spiders can only index a limited amount of your website therefore don't build an entire site that has 500 pages!
The spider is periodically returning to the websites to look whether any information has changed. The frequency that this occurs is determined by the moderators who run the engine.
A spider can be thought of as the book in that it has an index, the tables of text as well as the hyperlinks and references to all websites it discovers in its searches, and can index up to one million pages per day.
Examples: Yahoo, Bing, Yelp, Youtube, and Google.
When you ask a search engine to find information, it's looking through an index it created, and it's not actually searching the Web. Different search engines give different rankings since not every search engine utilizes the same algorithm for searching through the indexes.
One of the factors that an algorithm for search engines scans for is the frequency and position of keywords on a website webpage, however, it may also spot fake keywords or the use of spamdexing. The algorithms also look at the ways that websites link to other sites across the Web. By examining how the pages link to one another search engines can tell the content of a webpage when the keywords on the linked pages are comparable to the keywords that appear on the home or original page.
Keyword Density
Keyword density is a measure of the frequency the keyword is used on a web page. Be aware that keywords should not be used in excess, but must be sufficient to be prominently displayed.
If you are repeating your keywords using every other word in every line, your site is likely to be deemed an unnatural or spam website.
Keyword density is usually expressed in percentages of the words on a web page.
Imagine that you've got 100 lines of text on your page (not comprising the HMTL code that is used to write the page) and you include an exact keyword five times within the text. The keyword density on the page is determined by dividing the number of keywords by the total amount of words appearing on your page. Thus, 5 divided by 100 = .05. Because the keyword density refers to a proportion of total words on the web page you can multiply the number above by 100. That's 0.05 10 x 100 = 5%
The most common standard to determine the keyword density ranges between three and five percent in order to be noticed by search engines, and you should not exceed that.
Be aware that this rule is applicable to all pages on your website. This rule also applies to more than one keyword, but to a group of keywords that are related to a specific product, service, or offer. The keyword density should be between 3 and 5 percent.
Simple steps to test the density:
We are the premier digital marketing solution in Chicago. Contact us today to get your free, no-obligation consultation!
888.926.4939