- Back to Home »
- content , SEO codes , SEO Philippines , website content »
- Search Engines' Emphasis on a Site’s Content
When assessing the relevance to a particular topic, search engines place a lot emphasis on a site’s content. Before, search engines figured if a word was mentioned a lot on a page or site, then, that site must be relevant to that topic. Search engines then realized that this did not help provide their users with truly relevant results as it became far too common for people to stuff a site full of keywords. Since then, they lessened the emphasis they place on keyword density. Now, search engines are aiming for relevance aside from density. Thus, it is very essential to use target keyword(s) found in the target audience’s natural vocabulary.
Search engines usually look at the characteristics of a word on the web page when determining the importance of a certain word. This might include if it is linked to another page or is it in bold. These are some hints that search engines can use since it makes sense that a word that is somehow made to stand out has some importance.
On the other hand, a common content issue for websites is the duplicate content either both on the same site and amongst many websites. Search engines feel that it is important to direct their users to the most relevant and original version of the content. However, it can confuse the engines when identical content appears on a site in more than one place. One way to deal with this is to use the robots.txt file, a simple text file housed in the root directory of a website’s file structure that instructs web spiders to ignore a particular web page.
But search engine spiders cannot read all types of content. Some limitations include reading the text within Flash; interpreting the words in an audio file; and determining the words used within a video. Meanwhile, site designers can use tags (such as