Why Your Site Needs Fresh, Relevant Content
By Victor George Published 07/5/2005
It is said that content is king, but today 'fresh, relevant content' is the master - or is it?
Every owner of a commercial web site knows that frequent fresh content is needed on their pages in order to achieve and maintain a high listing on search engines which actively seek fresh content. Google sends out its 'freshbot' spider to gather and index new material from all the sites which offer it. MSN Search seeks it too. I've noticed that MSN Search's spider pays a daily visit to a site of mine which has proper fresh content every day.
By incorporating fresh content, commercial web sites will remain competitive, for without it they will certainly fall down the search engine listings and lose business. Besides, having something new keeps visitors coming back and attracts potential customers.
But creating and then manually uploading fresh content onto our web sites each day is hard, time consuming work, isn't it? What we want is a way of putting daily fresh content onto our web sites easily and efficiently. Let's look at the current techniques available to us to achieve this goal and see which one offers a global solution to the fresh content problem:
Server Side Includes (SSI): These are HTML statements written by the webmaster and uploaded onto the server. SSI's inform the server to include a specific block of text when a specific page is served to a browser or a search engine spider.
Because these scripts are compiled 'before' they are served, they remain 'visible' to search engine spiders and therefore will be seen as fresh content. Unfortunately, not all web hosts support SSI's; this is because the server must 'read every page' on the web site as it looks for include statements, a process which clearly reduces server performance.
How many web site owners have the time to manually upload fresh HTML content onto their servers every day? Probably very few, which is why the use of SSI's is not a global solution to the fresh content problem.
Blogging: Google's Freshbot spider is so voracious for fresh content that it eagerly devours the contents of common weblogs. But can a daily blog be used to influence the listing of a web page under specific keywords or phrases?
It can, but for the vast majority of web site owners, blogging is out of the question. Putting up a daily keyword-rich business blog onto a web site is hard, time-consuming work, and it requires the blogger to be a competent writer, too. Few business owners have time available or the competence to write something new about their products or services every day.
Blogging is therefore not a global solution to the fresh content problem.
RSS Newsfeeds: Having newsfeeds placed on a web site is certainly an easy way of getting fresh material to appear each day. 'Really Simple Syndication' or RSS, is a fast growing method of content distribution. Newsfeed creation is an uncomplicated procedure and therefore appears to be an easy solution to the fresh content problem.
Many owners of commercial web sites believe that by incorporating newsfeeds on their sites they will improve their search engine rankings by using the links appearing within those feeds, which are given relevance by Google. This belief is wrong because newsfeeds are basically JavaScript or VBScript.
These scripts must be executed by search engine spiders for the fresh content to be noted, and since the spiders take a simplistic approach when reading web pages, these scripts will not be executed at all. These scripts are compiled 'after' they have been served, and not before.
There are also a couple of growing menaces associated with RSS newsfeeds:
Since the popularity of RSS use is growing exponentially, the idea to monetize syndication with ads is gaining ground. Indeed, Yahoo has announced that it will begin displaying ads from Overture's service within RSS feeds. Now who wants other people's ads on their web site? I don't.
There are rumors of newsfeeds being used to deliver spam. If this gets out of control then newsfeeds will quickly become history. Who wants spam messages appearing on their web site? I don't. RSS is therefore not a global solution to the fresh content problem.
Newsfeed Scripting Solutions: A software solution can be rigged up to 'extract' the HTML from newsfeeds. The HTML is then placed onto web pages so that the fresh content will be seen by search engine spiders. This however involves the use of PHP and MySQL, which tends to put many business owners off. And if there's spam or ads in the feed, they will get extracted, too!
Newsfeed scripting solutions are therefore not a global solution to the fresh content problem.
Creating Original Content: As mentioned above under SSI's and Weblogs, creating and manually uploading your own fresh content every day is a time-consuming chore. And what if you have a number of web sites, each of which requires frequent fresh content in order to remain competitive? Yet we all know that there is nothing better than our own proper keyword-rich fresh content.
In summary, getting frequent proper fresh content onto our web sites is not straightforward at all. HTML extracted from RSS feeds appears to offer a partial solution, but it is too complicated for most businesses and is potentially menacing.
The e-commerce industry is clearly in need of a genuine solution to the fresh content problem. The way to do it is to automatically have our web pages updated every day with 'our own' content, not anyone else's. Only then will we be able to say that fresh content is truly the master!
About the Author:
Victor George is a "fresh, relevant content" crusader whose web site can be found at AutoPageUpdate. Easily control your web content to suit your clients and to keep the search engines well fed with your new and relevant content.
(C) 2005 Victor George. Other articles by Victor George.