For the last five years promoting various aspects of the Cactus and Succulent Hobby by publishing Web pages has become increasingly important. It has become a key recruiting method for Societies and an important sales channel for nurseries particularly those that sell by mail order.
One of the biggest problems is that simply publishing a page is not enough. As the web has no central index nobody will find your magnum opus unless you take steps to publicise it. Various methods are important. Publicising the web page address offline using traditional media - print, radio or television is likely to be very effective and should be done whenever possible but may be costly. On-line publicity through groups such as cacti_etc is good but tends to be short lived and in any case is preaching to the converted.
I spend a significant amount of time reading on-line internet marketing newsletters. I recently read of some interesting developments which I think have considerable relevance to our position. In order to understand this idea I need to go a little into history of the www.
In the early days of the www the problems in finding on-line information led to the development of two rather different types of services. The first, exemplified by Yahoo, built an index of pages in a structure ordered by subject matter. This was initially very successful. You submitted your internet address to Yahoo with a suggested category and within a few days your entry appeared. However as the volume of pages grew exponentially it wasn't long before it was weeks or even months before your entries appeared. Being manually reviewed the editors could not cope and the costs escalated. Yahoo then introduced a fast track for companies that paid for entries. It has now reached the point that commercial companies have to pay to get an entry at all. Non-commercial free entries generally have to wait a long time with little guarantee that an entry will ever appear. Various rather clever variants of the Yahoo directory were tried by other organisations. Perhaps the most successful, recruited expert volunteers in various fields to review and maintain the part of the index in which they specialised. It is a problem to maintain reasonable uniformity across the index and also to keep out personal bias. I guess it is also a problem finding enough people will to do this work for free. One of the biggest nightmares of this approach is the problem of obsolete and changed links.
The other approach to this problem used computers to try and solve the indexing problem. This approach is covered by the rather generic term "Search Engine", these sent out so called spiders to follow all the links on the www and analyse all the pages according to their content. With sufficiently powerful computers this solves the problem of coping with the volume of pages and also the changes in address. Using such a system you can very easily get a complete list of all the web pages containing say the word 'cactus'. This unfortunately does not get you very far as there maybe somewhere between 100,000 to 1,000,000 of them. If you are looking for plants it does not help much to be presented with pages for perhaps 100 Cactus Cafe's in various parts of the world or information on Cactus Jack who I believe is an American wrestler.
Without some means of assessing relevance or importance so that pages can be ranked, this system does not help a great deal. The early search engine results were a bit of a mess with a high percentage of irrelevant results being returned. The use of Meta-tags in the head of html pages helped somewhat but was very open to abuse.
Over the years various search engines have waxed and waned as they tried very different strategies to present the best search results. Over the last year or so one has had a particularly meteoric rise and that is Google. When I was first introduced to it I was particularly impressed with the relevance of the results it returned and soon switched my own use and also recommended it to others. It is now the most used search engine.
I have now learned a little more about the way Google works and can understand why it is so successful. In understanding this I can also see how we can now work with Google to improve the visibility of cactus and succulent pages on the www.
One of the factors which search engines have been experimenting with for some time is the topic of link analysis. It makes sense that if a page has a lot of links to it from other web pages it is in some way important to a lot of people. Quite a number of search engines will rate your pages higher if there are lot of links to it and will display the information higher up the search results page.
Google has taken this one critical stage further in that it has realised not all links are equal. A link from a free page put up by Mr SH about his hamster is not as important as say one from the Royal Horticultural Society or the Huntington Botanic Gardens linking to your society page. How to evaluate this relative importance? Such an enormous amount of data of course is just what computers excel at. Google has rated all web pages on a scale of 1-10. The process of arriving at this figure is a process of iteration.The essence of this is that to increase the importance of your page you have to get links from other important pages. This is not to say that links from less important pages are worthless, you just have to get more of these to have the same effect.
There is a free toolbar available from Google which adds on to Internet Explorer (only I regret) which will gives you this relative importance factor of any page you visit. Aside from the main point I have found this extremely helpful when surfing. It also gives you a Google search box available at all times so is worth getting even if you are not involved in webpage development or promotion.
Most cactus and succulent sites have homepages rated at between 3-5. The BCSS and CSSA homepages are rated at 6 and the Cactus-Mall homepage at 7. We could use links from sites rated in the 8-10 bracket to improve our position but these are relatively rare beasts and probably more links from sites rated 5-7 are perhaps a more realistic goal. What attracts me enormously about this is that acting as a community we can all automatically help each other.
For example if a BCSS branch website rated at say 3-4 puts up good contents on its website and then gets links from other sites rated 5-6 it increases its own rating, but it automatically helps to improve the main BCSS site rating through its links becoming more important. The fact the you can quickly check how valuable a link from a particular site is likely to be is of enormous value in itself.
I think this idea is really going to take off amongst reasonably well organised and cooperative international community like ours. Once you start thinking about the idea many more ramifications and possibilities come to mind.