Robot exclusion is what ROBOTS.TXT is for. You can also use a meta "exclude" tag in your templates.
On 04/02/1999 01:40 AM, BKA wrote:
>Would it be possible to add a filtering function to http requests? There >is a search engine called inktomi that comes to visit my site from time >to time and I would like to keep them out. > >If you are wondering why on earth one would need something like that: >Their search engine takes everything visible on a web page and tries to >interpret them for possible web addresses. i.e. apart from looking at >the source for the anchor tag, it looks at the visible text to look for >it too. > >On some of my pages I have got a web tutorial where I have got sample >codes too. This stupid engine reads those pages and tries the sample >codes on my server. I have written about it to its programmers but they >preferred to leave it that way. > >I know it does not hurt but I also know this is the method some robots >use to extract e-mail addresses from web pages. So, I would like to be >able to stop any bad intentions. Besides, the fact that they carry on >what they doing despite my warning is a good enough reason for me in >wishing to keep them out. --- Daniel O'Leary, Admin/WebMaster KloneZone Mac - A TeleFinder 5.7 Mac/Windows BBS