Nevermind Marc ;) Saw you thought of that too.
Gavin M. Roy wrote:
> Can't we just throw a robots.txt in there to limit what they can get?
>
> Marc G. Fournier wrote:
>
>>
>> being pounded right now by several search engines at once ... am
>> working on it right now ...
>>
>> On Wed, 27 Apr 2005, Joshua D. Drake wrote:
>>
>>> Hello,
>>>
>>> We are at load of 44... what is going on?
>>>
>>> Sincerely,
>>>
>>> Joshua D. Drake
>>>
>>> --
>>> Your PostgreSQL solutions company - Command Prompt, Inc. 1.800.492.2240
>>> PostgreSQL Replication, Consulting, Custom Programming, 24x7 support
>>> Managed Services, Shared and Dedication Hosting
>>> Co-Authors: plPHP, plPerlNG - http://www.commandprompt.com/
>>>
>>> ---------------------------(end of
>>> broadcast)---------------------------
>>> TIP 2: you can get off all lists at once with the unregister command
>>> (send "unregister YourEmailAddressHere" to majordomo@postgresql.org)
>>>
>>
>> ----
>> Marc G. Fournier Hub.Org Networking Services
>> (http://www.hub.org)
>> Email: scrappy@hub.org Yahoo!: yscrappy ICQ:
>> 7615664
>>
>> ---------------------------(end of broadcast)---------------------------
>> TIP 6: Have you searched our list archives?
>>
>> http://archives.postgresql.org
>
>
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 3: if posting/reading through Usenet, please send an appropriate
> subscribe-nomail command to majordomo@postgresql.org so that your
> message can get through to the mailing list cleanly