Re: Getting better Google search results - Mailing list pgsql-advocacy

From Greg Sabino Mullane
Subject Re: Getting better Google search results
Date
Msg-id 64f7c031a223d74df451478d38901644@biglumber.com
Whole thread Raw
In response to Re: Getting better Google search results  ("Magnus Hagander" <mha@sollentuna.net>)
List pgsql-advocacy
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1


> Wouldn't this be better on -www? :-P

I considered that, but this seems more of an advocacy problem
with a technical solution. I thought the wider advocacy audience
might have some ideas about it.

> Don't we have enough silly domains already? If we want to get rid of
> those hits, why don't we just add a robots.txt and tell google not to
> index the old docs at all? (If people *need* hits in the old docs, they
> can always hit our own search engine for those docs)

I considered that, but I would not want to completely eliminate the
old docs from Google searching. There's always a chance something useful is
there. However, since 99.9% of generic non-version specific searches should
*not* hit those pages, it's best to stick those on the "e" in
Goooooooooooooooooooooooogle. :)

- --
Greg Sabino Mullane greg@turnstep.com
End Point Corporation
PGP Key: 0x14964AC8 200608290634
http://biglumber.com/x/web?pk=2529DF6AB8F79407E94445B4BC9B906714964AC8
-----BEGIN PGP SIGNATURE-----

iD8DBQFE9BhTvJuQZxSWSsgRAjMuAJ9eHXUVE1teHRNVMuH7lKOKxIgG6wCffYak
S9CKiT3zI/FitV09LeAKPHs=
=CwM6
-----END PGP SIGNATURE-----



pgsql-advocacy by date:

Previous
From: Hans-Juergen Schoenig
Date:
Subject: Re: database contest results
Next
From: Andreas Pflug
Date:
Subject: Re: database contest results