sub-directory protection and search engines

by "Craig Newton" <whitby(at)idirect.com>

 Date:  Thu, 3 Feb 2000 10:42:18 -0500
 To:  <whitby(at)idirect.com>
 Cc:  <hwg-techniques(at)hwg.org>
 References:  aol zoomnet connection geocities
  todo: View Thread, Original
I have been using a low-tech solution to protect sub-directories from casual
browsers by putting a default.html page into every folder that has a meta
refresh back to home.
This prevents users from accessing folder contents at the ISPs that are not
configured to prevent this.
I have a feeling that the search engine robots have been reading these files
and penalizing me.
Is this a possibility - what other low tech solutions are available.

Craig

HWG hwg-techniques mailing list archives, maintained by Webmasters @ IWA