Re: URL Rewriting

by gchurchman(at)medseek.com (Gary Churchman)

 Date:  Tue, 21 Mar 2000 08:56:32 -0800
 To:  <hwg-techniques(at)hwg.org>,
"Stephen Johnston" <pepe(at)gainsay.com>
 References:  gainsay
  todo: View Thread, Original
We create Web sites that are almost entirely generated dynamically.  The Web
site is never the same twice.  This is a tremendously appealing feature to
most clients, but it also presents the very problem you allude to -- search
engine indexability.

We have toyed with dozens of ideas including scripting a routine that
creates a "ghost" html page for every dynamically generated page, that also
activates when the page is spidered by a search engine.  Easier said than
done.  Meta tags and embeds will only do so much.  If we come up with a
definitive solution, I will post it.  And, if someone has already won this
war, I'd love to know what method you used.

Gary Churchman


----- Original Message -----
From: "Stephen Johnston" <pepe(at)gainsay.com>
To: <hwg-techniques(at)hwg.org>
Sent: Tuesday, March 21, 2000 10:07 AM
Subject: URL Rewriting


> Hello All-
>
> At one point, quite a while ago I believe, a person mentioned a way to
> make CGI calls appear to be normal URLs. Like
> http://www.domain.com/cgiFile.pl?fname=stephen&lname=johnston could be
> represented by http://www.domain.com/cgiFile.pl/stephen/johnston or
> *something* like that. I am not sure I have the specifics correct but that
> is the idea.
> Basically I inherited a site that uses perl extensively. Every page except
> the home page is called through a Perl file. Though not the way I would
> have designed it, it was fine until the client said "Make sure all the
> pages will be indexed by the search engines.", I am now trying to find a
> way not to say "Tough Sh*t" and instead offer her some solution.
> Any ideas?
>
> -Stephen Johnston
>
>

HWG hwg-techniques mailing list archives, maintained by Webmasters @ IWA