Re: Security Challenge

by "Kehvan M. Zydhek" <kehvan(at)zydhek.net>

 Date:  Wed, 3 Jan 2001 13:17:10 -0800
 To:  "Rob Atkinson" <robatkinson(at)nucleus.com>,
"Steve Segarra" <ssegarra(at)mitre.org>
 Cc:  "Campbell, Rebecca" <Rebecca.Campbell(at)converg.com>, <hwg-techniques(at)mail.hwg.org>
 References:  converg mitre nucleus
  todo: View Thread, Original
Rob,

Your challenge has been met and (I think) mastered. But to test it, you
would have to be willing to go to the one site that has it (it's
adult-oriented material).

Let me describe the solution; it's an ASP solution, which may limit it's
usefulness for you:

A subscriber to the adult site, who has already logged in to the protected
areas, clicks on a thumbnail of the picture they wish to see and/or save. A
second browser opens, either as a simple pop-up window or a complete browser
depending on whether they're in the HTML3 or HTML4 version of the site (-4
gets more bells and whistles, -3 gets the plain basic package). Anyway, the
thumbnail is not your standard thumbnail... Clicking it triggers a scripted
file that first verifies that the person is an authorized user (logged on),
and then reads data from a directory OUTSIDE of the web directory structure,
routing it to the browser for display.

The (simplified) resulting directory structure is:

SERVER ROOT
        +--- PICTURES
        +--- WEB ROOT (www)
                    +--- COMMON FILES (scripts)
                    +--- HTML3 FILES
                    +--- HTML4 FILES
                    +--- IMAGES (site navigation, thumbnails, etc)

As you should know, NO BROWSER can actually access any folder OUTSIDE of the
web root structure, since the root domain would drop you into the web root
folder. You have to FTP to do that, which is beyond the scope of this
exercise (and in this case, the site does not allow for anonymous FTP). As
described above, the thumbnails trigger an ASP script in the Common folder,
which then accesses and feeds the protected images to the browser.

Now, as for whether this works... It works great for the images (all of
them), but is somewhat flaky on the download of the ZIP files of the images.
We're still working on that. But, the only URL you'll be able to get for the
images is (for example)
http://www.inuniform.net/common/imgdisplay.asp?file=iu14ps213.jpg
(URL from Netscape 4.76, which gets the HTML3 version of the site; as
mentioned above, HTML4 browsers do not display URLs for the images by
default)

This results in the following code in the source:
<.img src="image.asp?sessionid=#########&issue=14&file=iu14ps213.jpg">
(sessionid is displayed as # because it changes each time the above link is
invoked)

While this DOES show the image name (iu14ps213.jpg), the images DO NOT
reside in the /common folder. In fact, if you click on the above link,
you'll get a broken image indicator, because you are not logged in to the
site, and therefore do not have authorization to view that image. In theory,
if you were logged into the site, you could use the above URL to view the
remaining images on the site (the naming convention isn't too difficult to
determine), rather than clicking on individual thumbnails. However, at best,
you'll get just the Issue 14 images, which you would already have access to
via your login information, so it's rather pointless. Issue 15 images are
stored in a different directory than the Issue 14 images, and their folder
names are fairly complex. For example, the Issue 14 image folder is called
"iu14-7d0506831" -- I know why it's named like that, but do you? It's not
what you probably think... Even if you were to guess (which is unlikely),
the chances of getting the Issue 15 folder named right is, well,
astronomical.

Because the file that displays these images is an ASP file, all code is
processed on the server. The browser gets just the output, so viewing the
source code wouldn't help to determine where the pictures come from, either
(see above).

So, does this pass your requirements?

Kehvan M. Zydhek
Zydhek Enterprises

----- Original Message -----
From: "Rob Atkinson" <robatkinson(at)nucleus.com>
To: "Steve Segarra" <ssegarra(at)mitre.org>
Cc: "Campbell, Rebecca" <Rebecca.Campbell(at)converg.com>;
<hwg-techniques(at)mail.hwg.org>
Sent: Wednesday, January 3, 2001 11:06
Subject: Re: Security Challenge


> Having now tried some Link Redirection scripts (some with their
> own Anti-Leech protocol) I find that it all comes down to the
> same thing: When the file download is cleared to proceed, every
> Browser will show the actual URL of the download, somewhere along
> the line.
>
> Steve might have the right idea with using a DB, but I don't know
> enough about mySQL (which is the only DB I work with) to
> incorporate the security required.
>
> Rebecca had the same idea as I did (using a protected directory)
> but including the ID/PW within the PERL script making the URL
> call, doesn't work well enough with IE.
>
> Can anyone offer a suggestion on how to provide the ID/PW of a
> protected directory, in a hidden manner, just before the call to
> the URL download?
>
>
> Website Rob
> Helping people create a Potent Website
>
>
>
> Steve Segarra wrote:
>
> Do you have a db running? How big are the files? Write the
> contents of the file to a blob in the db then extract them and
> create the files on the fly when people make the requests...
>
> HTH,
> steve
>
> "Campbell, Rebecca" wrote:
>
> not sure if this will be helpful or not, but what i'd probably
> try to do is put the files in a protected directory that requires
> some kind of password to access them. then you can call a routine
> that will "enter" that password when the download link is
> clicked.
>
> this isn't something i've ever tried to do, so i don't know how
> easy/feasible this solution is, but it might be something to play
> with...
>
> hth,
> rebecca
> http://www.nerdygirl.com
>
> -----Original Message-----
> From: Rob Atkinson [mailto:robatkinson(at)nucleus.com]
> Sent: Tuesday, January 02, 2001 2:38 AM
> To: hwg-techniques(at)mail.hwg.org
> Subject: Security Challenge
>
> Not having been able to solve this challenge myself, I look to my
> colleagues for help.
>
> Object
> ---------
> To prevent *.exe & *.zip files from being downloaded, except from
> a link on a page at the site. Server is Linux with Apache 1.3.12.
>
> Tried
> --------
> PERL download scripts with Anti-Leech which all worked as
> advertised but did not meet the Challenge. When download script
> kicks in, the actual URL is shown in the Status Bar. In Netscape
> it happens to quick for the eye to follow. In IE the URL shows
> long enough for someone to get the actual location URL.
>
> Someone could then use the actual URL as a download link from a
> page on "their" site, thus defeating the purpose of an
> Anti-Leeching download script.
>
> Used Protected Directories with ID/PW built into the download URL
> from the PERL script but again, the actual location URL (with
> ID/PW) is shown which also defeats the purpose. IE has a tendency
> to accept/not accept, the built ID/PW when using a URI of http:
> and I didn't bother with ftp: as I figured same result.
>
> The test page can be viewed at:
>
> http://www.webhostingavailable.com/download_test.html
>
> Only the download links are functional on the pages.
>
> I look forward to any workable solution and can always have a
> mod installed if necessary.
>
> TIA
>
> Website Rob
> Helping people create a Potent Website
>

HWG hwg-techniques mailing list archives, maintained by Webmasters @ IWA