Fog Creek Software
Discussion Board

security question

My employer asked me to write an ASP .NET web site which allows the users to upload files to it.

Each user has a directory of his own, and can make subdirectories.

The users can upload, view and download files, and move files from one directory to another.

Now, the problem is that the user must access only his own files from his own directories.

I proposed two alternatives in order to acomplish this:

1. Every ASP .NET page that manipulates the files or directories (moving files, viewing files, etc) should check if the user has loged in.

If the user has not logged in, of course we shall disallow any operation and display an error message.

If the user is logged in, then we check the parameters of the operation - check if the directory into which the operation happens is indeed that user's directory, and if it's not, just display an error message.

My boss says that this is insecure because the IUSR user has access to all the user's directories, and this is a very bad potential security breach, because there are many hacking tools that attack IIS.

The boss says that the security achieved by this is insuficient.

2. I proposed that the scripts create a Windows user for each web site user, and every such user will have access only to his own directory.

When logging in, the ASP .NET application should impersonate a certain Windows user, and so it will have access only to the user's directory.

But, the boss doesn't want to have the script create Windows (or domain) users.

So, the two alternatives I proposed are not acceptable to my boss.

He wants me to find another alternative, which doesn't have the disadvantages listed above.

What other alternative is there?

How can I do this?

I would realy apreciate a good advice.

Thank you!

Jed Boree
Saturday, September 13, 2003

Do the files actually have to be real files on the filesystem?

If not, an easy way to do it is to store the files as encoded strings in a text field in SQL server, and you can then control the security 100% in your application.

There is a sample WinForms app that does this using WebServices and such, search "ColdStorage" on MSDN.

You could do it by storing all the files in a single directory on the server and then keeping the details of access to each file in the SQL server, but this still has that step of "Storing the files on the server" which I think is where your boss is having the issues?

Saturday, September 13, 2003

>>"My boss says that this is insecure because the IUSR user has access to all the user's directories, and this is a very bad potential security breach, because there are many hacking tools that attack IIS."

This is just bullshit - these things -- "IUSR user" and "because there are many hacking tools that attack IIS" -- are completely unrelated. The IUSR account is very limited in what it can do. The hacking tools that breach IIS security do so due to buffer overruns that let them run, not as IUSR, but as LocalSystem (and we're talking more priviledges than Administrator here). So no matter what account your code runs under a buffer overrun attack will bypass it completely.

Anyway your code in ASP.NET does not run under ISR anyway - it runs under the ASPNET account -- another low privilege account.

IMHO, The SQL solution suggested by Chris sounds a bit overengineered for what your trying to acheive and I'm not sure what it wins you -- whether the files are stored on the filesystem directly or embedded in SQL server, your application is responsible for implementing the security. Also, "storing the files as encoded strings"? Why do that -- why not store them as BLOBs in an IMAGE field?

How are you implementing authentication? FormsAuthentication? Anyway I thing the simplest this is that you store the files on the filesystem, but totally control access to them via ASPX pages -- make them inacessable without going via your app, i.e. store them outside of your app's virtual directory. The ASPNET account will need appropriate permissions on the file store directory you create -- make sure it doesn't have write permissions anywhere else (it won't have by default, unless you have Everyone FullControl somewhere... which you won't, will you? ;-) )

Then doing the the thing you suggested, with a directory per user. When the user requests a file send it to them via a  "proxy" page that uses Response.Clear then Response.WriteFile to send it back. Centralise all the file handling code. Make sure it's impossible for the user to use something like directory traversal ("../" in paths) to get at the other user's stuff. Never trust any input the user gives you that would allow them to do this - always sanity check it.


Duncan Smart
Saturday, September 13, 2003

Is there a reason that secure FTP won't solve your problems?


Saturday, September 13, 2003

Second Philo here, although the file depository might be situated on a well-connected co-host. Most people have to upload and download crap using dial-ups and DSLs. And that means timeouts and quality of service disruptions because other programs overwelmes the download. Web uploads are one of those not so great (but not so bad either) ways to upload large file so you might want to consider secure ftp/scp or something else.

Li-fan Chen
Saturday, September 13, 2003

I 3rd Philo here!

Why not even just use the browser ie5, and ie6 has a great UI for ftp now. If you enable folder view, then users will have difficulty telling the difference between a regular folder on the pc, and the ftp one. I use ie6 as a my ftp client all the time. Cut and paste, and drag and drop all functions just like it does now in Windows.

Even better is that you don’t have to install any software. IE6 will ask for a the user name and password.

About the only advantage of some web interface I can think of is automating the process of signing up new users, and the creating of passwords etc. If this part can be done manually, then you don’t need to built a interface.

Also, if you use ftp then control and security is that of windows, and you can restrict users to their own dirs...

Albert D. Kallal
Edmonton, Alberta Canada

Albert D. Kallal
Saturday, September 13, 2003

The reason why I'm not using FTP is that the users must see a nice user interface, without installing anything on their computers.

So, we must do everything through a web site with nice design, step-by-step help, etc.

This is a very important requirement.

Jed Boree
Saturday, September 13, 2003


(ie microsoft web folders). 

Interfaces already written.

I would just use the existing standard and extend to your needs.  Microsoft and apache implementations are already available.

christopher baus
Sunday, September 14, 2003

*  Recent Topics

*  Fog Creek Home