Sandboxing an upload zone on a web server
I need to store data via a Perl CGI script on an Apache 2.x Ubuntu server. I want to access-lock that data to be only RW access to Apache (or the Perl script), and not be available from an URL.
One solution is to install a database. I'd lik开发者_如何学JAVAe to avoid that dependency and store the data in unique filenames (this approach works for this application).
However, I'm not familiar with a good way to configure this.
What is the configuration needed to set up a 'sandbox' directory for file IO for a webserver, without allowing an URL to that directory?
Place the directory outside of the document root (and outside of directories aliased into the document root or any directories mapped via mod_rewrite). Apache and the Perl script will be able to read/write to this location (given the full path to the location and with correct file permissions, of course), but it will not be accessible via a URL.
There's an important caveat! Files outside of the document root aren't directly accessible via a URL, but if the Perl script is displaying content from the files then you have the usual security concerns of XSS and information disclosure. You'll also want to pay attention to how filename "uniqueness" or lack thereof impacts this design, e.g. a user might try to intentionally overwrite a file or guess file names.
精彩评论