31 July 2017

How To Create An HTTP Upload Server

Theory: The idea is to create a file-server server that supports uploading/downloading files over Hyper Text Transfer Protocol (HTTP).

This can either be done using simple standalone software or complex Apache/Nginx configurations.

With regards to security and scalability for all entries below, stunnel or Squid can be used to proxy connections to the HTTP server software (or any other service actually) to handle TLS termination externally. Stunnel can be combined with the EFF's Certbot and the Let's Encrypt CA to support authentication as well.

Further reading:
To aid in network availability, Cloudflare can provide distributed internet-based caching, or, for local-only systems, Squid could be configured to support caching.

So here are some of the available options:


Python's simple http server

Downloading files over a LAN is very easy with the Python one liners (single threaded):

Pros:

  • Works okay
  • Simple
  • Directory listings

Cons:

  • Single threaded = low network availability
  • Downloading only
  • No user authentication

Installation:

#Windows 
https://www.python.org/downloads/windows/ 
#or Linux
python -V

python -m SimpleHTTPServer 80  #python 2
#or
python -m http.server 80  #python 3

Node.js http-server

For a multi-threaded HTTP download server (simultaneous file downloads), the node.js implementation is production ready for small deployments.

Moto: "Serving up static files like they were turtles strapped to rockets."

TurtlesAndRockets.png

Pros:

  • Works
  • Simple
  • Directory listings

Cons:

  • Downloading only
  • No user authentication

Installation:

# Install Node.js's package manager "npm"
https://nodejs.org/en/  # Windows
apt-get install npm #Debian
yum install nodejs npm #Fedora 21
dnf install nodejs #Fedora 22

#Debian, Fix node binary using a symbolic link if necessary
which node
which nodejs
ln -s /usr/bin/nodejs /usr/bin/node

#Install for Debian and Windows
npm install http-server -g

#Install for Fedora
yum install 'npm(http-server)'
npm link http-server

#start it
which http-server
http-server --help
http-server -p 80 &

#stop it
ps -A
pkill node
ps -A

#start on boot
which http-server
crontab -e
@reboot /usr/local/bin/http-server /home/user/Public -p 80 
@daily pkill node && /usr/local/bin/http-server /home/user/Public -p 80

Tip: The node.js http-server implementation is also excellent at recursively auto-indexing sub-directories mounted via directory junctions from Fat32 volumes without running into permissions issues. Just FYI~


Upload Servers

Uploading over HTTP is a completely different problem since HTTP was not designed for that. There are not a lot of I-am-not-a-web-developer options, but there are some, all of which are varrying degrees of broken. The Node.js http-upload-server's quality is more or less representative of the available sub-Apache options.


Node.js http-upload-server, github

Pros:

  • Simple

Cons:

  • Multi-uploads can crash the server
  • Unreliable
  • No user authentication
  • Uploading Only

Installation:

# Install Node.js's package manager "npm"
https://nodejs.org/en/  # Windows
apt-get install npm #Debian
yum install nodejs npm #Fedora 21
dnf install nodejs #Fedora 22

#Debian, Fix node binary using a symbolic link if necessary
which node
which nodejs
ln -s /usr/bin/nodejs /usr/bin/node

#Install for Debian and Windows
npm install http-upload-server -g

#Install for Fedora
yum install 'npm(http-upload-server)'
npm link http-upload-server

http-upload-server --help

#If it doesn't work, the line endings are off.
which http-upload-server
apt-get install vim -y
vim /usr/local/bin/http-upload-server
:set ff=unix
Enter
:x
Enter

http-upload-server --help

#start it
mkdir uploads
cd uploads
http-upload-server --upload-dir /home/user/Public --port 80 &

#stop it
ps -A
pkill node
ps -A

#start on boot
which http-upload-server
crontab -e
@reboot /usr/local/bin/http-upload-server --upload-dir /home/user/Public --port 80 
@daily pkill node && /usr/local/bin/http-upload-server --upload-dir /home/user/Public --port 80 

Node.js upload-server, github

Pros:

  • Simple
  • Directory listings
  • Custom client-side uploads possible by using client-side templates that implement AngularJS.
    • What was that about I-am-not-a-web-developer optons...?

Cons:

  • Unreliable uploading larger files (timeout issues).
  • No user authentication

Installation:

# Install Node.js's package manager "npm"
https://nodejs.org/en/  # Windows
apt-get install npm #Debian
yum install nodejs npm #Fedora 21
dnf install nodejs #Fedora 22

#Debian, Fix node binary using a symbolic link if necessary
which node
which nodejs
ln -s /usr/bin/nodejs /usr/bin/node

#Install for Debian and Windows
npm install upload-server -g

#Install for Fedora
yum install 'npm(upload-server)'
npm link upload-server

upload-server --help

#start it
mkdir uploads
cd uploads
mkdir files
upload-server -p 80 &
#the -f switch can be buggy because it insists on prepending the current path
cd ~ & upload-server -p 80 -f files &

#stop it
ps -A
pkill node
ps -A

#start on boot
which upload-server
mkdir /home/user/files
crontab -e
#the -f switch can be buggy because it insists on prepending the current path
@reboot cd /home/user && /usr/local/bin/upload-server -f files -p 80
@daily pkill node && cd /home/user && /usr/local/bin/upload-server -f files -p 80

HFS ~ HTTP File Server

Pros:

  • Works
  • Reasonably simple
  • Customizable directory listings
  • Supports user authentication

Cons:

  • Windows only

Installation:


Without Apache, the options are Windows, or using a Node.js implementation that either crashes when pressing the wrong button or another implementation that freezes when uploading.


Apache

Apache is a highly configurable web server that supports all the things, very well, but has significant configuration and security overhead. It also scales from small deployments to just before the most congested sites on the internet.

Pros:

Cons:

  • Requires complex configuration
  • Permissions are always an issue
  • Authentication is just... x_x

Workflow:

  1. To get uploading working with Apache, it needs PHP.
  2. After that, it is necessary to configure it and activate an Apache "virtual server."
  3. Then there are various hosted jQuery libraries for HTTP uploading like Widen's FineUploader, dropzone and blueimp's jQuery-File-Upload to copy/paste into the web server's root directory. Google "jQuery Upload" for more alternatives.
  4. The server-side PHP to handle the uploaded file is library specific and has to be implemented seperately, but there are some templates available.
  5. Usually, the client-side HTTP also needs to be implemented, again usually by copying from templates.
  6. Authentication can be set up after that, and it is just...x_x
  7. Then, after everything is working, Apache needs to be configured (again) with settings that prevent abuse.

In other words, Apache is the land of web-developers and sys-admins. If this is not complicated enough to scare you away, consider using Nginx instead. It scales better than Apache with a matching increase in configuration complexity.

Now, Let's do this.

Apache Documentation:


Installation:

  1. Install Ubuntu 16.04 LTS or another operating system. The instructions below are for Ubuntu/Debian with username: user and password: Password1.
  2. Give it a static IP address. 192.168.200.2 is used below.
  3. Install SSH if it is not already.
    1. which sshd - Stop here if it is already installed and running.
    2. apt-get install ssh
    3. which sshd
    4. /usr/sbin/sshd -p 80
    5. Add SSH to cron:
      1. crontab -e
      2. @reboot /usr/sbin/sshd -p 80

Note: This will also use a static snapshot of blueimp's jQuery-File-Upload available here for simplicity. If you are a web developer, consider using FineUploader instead.

#Login over SSH
Username: `user`
Password: `Password1`

#switch to root
su -
#or
sudo su -

#update and install dependencies
apt-get update
apt-get upgrade -y
apt-get install apache2 php libapache2-mod-php apache2-utils fail2ban nmap aria2 zip -y

#Configure Apache
#Info: Global Configuration Files:
/etc/apache2/apache2.conf
/etc/apache2/ports.conf

#Info: Directory to put server configuration files (mysite.conf) in:
/etc/apache2/sites-enabled/
#Info: Use the <Directory "/home/www/html"> syntax to specify a virtual server's root directory.

#Copy the default config file for site specific configuration.
cp /etc/apache2/sites-available/000-default.conf /etc/apache2/sites-available/mysite.conf

#Enable the new VirtualHost using the a2ensite utility and restart Apache2: 
a2ensite mysite
service apache2 reload

rm /var/www/html/index.html
nano /var/www/html/index.php
#Enter the following into the new file

<?php
  phpinfo();
?>

#save the file and exit nano
CTRL + o
Enter
CTRL + x

#Check to make sure Apache is working on port 80.
nmap localhost

#Then check to make sure php is working:
http://192.168.200.2
#If lots of information about php gets displayed, then it works. If not, fix it before moving on.

#A PHP setting need fixing to support large file uploading.
#In the above webpage look at the entry for "Loaded Configuration File" and use it for the next command.
nano /etc/php/7.0/apache2/php.ini

#Use CTRL + w to search for and change the following values
post_max_size = 8M  ->  post_max_size = 8192M
upload_max_filesize = 2M ->  upload_max_filesize = 8192M
#These are max file sizes. Change them to the largest file you are willing to support.
#Keep these settings below 4000M if working with Fat16/Fat32 filesystems.

#Tell Apache to reload config files.
service apache2 reload

#Change current directory to web server's root
cd /var/www/html

#Cleanup the root of the web server.
ls -la
rm -r *
ls -la

#Download the standard uploader software:
aria2c https://github.com/gdiaz384/jQuery-File-Upload/releases/download/v9.18.0-standalone/jQuery-File-Upload-9.18.0-standalonePHP.zip

#Or, if normal users should not be able to delete files, then try this other template instead of the one above:
aria2c https://github.com/gdiaz384/jQuery-File-Upload/releases/download/v9.18.0-standalone/jQuery-File-Upload-9.18.0-standalonePHP-noDelete.zip
#Be sure to modify the commands below if using the noDelete.zip version, instead of copy-pasting.

#extract software
unzip jQuery-File-Upload-9.18.0-standalonePHP.zip

#move the software to home directory as a backup
mv jQuery-File-Upload-9.18.0-standalonePHP.zip ~

#fix permissions
chmod  0755 -R ../html
chown www-data:www-data -R ../html
  • Try uploading a file at http://192.168.200.2. Be sure to check a small image file, a music file and a very large (2GB+) file.
  • As configured above, images will give some inconsequential error but will otherwise still work for arbitrary files.
  • Clients can get a static directory output of all available files at http://192.168.200.2/server/php/files, or any other directory below the web server's root for that matter. This is probably not a good thing.
  • The path from the web server's root where the files are stored is /server/php/files.
  • Assuming a web server root of /var/www/html, the complete path is /var/www/html/server/php/files and attempting to change the server/php/files portion in the jQuery config files breaks everything, so don't.
  • Alternatives:
    • Use directory junctions, ln --help.
    • Mount the files directory using NFS or fdisk
    • Change the web server root itself using Apache
    • Unzip the entire jQuery.zip archive into a subdirectory like /var/www/html/mp3. This works for multiple instances.
  • The client-side html file is index.html. It is normal HTML and can be edited as such. Feel free to customize it.
  • If you are a web developer, consider implementing FineUploader instead of using the standalone template. Part 1 of their multi-part "Quick-start" is 20 pages long: "Getting Started with Fine Uploader." Good luck.

With uploading working, it is now time to implement per user authentication.

Apache Authentication

Authentication is actually a very hard problem to solve.

The instructions below are for Basic HTTP Authentication which means Chrome does not work without a second authentication prompt because web-kit browsers (like Safari, Opera and Chromium) violate RFC 7617 by using per-directory authentication as opposed to domain-level authentication. Only Firefox handles authentication to-spec by handling the session created for / (root) as the same session as server/php/files.

Typically, it is solved using cookies and/or an external authentication provider like WebAuth or one of the next-gen authenticators like OAuth or SQRL or copy/pasting config files and hoping for the best.

That is a bit much for this setup.

In other words, Firefox works, but uploading large files in Chrome will probably fail for files that take more than a few seconds to upload. This can be very annoying if that large file took several minutes or hours to upload. Workaround: Use Firefox or implement better authentication.

User-Specific Authentication Docs:

Add new users using htpasswd:

The first time we use this utility, we need to add the -c option to create the specified file.
Syntax:
htpasswd -c /etc/apache2/.htpasswd admin
htpasswd /etc/apache2/.htpasswd user
Duplicate the command with a different user to support many users.

nano /etc/apache2/apache2.conf
Change the following from:

<Directory /var/www/>
        Options Indexes FollowSymLinks
        AllowOverride None
        Require all granted
</Directory>

#<Directory /srv/>
#       Options Indexes FollowSymLinks
#       AllowOverride None
#       Require all granted
#</Directory>

To:

<Directory /var/www/>
        Options Indexes FollowSymLinks
        AllowOverride None
        Require all granted
</Directory>

<Directory /var/www/html/>
        AllowOverride None
        AuthType Basic
        AuthName "Restricted"
        AuthUserFile /etc/apache2/.htpasswd
        Require valid-user
</Directory>

#<Directory /srv/>
#       Options Indexes FollowSymLinks
#       AllowOverride None
#       Require all granted
#</Directory>

Go to http://192.168.200.2 to test in both Chrome and Firefox.


With user-side authentication sort-of working, at least in Firefox, it is time to get connection privacy working. Connection authentication can be done after that, and together they provide a measure of security.

Note: To keep things server software agnostic, Stunnel will be used for the purposes of this tutorial but Apache does support native TLS privacy and Certbot does support Apache for server-side authentication.

On Using Stunnel + Certbot for TLS

Theory: The idea is that some server exists on port 80 and it would be nice if TLS over 443 worked too.

Documentation:

Local Server Prerequisites:

  1. Find your public IP: https://icanhazip.com
  2. Sign up for a domain from afraid.org or buy one from hover.com and point the DNS entry to the public IP.
    • Hint: Hover supports the .moe TLD.
  3. Port forward TCP over 443.

Installation

apt-get update
apt-get install software-properties-common
add-apt-repository ppa:certbot/certbot
apt-get update
apt-get install certbot
which certbot


#If there is a web server on the system running already
certbot certonly --rsa-key-size 4096 --webroot -w /var/www/example -d example.com -d www.example.com -w /var/www/thing -d thing.is -d m.thing.is

#if there is no web server on the local system but port 443 is publically available
/usr/bin/certbot certonly --allow-subset-of-names --standalone --rsa-key-size 4096 --preferred-challenges tls-sni-01 --no-self-upgrade --agree-tos --email your.email.here@example.invalid -d example.com -d www.example.com,blog.example.com

#if there is no web server on the local system but port 80 is publically available
/usr/bin/certbot certonly --allow-subset-of-names --standalone --rsa-key-size 4096 --preferred-challenges http-01 --no-self-upgrade --agree-tos --email your.email.here@example.invalid -d example.com -d www.example.com,blog.example.com

#test renewal
certbot renew --dry-run

#if renew works, then add to crontab, change "certonly" to "renew" and add "--non-interactive --quiet" 
#Only use a pre-hook and post-hook if necessary.
crontab -e
@daily /usr/bin/certbot renew --allow-subset-of-names --standalone --rsa-key-size 4096 --preferred-challenges tls-sni-01 --no-self-upgrade --non-interactive --agree-tos --quiet --email your.email.here@example.invalid --pre-hook "sudo service stunnel4 stop" --post-hook "sudo service stunnel4 start"

ls /etc/letsencrypt/live
ls /etc/letsencrypt/live/example.com/
cat /etc/letsencrypt/live/example.com/README

So the important ones are the privkey.pem and fullchain.pem. The others can be ignored.

And now for stunnel.

apt-get install stunnel

which stunnel4
ls /etc/stunnel
cat /etc/stunnel/README
cp /usr/share/doc/stunnel4/examples/stunnel.conf-sample /etc/stunnel/stunnel.conf
nano /etc/stunnel/stunnel.conf

Use ; to comment out gmail pop/imap/smtp settings.

Then change:

;TLS front-end to a web server
;[https]
;accept = 443
;connect = 80
;cert= /etc/stunnel/stunnel.pem

To:

cert=/etc/letsencrypt/live/mywebsite.com/fullchain.pem
key=/etc/letsencrypt/live/mywebsite.com/privkey.pem

ciphers = ECDHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-RSA-AES128-SHA256:ECDHE-RSA-AES128-SHA:AES256-GCM-SHA384:AES128-GCM-SHA256:AES256-SHA256:AES128-SHA256:AES128-SHA
options = CIPHER_SERVER_PREFERENCE
renegotiation=no

;TLS front-end to a web server
[uploadserver]
accept = 443
connect = 127.0.0.1:80

Enable stunnel (config)

nano /etc/default/stunnel4
ENABLED=1

Start the actual service:
/etc/init.d/stunnel4 start

Certbot + stunnel autostart and autorenew:

crontab -e
@reboot /etc/init.d/stunnel4 start
@daily sudo /home/user/stunnel/certbot-auto renew --allow-subset-of-names --standalone --rsa-key-size 4096 --preferred-challenges tls-sni-01 --no-self-upgrade --non-interactive --agree-tos --quiet --email your.email.here@example.invalid --pre-hook "sudo service stunnel4 stop" --post-hook "sudo service stunnel4 start"

Go to https://192.168.200.2 to test. Should get a cert error for mis-matching domain, but examining the cert should list the correct one from afraid.org.

TODO: Security touch ups for Apache:

  • Header stripping (for uploaded .js files).
  • Bypass execution of uploaded files by enforcing download requirements.
  • Strip Apache and OS version information from indexes and error pages.
  • Restrict automatic indexing subdirectories and redirect to root (for uploaded index.php files).
  • Automatic redirection to TLS on 443 (if applicable), if http on either 80 or 443 gets requested.

Further Reading:

No comments:

Post a Comment