When setting up a secure website, system administrators and webmasters often fail to perform very basic tasks that would greatly "shore up" the webserver. Here are 4 and half simple tips to secure your webserver, make it easier to monitor, and prevent it from sticking out like a sore thumb during a security audit.
1.) There are known security vulnerabilities and weaknesses in some SSL versions and encryption ciphers. SSL2 along with all weak and export grade SSL encryption ciphers should be disabled. In addition to being a good overall security practice, this is also mandated by the PCI Data Security Spec. (4.1). This can be easily done in apache by adding the following line to your config file:
#Disable SSLv2 and weak/ export grade ciphers
SSLCipherSuite ALL:+HIGH:+MEDIUM:!SSLv2:!EXP:!eNULL2.) When hosting a secure portal'ish site where the landing page is simply a login page, I like to force SSL only without requiring the user to remember that the site is SSL only. This can be easily accomplished in Apache by using a rewrite rule. This allows my server to still listen for regular http requests, but automatically rewrite those to https. Adding the following to your Apache config file will achieve this behavior.
#Redirect to SSLRewriteEngine OnRewriteCond %{HTTPS} !=onRewriteRule ^/(.*) https://%{SERVER_NAME}%{REQUEST_URI} [R]3.) TRACK and TRACE are not very well-known HTTP request methods that allow you to debug HTTP problems. These methods are very seldomly used (if ever) and there are a few known Cross Site Scripting (XSS) vulnerabilities related to them. This is a very common vulnerability that will be reported by almost every automated security scanner in the world and can also lead to failed security audits. Because of this, its best to disable them. Again we can use Apache rewrite rules to do this by adding the following lines to the apache config file:
#Disable TRACE & TRACK MethodsRewriteCond %{REQUEST_METHOD} ^(TRACE|TRACK)RewriteRule .* - [F]4.) Monitoring application logs is an essential part of any security program. Often time your access and error logs will be polluted with error messages that "robots.txt" file is not found. Essentially all this really is, is a list of rules that a search engine spider should follow when crawling your site. Each time an automated crawler visits your site, this file is the first thing they request. To prevent this error from filling your logs, you should create a simple text file named "robots.txt" and place it in the root of your web directory which will still allow crawling of everything. The contents of the file should be:
User-agent: *Disallow:4.5) The next thing you will see constantly polluting your error logs are failed requests for a file named "favicon.ico". This file is the small little logo you see in your browser's address bar when you visit some sites or in your bookmarks when you bookmark that same site. This file is requested by the users browser at the beginning of EVERY visit to your site. Because of this, the failed request can quickly fill up your log files! An easy way to fix this is to copy a
blank favicon.ico in to the root of your web directory. Alternatively, if you are feeling especially creative you can create a custom favicon
here or
here.