Monday, April 29, 2013

Designing a good security policy for your websites

Recently, i went through web server security and security analysis of forums based on phpBB. Although i do agree that humans are the weakest link in the security chain, there are a few measures a web developer can take to prevent malicious users from exploiting their website.

1) The login page (or any other page that requires authentication) should use HTTPS. The best approach is to use HTTPS throughout your website.

2) Set secure cookies and check them on each request from the user. This goes hand in hand with step 1.

3) Limit the maximum number of login attempts (say 5). Also, use techniques like exponential delay at each failed login attempt.

4) Deploy captcha verification on pages that require authentication (probably couple the captcha with the login page). Captcha should be account based rather than IP based (doing so prevents DDOS attacks since step 3 should stop login attempts after a few incorrect tries regardless of the IP used to log into the account)

5) After the maximum number of invalid login attempts have been reached, a web developer can go with two approaches.
     i) Deactivate the account and send password reset link at the user's registered email.
                                                                      OR
    ii)  Throw up a security question (entered earlier by the user) and follow the deactivation step if this too fails.

6) Set a password complexity policy (make digits, mixed alphabets and special characters mandatory along with a minimum password length)

7) Change the session ID on each request (for websites that require extra-security). Also, make sure that the session ID or any other session information is not a part of the query string (this will not be much of a problem if HTTPS is used throughout the website)

8) Force logout after a fixed period of inactivity.

9) Make extra authentication checks for administrator logins.

10) Never trust user input. Always validate it.

11) Maintain a blacklist (or whitelist) of users (IP based or whatever suits you best) to block malicious users who try to attack your website. It's not recommended to block them forever.

12) Understand and write your robots.txt with diligence.

13) Use vulnerability scanners to scan your website for security loopholes. Some good ones are jsky, acunetix and w3af. Fix the issues these scanners list and re-scan to confirm from various other scanners.

 14) Take regular backups of your website. In case of a security incident, the migration to another webserver or to restore the site back will be seamless with proper backups.

15) Monitor your website traffic and statistics on a regular basis. Watch out for unusual traffic (depending on IP, location, webpage requested and so on). Use google's webmaster's tool and google analytics to aid you in the development and monitoring process.

These are just my thoughts. Feedback or any other addition to the above listed policy is appreciated.

Technorati Tags: security, Policy, Web security, phpBB

6 comments:

  1. Nice collection of tips. Absolutely loved the way you put point no 10 :D

    ReplyDelete
  2. Nice collection of tips. If an application runs on the web, security becomes an important issue and cannot be ignored at all. I would elaborate on a few tips.

    Point No 6:Some or the other encryption like SHA (one way) or md5 (two way) must always be employed.

    Point No 10:This couldn't have been put better. It is a small point, yet very important. As a general rule, if there is any input from the keyboard (virtual or physical) it must be validated for at least two things: Cross Site Scripting (XSS) and SQL Injection.

    Point No 15:There is another new tool called "New Relic" which is also impressive.

    ReplyDelete
    Replies
    1. Thanks for the reply.

      To expand on point 6: I would also add a salt with the hash, since md5 and sha-1 have been broken. As of today, SHA-2 with a salt would provide decent security.

      For point 10: I would also validate any GET parameters used in php include statements which drive my database queries. This way i prevent sql injection in which user input is not expected.

      Thanks for adding info about new relic.

      Delete
    2. Right :) Also sometimes some people think that putting added security checks on their site may slow it down. For such cases they can use compression algorithms like gzip so that compressed web-pages are transferred over the network (All modern browsers can decompress web-pages compressed with gzip).

      Delete
    3. Nice point. I think this will encourage more developers to implement security checks in their website without taking a performance hit.

      Delete