Recently, i went through web server security and security analysis of forums based on phpBB. Although i do agree that humans are the weakest link in the security chain, there are a few measures a web developer can take to prevent malicious users from exploiting their website.
1) The login page (or any other page that requires authentication) should use HTTPS. The best approach is to use HTTPS throughout your website.
2) Set secure cookies and check them on each request from the user. This goes hand in hand with step 1.
3) Limit the maximum number of login attempts (say 5). Also, use techniques like exponential delay at each failed login attempt.
4) Deploy captcha verification on pages that require authentication (probably couple the captcha with the login page). Captcha should be account based rather than IP based (doing so prevents DDOS attacks since step 3 should stop login attempts after a few incorrect tries regardless of the IP used to log into the account)
5) After the maximum number of invalid login attempts have been reached, a web developer can go with two approaches.
i) Deactivate the account and send password reset link at the user's registered email.
OR
ii) Throw up a security question (entered earlier by the user) and follow the deactivation step if this too fails.
6) Set a password complexity policy (make digits, mixed alphabets and special characters mandatory along with a minimum password length)
7) Change the session ID on each request (for websites that require extra-security). Also, make sure that the session ID or any other session information is not a part of the query string (this will not be much of a problem if HTTPS is used throughout the website)
8) Force logout after a fixed period of inactivity.
9) Make extra authentication checks for administrator logins.
10) Never trust user input. Always validate it.
11) Maintain a blacklist (or whitelist) of users (IP based or whatever suits you best) to block malicious users who try to attack your website. It's not recommended to block them forever.
12) Understand and write your robots.txt with diligence.
13) Use vulnerability scanners to scan your website for security loopholes. Some good ones are jsky, acunetix and w3af. Fix the issues these scanners list and re-scan to confirm from various other scanners.
14) Take regular backups of your website. In case of a security incident, the migration to another webserver or to restore the site back will be seamless with proper backups.
15) Monitor your website traffic and statistics on a regular basis. Watch out for unusual traffic (depending on IP, location, webpage requested and so on). Use google's webmaster's tool and google analytics to aid you in the development and monitoring process.
These are just my thoughts. Feedback or any other addition to the above listed policy is appreciated.
Technorati Tags: security, Policy, Web security, phpBB
1) The login page (or any other page that requires authentication) should use HTTPS. The best approach is to use HTTPS throughout your website.
2) Set secure cookies and check them on each request from the user. This goes hand in hand with step 1.
3) Limit the maximum number of login attempts (say 5). Also, use techniques like exponential delay at each failed login attempt.
4) Deploy captcha verification on pages that require authentication (probably couple the captcha with the login page). Captcha should be account based rather than IP based (doing so prevents DDOS attacks since step 3 should stop login attempts after a few incorrect tries regardless of the IP used to log into the account)
5) After the maximum number of invalid login attempts have been reached, a web developer can go with two approaches.
i) Deactivate the account and send password reset link at the user's registered email.
OR
ii) Throw up a security question (entered earlier by the user) and follow the deactivation step if this too fails.
6) Set a password complexity policy (make digits, mixed alphabets and special characters mandatory along with a minimum password length)
7) Change the session ID on each request (for websites that require extra-security). Also, make sure that the session ID or any other session information is not a part of the query string (this will not be much of a problem if HTTPS is used throughout the website)
8) Force logout after a fixed period of inactivity.
9) Make extra authentication checks for administrator logins.
10) Never trust user input. Always validate it.
11) Maintain a blacklist (or whitelist) of users (IP based or whatever suits you best) to block malicious users who try to attack your website. It's not recommended to block them forever.
12) Understand and write your robots.txt with diligence.
13) Use vulnerability scanners to scan your website for security loopholes. Some good ones are jsky, acunetix and w3af. Fix the issues these scanners list and re-scan to confirm from various other scanners.
14) Take regular backups of your website. In case of a security incident, the migration to another webserver or to restore the site back will be seamless with proper backups.
15) Monitor your website traffic and statistics on a regular basis. Watch out for unusual traffic (depending on IP, location, webpage requested and so on). Use google's webmaster's tool and google analytics to aid you in the development and monitoring process.
These are just my thoughts. Feedback or any other addition to the above listed policy is appreciated.
Technorati Tags: security, Policy, Web security, phpBB