Predicting the destiny of CAPTCHA system won’t be a tough job now, as most of them are easily prone to hacking these days . But this has not prevented the web developers in using CAPTCHAs in web pages, e-mail forms etc…
Making a wiser use of CAPTCHA system can stop bots, without making sign-ups hard for genuine users.
From a user’s perspective, it is undoubtedly true that no one likes seeing CAPTCHAs. Not everyone can use a CAPTCHA, especially those with impaired vision. It can also cause trouble if the graphics are not enable. Improperly given CAPTCHAs may slowdown the registration process of a site there by resulting in lesser registrations than expected.
The worst part of using the CAPTCHA system is that it lays the entire burden on the user’s shoulders. Hence CAPTCHA should be the last choice, according to experts, and not the first.
Features of Good Captcha System
Recent discoveries suggest that much of the hacking attempts and bots can prevent even without relying on the aid of CAPTCHAs.
Here are some tips to stop spoofing attempts.
1. Validate everything server-side
Try to validate every field with the help of the server side code, even if you are confident about the secure client side validation of yours. Be specific about the fields you place in the email headers. Email addresses are possibly the most significant values to check: Try to use a good regular expression and look out for ‘HTML tags, SQL injections, or return characters (\n and \r in PHP)’.
2. Is there a spam-like content?
It can be observe that most spammers include links to websites. If you don’t receive the expected site, it could indicate a spam bot. A third part tool like Akismet (https://akismet.com/ ) can help you here.
3. Beware of the rogue POST and GET values
Is your form expecting 3 POSTed fields? Do you see a 4th one too? Be careful, as it can indicate a hacking attempt. Also make sure that ‘no additional GET values have been passed’.
4. Check HTTP header
Simple spam bots hardly ever set a ‘user agent (HTTP_USER_AGENT) or a referring page (HTTP_REFERER)’. Make sure that ‘the referrer is the page where the form is located.’
5. Use a honeypot field
Spambots usually make sure that every form filed is completed, so that they do a successful basic validation.
Create a honeypot form field which should be left blank. Use CSS to conceal it from human users and not bots. As soon as form is submitted you can check it and make sure whether the value of the form field is left blank or not.
6. Detect the presence of JavaScript
Can the page run JavaScript? If so, you are on the safer side because only a human user can load it on a browser. A majority of robots do not have the ability to interpret JavaScript present on the website. Developing a JavaScript interpreter is more difficult than developing a crawler that passes through HTML links.
A standard browser used by a human being loads the HTML of the webpage as well as the other resources, which includes the ones served by Java Scripts. But robots just load the HTML of the page, maybe ‘graphics on the page’ too, nothing more.
7. Show a verification page
It’s very difficult for bots to react to a server response. Do you have any doubts about the validity of a post? Try to show an intermediate page to ask the user to confirm the data and press submit bottom later.
8. Time the user response
Human users normally take a little time to complete the forms were as bots do it instantaneously. This point can be taken as a sign to detect bots.
9. Log everything
Keep a record of everything that happens during a form submission. Not a 100% practical one always, yet this can help you detect the hacking attempts later on.
10. Also note that regex or other validations used should be a white list
i.e. Only those inputs which are safer should be allowed.