This morning, as I was logging in to one of the many apps I need to navigate the digital realm, I was prompted to complete a CAPTCHA puzzle in order to prove I was not a robot. The task I was required to complete, was to identify all of the store fronts in the image displayed. Amazingly, I passed on my first attempt! Nearly anyone who utilizes the world-wide-web understands to frustration of passing these CAPTCHA tests. I get it, protecting sites from bots that scrape and spam sites is a necessity. But there has to be a better way.
Now, with reCAPTCHA v3, from Google, there is a better way. Users won't be burdened with these frustrating puzzles. Site admins simply add reCAPTCHA v3 to the high-traffic pages of their app or site and then check the reCAPTCHA v3 admin console to see how bots are interacting with these pages. Bots are identified through adaptive risk analysis and tagged by the admin for further analysis from the Google databases in the cloud.
I like reCAPTCHA v3 because of the granular control an admin can have to filter bots and unwanted traffic, while making the user experience less stressful. The caveat is that admins are going to need to be more pro-avtive if they choose this upgraded version of CAPTCHA, periodically monitoring the traffic analysis, instead of just relying on CAPTHCHA to block all bots, which is now unrealistic. But that's the topic for another post.