W3BStation
Markets
BTC $96,420 +2.34% ETH $3,280 +1.82% SOL $185.40 -0.92% BNB $642.50 +0.45% XRP $2.18 +3.12% DOGE $0.082 -1.50% ADA $1.05 +0.80% AVAX $42.10 +1.15%
BTC $96,420 +2.34% ETH $3,280 +1.82% SOL $185.40 -0.92% BNB $642.50 +0.45% XRP $2.18 +3.12% DOGE $0.082 -1.50% ADA $1.05 +0.80% AVAX $42.10 +1.15%
08/15/2024

AI Researchers Want to Solve the Bot Problem by Requiring an ID to Use the Internet

AI researchers are concerned that AI bots will gradually take over the internet and spread like a digital invasive species. Rather than tackling the problem by restricting the spread of bots and AI-generated content, one research group has decided to go in the opposite direction. In a preprint paper published recently, dozens of researchers propose a system in which humans would need to verify their humanity in person — by another human — in order to obtain a "personhood credential." The id

AI Researchers Want to Solve the Bot Problem by Requiring an ID to Use the Internet

AI researchers are concerned that AI bots will gradually take over the internet and spread like a digital invasive species. Rather than tackling the problem by restricting the spread of bots and AI-generated content, one research group has decided to go in the opposite direction.

In a preprint paper published recently, dozens of researchers propose a system in which humans would need to verify their humanity in person — by another human — in order to obtain a "personhood credential."

The core idea appears to be creating a system in which a person can prove they are human without having to reveal their identity or any other information. If this sounds familiar to those in the crypto community, that's because the research draws on blockchain-based "proof of personhood" technologies.

Digital Verification

Services like Netflix or Xbox Game Pass that charge usage fees typically rely on users' financial institutions to handle verification. This doesn't allow for anonymity, but for most people that's an acceptable trade-off — generally regarded as just part of the cost of doing business.

Other services, such as anonymous forums, cannot rely on user payments as proof of humanity — or at least as proof that a non-human customer is in good standing — and must take steps to restrict bots and fake accounts.

As of August 2024, for instance, ChatGPT's own safeguards would prevent it from being exploited to register large numbers of free Reddit accounts. Some AI can bypass human-style CAPTCHA checks, but it would still take significant effort to complete the steps involved in email address verification and the rest of the account setup process.

The group's central argument, however — which includes experts from companies such as OpenAI, Microsoft, and a16z Crypto, as well as academic institutions including the Harvard Society of Fellows, Oxford, and MIT — is that current restrictions can only hold for so long.

Within a few years, humanity may have to reckon with the reality that without the ability to look someone in the eye, there may be no way to determine whether you're interacting with an individual at all.

Pseudonymity

The researchers are advocating for the development of a system in which certain organizations or institutions would be designated as credential issuers. These issuers would use humans to confirm an individual's humanity. Once verified, the issuer would certify that individual's credential. Crucially, issuers would not be permitted to track how those credentials are used. It remains unclear how the system could withstand cyberattacks and the looming threat of quantum-assisted decryption.

On the other side, organizations interested in serving verified users could choose to grant accounts only to those who hold a credential. This could limit each person to a single account and make it impossible for bots to access these services.

According to the paper, the research does not set out to determine which centralized pseudonymity method is most effective, nor does it resolve the many potential problems such a scheme could encounter. Nevertheless, the research group acknowledges these challenges and has called for further action and study.