Decentralized chat app OpenChat, which is based on the Internet Computer blockchain, wants to foster more virtuous discourse on social media and believes that requiring users to both prove they are human and possess only one account may be the way forward.
To do so the on-chain messaging app, likened to Discord and Slack, plans on testing facial recognition.
“Proof of humanity is one thing, and that’s relatively easy to do. What’s harder is proof of unique humanity. I can solve all sorts of proof of humanity tests but I could do it a hundred times over and get a hundred accounts,” OpenChat co-founder Matt Grogan told The Block. “This is massive for this space, proof of unique humanity,” he added, mentioning how eliminating people from piloting multiple online accounts could limit how much some users take advantage of token farming or airdrops by using more than one account.
OpenChat, which has more than 100,000 users, recently partnered with Modclub, a platform focused primarily on decentralized content moderation. But the platform, which also runs on Internet Computer, will also be testing facial recognition for OpenChat as part of the push to implement a “proof of unique humanity” system, said Grogan.
“They’ve got facial recognition … we’re going to trial it and see how it runs,” he said. “It’s not going to be 100% perfect, but it should limit how easy it is to have multiple identities.”
Grogan also said, however, that besides not having worked out all the details regarding the trial, OpenChat isn’t going to require all users to verify they are human and only have one account. He did say that, going forward, proving unique humanity through facial recognition might be something that is used when determining who is eligible for future airdrops. Users might even use it to bolster their own reputation on the platform, he added.
Up until now, unlike traditional social media platforms, which use email IDs and a unique usernames, OpenChat has utilized crypto addresses and NFTs for authentication and monetization purposes.
Avoiding toxicity on Facebook and X
Users anonymously using multiple accounts and bots has long been considered an issue that not only contributes to the amount of illicit behavior conducted online, but also underpins the level of toxic discourse prevalent on traditional social networking platforms like Facebook and X (formerly Twitter).
Modclub is hoping to help prevent similar behavior on OpenChat. Earlier this week, the platform announced its partnership with OpenChat. “Users on the OpenChat platform will have the ability to report content that violates the rules established by the OpenChat DAO. These reports will be sent to Modclub’s pool of moderators for careful review,” the platform said in its post. “Decisions will be made, and content will either be removed or allowed to remain based on the results.”
In the past it has generally been the responsibility of the leaders running specific groups and communities that exist within OpenChat to moderate discourse, according to Grogan. The moderation tactics are meant to be guided, at least in part, by OpenChat’s high-level platform rules, he also said.
With the new partnership, however, OpenChat will be “offloading” that responsibility to Modclub, which incentivizes its moderators by paying them in crypto.
Besides hoping to foster a reputation system which incentivizes virtuous discourse, OpenChat also currently rewards users with tokens in an effort spur growth and engagement.