Chatsight Pivots Its Content Moderation A.I. to Battling Discord Scammers

In brief

  • Scams continue to plague DAOs and NFT collections, exploiting human and platform weaknesses.
  • Former content moderation service Chatsight is now applying AI to Discord servers.

While the crypto industry is focused on building the decentralized Web3 future, centralized Web2 platforms like Discord, Twitter, and Telegram are where the community lives today. As DAOs and NFT collectives continue to use these platforms, fraudsters are flooding in to scam and steal. The Federal Trade Commission recently reported that over $1 billion in crypto had been lost to scams since 2021.

To help combat these attacks, a new San Francisco-based startup called Chatsight is making safety in Discord servers its main business, joining a growing list of services aimed at protecting Discord communities.

Founded in 2021 by Marcus Naughton, Chatsight calls itself a “safety as a service company” designed to provide an added layer of security to social media platforms like Discord and Telegram. These platforms have become central to Web3 projects looking to organize and build communities around their projects.

“We’re providing agnostic technology,” Naughton tells Decrypt. “We build the anti-scam A.I. (artificial intelligence) tech and bridge it out to platforms like Discord, Telegram, and others as they come along with the eventual goal of providing safety tools for on-chain networks.”

Discord is a popular place for DAOs (decentralized autonomous organizations) to organize and collaborate. DAOs are loosely organized communities that come together to build or support crypto projects and often finance their activities with tokens.

Already wary of scammers, DAOs use third-party projects like Collab.Land to act as gatekeepers to their Discord servers, verifying that members hold the DAOs token before gaining access. But while token gatekeepers can manage memberships, security remains an issue.

In May, Security firm PeckShield posted an alert to Twitter saying that scammers had exploited NFT marketplace OpenSea’s Discord server to promote a scam NFT mint.

Earlier this month, the popular NFT collective Bored Ape Yacht Club’s Discord server was compromised, allowing scammers to make off with NFTs worth 200 ETH ($358,962 at the time).

Following the exploit, a Bored Ape Yacht Club co-founder lashed out at Discord on June 4, saying the popular communications app “isn’t working for Web3 communities.”

While Chatsight is meant for deployment on social media platforms, Naughton explains, the focus is on scams and phishing attacks, not content moderation, adding, “the one thing everyone can agree upon is [that] scams are bad.”

Chatsight started as an A.I. content moderation platform for social networks, Naughton explains, but pivoted after he spoke with a crypto Telegram group owner who was paying around $5,000 to have physical people monitor the channel.

“If these people are paying humans to do this, that shows that there’s a need that these platforms aren’t addressing,” Naughton says. “When you build your communities on these platforms, you’re expressly signing up to the fact that you are now taking security back into your own hands.”

Naughton says Chatsight aims to act as a managed security partner, “a quasi antivirus,” giving users a suite of tools for monitoring their Discord servers.

According to Naughton, Chatsight uses an “air-gapped” Discord account, one unused anywhere else. Once associated with the Discord server, this account is given admin rights. It can then monitor the server for scams and phishing attacks, keeping the owner of the server’s account separate while providing the server owner control of the Chatsight bot.

Naughton says that the freemium product includes features that provide extra security, including Enterprise Cloudflare, Discord account verification, checking the account’s reputation across Discord, and punishments ranging from a 30-minute time-out to bans for accounts that are repeatedly flagged.

For Naughton, the flaw in the current version of the internet is that users are handing over the assets they own (plans, designs, missions, etc.) to third parties like Discord, Twitter, and Telegram to host and hopefully provide security. Still, the users have no say in that security.

“We expect you to be compromised because of the nature of Discord’s product—exploits happen to everyone,” Naughton says. “So we assume from the default position that you’re going to get exploited, and how can we prevent the damage that is caused from there?”

Want to be a crypto expert? Get the best of Decrypt straight to your inbox.

Get the biggest crypto news stories + weekly roundups and more!

Source: https://decrypt.co/103720/new-a-i-startup-targets-discord-phishing-scammers