Clubhouse’s security and privacy lags behind its explosive growth

Clubhouse did not respond to a request from WIRED to comment through press time on its recent security stumbles. In a statement to researchers at the Stanford Internet Observatory, Clubhouse detailed the specific changes it planned to make to strengthen its security, including cutting pings to servers in China and strengthening its encryption. The company also said it will work with a third-party data security company to help carry out the changes. In response to the unauthorized website which was televising the Clubhouse discussions, the company told media it has permanently banned the user behind it and will add additional “safeguards” to prevent the situation from happening again.

While Clubhouse appears to take researchers’ feedback seriously, the company has not been specific on all of the security enhancements it has implemented or plans to add. Additionally, given that the app doesn’t appear to offer end-to-end encryption to its users, researchers say there is still a feeling that Clubhouse hasn’t given enough thought to its security posture. And that’s before you even tackle some of the core privacy issues raised by the app.

When you start a new Clubhouse room, you can choose from three settings: an “open” room can be accessed by any user of the platform, a “social” room only admits the people you follow, and a “social” room. closed ”restricts access to guests. Each has its own implied level of privacy, which Clubhouse could make more explicit.

“I think for public rooms, Clubhouse should give users the expectation that public means public to all users, because anyone can join and record, take notes, etc.” says David Thiel, chief technology officer of the Stanford Internet Observatory. “For private rooms, they can convey this as with any
communication mechanism, an authorized member can record content and identities, so be sure to both set expectations and trust participants. “

Like any leading social network, Clubhouse also has struggled to cope with abuse On the platform. The app’s terms of use prohibit hate speech, racism and harassment from november, and the platform offers some moderation features, such as the ability to block users or flag a room as potentially abusive. But one of Clubhouse’s greatest features is also an issue for anti-abuse: people can use the platform without risking their contributions being automatically saved as messages. This can embolden some users make abusive or derogatory remarks, believing that they will not be recorded and will not suffer consequences.

Stanford’s Thiel says Clubhouse is currently storing chat tapes temporarily to review them in the event of abuse complaints. If the company were to implement end-to-end encryption for security, however, it would have an even harder time staying on top of abuses, as it wouldn’t be able to make these recordings so easily. Every social media platform faces a version of this tension, but security experts agree that, where applicable, the benefits of adding end-to-end encryption are worth the added challenge of developing solutions. more nuanced and creative anti-abuse.

Even end-to-end encryption does not eliminate the additional possibility that a Clubhouse user can externally record the conversation they are in. It is not something that Clubhouse can easily fix. But he can at least set expectations accordingly, no matter how friendly and informal the conversation is.

“The clubhouse should just be clear on what it’s going to contribute to your privacy,” Potter says, “so you can define what you’re going to talk about as a result.”

More WIRED stories

Leave a Reply

Your email address will not be published. Required fields are marked *