Your resource for web content, online publishing
and the distribution of digital products.
S M T W T F S
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
10
 
11
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 
 
 
 

Bluesky’s Approach to Content Moderation

DATE POSTED:September 24, 2024

:::info Authors:

(1) Martin Kleppmann, University of Cambridge, Cambridge, UK ([email protected]);

(2) Paul Frazee, Bluesky Social PBC United States;

(3) Jake Gold, Bluesky Social PBC United States;

(4) Jay Graber, Bluesky Social PBC United States;

(5) Daniel Holmgren, Bluesky Social PBC United States;

(6) Devin Ivy, Bluesky Social PBC United States;

(7) Jeromy Johnson, Bluesky Social PBC United States;

(8) Bryan Newbold, Bluesky Social PBC United States;

(9) Jaz Volpert, Bluesky Social PBC United States.

:::

Table of Links

Abstract and 1 Introduction

2 The Bluesky Social App

2.1 Moderation Features

2.2 User Handles

2.3 Custom Feeds and Algorithmic Choice

3 The at Protocol Architecture

3.1 User Data Repositories

3.2 Personal Data Servers (PDS)

3.3 Indexing Infrastructure

3.4 Labelers and Feed Generators

3.5 User Identity

4 Related Work

5 Conclusions, Acknowledgments, and References

2.1 Moderation Features

Bluesky currently has the following moderation mechanisms (additional mechanisms are under discussion [14]):

\ Content filtering: Automated systems label potentially problematic content (such as images of a sexual or violent nature, posts promoting hate groups, or spam), and the app’s preferences allow users to choose whether to show or hide content in each of these categories in their feeds.

\ Mute: A user can mute specific accounts or threads, which hides the muted content from their own feeds and notifications. The content continues to be visible to other users, and the target does not know that they were muted. A user can also publish a mutelist of accounts, and other users can subscribe to that list, which has the same effect as if they individually muted all of the accounts on the list.

\ Block: One user can block another, which prevents all future interactions (such as mentions, replies, or reposts) between those accounts in addition to muting. Similarly to mutelists, a user can also publish a list of accounts, and other users can block all accounts on that list by subscribing to it.

\ Interaction gating: A user who makes a post can restrict who is allowed to reply to it (anyone, anyone they follow, anyone mentioned in the post, and/or anyone on a particular list of accounts) [11].

\ Takedown: Users can report content that violates the terms of service to server operators, and the operators can take down violating media, posts, or accounts.

\ Custom feeds: While the aforementioned mechanisms provide negative moderation (helping users avoid content they do not want to see), feed generators (see Section 2.3) can actively select high-quality content.

\

:::info This paper is available on arxiv under CC BY 4.0 DEED license.

:::

\