MultiversX Tracker is Live!

Coinbase’s Philosophy on Account Removal and Content Moderation

Coinbase 

Cryptocoins Exchanges / Coinbase  302 Views

Coinbase

Feb 4 · 7 min read

By Brian Armstrong

In the last few years, it’s become increasingly common for tech companies to censor customers or close their accounts for a range of reasons (e.g., misinformation). Luckily, as a crypto business we don’t face this issue as frequently as a social network does, but we still need to set clear policies around acceptable use of our products. As our product suite grows, it will even include products that host user generated content like NFTs.

Our high level philosophy is that, in a democratic society, the people and their elected officials should decide what behavior is allowed and not allowed by setting laws. We think it sets a dangerous precedent when tech companies, such as Coinbase, or their executives start making judgment calls on difficult societal issues, acting as judge and jury. This approach sounds simple in theory, but in practice it is anything but.

First, it can be very complex to determine whether an activity is legal or illegal. Laws vary greatly across different countries, states, and regions. Some activities are legal only if you have a license. Some activity is in a gray area. Some unjust laws go unenforced. Like most companies, we refer suspected illegal activity to the relevant authorities, but we can’t expect to receive a timely response or opinion back from them given the many demands on their resources. Unfortunately, this puts us, along with most companies, in the unfortunate position of having to make our own determinations about what activity is legal or illegal.

Second, even if some activity is legal, it may be something that is deeply troubling to have on the platform. The world is littered with polarizing, uncomfortable, or obscene content that may still be legal. This is where companies start to exercise even more judgment on what they allow. But there is great danger of falling down a slippery slope, having to render decisions on every difficult societal issue, where you are sure to upset someone no matter where you land. Without some strong principled based approach, these decisions become arbitrary and capricious, opening the company to attack.

Finally, every company works with other companies that have their own set of moderation and deplatforming policies. For instance, for any app to be listed in the Apple and Google App Stores, it needs to play by the rules of those two companies. In the financial services world, we also work with banks and payment processors who have their own acceptable use policies. Very few companies are completely vertically integrated, with the luxury of making their own decisions in a vacuum.

So how should a company implement a reasonable approach based on the above constraints? We’ve come up with our own answer, and I want to share it here so our customers can understand it, and in case it helps other companies.

First, it’s important to differentiate our approach based on the type of product. Coinbase has a broad product suite, but for moderation purposes we group our products as either infrastructure products or public-facing products when thinking about how to moderate them. Infrastructure products enable access to basic financial services and are typically used privately by a single customer, while public-facing products often host user generated content and have social features visible to large numbers of users. Ben Thompson’s article on moderation in infrastructure illustrates how companies typically take a different approach for each of these products.

For our infrastructure products, we use rule of law as the foundation of our approach, because we believe that governments, not companies, should be deciding what is allowed in society. We also believe that everyone deserves access to financial services, and a test of legality should be sufficient for these products.

For our public-facing products, we again start with rule of law as the foundation. But assuming something is legal in a certain jurisdiction, we also go beyond this and moderate content that is not protected speech under the First Amendment. We’re not legally held to the First Amendment as a company, and the First Amendment is a U.S. focused concept only, but we’ve chosen to use it as the guiding principle of our content moderation approach because it is in line with our values and helps ensure we don’t fall down a slippery slope over time. The First Amendment has hundreds of years of case law built up, and provides a reasonable framework to moderate content such as incitement, fighting words, libel, fraud, defamation etc. David Sacks does a great job describing this approach in this blog post.

Finally, there are cases where we want to work with external partners, such as the App Stores, and need to follow their moderation policies to do so. Sometimes third party payment providers have their own policies. For payment providers, we can simply disable functionality related to that partner if there is a problem with a specific user, while continuing to offer Coinbase services. But getting kicked out of the app stores wouldn’t help anyone. So when working with partners, our approach is to be free speech supporters, but not free speech martyrs, and to make accommodations if it is essential for us to function as a business.

This is obviously a complex issue, and hopefully the above approach starts to show a path through it that doesn’t devolve into arbitrary and capricious decision making. To boil down the above approach, we ask the following questions for our public-facing products:

1. Is the content illegal in a jurisdiction in which we operate?

A. If yes, then remove in that specific jurisdiction

2. Is the content a free speech exception under the First Amendment?

A. If yes, then remove globally

3. Has a critical partner required us to remove the content?

A. If yes, then remove the content or disable the functionality of that partner for the affected user

If the answer to any of these 3 questions is “Yes” we will take some moderation action, such as taking down content and in severe cases terminating the account.

Most of this post has been about how we can create a reasonable moderation policy that doesn’t get co-opted over time, succumb to pressure, or descend into us playing judge and jury. This is important so that Coinbase is able to stand up to pressure. Of course, the decentralized nature of cryptocurrency offers its own important protections here, and those protections get stronger the more our products decentralize.

If our policy above fails, and Coinbase starts making bad judgment calls or turns evil, customers can withdraw their crypto to any other competing exchange, wallet, or custodian. Compare this to social networks today, where you can’t take your followers with you. Your data is owned by one company, in a proprietary format. The open nature of crypto protocols provides lower switching costs, which is an important customer protection, even for relatively centralized crypto products. But decentralized, or self-custodial, crypto products have an even greater protection because the company is simply providing access to something running on-chain. For instance, no one can deplatform your ENS name without taking every ENS name offline. Decentralization moves you from the slippery slope to the crypto cliff, where the would-be censor must compromise an entire blockchain to censor just one person.

Decentralization is a spectrum, and Coinbase is moving farther down this path over time, embracing self-custody with Coinbase Wallet, stepping up user education around private keys, and by investing in Bitcoin core development and web3 protocols. The more decentralization we can support, the better protection customers will have.

We believe everyone deserves access to financial services, and that companies should put appropriate controls in place to prevent censorship or unjust account closures from taking place. For centralized financial infrastructure products, we believe rule of law is a sufficient standard for moderation, while for decentralized products even greater protections can be provided by the blockchain. We also acknowledge that public-facing products deserve some additional consideration, and that the First Amendment can be used as a reasonable test or boundary. We believe this approach is consistent with our mission of creating more economic freedom in the world and with the ethos of crypto.

Companies are in a difficult position when they choose to censor or terminate a customer account. What often seems like an easy decision, especially under public pressure, turns out to have larger unintended consequences and sets a dangerous precedent for the role of private companies in society. I’m sure we won’t get it perfect with our policy above, but my hope is that we’ve laid out some principles we can fall back on when difficult decisions arise, and that investors, customers, and employees can have a better understanding of our process.

Further Reading


Get BONUS $200 for FREE!

You can get bonuses upto $100 FREE BONUS when you:
💰 Install these recommended apps:
💲 SocialGood - 100% Crypto Back on Everyday Shopping
💲 xPortal - The DeFi For The Next Billion
💲 CryptoTab Browser - Lightweight, fast, and ready to mine!
💰 Register on these recommended exchanges:
🟡 Binance🟡 Bitfinex🟡 Bitmart🟡 Bittrex🟡 Bitget
🟡 CoinEx🟡 Crypto.com🟡 Gate.io🟡 Huobi🟡 Kucoin.



Comments