This transcript was created using speech recognition software. While it has been reviewed by human transcribers, it may contain errors. Please review the episode audio before quoting from this transcript and email [email protected] with any questions.
So you’ve been in Hawaii now for what, a couple of days?
Yeah. I got here on Sunday. I did see something in this hotel for the first time in my life. And I almost fell over when I saw it. Are you ready for what is actually in my hotel room?
Yes.
A USB-C charging port.
Wait, which one is USB-C?
This is the sort of modern connector that you’re going to use to charge your MacBook, that you’re going to use to charge —
This is the new one.
Yes, you charge your iPad with this if you have an iPad. And I don’t know about you, but remember when iPods came out and you would charge them with those little 30-pin connectors and eventually hotels would buy like alarm clocks that would have this 30 pin connector in it. And it was really nice that they did that. The problem was that they never updated the 30-pin connectors in the hotel room.
So I’ve spent like the past 10 years just going to these hotels that are so well suited to charge an iPod and nothing else. And today, for the first time in my life, there’s actually a modern port in a hotel room. And that’s just the spirit of progress in this country. That’s what makes it great.
I’m so happy for you. And I also just want to just shout out the incredible dedication to tech blogging that you are bringing us updates on the ports in your hotel room. Truly a posting legend.
Some people in Hawaii visit a port where there is a boat, maybe a place to get a mai tai. But me, I’m charging my iPad.
All right. Well, please keep us updated on the Wi-Fi connectivity, the TV selection, anything else you might need to know about the gadgets at your hotel. I really appreciate it.
Absolutely will.
[MUSIC PLAYING]
I’m Kevin Roose. I’m a tech columnist at “The New York Times.”
I’m Casey Newton from Platformer. And you’re listening to “Hard Fork.” This week on the show, Reddit goes dark and maybe takes the open internet with it. Then, a deep dive into MrBeast and what his YouTube empire says about the future of the internet. And finally, how platforms are already screwing up the 2024 election.
[MUSIC PLAYING]
So Casey, I am — I’m having a crisis right now. So I am also getting ready for a trip. I’m taking a very long flight with a very small child. And normally what I do in situations where I am looking for some parenting advice is I go onto Reddit, and specifically to this subreddit called r/daddit.
So last night, I’m packing for this trip, getting ready. I am feeling nervous about this flight, so I go onto r/daddit debit to get advice for this trip. And I see a message that says our r/daddit is a private community. It is currently closed in protest of Reddit policies squashing third party apps.
So Casey, you had a couple of newsletters recently about the situation at Reddit. So what the hell is going on at Reddit? And why can’t I get my parenting hacks?
Well, I’m so sorry to hear that you’ll be parenting unassisted. I wonder if you’ve thought about asking your wife for help? You’d rather than random strangers on the internet?
Yeah. But what is happening is that Reddit has made its users mad, maybe, the maddest that they have ever been over frankly, a bunch of things that have all kind of come together in a giant explosion.
Right. So this is a giant blackout, basically. Moderators have shut down thousands of subreddits on Reddit in protest of the site’s changes to its API pricing structure, so much that the site actually crashed for a brief period. So can you just catch us up on what’s happened so far?
Yeah. So the spark of all this was in April when Reddit announced that it was going to begin charging for its API, an API lets software talk to other software. And for a long time, Reddit’s API has basically been free, which meant that third party developers could come along and they could build their own Reddit clients and show you Reddit how they wanted you to see Reddit.
Or they would make tools that the volunteer moderators on Reddit could use to moderate their own communities. They do research, or they would do accessibility features. So maybe if you’re blind, this would make Reddit easier to use. So that was what this API enabled.
But then in April, they announced they’re going to start charging. And if you make a big Reddit app, like there’s one called Apollo, developers are saying that, at least in the case of Apollo, it was suddenly going to cost $20 million a year to run an app that previously had been free. So basically at that price, no one could afford to develop a third party app anymore. And that got everyone really mad.
Right. And I should say like API access for developers is kind of one of these things that just gets fought about a lot on the internet. But why did Reddit make this change now? Why did it decide to start charging for access to its API?
Well. So I’ll tell you what Reddit said about it. Steve Huffman, the C.E.O, said, look, we need to make money. Reddit is not a profitable company. And Reddit wants to make an initial public offering of its stock later this year. And so for that reason, they needed to find new revenue sources.
I mean, I remember when Twitter was going public a decade ago. It also restricted access to its API and made life harder for these sort of third party developers who are building their own kind of Twitter apps. So in some sense, like that piece of it feels predictable, like you’re a company, you’re not making money, you want to go public. And one way that you can make money is by charging people for data access and not letting developers piggyback on your platform to make their own apps.
Yes. Although I think, in retrospect, Twitter did this in such a better way. And I’m shocked that I said that because at the time I thought it was like the worst thing that had ever happened to the site. But the way that Twitter did it was they said, OK, we’re going to allow that these third party apps to continue to exist, but we’re going to cap the number of users that they can have at some arbitrarily low limit.
But they allowed the apps to continue operating. So what that meant in practice was that developers stopped making third party apps because it was a clearly terrible business. But all of the people who’d come to rely on these third party apps were able to continue using them. And so you never saw the revolt at Twitter that you end up seeing at Reddit.
Right. OK.
So. Yes, this is on some level, a very basic financial issue. Reddit needs to make money. Charging for access to its API is one way that it can make more money. But my understanding is there’s also something else going on here. So what is that other factor?
Well, you know, Kevin, it wouldn’t be a “Hard Fork” episode without a little bit of AI. And AI is a huge character in this story. So Reddit is this incredible library, essentially. Right. It’s not just and the dads who are using it to raise their children.
People across all walks of life are asking questions. They’re getting answers. And they’re having conversations. And so if you are open OpenAI and you’re building ChatGPT or your Google and your building Bard, Reddit is a goldmine because there is a free API that you can use to just ingest essentially the entire corpus of the website.
And so one of the things that Steve Huffman, the C.E.O of Reddit has said, was, hey, we are not going to take this anymore. We have something that is very valuable. And they’re going to have to pay for it. So Reddit very much positioned this as a sort of defense against the rise of these large language models.
If I can just go to ChatGPT and say, hey, what are some tips for taking my infant on a long plane trip? And I get that answer from ChatGPT and ChatGPT essentially just sourced it from Reddit without crediting it, like that begins to destroy Reddit. But this argument wound up completely infuriating a lot of the people who use Reddit.
Yeah, let’s talk about that. Have you talked to Reddit users? Do you know like why they’re feeling this way?
Yes. So I’ve heard from a bunch of them this week as I’ve started to explore this subject. And what I’ve learned is that Redditors, like a lot of other people, have deep misgivings about the way that their data is being used to train these large language models.
It’s one thing, I think, to participate in a forum where everybody is just kind of trading tips and everybody’s getting something of value. But it’s another to package up everything that everyone has ever said and start to sell that as enterprise software. And that’s really what Reddit is talking about doing here, right.
Reddit came along and said, oh, we need to protect Reddit from the rise of these large language models. But the way that it’s doing that is it’s packaging up all of the posts of its users, who it is definitely not going to pay for any of this. And it’s going to sell it to some of the biggest companies in the world.
And so that, I think, was one of the key mistakes here was assuming that Redditors were going to be thrilled that Reddit was standing up to the tech giants, when in fact, Reddit just wants to turn them into customers.
I mean, I guess, my question about this is couldn’t Reddit have set up a two tiered pricing system, where it’s like if you are a developer who’s making a Reddit app for blind people, say, or if you’re making tools for moderators, like you can still continue to get the API and the data for free?
But if you are an AI company trying to build a large language model by scraping Reddit data and feeding it into your training, like then you have to pay this exorbitant rate. Could they have done that? Or is that just too complicated?
Well, they can. And in fact, in some ways, they have. So in April, when they first announced this, there were not a ton of details about how they were going to price it and what kinds of apps were going to be affected. As the weeks have gone on and as outrage has grown, they’ve said, well, if you’re going to make an app about accessibility, then yeah, you’re not going to have to pay us. So they have sort of been walking that back. But at the end of the day, you just still have this fundamental tension that this user data is about to be sold off.
Right. It’s really setting up kind of a three-sided war here, not just at Reddit, but at a lot of social platforms. There’s like the company, Reddit, in this case. There’s the users. And then there’s the sort of AI parasites that are like sucking up all of this data to train these large language models. And it seems like Reddit is trying to find a way to kind of make everyone happy except maybe the AI people. And just it doesn’t seem to be going all that well.
No, and I mean, Kevin, I think this is such an interesting question. Will the web forums of the future, how will they approach this question? I can imagine a forum saying, we will never allow this data to be used to train a large language model.
Or I could imagine them saying, if we do sell any of this data to a large language model creator, then we’re going to compensate our users in some way. Like I very much actually do understand why the Redditors are mad about this. And even though I don’t myself get that anxious about my own work being to train large language models, I absolutely get why other people do.
So this Reddit revolt, this stand that a lot of the biggest subreddits are taking, it started this week. And it’s still going on. And some of the subreddits that initially said we’re going to go dark for a day or two to protest these changes have now said, we’re going to extend that indefinitely, basically until the company buckles and says, we’re actually not going to implement these changes after all. So what is Reddit doing about this pressure? Is it reversing course? Is there any indication? That they’re going to back down from these changes?
To the contrary, they are digging in their heels. So Huffman sent an email to Reddit employees on Monday, which leaked to “The Verge.” And among the things he said was, quote, “There’s a lot of noise with this one, among the noisiest we’ve seen. Please know that our teams are on it and like all blowups on Reddit, this one will pass as well.” So that was a fairly dismissive quote. And after it leaked, suffice to say, Redditors got even more mad,
Yeah. I mean, Reddit as a user base has a history of yelling at the company’s leadership. It’s a pretty unruly community that does not take well to being told what to do. But it does seem like there is this kind of standoff now where Reddit is saying like, we’re not going to back down. We’re going through with these changes. This is what we have to do to make this business sustainable. And the Reddit community and all these big subreddit moderators saying like we’re not going to let anyone post until you change your mind. So how do you see this resolving?
Well. I don’t. I mean, I don’t know. I think that this controversy is actually different from the previous ones. In the past, what made the Redditors mad was like, hey, we want to continue to post naked pictures of people without their consent. That was like a 2015 era Reddit controversy.
This one, I think, is a little bit more principled. And I think that folks are sticking to their guns. I’ve already seen people spin up these kind of decentralized Reddit alternatives and moving their communities over there, at least temporarily. I imagine they will want to return to Reddit if they can, but we’ll sort of see what happens there.
I do think that there is clear room for compromise here though, which would essentially just be to extend the amount of time that these developers have to either create a new business model for their apps or to wind them down. And so if there is going to be a compromise here, I think that’s where you’re going to find it.
Right. But I feel like the bigger question here is like is the open internet dead? We’ve talked about how it feels like the kind of social media era of the 2010s is dying off. And part of that is this idea of the open internet, where content exists on the internet in an indexable, searchable, scrapable way.
Everyone’s got these public APIs. There are these third party developers making all these kind of cowboy apps for these platforms. And over the past few years, we’ve really just seen that movement just shrivel. Like Twitter, Reddit, these companies are locking down their platforms. They’re killing off third party apps. And I think large language models may be the last nail in the coffin of the open internet.
I disagree with this. Look, Kevin, yes, the open internet faces real threats. I’m particularly worried about what generative AI is going to do to digital media. I think a lot of websites are going to break because of that, I think you’re going to see less media being produced maybe because there’s just sort of less money to fund it. So all of that is real.
But if what you’re looking for is the kind of average internet citizen who gets online and posts, I don’t think that’s dying at all. I just think we are in a time of transition. The slow unwinding of Twitter has given rise to these decentralized platforms like Mastodon and Blue Sky, which we’ve talked about.
And yes, it’s very early. But oh man, they’re so nerdy and fun in the best way. And every time, I visit one of those places, I feel like the open internet is still alive. It’s just in a state of transition.
I do think if you run a website or any kind of social media service where people have been submitting examples of text, whether it’s Wikipedia entries or Reddit posts or some other forum, you are now sitting on a goldmine.
Because that data is extremely valuable to the companies that are building and training these large language models. And maybe that data didn’t have a lot of value before, but now it does. And so I expect that a lot of these companies are going to start trying to find ways to get a piece of the pie for themselves.
Well, I’ll tell you this much, the comments under “Platformer” stories, let’s just say they’re not going to go cheap. OK, so if you want them, get in touch. But open up your checkbook.
I mean, you’re joking, but I actually do think that the archives of “Substack” and “The New York Times” and “Platformer,” these are all part of the training sets for these models. I am sure of it. And I know that a lot of media companies right now are trying to figure out what do we do.
How do we prevent these language models from scraping our archives? But if they are going to scrape our archives, how to at least make sure that we’re getting paid for that. And so that is a very active and ongoing conversation at a lot of media organizations right now. And I expect that those talks will sort of continue. And that a lot of these platforms will do what Reddit has done, which is to sort of cut off the spigot of free data access.
Yeah, well, you know, Kevin, I’ve always thought that in some small way, what we do as writers is the most valuable thing that humans have ever tried.
Training AI language models.
No. It’s the writing, the writing is what’s valuable.
Oh, Yeah.
The journalism, my friend.
I did love one sort of piece I read this week, which was about this future of these data wars. And it was saying that basically in order to keep getting high quality data to feed into these language models that some of these AI companies may have to start paying humans to create valuable text data that’s like fact checked and accurate and reliable and maybe posting that somewhere on the internet so that it can — which basically is just journalism, like they’ve reinvented journalism.
They’re going to — the AI companies are going to sponsor journalism organizations to get the like grist for their language models to be trained on and eventually we’ll all be working for Sam Walton.
I mean, look, it may turn out that we actually need an artificial general super intelligence just to create a sustainable business model for the media industry. Because humans have not actually been able to crack that. So that’s going to be my first question of the AI.
Please save us ChatGPT.
.
When we come back, what the rise of MrBeast, one of the biggest YouTube celebrities in the world, says about where the internet is headed.
[MUSIC PLAYING]
Hi, Max.
He can’t hear us.
Fuck, Max.
Can you hear us now?
You guys hear me?
Yeah.
Hey, Max.
You sound great.
I can’t hear them.
Oh, no. Fuck, Max. Oh, I got it.
Hi, boys.
So good to see your smiling faces.
Good to see you.
Max Read, welcome to “Hard Fork.”
Thank you guys for having me.
So Max, you wrote a piece for “The New York Times Magazine” this week in a genre that I think of as kind of the YouTube Safari, which is where you basically like explain something that is happening on YouTube to people who by and large do not spend a lot of time on YouTube.
And this one was especially good. And I really wanted to bring you in to talk about it. So this piece was about MrBeast, who is one of the biggest people on YouTube. So I guess my first question is, why did you decide to write about MrBeast out of all the possible YouTubers?
Well, I mean, he’s literally the biggest YouTuber. As of this week, he is the second biggest channel on YouTube. He’s the biggest single YouTuber person. He passed Cocomelon, the toddler channel, he’s only below the Music Label T-series.
And if you talk to an 8/9/10/11/12 year old, and you ask them what they’re into and what they’re watching, invariably, the answer is MrBeast. And as a 37-year-old, MrBeast doesn’t connect with me on the same level.
And I think that’s an interesting question of what is it about this guy? What is it about this cultural icon, this content producer, that makes him so popular and makes him so successful when it’s not immediately apparent in the way that something else might be?
Right so, I think a lot of people listening to this show will at least have heard of MrBeast even if they haven’t watched a ton of his videos. But just give us the basic kind of Wikipedia entry on him. Who is he? What kind of videos does he make? And how did he get so popular?
So his name is Jimmy Donaldson. He’s a 25-year-old from Greenville, North Carolina. It might help to think of him as the sort of Willy Wonka of YouTube. His stock in trade is making these extremely elaborate, expensive, extremely well produced videos of things like contests where he’ll do the sort of classic keep your hand on this Lamborghini, or keep your hand on this big pile of money. And whoever keeps their hand on it the longest gets to keep the money.
He’s probably most famous for a big, what I think he called the Real Life Squid Game. So he sort of approximated the Netflix show about essentially lethal reality show competition. No lethality in MrBeast, I have to make sure that’s clear on the up front, and had a bunch of people compete to win money.
And it appeals sort of on a surface level to the inner 12-year-old in all of us. And to the actual 12-year-old in 12-year-olds. This idea that you can just be walking down the street and MrBeast might pull up next to you and just hand you $10,000 or invite you to join a contest where he might give you $10,000.
The joke I make in the article is that the MrBeast shtick is essentially taking a number and adding as many zeros as possible on the end and putting it in the headline. So you know, he’s going to give away $1,000,000. Or he’s going to have $100,000 hotel room.
And you stick those zeros in the title of your YouTube and you can get a lot of people to come watch it.
Now, Max, when you lay it out like that, it seems pretty evident why that would be a popular thing to watch, right. That’s incredibly fun to watch people get rich out of nowhere. Did MrBeast essentially invent this format? Or are we just kind of drafting off other people who are maybe doing it in smaller ways?
It’s probably not right to say he’s the originator in the sense that nobody’s ever thought to do this. In some ways he’s just doing what television has done for a long time. I mean, he’s doing extreme home makeover or queen for a day, if you want to stretch even further back, sort of formats that we recognize.
But he’s taken this sort of interesting path to get there. So he’s been making videos for a decade now. And he is known for being a student of the YouTube algorithm, let’s say, that he spent a lot of time in high school with his friends after high school studying what videos were successful and why they were successful.
And he has experimented a lot in his own quest to be a successful YouTuber. And one of the sort of flash points on that quest, one of the things that shifted his path and the kinds of videos he was doing was a video where he gave away $10,000 to a homeless guy.
And that video was not like a huge mega viral hit, but I think it was big enough for him and sort of interesting enough in the sense of how viewers responded to it that he recognized that taking the money that he was making from his YouTube channel, which was relatively big at that point, and giving it away was a really effective way both to feel good and to do good, but also to grow his audience, to make his audience bigger.
And you’ll see a lot of YouTubers doing similar things like this, but I think Jimmy and MrBeast are so good at it and have sort of made such a science of how to turn that idea and these videos into successes that he stands head and shoulders above anybody else trying.
Right, I mean, it kind of sounds like the YouTube modernized equivalent of something like the Publisher’s Clearinghouse Sweepstakes. I mean, I would say I would say almost that something like “Queer Eye” on Netflix is kind of the 30-somethings MrBeast, like the sort of feel good, excited guys enter your hometown and lift up somebody. And “Queer Eye” is obviously extremely successful, but it’s not MrBeast successful.
Yeah, there are so many questions I want to ask you about MrBeast, but I wonder if we should just watch a MrBeast video together to have you narrate us through what you make of this and how he stages these stunts and what we might learn from this.
So in your piece, you talk about this video that’s titled, 1,000 blind people see for the first time, which generated a lot of controversy when it came out. So I’m going to share my screen here. And let’s just watch this together.
This was super controversial. I feel like this actually might have been the first time a lot of people heard about MrBeast just because everyone had an opinion about this one.
Before we even start I feel like we should describe what the thumbnail to this video looks like, because I do think, for me at least, this was a key reason that people kind of flipped out about this video and found it so weird, so if I can just describe it. Every YouTube video has a thumbnail, that’s the image that the video shows when you click on the sidebar or whatever. And MrBeast has these unbelievably, sort of uncanny valley photoshopped thumbnails. And in this one, the thumbnail is, what I assume is a stock photo of a kid who does not appear in the video, with a bandage raised up over his eyes.
He’s got a shocked face on. It’s not clear whether he’s shocked happy or shocked because he’s been like forced to watch something he never should have seen. And then just over his shoulder, to his right, is MrBeast grinning. Like the gravity of people who haven’t been able to see receiving surgery that gives them eyesight that they hadn’t had before is not reflected in the kind of cheesiness of this weird thumbnail.
Yeah, and I also remember hearing an interview with MrBeast once where he said that he has a full time thumbnail team, that there are people whose entire job is just to test out many, many different iterations of all of the thumbnails for all of his videos and see what gets watched the most. So yeah, this is not an accident that his thumbnails are so weird and seemingly uncanny. All right. Let’s watch this video.
We’ll do the intro. And it is — like I feel like it’s worth noting MrBeast is really good at grabbing your attention and explaining what’s going to happen in the first 10 seconds of a video because that’s so important to getting people to stick around instead of click away. All right.
- archived recording (mr beast)
-
In this video, we’re curing 1,000 people’s blindness. [CHEERING]
It’s going to be crazy. Most of us see the world like this. But here’s the thing, 200 million people see the world like this. But I just made it one less.
[CHIMES TINKLING]
- archived recording
-
. Oh. Wow. .
- archived recording (mr beast)
-
She’s just one of 1,000 blind people we helped from around the world. They can’t see, but we have all the technology to fix it.
- archived recording
-
. Half of all the blindness in the world is people who need a 10 minute surgery.
Can we pause there? I just want to say that when you meet the surgeon in this video, it says in giant font, Surgeon. And there’s a red arrow pointing to. This man does not have a name. What are his credentials? We’ll never know. For our purposes, he is simply, Surgeon.
And we should say, speaking of Chirons, so in the video, the first person who’s cured, or whose blindness is cured that we see, a little counter pops up at the bottom that says 1 out of 1,000, that to me is reminiscent of like a video game where you have to collect as many blind people as possible. I suppose, and cure them. And MrBeast is going to plat it.
You know, so this is his like really fast intro, I think. He talks basically like a 12-year-old. I don’t mean that demeaningly, as demeaningly as maybe it sounds. But he has the kind of overexcited, enthusiastic, sort of running over himself attitude of an adolescent doing something really cool, which I think is one really obvious reason why he connects so much with his adolescent viewers.
Yeah, and to your point, Max, there is not one wasted frame in anything that we watch. Like every single frame served a purpose, and they start playing with your emotions instantaneously in this video.
Right. All right. Let’s skip ahead a little bit and see when he actually gives someone $10,000.
- archived recording
-
. You pay for my surgery?
- archived recording (mr beast)
-
Yes.
- archived recording
-
. Seriously?
- archived recording (mr beast)
-
You know what, here’s $10,000. Make your day every better.
She’s fallen on the floor. So he gives this woman a briefcase filled with $10,000 that’s going to pay for her cataract surgery. She falls on the floor in hysterics and celebration.
She’s actually getting $10,000 on top of the cataract surgery. When we see her in a video, she’s presumably just gotten the surgery, and then MrBeast opens the suitcase of cash.
That was her Beast bonus.
The other thing that I always find interesting about these, a lot of “Queer Eye” or whatever would focus on the person who’s receiving the money. You get the sob story. You get whether it’s going to change.
MrBeast is just in. You get the reaction, you don’t need to know much else. There’s a couple more in-depth stories that I’m sure that we’ll see in a sec, but for the most part, the video can’t go over, what is it, eight minutes long, something like that?
Yeah. It’s eight minutes long.
You don’t necessarily need to get the full story. You want to just get the quick hit of the extreme emotional reaction and get out.
Yeah, so here in our next clip, this is Jimmy, MrBeast, talking with Jeremiah who gets not only surgery to correct his eyesight, but also a big check for college.
- archived recording (mr beast)
-
How’s it going?
- archived recording (jeremiah howard)
-
OK. Yeah, I’ve been subscribed to you for 11 years.
- archived recording (mr beast)
-
Really, oh my god.
- archived recording
-
Jeremiah has been blind in his right eye since he was born, and it’s affected his vision, his entire life.
- archived recording (mr beast)
-
Are you excited to be able to see out of both?
- archived recording (jeremiah howard)
-
Hopefully, if the surgery goes that way.
- archived recording (mr beast)
-
Because Jeremiah was born with cataracts, his right eye never received light. And that means that this surgery only has a 50 percent chance of working.
- archived recording
-
I think we go ahead and get underway, and then we’ll just talk post-op.
OK. So. This video was sort of central in your piece about MrBeast and his relationship with his audience. So why did you decide to focus on this particular video and the controversy it generated?
Well like Casey said, I think this was for a lot of people in our age, the crypt keeper age bracket, this was sort of the first time, first time MrBeast really entered our media environment so to speak. It exited the YouTube sphere and entered other media outlets.
And part of the reason for that and part of the reason I was interested in writing about it is that it sort of presented a moral or an ethical dilemma about what MrBeast is doing. There are many questions you could ask, but maybe the central one is, is it exploitative to do something unquestionably good, like pay for cataract surgery in service of audience growth?
Are you exploiting these people? Are you helping them? So I interviewed Jeremiah Howard who’s the first kid we heard there. And Jeremiah’s story is incredibly fascinating. He’s incredibly bright and articulate kid. He’s raised by a single father.
He spent a long time trying to get assistance for the surgery. He tried to raise money. And he was sort of turned away at every opportunity. He couldn’t make it work. And then MrBeast swooped in and paid for this. And to me this is like, you could not write a dystopian science fiction story more on the nose than like, oh well, the government couldn’t help. The GoFundMe didn’t fund enough.
So luckily, the Willy Wonka of Greenville, North Carolina popped by, saw my GoFundMe and decided to pay for it. And you get none of this in the video. You get none of this sort of sense that this was a long period of trials for somebody who’s on the lower end of the socioeconomic spectrum, who doesn’t have access to the same kind of care that you know people at a higher end do.
For MrBeast, it’s sort of we’re going to pay for the surgery. Novelty checks have not gone away. They give him an enormous $50,000 check and say it’s for college. And you get a little bit of weepiness with Jeremiah and his dad. And that’s that.
And so trying to figure out how you feel about this kind of total mixture of genuine charity, of genuine philanthropy with abject YouTube view mongering is really interesting and feels really sort of typical of the moment.
For what it’s worth, I have two nephews who are 7 and 10. And they love MrBeast. And I was asking them last night what they like about him. And my seven-year-old nephew said, he’s kind. And I don’t know. I thought, well, if my nephew’s going to be taking any lessons from YouTube, kindness, I think it’s about as good as I can hope for. So I don’t know.
Yeah, it’s kindness or investing in crypto. So I think that kindness is probably better for seven-year-olds at least, I think kindness is you’re better bet.
So MrBeast is kind of this fascinating character right now. He’s obviously a huge celebrity. He’s also got all these brand extensions. He’s got like, I was in Walmart a few months ago and just saw this enormous display of his chocolate bar.
So like in a very literal sense, like he is turning himself into a kind of Willy Wonka figure in the sense that he is selling chocolate and other food products under his name and brand.
The way I think about it is sort of the something that Jimmy is very good at obviously and something that he talks a lot about is his desire to study and master what you might call a platform marketplace. What do the sorting and recommendation algorithms that YouTube has in place reward? And how can I make videos that push that?
And I think he sees, and this is, I should be clear, speculation on my part, I think he sees other platform marketplaces that he can apply the same kinds of lessons or the same techniques of study to. So one of his most successful endeavors right now is something called MrBeast Burger.
So parents of adolescents in New Jersey and New York will know that there is a physical MrBeast Burger location in the American Dream Mall in New Jersey. But for the most part, MrBeast Burger is one of these ghost kitchens where you can license the MrBeast Burger brand to restaurants around the country who can then offer it on DoorDash.
And my sense is that Jimmy looks at something like DoorDash or an app that has certain similarities to the YouTube app and thinks like how can I apply the lessons I’ve learned on YouTube and also master the DoorDash algorithm and also figure out how to own seamless the way I own YouTube?
How much do you think that his focus on philanthropy reflects his own interests? And how much of it is just, this is actually the most popular thing you can do on YouTube?
This was sort of the central question of the article. And I don’t want to spoil it, but I did not come up with an answer. I just kind of waved goodbye because I couldn’t figure it out. What I would say is I think that there’s sort of a generational divide, let’s say, between people who grew up not on YouTube and people who grew up on YouTube.
Where I think the idea for maybe for people like us is that you do things sincerely or cynically. You are doing things out of the goodness of your heart or you are doing them with some kind of ulterior motive. And the young adults I talked to who are MrBeast fans and based on things that MrBeast himself has said, I think that divide doesn’t work in quite the same way.
That for many of them, they recognize the kind of cynicism at play in what MrBeast does. That they’re not stupid themselves. They see that the thumbnails, the goofy corny thumbnail, the sort of alienating thumbnail is a necessary component of what he needs to get his audience bigger.
And for as far as they’re concerned, that’s the table stakes. You’re not even playing the game if you’re not doing stuff like that. And so I think that for MrBeast, it’s those two kind of questions, does he actually want to do this or is this what the algorithm is sort of encouraging to do?
I don’t know that he would recognize the distinction between those two things. And I say this because I genuinely do think that he is a good-hearted kid, that he is he comes to philanthropy from an honest place, that he wants to help people out. But I also — I don’t know the answer to the question would he still be doing this if it wasn’t rewarding to him on YouTube?
Because it wasn’t like he started his YouTube channel giving out money. This was something that he arrived at. And then he’s seen how much success he can have with it. It’s sort of confounding to confront somebody like this who is at once extraordinarily cynical about how the platform works and how he can game it. And on the other end, like utterly earnest about how he’s going to make use of that cynicism.
I mean, I will say, for all of us have spent time writing about our fears about the YouTube algorithm and what it incentivizes, there is something deeply relieving about the fact that the most successful individual YouTuber has gotten famous by giving away money and curing blindness.
Yeah, I mean, I would be interested, Kevin, what you think about this, because you’ve written some really good stuff about YouTube radicalization in sort of the sense that you can follow a rabbit hole on YouTube and suddenly find yourself spouting insane anti-feminist nonsense.
And so you know I had a reaction to MrBeast was like, well god, compared with whoever the worst of the worst is, compared with Stefan Molyneux or whatever, this guy seems this guy seems fine. Like please, let’s get 50 more of him.
Yeah, I mean, I had the same reaction. I remember a few years ago, I wrote a profile of PewDiePie who was sort of the MrBeast of 2015, like he was the biggest YouTuber in the world for many years. And he had kind of this classic YouTuber career arc where he got started playing video games. And then got really popular, you know, tens of millions of subscribers.
And then he basically started making all of these edgy offensive jokes on his channel. And he did this like prank where he paid some guys on Fiver to hold up a sign that said “Death to all Jews.” That went horribly and he lost brand deals and kind of like blew up his career.
And so I went to just ask him about that, and what he said was basically at the time he was just kind of following the algorithm. Like he recognized that the edgier he acted on his channel, the more views he got. And so he described that YouTube at the time as this kind of place where everyone, all the big YouTube creators were kind of testing the limits, pushing the boundaries, like seeing what they could get away with because that was what was good for their channel.
So I guess for you, my question is like, does the fact that MrBeast is bigger than PewDiePie now, does the fact that he has kind of become the biggest YouTuber mean that YouTube has changed, that it’s more wholesome, that it’s rewarding different things than it used to? What does it say to you about YouTube and social media, maybe as a whole that MrBeast has taken over the internet in this way?
Yeah, this is maybe the big unanswered question I have, because the YouTube algorithm is such a black box. And by YouTube algorithm, I’m actually talking about thousands of different ways that the site recommends and rewards video creators, that this sort of system of sorting and recommending.
We don’t know what its prerogatives are. And so somebody like Jimmy or like PewDiePie can reverse engineer it to some extent. And they can follow it and sort of see what they can get out of it. One easy story would be well the kids are sick of drama. They’re sick of edginess. They want niceness.
They get the Paul brothers out of here. Get PewDiePie out of here. Let’s bring out MrBeast. And I don’t think that’s wrong. I do think that there is sort of like, some of the kids I talked to, I think there is fatigue with the kind of endlessness of the Hype House dramas that sort of characterized YouTube in the late 2010s, say.
I also think it’s hard not to come to the conclusion that YouTube is doing more to reward sort of wholesome, slightly more substantive, slightly higher production value content. To be clear, I also don’t think that means that Jimmy is sort of lucky that he just happened to be the wholesome guy at the time when the wholesome stuff was being rewarded.
I think that he is really attuned to what does well and how to make it do well. And so the fact that this is what he’s doing suggests that the algorithm is telling him something, not to anthropomorphize the algorithm too much, but the algorithm is telling him something that he needs to follow.
So you know, I think that — if I was in charge of YouTube, thank god, I’m not. But if I was, I would be so happy that MrBeast is the biggest YouTuber. Like I just can’t imagine after having gone through half a decade of having to listen to Kevin appear on every News Channel and say this platform is maybe not the greatest thing, to have Kevin be like, well gosh, isn’t it great that we have a guy who is just giving away money everywhere. Not that you said that Kevin, but you could say it at the end. And I think you would make people at YouTube very well.
For what it’s worth, my channel of insult commentary is still growing very quickly. So I don’t think they’re totally out of the woods. One question I have is whether MrBeast has inspired a lot of imitators. Because one way that we might be able to tell whether the algorithm had truly changed was whether there were more of him. So like how influential has MrBeast been on the broader community do you think?
Extremely influential. So bracketing the sort of generosity aspect of what he does, the philanthropy charity aspect, he is the, more or less the originator of a whole genre of YouTube called Junk Lord, which is so-called because it involves spending a huge amount of money on junk and sort of showing off how much money you just spent.
So that stuff like MrBeast filled a big tub with slime and bathed in it for 24 hours, or he bought $500,000 bouncy balls and threw them in his brother’s backyard. I don’t know if these are real videos or I’m just mad libbing the vibe here.
He bought like 100 — this one is real. He bought like 100 leaf blowers and tried to make a hover field that he would go over. So there’s a lot of people doing this kind of thing. And I think it’s the sort of philanthropic aspect of what Jimmy does that separates him out from the mass of successful, but not MrBeast level successful, YouTubers.
But there also are more and more people on TikTok and YouTube doing the kind of we’re going to hand out $10,000. The most recent and worst one I saw, which is a stunt. I think everybody involved was acting. These two TikTokers filled the bed of their pickup truck with like 200 packets of instant ramen, and then they would drive up to again, I’m pretty sure they were actors, like people pretending to be homeless on the side of the street, and being like do you want to have some instant ramen or something. Just some awful combination of like bait and whatever else.
But I think that kind of video obviously owes an enormous amount to Jimmy and what Jimmy was doing and has been doing. And you know, I think it’s a credit to him that he’s never made that kind of mistake. That kind of, I mean, I can’t believe — it’s a credit that he’s never offered Raman out of the back of a pickup truck to a homeless guy. But in the context of him stuff, that is frankly, a credit to him.
It’s also just crazy like the economics of this. Right, like I find it hard to believe that these videos pay for themselves. I imagine in many cases, they don’t actually pay for themselves and they sort of serve to prop up the other MrBeast businesses. But my god, like it must be expensive to cure blindness for 1,000 people.
Yeah, where is he getting the money?
I mean, I actually think, as far as I can tell, and I had no deep throat handing over MrBeast financials to me in a parking garage in Greenville somewhere. But as far as I can tell, he actually is making enough money to fund his videos from his videos.
A very common criticism you see, people assume that he’s keeping most of that for himself, that it’s mostly profit. God help me, but I am I’m basically at this point a believer that he just pumps the money back into the videos, that he doesn’t.
If you watch him often enough, Jimmy does not seem like a guy who really wants to live in a mansion and have a Lamborghini. He doesn’t have the Logan Paul sense of extravagance in terms of personal wealth. He seems like a goofy 25-year-old guy from North Carolina who really likes making YouTube videos and would rather just put more money into making really good YouTube videos than putting it away somewhere.
That being said, anything is possible. And it’s very hard on YouTube end or on the MrBeast end to actually see real numbers on any of these things.
Right. Max, I was struck by an idea that you wrote about in your piece, which was that MrBeast’s audience is just very conscious of his whole, not just his schtick, but just his business model, how he uses the ad revenue from his videos to pay for philanthropic projects for his next video.
And that these kids who are watching his videos, they almost feel transgressive by watching his videos. Like they’re, by clicking on a MrBeast video and watching it all the way to the end, they’re not only like watching something entertaining, but they’re kind of like taking money out of the pockets of Coca-Cola or whoever advertises on YouTube and putting it into cataract surgeries for blind people. So say more about this relationship between MrBeast and his young audience on YouTube. And what does that tell you about where this creator economy may be going?
Yeah, so I’m cribbing this idea a little bit from an academic named Vincent Miller who works out of the UK who wrote a paper recently about MrBeast. And I like this paper because it helped crystallize a sort of sense I was getting from talking to MrBeast fans and watching the videos, which is that you have a sense when you watch TV. You have a vague sense in the back of your head about how the people on TV make money, how the business makes money. It’s not like it’s hidden from us. But you put on “Madmen“, you’re not thinking, well that’s $0.10 in Jon Hamm’s pocket and $0.25 in Matthew Weiner’s pocket. And RJDW who made the theme song is going to get a little residual check in a little bit. And you’re not thinking about how that overall works into the projects and ideas of the other people.
To use the sort of academic language of it, maybe, you’re not necessarily thinking about yourself as a commodity or as a member of an audience that is a commodity against which advertising can be sold. And what makes MrBeast different is that he is very close to explicitly telling his viewers to think of themselves as a commodity.
He turns on the video and especially on his philanthropy sub channel called Beast Philanthropy, he starts almost every video by saying something like every person who watches this is giving me $0.05 that, I’m going to hand to somebody else, is putting $0.05 in the pocket of these people, is planting five trees or whatever it is.
And I was struck by this idea because I think one, it suggests to me that the people who are watching MrBeast, and this was reflected in conversations I had with people I talked to, have a much more sophisticated view of platforms and how they work than I think is often assumed about adolescents.
And I think there’s something quite powerful about having Jimmy treat them as sophisticated consumers of media. I don’t want to overstate this. Obviously the videos are the videos we’ve heard, some of what the videos are, they’re not the most sophisticated things in the world. But I think it’s not just that the kids are sort of watching this awed by the beauty of giving people sight. It’s also they know that by watching it they’re getting cataract surgery for more people. And they like being part of that kind of movement.
The other part that’s powerful is when you’re talking about people who are adolescents or teenagers, who want to help in some way but maybe don’t have — they don’t have money to donate themselves, obviously, they don’t have control over their schedules to donate time or volunteer. To have somebody sit down and say, well, by actually by watching YouTube, which is something you probably were doing already, you are helping. And you should feel powerful. You should feel like a member of the Mr Beast movement, so to speak.
I don’t mean to hold this up as a model for future philanthropy. I think there’s all kinds of problems with it. But this really helped me see part of what makes MrBeast appealing beyond just the obvious appeal of feel good stories in his videos.
Max, for people who don’t inhabit the YouTube universe who maybe saw this cataract surgery video going viral on Twitter and saw all of the heated commentary about it. Is this cynical? Is it exploitative? It sounds like you basically came out of that question on the side of, this is fine. Like, of all the things that could be popular on YouTube, at least it’s this guy and not some neo-Nazi. Like is that a fair reflection of where you sort of came out of this after spending so much time and energy reporting on MrBeast?
God, you’re asking me to endorse MrBeast live on air. Will be taped on air. Yeah. I think. yes. I think. So I have — I feel hesitant about a lot of what he does. I remain extremely impressed with his ability to toe an ethical line that is maybe a little bit further than I would like, but doesn’t ever cross into things like the Ramen in the back of the pickup truck.
I don’t want to overstate the extent to which that represents a true ethical commitment to non-exploitation or whatever. But I do think that kind of line is important. And I think the fact that he consistently stays on the right side of that line is worth something.
And so I don’t blame anybody who looks at those thumbnails and thinks that there’s something sort of awful and — or as a streamer put it, very icky about this sense that we’re watching inspiration porn, disability porn for the purpose of making MrBeast a bigger, more important figure with a reputation for niceness.
But I came away feeling, having done all this reporting, I mean, maybe I should even say this. One of the things I entered in this reporting doing was there’s a thriving community of MrBeast’s conspiracy theorists who are quite positive that he does not give the money away, that the people who receive the money are employees of his, that there’s all kinds of shenanigans on the back end.
And I found nothing. The people I talked to who are on the videos. It’s like reality TV. Producers maybe push a line or suggest that you say something. But it’s not — it’s not a total con job. And you know, Jeremiah really did receive a $50,000 check and cataract surgery.
The fact that I wasn’t able to find anybody who was like, this is fake. He was a jerk This whole thing is a total put on, probably sort of pushed me onto the side of saying, I guess it could be worse. That’s the official Max Read line on MrBeast. “Could be worse.”
And Max, that leads me to my last question, I think today, which is if you need $10,000, where does MrBeast usually hang out?
Greenville, North Carolina. I would suggest. I would suggest just driving around Greenville as much as you possibly can. You never know. You never know. Maybe get a job at a deli. He likes to show up at businesses and offer people money to quit their jobs. So maybe go to Greenville. Get a job at a deli. And just wait. It will only be five or six years before you see him.
Max, if we start a YouTube show. If Casey and I start a “Hard Fork” YouTube show, what should our shtick be?
Oh, god. I don’t want to — maybe you guys can edit this out. You’re much too old. You’re just — it’s just not going to happen. I think you know what it could be. It could be like old guys react to young people stuff. I think that could be cool. Just have you guys watch MrBeast videos, you know, when they get grandmas to listen to trap music, that’s kind of, I think that’s where you’re at right now.
Well, you know Max, we we’re going to offer you a Tesla full of dollars for coming onto the show today. But that’s off the table now.
Max, thank you so much for coming.
This was great. Thanks, Max.
Thank you guys for having me. This has been a blast. [MUSIC PLAYING]
When we come back, how the big platforms decided to stop policing 2020 election lies.
[MUSIC PLAYING]
All right, Kevin. Now from time to time on the show, we do have to raise alarms. We like to talk about what is new and exciting in the world of tech, but sometimes, there are things that concern us.
Yeah.
And this week, I am afraid I have one of those.
Oh boy.
So I have a hypothesis for you. And the hypothesis is this. We may have passed peek trust and safety.
What do you mean?
So trust and safety in the tech world is the umbrella term for the people who try to keep the platforms clean. So if there’s hate speech, bullying, nudity, anything bad that’s happening on the platforms, like let’s say a foreign country tried to interfere with our elections.
Hypothetically.
Hypothetically if that happened, the platforms would all have their trust and safety teams who would work to solve that problem. And we saw an explosion of this industry after 2016 as platforms like Meta and Twitter and YouTube all invested heavily into this field, hiring tens of thousands of people.
And on one hand, I think very few of us would say that the period from let’s call it 2017 to 2022 was a golden age.
It was a golden age for something.
It was a golden age for something. But I don’t think people would say it was a golden age for how safe the platforms fell to use.
Correct.
And yet halfway into 2023, I’m increasingly concerned that we are going to look back at it as the time when the platforms paid the most attention to this stuff.
And why do you think that?
Well, I can give you a handful of examples from recent weeks. And I want to start with Instagram. Have you heard of this character, Robert F Kennedy Jr?
I have. Yes.
He was the son of the original Robert F Kennedy. The one some have called The Good Robert F Kennedy. And the new one, the son, became one of the leading anti-vaccine conspiracy theorists in the entire world.
Yes. He’s famously like very anti-vaccine, pre-COVID. This was like way before COVID. He was making up crazy insane claims about the vaccines that kids get to go to school.
Yes, exactly. And then COVID came along and gave him an opportunity to massively expand his platform to try to scare people away from getting these shots that very well may save their lives. So Kennedy starts doing this. He amasses 700,000 followers on Instagram.
And in 2021, Instagram says, you know what, we’ve had enough of this. And so they get rid of his account. They get rid of the account of his organization that tries to advance the same sorts of conspiracy theories. And somebody like me says, OK, great. This guy has lost his reach.
And if he wants to go stand on a soapbox in Central Park and yell about vaccines, he can. And that’s fine. But he can’t get on Instagram and harness the reach of this platform in order to advance his views. But then Robert F Kennedy Jr. Has a brilliant idea. You know what this idea was?
What is it?
He decides to run for president.
How I solve most problems in my life. Yes.
And I think it’s safe to say that Robert F Kennedy Jr. has about as much chance of becoming the Democratic nominee as like Connor Roy had of winning the presidency in succession. OK. This is not a serious campaign. But sometimes these fringe candidates will run, not because they think they can win, but because it gives them a platform to advance their views into the mainstream.
Yes. And I remember several years ago when people like Marjorie Taylor Greene were announcing their runs for office. This was sort of the loophole that people had discovered, that these platforms, they had one set of rules for normal users, and another set of rules for public figures and politicians and candidates for office. Is that what you remember?
That is absolutely right. And it’s just what we’re seeing here. A spokesperson for Meta told the Washington Post quote, “As he is now an active candidate for President of the United States, we have restored access to Robert F Kennedy Jr’s Instagram account.”
This seems like basically a backwards system. Like if I were running a social network, which let’s all be glad I’m not. But one rational thing that you could do was apply stricter standards to bigger accounts and more prominent people right.
That’s right. So Robert F Kennedy Jr. getting his Instagram account back, that’s thing one. Let’s talk about thing two. After 2020, YouTube, Twitter, Meta, which was then Facebook. all say, you cannot come on our platforms and say that the 2020 election was stolen.
And then Twitter quietly just stops enforcing the rule. It actually doesn’t even get reported that they have stopped enforcing that rule until months later. And by the way, this even predated Elon Musk. This was not an Elon Musk thing. Twitter just stopped enforcing the rule.
Then Meta lets Donald Trump back onto the platform. And in the hubbub of that, I think a lot of people didn’t notice something else that they did, which was they said, we’re also not going to prevent people from lying about the 2020 election. And then along comes YouTube.
Now YouTube is the most cautious of the platforms. It operates by the sort of Jurassic Park policy, which is that if you don’t make any movements, the dinosaurs can’t see you. And so they’re always moving very slowly when it comes to this sort of thing. But sure enough, on the Friday before Apple is going to unveil the Vision Pro, YouTube says, oh by the way, you can lie about the 2020 election now on YouTube. We’re not going to enforce that rule anymore.
And their reasoning was. And this is a quote, “In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real world harm.” I sort of know what that statement means, but I also sort of don’t.
I think they’re trying to say something like look, there could be some valid political speech about whether the 2020 election was stolen and we would hate to have a point of view on that. And so we’re just going to throw up our arms and be done with it. And so now we are in this world where we’re heading into an election.
And one of the most contentious issues will be around what happened in 2020. And if you are a right-wing candidate who just takes it as an article of faith that the election was stolen, whether you believe it or you just think it’s a useful thing to say to rally your base, you can now do that. And the platforms are going to stand down. They’re laying down their arms.
And if you’re a Democratic candidate, you can go on and lie about vaccines and start your presidential campaign that way.
Yeah. And now look, there are people out there who believe, and I’m sympathetic to this view, that when it comes to political speech, you want to allow for the maximum amount of discussion. Right. Good ideas come from the fringes some time. You don’t want to take a heavy hand in saying, this idea is good and this idea is bad, particularly in the context of Democratic debate, right.
At the same time, one of the sort of core tenets of the whole trust and safety movement was these platforms can and do incubate violent movements. People see misleading information, hoaxes. They get riled up and that spills over into real world violence. And all of a sudden you have a January 6 on their hands.
So look, I’m not saying it’s as simple as because platforms existed, January 6 happened. But I am saying as we head into 2024’s election, I think we really have to be on guard for what kinds of conspiracies are spreading on these platforms, how many views they’re getting, who is saying these things, and to what extent those movements are moving from the online world to the offline world.
Right. And you and I have both spent a lot of the past five or six years writing about trust and safety, content moderation, et cetera, in the context of these platforms. And I’m curious why you think the platforms have shifted their policies.
Is it just that they sort of are kind of admitting defeat, waving the white flag, saying, look, we’re spending all this money. We’re hiring all these people. And yet there is still a huge conspiracy theory contingent out there that wants to believe stuff that isn’t true. And so nothing that we’re doing seems to be kind of moving the needle on that. So why are we even trying? Or is it something else?
I think there are a handful of plausible explanations. One is political pressure. The Republicans are in control of the House now. They’re already pressuring disinformation researchers to turn over a bunch of documents, which in my view, is essentially just harassing these researchers for trying to understand the spread of conspiracy theories.
So if you’re a platform, you’re thinking, I don’t want my CEO to be called before Jim Jordan to testify about this thing. And the less we moderate, the fewer questions we’re going to have to answer. So I think that’s one factor. The second factor, though, is that we are in a different economic world than we were in 2020, right.
And a really sort of cynical one sentence way of describing this would be that trust and safety was a zero interest rates phenomenon.
Totally.
When these companies were printing money and their margins all looked great and the money was rolling in, they felt great about spending hundreds of millions of dollars on these huge apparatuses like the Meta Oversight Board, which is a multi-hundred million dollar project to essentially serve as the content moderator of Final Resort. It’s basically impossible for me to imagine, Meta deciding in 2023 that it would want to create an oversight board.
Right. And spend that much money on it, because they don’t have as much money now. And they’re spending it on the Metaverse and all this other stuff. So I buy this hypothesis, actually, that trust and safety was in many ways a zero-interest rate phenomenon.
And I think it’s possible that it’s just broader than like that stuff costs a lot of money, and we don’t have the money anymore. I think one other thing that really changed in the last few years is that employees at these companies have lost a lot of leverage.
And if you remember back during the kind of days of the sort of misinformation wars, like when Alex Jones was on these platforms and they were trying to decide whether to kick him off or not, one major factor in that was that their own employees, these tech companies’ own people were telling them you have to kick this person off our platform. I don’t want to work at a platform where Alex Jones has one of the biggest and most influential megaphones.
Totally.
And these companies largely have kind of like liberal employee bases. They’re all based in the Bay Area. They draw from a San Francisco tech crowd. And I think those employees had a lot of leverage because the labor market was very tight. Like these people had a lot of options.
If they didn’t like how their employer was handling trust and safety, they could go work somewhere else. And now that’s totally changed. All these companies have laid off thousands of workers. The workers who are left like feel very grateful to be there and have their jobs. And they’re not trying to of like upset the apple cart here.
And so I think you probably don’t hear as many protests from employees at these companies saying like you have to take this stuff seriously or we’re going to walk.
Yeah. I think the employee piece is huge. I want to say a third and final thing about all this, which is that, so far, I think we could say that this has essentially been a vibes-based analysis of the situation. But recently, I took a look at a third issue where it seemed like maybe the trust and safety enforcement has receded a little bit.
And it’s a really serious issue that makes all of us uncomfortable, which is what folks in this field call C-SAM. This child sexual abuse material. It’s the worst stuff that gets posted on any of these platforms. And there was a great report out of Stanford this month that took a look at these networks of buyers and sellers of C-SAM.
And without getting into all of the details, one of the implications of that report was that this stuff is proliferating, in part, because there has been a disinvestment in trust and safety teams among the big platforms. Twitter, of course, number one. I mean, they have lost almost their entire trust and safety division.
Do they even have a trust and safety division any more?
Their head of trust and safety recently resigned. Their head of brand safety recently resigned. It’s unclear to me in this moment who is running that division.
Cat Turd Two is running the trust and safety division.
Got a battlefield promotion. There was also a piece in March by this law professor, Kate Kalanick, who wrote about the sort of past five years as a golden age of tech accountability. And she wrote about how counterintuitive that seemed, that in this era where Congress passed not one single new tech regulation, that we would think about that as the Golden Age. But then she also lays out some of the big investments that these companies made in making the platform safer. And she said, you know, I hate to say it, but we may look back at these as the good days. So I think the question now is, where do we go forward?
And I really only have one idea, which is that a reason that this stuff started to get a lot of attention after the 2016 election was that there was a lot of pressure on these platforms from regulators, from lawmakers, from journalists, from activists.
And I do think it may be the case that over the past year or so, after Trump left office, some of us, myself included probably, took our eye off the ball a little bit. And we started to feel like, OK, the investments have been made. The platforms are not taking this seriously.
We can no longer assume that the teams are adequately staffed, that they’re paying attention to the right things. And I think it’s going to be incumbent on us as journalists who cover this world to start ringing these alarms when we see things going wrong.
So I hear everything that you’re saying. And I agree largely with a lot of it. I mean, just anecdotally, like I used to spend a lot of my time covering disinformation on social media. And I don’t anymore.
It’s was very annoying to me when you did this. I was trying to make that my thing. And I’m like, oh, it’s got to be his thing too. So now we’re all writing about AI and all this stuff happening there. And it does seem like there’s been sort of like a little bit of outrage fatigue among some of the tech press about oh, am I really going to write another story about a crazy conspiracy theory that’s going viral on social media? And I think, for a lot of reporters, the answer to that is like increasingly no.
But I want to challenge you on this, because I don’t necessarily think that it’s a bad thing if we have past peak trust and safety. And in part, that’s because I think that what’s happening in addition to all of the factors you mentioned is that social media is just becoming a lot less centralized.
We don’t have these kind of massive hubs where every conversation and every political discourse is flowing through them. So if the conversation is not all happening on Facebook and Twitter and YouTube. If it is happening in Discord servers and group chats and all these other places where people are now talking amongst themselves on the internet, maybe it’s not that the sort of trolls won.
Maybe it’s that the trolls are just more diffuse now. And you really don’t need these big centralized police forces of content moderators anymore because there just isn’t that much political discourse happening on these platforms. What do you think of that?
It is definitely the case that social media is not as monolithic as it once was. We’ve talked about some of the new upstarts on this very show. But just because there are more social platforms and the conversations are more diffuse, doesn’t mean that there aren’t still risks here.
The QAnon movement did not start on Facebook. It started on 4chan, but it spread from there to every other network, right. So just because a network is smaller, doesn’t mean that the rest of us are going to be protected once that conspiracy theory moves off a small platform into let’s say the mainstream of the Republican Party.
So it’s just something that I want us to keep an eye on because, of course, the platforms are going to tell us every step of the way. We continue to take trust and safety very seriously. We’re not reducing our investment. But I’m less convinced. And I’m nervous.
Yeah. I think that’s right. And I think that it’s certainly possible that as the 2024 election approaches that these problems will get more salient. But it makes a lot of sense to me that there isn’t as much investment in trust and safety these days as there was a few years ago.
I mean, the biggest reason is because Donald Trump is not the president right now. So when you did have Trump in the White House, there just was this kind of like urgency to content moderation because not only was he in the White House, but there was a whole sort of like pro-Trump media ecosystem that had revved up to support him and spread lies about his enemies and opponents and that just really has all dissipated. I mean, some of it’s happening on Truth Social. Some of it’s happening on Parler or Geter or wherever the hell these people are now. Tucker Carlson is not at Fox News anymore. Like the ecosystem has just really shifted in a way that means that maybe the same structures that we had after 2016 aren’t going to work or aren’t even necessary in 2023.
Well, here’s the one thing I would say about whether these things are necessary. I think we can all agree that Twitter has lost the most trust and safety employees as a percentage of the staff of any of the big social platforms over the past year. And as of the last reporting that I read, Twitter’s ad revenue was down 59 percent year over year. And there’s a direct connection between the degree to which you set and enforce policies and the degree to which advertisers are going to want to spend money to be on your platform.
I guess, I’m curious what you think they think about that question. These are companies. They are not humanitarian ventures. They have a fiduciary duty to their shareholders. And they want to make money. And so you would think that if there was such a clear connection between the amount you spend on trust and safety and the amount of money you make in advertising, that they would be pouring money into this stuff still.
So do you think they have a different idea about that? Do they think that this connection between how much you moderate your content and how much you make in advertising are not as connected as you think they are?
I think they’re connected. But I think that it just leaves out the political piece. It leaves out the fears of being called to account for every single policy and every single decision in front of Congress. And it doesn’t account for the fact that there are a lot of folks at these platforms who want to enable the maximum amount of speech that they possibly can, because that is a principle that they hold dear.
And I respect that principle. I do. But the reason I want to talk about this on today’s show is because it feels like every time I look up from my laptop over the past few weeks, I see a platform sort of retreating from a position that I thought was pretty good and that it kinda taken a long time to get to.
And the platforms just are not going to talk about this in any honest way. And so I think we just kind of need to kick them with our spurs a little bit so that they know that we can’t retreat to a pre-2016 view of the world.
Right. I think that makes sense. I’m sorry I’m pushing you so hard on this, but —
No, I get it. You love bullying and hate speech. And you get mad when you don’t see it on the internet. And I understand that.
I just want to be free to use my ivermectin to treat my COVID in peace.
No. But I want to float one other hypothesis here, which is that I think there’s a sense among some people I’ve talked to with these platforms that the tactics that they tried over the past, call it six or seven years, to sort of combat misinformation and conspiracy theories and hate speech, they really just weren’t that effective.
So for example, there’s a poll that came out last year that found that 61 percent of Republicans still believe that the 2020 presidential election was illegitimate. And that number hasn’t moved, even as platforms have tried to combat this lie about the 2020 election. And so if you’re an executive at one of these platforms, you might just say, well look, we spent all this money. We hired all these people. We tried our best to keep this type of misinformation off our services. And it didn’t ultimately matter. It didn’t change anyone’s mind. And so why are we even doing this?
Yeah. And people do feel that way at these platforms. But that is defeatism. That is just sort of saying from the jump, that when you try and you fail, the response is to give up. Now look, it is true that these social networks are only one piece of a broader media ecosystem.
The idea that the 2020 election was stolen is an article of faith for many, if not most Republicans. It gets promoted across their entire media ecosystem. And that would be true whether or not the platforms enforce this sort of thing.
But again, almost no one who works at these platforms believes that the 2020 election was stolen. And they are building a product that can advance that belief and has advanced that belief. And so part of what I’m making is just a moral appeal to say, is that the thing that you want to build?
You personally, who knows that the election was not stolen, you want to go up and work every day and build a tool that helps to advance that lie and potentially rekindle the violent movement that resulted in January 6? Like that’s how you want to spend your career? If so, great. But let’s have an honest discussion. And say that to my face.
Right. I mean, it does go back to this theory that I’ve had for a while now, which is that the most effective regulator of social media is actually shame. Congress hasn’t passed anything to regulate social media. But the thing that has actually made these platforms like shape up and kind of take on the problems is that their executives and employees just do not want to be associated with the kind of garbage that is on their platforms.
This is not what they want their legacies to be. They don’t want to tell their grandkids someday, I worked at the social media platform that got Donald Trump elected or got him re-elected or led to a violent insurrection or COVID denial. So I think if for no other reason then there’s kind of keeping that pressure applied, I think it is good that we’re talking about this.
But I do want to just sound a note of caution, which is that I don’t know that actually all the investments that they made had that much effect. And that’s an area where I just don’t know what to make. Because these companies invested billions of dollars, hired all these people, and yet there is still so much misinformation out there. And I just don’t really know what to make of that except that maybe we need a new tactic.
I mean, Kevin, just live for 30 seconds in your mind in a world where they had not made those investments where people were free to say whatever they wanted on these platforms, where the algorithms were promoting the wildest conspiracy theories, the most aggressively.
You never would have stopped covering it, because there would have been a new crisis related to that every week. So again, like I understand the inclination to throw up your hands and say, well, it didn’t work. But we can’t know what it would be like to live in a world where these companies did not hire these people.
That’s true. That’s true. And it’s hard to play out the counterfactual there. So let’s say five years from now, if these platforms really do de-emphasize trust and safety and really just let anything go on the platforms, how do you think the platforms will look different five years from now?
My fear is that it will become ever more difficult to tell what is true and what is false, both on social media and off. There’s another dimension to this, Kevin, that we haven’t brought into things yet, which is we know that we are about to live in a world where bad actors or trolls can use these tools to generate infinite posts about infinite things, flood these social networks with them.
And if there is not robust trust and safety teams in place to catch that stuff, prevent it from going viral, we’re just going to live in a world that has even less of a shared sense of reality than it does today. And I’m somebody who believes that you need a shared sense of reality to have a functioning democracy.
So my fear is, you got five years down that road, you add in a bunch of AI and you reduce the investment in trust and safety. And these platforms become more dangerous maybe than they have ever been.
So I think that’s plausible. But let me just sketch out a kind of counterargument here, which is that yes, these AI tools that we’ve talked so much about on the show, they do make it very easy for people with bad intentions to kind of create misinformation and disseminate it.
But they also allow these platforms to be more aggressive about policing the misinformation. So I heard about one project recently where some researchers at Stanford are trying to use GPT4 to do content moderation, basically to use the AI tools for content moderation, and maybe to actually replace some of those horrible jobs that you’ve written so much about where people just sit in an office building for eight hours a day, just like viewing the worst stuff imaginable and deciding whether to take it down or leave it up.
Maybe you can actually do a lot of that work with these new large language model based AI systems and maybe scale trust in safety without having to go out and hire thousands of people to sit there all day and look at horrible things.
So you’re saying the only thing that can stop a bad guy with a generative AI is a good guy with a generative AI.
I’m very uncomfortable with that, but I think that actually is what I’m saying. Maybe that actually is true.
So what do you think it would take to get the platforms to reverse course on this? Is there anything short of another pandemic or a mass incident of violence that can make them eager to reinvest interest in safety?
Well, there has been some legislation proposed that would require them to be more transparent about what’s going on in their platforms. They would have to measure which posts were going viral. They would have to disclose that publicly. There’s something called the Platform Accountability and Transparency Act that just got reintroduced this month that calls for some of those things.
And the idea there is that if the platforms were legally required to measure and report on some of this stuff, it would probably set a floor on how far they could walk back some of these systems that they set up. And it would create a legal obligation and a sense that they had to take it seriously.
Right. That makes a lot of sense to me. If you’re legally required to tell regulators this is the most popular content on our platform, you’re probably going to want to make sure that list is relatively clean.
And I also think that platform design and the choices that these companies are making about what to show their users has a big effect. I mean, one thing that Facebook did in the past couple of years that I think really has changed the media ecosystem is to just de-emphasize politics and news in the news feed today.
If you go on Facebook, you do not see a ton of links about political controversies. You do not see a ton of arguments about various policies, in part, because they kind of realize like, oh , one way to just slow the spread of misinformation on Facebook is to just de-emphasize political news altogether. And so that’s a choice that they made.
So I actually think those kind of platform decisions will matter as much, if not more, than the specific investments that these companies are making in trust in safety.
Did you guys see the new product that Meta showed off to its employees last week?
No, what was it?
It’s a new social network for discussing news and politics.
What could go wrong? Anyway. Looking forward to the presidential election, everybody.
It feels like one of the morals of this story is we should just announce that we are running for president so that we can say whatever we want to on social media. It’s worth a shot. Who’s president on our ticket and who’s vice president?
I want to be VP. I just want to fuck off. I want to go live in that admiral’s mansion.
You’re taller and more commanding. I think you have to be the president. [MUSIC PLAYING]
“Hard Fork” is produced by Davis Land and Rachel Cohn, were edited by Jen Poyante. This episode was fact checked by Caitlin Love. Today’s show was engineered by Alyssa Moxley. Original music by Dan Powell, Elisheba Ittoop, Marion Lozano, Sofia Lanman and Rowan Niemisto.
Special Thanks to Paula Szuchman, Pui-Wing Tam, Nell Gallogly, Kate LoPresi and Jeffrey Miranda. As always, can email us at [email protected]
Yeah, if you’ve eaten Beast Burger, we want to know how it tasted.
[MUSIC PLAYING]
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest Technology News Click Here