On this week's Group Chat episode...
We have an Internet Explorer reunion! Katie Notopoulos, Ryan Broderick, and Charlie Warzel get together to talk about some of the strange stuff going down on the internet: They tackle what exactly QAnon is (and what it isn't), and what's been going down with Alex Jones v. tech companies.
Then, head of breaking news Tom Namako breaks down a couple of the stories that are on his mind today.
This week's Group Chat question is: What responsibility do you think tech companies have to control what's on their platforms?
To share your thoughts, text JoJo the words "GROUP CHAT" at 929-236-9577.
Listen and subscribe! We're everywhere you'd want us to be, like right here 👇🏽as well as Apple Podcasts, Google Podcasts and Spotify.
The Group Chat - 1:00
Katie Notopoulos: So Ryan, what is QAnon or Q?
Ryan Broderick: It's a bunch of insane stupid garbage nonsense on the internet. Um, so the tldr of it is that a bunch of deranged Trump supporters have convinced themselves that Donald Trump and the military are secretly fighting a international cabal of pedophiles, and that Donald Trump is revealing his master plan via a series of very very obtuse and vague clues that are being released into the media. And it sort of is built around the idea that, um, very soon there will be a cataclysmic moment that they call "the storm", in which... It sort of depends on like what version you're looking at, but it usually revolves around World War 3, and that... This, um, definitely not-racist-at-all theory about a global elite class controlling everyone will result in a huge World War. And Donald Trump will lead us into victory.
That's sort of the gist of it I think.
KN: Okay makes sense, checks out. Um, so like obviously this is not true. It started on 4chan, right?
RB: So it started, um, in October of 2017 with like a very typical kind of 4chan post. It was like very long, and very strange. And it really shouldn't have caught on, but it happened to get screenshot and put on reddit's conspiracy subreddit, which is obviously a very good place, um, full of very normal people. And the post detailed this idea that uh, the poster was working at the White House with a Q level security clearance and like all White House staffers, was of course posting on 4chan. And...
KN: Right right. I mean as, you know, all White House staffers do.
RB: Of course! You know, everyone's on there. Actually nowadays that's not even a joke. I'm sure Stephen Miller is on 4chan.
KN: So Ryan, so people actually believe this? That somebody who works in the White House posted to 4chan about, guess what, there's this whole secret thing of Donald Trump and the military fighting to prevent World War 3?
RB: I think the most realistic way to look at it is, I would say 50% of the people that are talking about this stuff actually believe it. Those fifty percent of the people are typically older. Um, I mean, there's this kind of reputation that QAnon followers are old baby boomers who, like, don't know any better. I think there are a lot of college-age far-right trolls who are jumping on it because they think it's funny. And then I think there are a lot of people kind of like, um, like the Mike Cerniviches of the world. Some don't agree with it, some do agree with it, but I think, no matter what, they kind of see it as a really good lightning rod.
Charlie Warzel: That... That feels very right to me, that a lot of people who are showing up to the rallies, like creating their own Q t-shirts, holding signs, waving, like high-fiving, meeting in public, going to these kind of like meet-up groups. I think it's really more just like these are people who, you know, generally pro-Trump, sort of on the internet to some degree, and just sort of like forming a community. Like it seems to me less about like, yeah a focused end-goal really and more about, just like, we can gather around this idea and this is like... It just signals that you're a like-minded sort of like MAGA human.
RB: Like I do think that like the easiest way to think about it is that it's a political Slender Man.
KN: Yeah. So and famously Roseanne Barr believes in the whole QAnon conspiracy theory, right? Like she had tweeted a few things that were in the QAnon code. And the people who are the True Believers often tend to be a little bit older and maybe less like digitally savvy and know how to spot a online fake conspiracy not as well? Like Roseanne Barr maybe just doesn't know how to parse 4chan and doesn't realize this is clearly fake?
CW: Yeah. It's like LARPing for the Watergate generation. Like they all get to pretend they're Woodwards and Bernsteins every, every morning when they go on to the subreddit, which scoops up the top-level stuff from 8chan or 4chan where people like Roseanne Barr wouldn't be able to navigate. And then it gets condensed, and it gets filtered up through Facebook pages and YouTube videos and newsletters and what have you.
KN: How much do you think that platforms like Facebook and YouTube—I mean just because you know, when we think of older people, middle-aged, Baby Boomers, we're not necessarily imagining them reading Reddit and 4chan a lot, right? So, how did this jump into their zone? Was it being passed around through more popular platforms like Facebook and YouTube?
RB: I think QAnon totally breaks the popular narrative that everything bad on the Internet is Facebook and Twitter's fault. Things like QAnon are happening in every country, completely platform-independently. I don't think platforms are really causing it, to be honest. I think, um, it's... It has far more to do with just like the nature of the way the internet works right now. Um, it's kind of weird. I almost feel like QAnon has the most in common with something like Pokemon Go. It's like an augmented reality game and people are gonna play it regardless of where they're going, you know, regardless of how it works. They're going to keep doing it because I think at the end of the day, it's fun for people.
CW: What you're... What you're saying is you think that there's probably then going to be more bleeding... QAnon like bleeding into you know, the real world and seeing that IRL, like there have been instances in the past. I don't know a couple of months, you've seen a guy sort of like protesting, and I believe he had a gun, at the Hoover Dam I think, in sort of you know, mentioning Q, uh, there was a group of people who occupied a former homeless camp in Arizona, talking about you know, this was a pedophilia ring that had been abandoned. Do you think that we're... Because you said this is sort of like, you know, LARPing, that we're gonna see more of this bleeding into the real world than say just on the platforms?
RB: Oh, yeah. Absolutely. I think QAnon is the new normal. I mean they had to literally turn off Facebook in Sri Lanka because fake news was causing riots. It happened in Myanmar, uh, fake news that's traveling around Bangladesh right now. I saw it when I was in Mexico. Like basically we're looking at a decade of like people crowdsourcing augmented reality games that they're going to take really seriously and are gonna probably have some pretty insane consequences. But I mean if you think about the rate of these things appearing, QAnon is right off the heels of pizzagate, which was right off the heels of gamergate, and the distance between them is shrinking. I could see a situation, you know in the next couple years, where we have multiple different QAnons happening at the same time.
KN: Well, but the difference between gamergate and pizzagate and QAnon, with gamergate at least is that gamergate was built on like something that actually happened in the real world. Like it was a disagreement about gaming journalism and you know, and it morphed. Pizza gate was built on a complete fiction, which was that there is a child sex ring being run in a pizza shop. And QAnon is also completely fictitious. It's... It's not real. So somebody out there made this up, posted it on 4chan, and now there's a bunch of people who believe this completely made-up thing.
RB: Right, but um, I mean the infrastructure is just getting faster and better at making these things happen, you know. I mentioned how QAnon is sort of like Slender Man, I mean that did result in the stabbing of like a little girl, because it like bled out into real life.
I just think that as the internet becomes less online and just becomes more like our lives, I think we're gonna see people using it to alter the alter reality, right? Like QAnon is just a bunch of people who have agreed to believe a completely different form of reality. And it's... I don't want to get too head spacey, and like, you know, like "I'm a futurist". But like I do think that we should probably view it as maybe a warning sign that things are about to get really weird.
CW: But this is the type of thing that's like happened with other conspiracies sort of, you know, that weren't online-based, you know, like the JFK assassination, things like that. Um, I think what you're saying though, that these... These mechanisms allow it to happen faster for that, for those communities to form in this way. I think that like that's sort of what the platforms bring to it. This organizing capability. And yeah, it seems like it then makes it so much easier to spread in the real world.
KN: So Ryan you put forth that there's a theory out there that QAnon was started by someone who isn't a, you know, a far-right Trump supporter, but is exactly the opposite and that the whole point of it is to kind of make these Roseanne Barr types look foolish.
RB: There are some people kind of floating around these parts of the internet that have begun to suspect that it might have been a prank, or might have turned into a prank, or it might have started as a prank and become real. And I think you're gonna see this as things like QAnon become more mainstream. Like basically once QAnon hits cable news, it becomes more dangerous, but also becomes more scrutinized by the community that started it. Because all of a sudden that they can see it. So like the thing that makes things die faster than anything on 4chan is when normies kind of understand what it is. So I think as it became more popular, as Baby Boomers became more into it, the twelve-year-old Nazis on 4chan were like ‘I don't want anything to do with this.’
KN: There's nothing a 12-year-old Nazi hates more than like like a 60-year-old Nazi.
RB: Exactly! It's like, you know, you don't want your tea party mom to find all of your Nazi memorabilia in your bedroom. Um, which is a very 2018 problem.
CW: My question is what do we do, what do we do it with any of that? I mean, I feels like... It feels like it's almost not worthless, but like futile to speculate on... On what it is in any way. Like it seems as if there's no way to prove any of it, and so I don't really know what we're supposed to do with, you know, it's origin story. Other than just like wait for it to... Wait for the you know, ironic Nazis to get tired of it.
RB: So I've wrestled with like the ethics of writing about QAnon at all, because I sort of almost feel like it's a virus more so than a news story. It's like an information virus and writing about it just transmits it. But I do think there is some journalistic value in thinking about how these things start. If only to perhaps inspire maybe the people who are on these message boards. And I'm not talking about the first wavers, who like will just transmit the stuff no matter what because like they're trolls, but like the second wave of people...
I would love to maybe push the idea that maybe they should be a little more skeptical. And obviously like the Roseanne Barrs of the world are not going to be skeptical. Like I do think that there's a moment with these things, like QAnon, where if you miss it, it's going to just balloon out and it's going to go on forever and then a bunch of crazy shit's gonna happen, people are gonna pull guns out in the middle of nowhere, and like it'll just be.. It's gonna be a mess.
KN: Well speaking of information warfare. Um... Infowars is also another fringe conspiracy website, slash video channel, slash podcast, that uh recently has been removed from a bunch of platforms. Alex Jones is the main host and then there's these other sort of smaller shows. Charlie, like, what happened? Why did these platforms decide all of a sudden after what, years of you know, Alex Jones and Info Wars being on these platforms? What made them change their mind all of a sudden just this week.
CW: This is basically been, you know, years in the making but really, um, I would say about six months in the making. Since, you know, the beginning of 2018 really there's been sort of increased pressure and scrutiny on Jones and Infowars. A lot of this has to do with the fact that some of the families of the victims of the Sandy Hook shooting in Newtown have, you know, decided to take legal action against Jones.
KN: Yeah, and he's a big like Sandy Hook Truther, right? Like he believes that it was fake and set up by the government and that the victims were just actors, right?
CW: Yeah. I mean he'll... He'll allege that, you know, he never said that and you know, if you've played back tape he did but he also implied it. He's... He's very good at, and this is actually one of the issues when it comes to moderating him on a platform like YouTube or Facebook... One of the issues is he...He’s grated sort of a budding the terms of service and the community guidelines for these platforms, and making a Facebook or a YouTube take like a stand rather than just say like ‘Oh you said’, you know, ‘the unspeakable thing.’
But anyhow, this is... This is basically been a brewing question about a month ago. A CNN reporter asked a Facebook executive in front of a room of reporters, you know, ‘If you're committed to fake News, why don't you take this guy off your platform?’ They had a pretty bad answer to that which was simply like ‘Oh, you know, even if we're trying to eradicate fake news, you know, we can't take down all fake news.’ And then you know, Mark Zuckerberg Facebook CEO got involved, made a pretty poorly timed reference to Holocaust deniers and why they should be all out of the platform, had to sort of walk that back and YouTube got involved and pulled a couple of videos and that sort of like started... That was like the initial crack in the armor, so to speak. And everything kind of came to a head on Sunday night when Apple just basically took down all but one podcast that Infowars, you know, hosted on the platform and that was sort of the first decision that treated Infowars content holistically rather than saying like ‘Oh this one episodes is bad, we're gonna pull it.’
This was just like, you know what, this is a violation of our rules against hate speech and sort of like, you know, slanderous content and so they pulled it. Every other platform, pretty much, besides Twitter, right now has followed suit. And so... And so really what this is, is just like a long building kind of like crisis. And you can...You can see, because Alex Jones is such a, like a bad faith actor who's always trying to, you know, get past the rules and attract attention to himself, that this was always going to happen, and the tech platforms just basically dragged their feet until something had to happen.
KN: It's sort of funny. I mean, we can't exactly speculate like how the decision process for each of these platforms was made but it's interesting that Apple, who we don't necessarily, when we're thinking about media platforms, think of as a media platform, but they do have this big powerful podcast store in the iTunes Store, um, you know, where you can listen to Alex Jones’ stuff. I mean, they're the ones who made that decision first and that it seemed like everyone sort of followed suit what... It was a little bit almost like a game of chicken.
CW: As a reporter I know I'm not alone in talking to Facebook and YouTube employees for the last year and having them say ‘Well, you don't understand the nuances of this, you know. This is like... If you actually worked here and dealt with these decisions every day, you know, you wouldn't just say, oh we can just, you know, throw him down the bin and take him down. That there's these huge consequences at play.’ And then, you know, Apple takes it down and they just pull the plug three hours later. It's like sort of undermines that whole idea that there's nuance to these decisions and really they were just afraid to look bad. It seems.
KN: So one platform that sort of did a 180 recently on this kind of stuff with Spotify. So... Kind of recently, maybe two months ago, they pulled all R. Kelly songs from their recommended playlist. So they didn't remove him from the app completely. They just removed him from… They sort of stopped promoting his music. And then they changed their mind and they decided they didn't want to be in charge of making editorial decisions about whether or not someone was an appropriate person to be, you know, based on the fact that R. Kelly is an alleged kidnapper and sexual abuser and a whole litany of crimes. Um, so you know, they went... They walked back on that. I think that wasn't a good look for them. And it also sort of showed how, yeah, how they kind of are making these up on the fly. Like I don't think it ever occurred to people at Spotify ‘Do we have to be in charge of thinking about whether or not an artist that we're going to put on a promoted playlist is kind of up to the morals clause of the promoted playlist.’ Charlie what do you think about that?
CW: Yeah. So I think we're in this place where it doesn't even matter if you are a tech platform or not, right? Like if you're a restaurant in Southern California or wherever. Everyone has to be ready to jump into the culture war now. Because it just...It just like shows up at every single person's doorstep. But you know with these platforms, I think what's so surprising is that a company like Spotify isn't ready to make those calls and flounders when they have to, I mean…
KN: It's yeah, it's kind of in the culture wars, you know all gave some, some gave all.
CW: These are global... These are global platforms right? You know?
KN: The culture war comes for you. Are you gonna contribute to the culture war relief effort?
CW: Exactly. I just... I just think, I think it's no longer an excuse if you host millions of users and make considerable money, if you're a, you know, a publicly traded company…
KN: Exactly. By the way, I just wanna say hello out there to all the Infowars podcast listeners who decided to jump over. Thank you for listening to The News from BuzzFeed News by BuzzFeed.
CW: It’s a great crossover demographic. And Katie's gonna sell her brain pills later.
KN: By the way, do you want my colloidal silver? And half people are real! That’s what Alex Jones always says. He's always talking about the like, there's like a human-beast hybrid, right? Animal hybrid.
CW: Human-Animal hybrids.
KN: Animal hybrids, which...You know, the thing with that is they are real. There was a recently, actually just discovered a new type of dolphin / whale hybrid in the oceans. Last week was in the news. So Alex Jones, I mean Infowars, is it really wrong? Just a thought.
CW: Wow, that's a hot take Katie.
RB: So with all of these like...These massive companies like Facebook, Twitter, Spotify, whatever. They make shitloads of money and... Oh well, theoretically make shitloads of money and have lots and lots of people using them but the culture war stuff is sort of exposing this like very real problem, which is that none of them create any content, right?
So as long as like this philosophical issue is never being solved, which is like Facebook users aren't being paid by Facebook to create anything. So it creates like this black market. This like weird demonetized, like free-for-all. Like you see this super super well on Instagram, where people are just like branding everything and it's just a crazy nonsense world. Until these massive companies answer really what they are and how far they extend and what they encompass, people like Alex Jones are going to continue to manipulate them and force them into these bizarre arguments and it's just, you know, not going to stop.
KN: Um, QAnon, fake or real? Ryan.
RB: Oh, it's real. No, I'm Q. I wanted to take this time to expose that I am actually I'm Q.
KN: Thank you for clearing that up. Charlie, Infowars and the animal hybrids, real or fake?
CW: I can't tell you because of my security clearance. I'm sorry.
KN: Well, I guess we still have some questions that we haven't figured out, but I think we have solved a lot of problems here about exactly how all media and platforms should behave here on out, as well as all people, and how they should be more skeptical about conspiracy theories that come from the Reddit’s conspiracy board, which is don't do it.
RB: It’s a great place. Great website.
KN: You know, that's just that's the tip I'm leaving for the listeners. Don't do it. Don't go on the Reddit’s conspiracy board and believe everything you see on that.
CW: Just unplug your router.
KN: Listen to Infowars?
Push Alert - 21:59
Tom Namako: So there are a bunch of local elections that happened last night. And usually when these kind of off-cycle races pop up, we hear some very broad kind of sweeping language from the news media. Things like 'This will determine where the country is politically.' 'It's a massive referendum on President Trump.' 'It's a mess a referendum on one specific party.' Well here's the thing. One race most certainly lived up to that last night. It's the one in Ohio, where Troy Balderson, the Republican, faced off against Danny O'Connor, the Democrat. This seat in Ohio is reliably red. Mitt Romney crushed it there. Donald Trump crushed it there. Everyone expected Troy Balderson to do the same and crush it there. But O'Connor held on to the point where President Trump himself went and rallied for Balderson. Mike Pence showed up a rally for Balderson. And now, as of this recording, the race still has not been called. The Republican has a 17 hundred vote lead. But the thing is there's still more than 3000 ballots that are yet to be counted. Usually what happens is, those kind of leads hold. These really kind of slim leads will hold on and it seems like Balderson may run away with it. But here's the thing. Democrats are extremely jazzed about this race and it's because they kept it very competitive in a very very red district. What does this mean overall? They hope this momentum will carry through to November, when the midterm elections come. And they hope that they can do something like, say, flip the House of Representatives and take control of that chamber. And if that were to happen, the one thing on everyone's mind, the one thing you're going to hear about a lot, is will the Democrats move for impeachment?
TN: So here's the story of a woman who discretely tries to apply for a new job on her lunch break and then got busted because she appeared on the local news. Ja'Naea Modest is from Illinois and she thought that she could run over to a local job fair, for a middle school on her lunch break one day. She got there, started cranking out the application. Was really really concentrating and did not realize that she was being filmed by a camera crew. That night someone reached out to her and said 'Hey you're on the news!' And she was like 'Oh no.' She posted about it on Facebook. And then it went viral. And she still hasn't heard from her bosses yet. And she says if she does she's just going to come clean. 'Yep I was looking for a new job.' And look if today's managers are listening or any managers out there are listening, it's tough out there. If people are trying to level up, cut them some slack.
Our Group Chat question for you: What responsibility do you think tech companies have to control what's on their platforms?
Text JoJo the phrase "group chat" to send your thoughts. Their number is 929-236-9577.