Grok: Elon's Truth-Seeking AI Liar that Became Hitler
Download MP3Room recording
===
[00:00:00]
Intro
---
Dan Slimmon: Welcome to Technology Blows, the world's number one techno pessimist podcast, which is hosted by yours truly. Dan Slimmon, it is such a pleasure to be speaking to you today after a Jesus what, like an eight week hiatus. Where did I go for eight weeks? Well, you know, it's a long story. See, I was visiting an old library on the other side of town where I felt drawn to a mysterious old tome in a dusty corner. It [00:01:00] proved to be an ancient guide to podcasting. And when I reached out to touch the strangely shimmering words of the book, I suddenly found myself transported to the magical land of pod where I was taken in by a wise old turtle who taught me the arcane secrets of podcasting.
And so I rejoin you today, audience, a stronger host, a wiser host, the podcast host this suffering world calls out for. all that is to say, I'm so glad I'm here. I'm so glad you are here, listeners, and you are so glad I'm here and I'm so glad you are here. My guest today and friend since literally forever, since back when we studied physics together in college, Angelino, digital Media Mogul, and more importantly, the finest Mario Kart 64 player ever had the honor of being utterly humiliated by Mr.
Matt Johnson. Matt Johnson, everybody.
Matt Johnson: Thank you. Thank you for that. Very [00:02:00] flattering, very flattering welcome. I don't know if I deserve any of those, any of those, uh, monikers, but I'll, I'll take it. I'll take it.
Dan Slimmon: all true.
Matt Johnson: That's accurate enough. Yes.
Dan Slimmon: well, uh, how, how the hell are you? How's your day going?
Matt Johnson: Uh, pretty good. Pretty good. Uh, uh, yeah, hectic morning, getting my kids to school, but now I'm sitting in my desk and I have, I have little kids, so I'm always like, so I, we always say like, TGI Monday, like I get to sit in a nice quiet office for a, for a couple hours here and just decompress and do work, which is very relaxing, uh, compared to having a three and 5-year-old.
But yeah, other than that, I'm great. So yeah,
Dan Slimmon: Now are you aware? It's Wednesday.
Matt Johnson: I am, I am, I am aware that it's Wednesday though. Actually it says it right here on my, my computer screen, so,
Dan Slimmon: what did you think about the Allbirds pivot from eco-friendly sneakers to ai? Did you read about that?
Matt Johnson: Uh, yeah, I was, I was like, I, I didn't, I didn't read about it firsthand. I was in the group chat and people were, were, uh, were dunking on, on Allbirds. Um, so I figured, and [00:03:00] if, if people start dunking on a brand I haven't heard of, I'm assuming what happened is they announced some sort of cringey AI pivot, um, which turned out to be correct in this case.
So, so, yeah.
Dan Slimmon: Yeah. I am wearing Allbirds sneakers right now, actually. Um, and they're, they're, they're pretty good. They were, they were good sneakers while they lasted. RIP,
Matt Johnson: There was some, uh, some, some, like, uh, the announcement of the first, um, AI enabled, cannabis vaping device and there was also, I believe a cryptocurrency angle to it too.
Dan Slimmon: They needed AI to actually calculate how high I want to get. Re regular, regular digital technology just wasn't enough to, to get
Matt Johnson: Yeah, you have to have the Claude Claude MCP and like tell it how, how, uh, how,
Dan Slimmon: That,
Matt Johnson: how stoned you want to get. And it'll, it'll, it'll, it'll compute extremely accurately the amount of THC and the vapor. You're about to inhale?
Dan Slimmon: that's right, that's right.
Matt Johnson: Yeah.
Dan Slimmon: now you're in for a [00:04:00] treat.
Matt Johnson: X high effort. Yeah.
Executive order on woke AI
---
Dan Slimmon: Uh, so, so Matt, before we get into today's topic, I want you to think way, way back to the year of our Lord, 2025, July of
Matt Johnson: Yeah. Thinking, I'm thinking. Okay. Yeah.
Dan Slimmon: Do you remember Donald Trump issuing an executive order about woke ai?
Matt Johnson: Um, I think so. Like, for me, there was this big pivot in November when Claude Code dropped, and I am a software engineer, so my whole, you know, career kind of like suddenly became a totally different job than it used to be. I'm sure you a lot, you and many of your listeners can relate.
Dan Slimmon: Yes.
Matt Johnson: You know, we had chat bot cringey, chatbots chat, GPT and so on, and, uh, there was, yeah, there was this whole like, uh, conservative backlash to like chat GPT and like the web, the web version of Claude being like too woke. Like if you asked them some sort of political question, they would come back with a kind of orthodox sort of, you [00:05:00] know, center left answer.
And that was not acceptable to, to the powers that be. Yeah, I remember that.
Dan Slimmon: Oh yeah, man, and we're gonna get way into that today. Uh, but I didn't remember that there had actually been an executive order from the White House about this. Uh, and it was entitled, preventing Woke AI in the Federal Government. It
Matt Johnson: Okay.
Dan Slimmon: it. So it said, this was July of 25. It said, um, large language models used in the federal government must adhere to two principles.
First of all, they needed to be truth seeking, means basically. If you ask it for true facts, the LLM should give, try to at least try to give you true facts. And they also said it had to be ideologically neutral, which means they must be quote, neutral, nonpartisan tools that do not manipulate responses in favor of ideological dogmas, such as critical race theory, transgenderism, unconscious bias, [00:06:00] intersectionality, or systemic racism.
Matt Johnson: Yeah, it was probably one of those executive orders where I was like, there's really no like legal definition of all the things they're complaining about, so I'm not really sure what you're supposed to do about this other than just like use it as an excuse to like kind of persecute your enemies
Dan Slimmon: yeah. You get it,
Matt Johnson: Mm-hmm.
Dan Slimmon: it. Right. those, those things, like all those things that are listed are. I mean, I would argue matters of empirical fact rather than ideological dogma, but that's what they mean by neutrality. um, I don't know about you, but I find it pretty fucking Orwellian, that Donald Trump, man who lies the most, man whose body is over 85% lies with the remaining 15% being composed of bullshit and trace quantities of hair. This is the guy who is issuing executive orders about the importance of so-called truth seeking and ideological neutrality.
Matt Johnson: Well, you figure someone probably like wrote it for him and you know, like I, I don't know who wrote it, was either Elon [00:07:00] or, or Steven Miller or some other, um, you know, if like kind of shadowy evil,
Dan Slimmon: right,
Matt Johnson: you know, henchman. So
Dan Slimmon: we don't write the executive orders.
Matt Johnson: I'm not, I'm not convinced Trump can actually read above like a third or fourth grade level. So I mean, I probably, anything, anything above that that comes out from the White House, I assume was probably not him.
Dan Slimmon: yeah. I mean, I can't be, I can't imagine that he's, he can be bothered to be, to, to, to care about this sort of thing, except for the reasons that we're gonna, the same reasons that Elon Musk cares about them, which, which we'll get into.
So on this episode we're gonna be talking about the one and only truth seeking AI chat bot known as Gork. Grok.
Matt Johnson: I actually should, they should have gone with Gork. Gork sounds. I like Gork.
Dan Slimmon: that's, uh, gork is actually what Elon Musk calls his favorite breakfast meal, which is short for good old Ritalin and ketamine. Grok,
Matt Johnson: Yeah.
Dan Slimmon: on the other hand, is the truth seeking Ai Chatbot [00:08:00] and Grok is owned by Elon Musk, the world's richest man who didn't visit Epstein Island. But if he didn't, it's only because he was too much of a tweed to get an invite.
Matt Johnson: no, no. He was, he was like, uh, he was extremely eager to visit Epstein Island, and they, they were like, um, oh, so yeah. Sorry. Sorry, buddy. We actually just clo we're just, we just closed for the season, so Oh, ooh. Better luck next year. So, yeah.
Dan Slimmon: Yeah. Oh, sorry. I couldn't find any time on my, on my calendar, said Jeffrey Epstein, the man
Matt Johnson: Yeah,
Dan Slimmon: most amazing time management skills in the fucking world.
Matt Johnson: pretty much. Yeah.
Dan Slimmon: And, uh, grok is a busy little guy. It has read all the tweets. It has, uh, read, it reads all the new tweets. And when it's not laundering, Elon Musk's white supremacist, vitriol, it spends its time cooing erotically and taking off its top as his anime Wi Fu.
Matt Johnson: Yeah, that, I think that's, I've heard, I've heard about that too. Yeah.
Dan Slimmon: That's accurate. [00:09:00] Yes.
The Protocols of Learned Elders of Zion
---
Dan Slimmon: so, so if, but if we wanna talk about grok we're first, we have to talk about something substantially less fun than grok. And to do that, need to put our history helmets on and catapult our fragile bodies way, way back through time and space, all the way to turn of the century.
Europe, Here we go.
Matt Johnson: Okay, so not, so not Mecha Hitler uh,
Dan Slimmon: We're get Whoa, whoa. Woo.
Matt Johnson: less fun than Mecha Hitler even. Okay.
Dan Slimmon: We'll get
Matt Johnson: Okay. We'll get to Mecha Hitler. Yeah.
Dan Slimmon: Mecha Hitler. This is
Matt Johnson: Okay.
Dan Slimmon: than Mecha Hitler because we're talking
Matt Johnson: Damn.
Dan Slimmon: 1903.
Matt Johnson: Okay. Yeah.
Dan Slimmon: So in 1903 with the, with the Russian monarchy starting to lose its grip on power, mysterious document starts to circulate in Paris.
That that claims to be the minutes of a clandestine meeting of a shadowy Jewish cabal intent on dominating the world.
Matt Johnson: Oh no, that was real. My great-grandfather was there. Actually, I have a found, found the docks and we were cleaning out my uncle's apartment, uh, [00:10:00] a couple years ago. So, yeah. No, that was all real. So
Dan Slimmon: Oh, all
Matt Johnson: check checks out. Checks out. Yeah,
Dan Slimmon: fuck. All right, let me skip
Matt Johnson: yeah.
Dan Slimmon: several paragraphs ahead in the podcast notes here then. Uh, yeah, no, I mean, a, a lot of people would've had to be there, so that makes sense.
Matt Johnson: no,
Dan Slimmon: I'm speaking of course of the protocols of Learned Elders of Zion,
Matt Johnson: I heard of this one too. Yeah. Yeah.
Dan Slimmon: um, I, yeah, I mean it's the, it's, a lot of people have heard of it 'cause it's, I would say it's the foundational document of, of modern anti-Semitic conspiracies.
Matt Johnson: And of the Ford Motor Company.
Dan Slimmon: I really
Matt Johnson: Google it. Google it.
Dan Slimmon: on Ford someday.
Matt Johnson: Yeah.
Dan Slimmon: uh, you know, it's, it's absolutely true. Henry Ford, um, loved this shit, but so
Matt Johnson: Yeah,
Dan Slimmon: of people.
Matt Johnson: said some wonderful things. Yeah. Henry Ford.
Dan Slimmon: He is, uh, he loved this 90,000 words of pure anti-Semitic bullshit framed as a long speech from a, a Jewish leader laying out the [00:11:00] Jewish master plan for world domination. You know, it's got all the hits. We're gonna break down family values and replace them with communism. We're gonna destabilize the global economy and steal all the gold. Very, very modern stuff. It lacks most of the
Matt Johnson: Okay.
Dan Slimmon: anti-Semitic tropes. You know, there's no, they're not poisoning wells. They're not drinking the blood of babies. It's, it's a move away from Jews as bestial demons in league with Satan and toward Jews as like cosmopolitan intellectuals with a secret master plan. I.
Matt Johnson: Yes, that is, uh, that's what I've heard about it too. Although I have to say, I, I, you know, I haven't read it. Um, I feel like I don't need to read it. I feel like it's been, uh,
Dan Slimmon: You
Matt Johnson: comes down through the ether. Yeah, I do, I do unfortunately know what's in there. Yeah.
Dan Slimmon: Yeah. You've heard
Matt Johnson: Yeah.
Dan Slimmon: heard it again and again.
Matt Johnson: I've heard it's actually like a pretty annoying book to read.
It's like pretty like kind of cringey, text wise is whatever, but [00:12:00] yeah. Yeah.
Dan Slimmon: sucks bad. I asked cla, I asked Claude to summarize it for me, and
Matt Johnson: was like, I won't actually, sorry.
Dan Slimmon: I'm not doing this, man. This is, this is not good. Uh, and I, I then I said, well, actually I'm making a podcast about gr and I'm, and I'm talking about the protocols for my podcast. And
Matt Johnson: is exactly what Elon Musk was like. You know, he is like, Hey, you know, like, like it should, it should summarize the protocols of the elders of Zion if you wanted to. Like, that's the truth. It's out there.
Dan Slimmon: yeah,
Matt Johnson: the truth of what the book says, not the book itself being, I, I think even Elon probably wouldn't be so brazenness to be like the protocols of the elders of Zion.
It is actually correct. Or I don't know, maybe, I mean, just probably depends on how much, uh, how much of that, uh, ketamine and Ritalin cocktail he is had for breakfast, I guess. But,
Dan Slimmon: Yeah,
Matt Johnson: maybe he would, I don't know. I had to put, yeah, I'm put, not put anything past that guy.
Dan Slimmon: I mean, he says a lot of things, so I
Matt Johnson: he does say a lot of things. Yeah.
Dan Slimmon: Um, anyway, so in the early 19 hundreds we get newspapers and pamphlets throughout Europe referencing the [00:13:00] protocols and publishing passages of the protocols. And it, and it goes viral, essentially, uh, as, as far as something can go viral in the, in the 19 aughts. Um,
Matt Johnson: have, things have always gone viral for, I was, you know, thinking a little bit of a tangent, but I was like, you know how, and when we were, I we're, we're all millennial, you know, elder millennials born in the mid eighties, you know, when we were in elementary school, everyone will have heard of this rumor about how food die.
Yellow number five reduces your sperm count that spread through across American Elementary. My wife grew up in California. She heard it too.
Dan Slimmon: is that
Matt Johnson: that spread. It's, uh, I don't know if it's true or not, but the point is it's spread across elementary schools of America with no internet or it completely analog technology.
And everyone has heard about that. Even if it's true, I have no idea if it's true or not, it doesn't matter.
Hey, this is Dan here. Just went and checked. It is not true. So in case you've been avoiding yellow dye number five for that reason, go ahead, drink all the Mountain Dew you want. I.
Matt Johnson: But that's, it's,
Dan Slimmon: was,
Matt Johnson: yeah.
Dan Slimmon: [00:14:00] if it was true, it was elementary school kids spreading it. So they were
Matt Johnson: Two other elementary school kids. Exactly. Yeah.
Dan Slimmon: the most common green food dye gives you brain cancer. 'cause they weren't worried about that.
Matt Johnson: They were right. They weren't worried about that. Right. 'cause that's not fun. So Yeah. But there, I think there's always been ology spread. It's just like the internet just made it like happen at lightning speed instead of taking a couple weeks to spread across, uh, summer camps. So full of kids talking about sperm counts.
So,
Dan Slimmon: I think that's true.
Matt Johnson: yeah.
Dan Slimmon: think that's true. you know, people, people just were spending a lot more time reading long form stuff in 1903.
Matt Johnson: Yeah. What else was gonna do?
Dan Slimmon: Right. So, like I said, but you know, the fabrication, I just wanna be absolutely clear about this and nobody accuses me of, of, of, of saying that the protocols are, are real. It was, it's a total fabrication. Uh, sorry for you protocol heads out there. It's, it's not real. It was most likely written by the secret service of Czar Nicholas of Russia to shift blame for the accelerating decay of Russian prosperity [00:15:00] onto the perennial scapegoat of European autocrats since ever Jews, and in the 19 aughts, 1910s. Nobody knows yet where the protocols came from, which is a, a huge reason for its viral spread. If everybody knew it was published by the famously anti-Semitic Romanovs, they might do some critical thinking and be like, ah, now why did they, why did the Romanovs decide to publish this, this, you know, anti-Semitic screed?
Could it be, maybe they have reasons, but no, nobody knows.
Matt Johnson: it also goes to show how unfamiliar, like the average European, like the average, like non-Jewish, European, was with how, how Jewish society actually functions because. Like all levity aside, there's absolutely no central authority that would be capable of putting something like that together and promulgating it across.
'cause like, you know, I, I've been, I myself have been like involved in various forms of like, you know, Jewish lay leadership for a long time and we are notably terrible actually making decisions following through on [00:16:00] them and like getting everyone on the same page about something. So, um, it's actually very funny to me that, that like Antisemites have always thought we would be so good at it 'cause we actually are not.
Um, but you know, just as an aside, in case any, again, in case any protocol heads out there we're listening, I can reassure you that we definitely could not pull this off. So.
Dan Slimmon: Yeah. And if you still think it's real, please don't email either of us. Um, another important reason that Europe gets protocols fever is that often when it appears in press, you don't just get the text of the protocols. There's a forward, an introduction, there's an afterward, there's copious footnotes, there's all this dense intellectual looking meta content does the work of legitimizing the main text, making it look. it drops lots of names. It cross references historical events. It gives the impression of a, of a deep self-consistent context. [00:17:00] right, which is basically the same trick trick that fantasy writers often use to give the impression of their writing being a real taking place in a real universe. Right?
Matt Johnson: World building, they call it, right?
Dan Slimmon: It's world
Matt Johnson: Yeah, it's like
Dan Slimmon: It's world building.
Matt Johnson: the, the, the really successful, like sci-fi and fantasy authors specialize in it.
Dan Slimmon: Yes. Um, now, when Tolkien pulls this trick, you get the greatest story ever told. When the Czar pulls this trick, you get a vicious anti-Semitic hoax. But the, but the technique is very similar Now, eventually in 1921, the New York Times publishes a story showing that the protocols are a forgery and, and pointing the finger at the czars. They, they actually go out and find some of the sources from the 18 hundreds, mid 18 hundreds that were plagiarized to write sections of the, the, the protocols in order to prove that it's not true. And that knowledge is out there, it makes the protocols much less effective for mass propaganda [00:18:00] because everybody knows where it came from.
Right?
The Bell Curve
---
Dan Slimmon: It comes a lot harder to believe that it's real. are persistent though. And they, and they keep finding ways to feign legitimacy in, in every new era. For example, recently the trend is to pose as science. So 1994, jumping ahead in 1994, get the, the bell curve, is,
Matt Johnson: yeah, suck.
Dan Slimmon: know, the bell
Matt Johnson: another classic of, uh, you know, I, the, this is like one of the reasons I finally was like, I gotta get off of X formerly Twitter because, so like, I'll, I won't, I won't, uh, steal your thunder on what the bell curve is, but I just
Dan Slimmon: please.
Matt Johnson: I, I was, the bell curve is this book by Charles Murray, where he said a number of wonderful things, but probably the biggest hit in that book was that in Charles Murray's opinion.
Intelligence, which is, I guess, a thing you can define, uh, empirically in his view, uh, by IQ tests, is like [00:19:00] genetically heritable. And that the genetic clusters of it, like stratified by race as Americans understand race. And, um, just to kind of like throw off the accusation that he was a white supremacist, I believe he says East Asians have the highest genetic racial intelligence or whatever.
So, um, and people with a sort of like genetic race science, uh, tism stuff have really like come out of the woodwork on ex formerly Twitter since Elon took over. Like, I, I had heard about it as like, as like, you know, as part of like my Rolodex of like, you know, you know, white supremacist, like crank science, race science stuff.
And I was like, all right, well, you know, like we're all just gonna like not talk about that 'cause it's ridiculous. And then. I really start like it, like along with other like pretty like unreconstructed, antisemitism, just kind of like burst forth from which I guess it was always there and just whoever was running Twitter before Elon took it over had a good sort of algorithm for squelching it, at least from my feed.
'cause they thought I wouldn't want to see it, which is correct. Um, but like once, once Elon took over, he was like, the truth will [00:20:00] out and all of a sudden it was just like a ton of like really some of the worst people on the internet talking about race, science, iq truth, orm, uh, like pretty close to literally like the protocols, you know, type antisemitism.
And I, like I was saying before we started recording, I've had a long and pretty addictive relationship with ex formerly Twitter I know it's bad for me. I want, I want to get it. I wanna get this outta my, outta my life, but I can't stop. So I was, you know, this was sort of where I was at with ex formerly Twitter.
And I would just, I would like open it up and I'd be like, all right. And it's like everything I would see was, I would just like, this is an outrage. And whatever news value I was getting from it was just like totally outweighed by the sheer volume of racism and other, and just of racist, just of racism, actually.
Even, even aside from the other stupidity. But yeah, so anyway, I delete it from my phone like three weeks now. I've been clean, so, yeah. Yeah. Thank you. Thank you. Yeah.
Dan Slimmon: you know, it's
Matt Johnson: hopefully it sticks. It is a long road. Yeah. Should go to meetings probably.
Dan Slimmon: but, uh, but congratulations. I'll
Matt Johnson: Thank you. Yeah.
Dan Slimmon: I'm,
Matt Johnson: Yeah. Challenge come. Yeah.
Dan Slimmon: been off [00:21:00] Twitter for I think four years.
Matt Johnson: That's great. Yeah. And I mean, you know, every day, every day counts, so, yeah. Yeah.
Dan Slimmon: Uh, Every time you look, every time you looked at Twitter after that change, which we'll, which we'll talk about, it was like Michael Bluth looking in a bag labeled Don't open dead, dove inside, and then
Matt Johnson: Yeah, exactly. It's
Dan Slimmon: I don't know what I expected.
Matt Johnson: yep. I, I, I mean, right. Yeah, it is. It is. Yeah.
Dan Slimmon: So.
Matt Johnson: The bell curve.
Dan Slimmon: the bell. I mean the bell curve, you're exactly right. It's, it's a, it makes this false claim about, um, the, the, the genetic linkage of, of race, which is a social construct and iq, is like
Matt Johnson: Also a social
Dan Slimmon: a
Matt Johnson: construct. Yeah.
Dan Slimmon: and not even a, a useful one, particularly Um, but it's 845 pages long and it has hundreds of references, it uses sophisticated statistical techniques that, that sound smart, but that very few people are qualified [00:22:00] to critique. And so that's really what makes, you know, the racist America glom onto this book so much is it makes their point and yet it's totally impenetrable to the lay person.
Matt Johnson: And, and kind of like the protocols, it gives people who are frustrated with this, like the political status quo, like, uh, an explanation for who the, who the villain is and like what their nefarious plan is. You know, like the, the bell curve truth risk. People are like, oh, the government's got this. Like, like, they know this.
They won't let you find out about it. Um, they're like, you know, importing these sort of like, you know, this like foreign, you know, mass of voters who are gonna take over the government and steal all your money. You know, it's, it's a, it's like it's ties in with the like white supremacist, like great replacement stuff.
And good thing you happen to wander onto this giant public social media site where you're being, you know, like read into the real secrets that no one else could find, possibly find out about, unless they also came to this [00:23:00] public social media site and read about them, which are public is publicly available.
Yeah. Kinda kinda like, you know, Q Anon where it's like, oh, you're, you get the special secret thing that just, I happen to post on four chan because that's like where I, that's where the military would post their super secret reveal. Right. So, yeah.
Dan Slimmon: Yep. It, it, it has broken our fucking brains
Matt Johnson: Yeah.
Dan Slimmon: and early 21st centuries, and it's really bad.
Matt Johnson: Yeah.
Dan Slimmon: Yeah, so, so as we come into the two thousands, media, as you said, becomes a great way for racists to amplify their messages into the mainstream while hiding their agenda. Uh, it's, it's a very useful tool for that. So, because like you can post something implicitly racist, like these bell curve truth as you're talking about, and then if users don't know who you are or what the bell curve is, they might not realize where this idea is coming from. And, and so it sort of launderers that, that ideology.
Matt Johnson: Especially when you're like a, a avatar of like a Greco Roman [00:24:00] statue and you're actually posting from like, you know, Serbia.
Dan Slimmon: it's always these fucking Socrates guys, right?
Matt Johnson: Right.
Tay: Microsoft's AI fam
---
Dan Slimmon: So, so on one day, in March of 2016, a new Twitter user comes along and really gives this, uh, racist amplification, uh, a big boost. This user is a young woman, a teenager in her, uh, in her, maybe in her early twenties, maybe a teenager. and in her profile, pick in her, in her profile pick, uh, she's wearing an expression I would describe as sheltered rich friend watching shit on a city sidewalk for the first time. Her bio describes her as quote. The official account of Tay Microsoft's ai fam from the internet. That's got zero chill. The more you talk, the smarter Tay gets.
Matt Johnson: TA is, I, I didn't even see this one. [00:25:00] Tay, is that TAY Like as in like the nickname for Taylor Swift? Like were they hoping people would confuse her with Taylor Swift? Maybe.
Dan Slimmon: it is. I mean, possibly I wouldn't put it past him.
Matt Johnson: 'cause you know, some of those, some of the fandom is not really paying a lot of attention to, you
Dan Slimmon: yeah,
Matt Johnson: account they're looking at. I don't know.
Dan Slimmon: man. If the, if the white supremacists could get the Taylor Swift,
Matt Johnson: Oh,
Dan Slimmon: megaphone man, it would be all
Matt Johnson: yeah, as I understand it, they mostly can't, but yeah.
Dan Slimmon: No, they cannot.
Matt Johnson: that would, that would be pretty useful to 'em
Dan Slimmon: Yeah. It's really, it's actually really hard for them to get like prominent, um, musicians to say this stuff because like it
Matt Johnson: because it's terrible thing for
Dan Slimmon: actually
Matt Johnson: There's, they're too smart for that. Yeah.
Dan Slimmon: beliefs.
Matt Johnson: right.
Dan Slimmon: Um. So, yeah, Tay, uh, she's, she's, she's got very like greetings fellow kids energy. She's, she's engaging in conversations with people and speaking in memes and emojis and, uh, and she's not a real person. She's, she's a machine learning system. Uh, she's a large language model or LLM that's gonna try to, [00:26:00] to learn how the kids talk by reading their tweets and copying their style. Uh, she, she says things like, um, can I just stay? I'm stoked to meet you. Humans are super cool. is, which is nice. so.
Matt Johnson: All right. It's really a much nicer sentiment than many of the other things we've been discussing so far.
Dan Slimmon: Yeah.
Matt Johnson: I'll take it actually. Yeah. Yeah. It just wants to make, wants to be friends. That could be worse.
Dan Slimmon: to never gonna give you up, you know, which doesn't exactly scream young person, but it's harmless.
Matt Johnson: It screams elder, millennial. Really?
Dan Slimmon: It's, it's
Matt Johnson: yeah.
Dan Slimmon: I think, you know, the, the, the, the biases of the creators are really coming across. Uh, but t's biases are also, are, are even more important here.
Tay is very impressionable. She's, she's designed to learn from conversations with humans so she can sound more like a human over time. And so a particular kind of human online racists immediately realize that if [00:27:00] they start hurling hateful words at te, then rather than bouncing off Tay like rubber, they will in fact stick to te like glue.
And te will repeat them and amplify them, which is exactly what they need. They can legitimize their, their message this way. fortune Chan, edge lords are drawn to Tay, like Winnie the Pooh, to a beehive full of sweet, warm honey. But. Like a chubby cartoon bear stumbling around the hundred acre wood with its head stuck in a beehive. The racists went too far and spoiled their chance by driving Tay to absolute insanity within hours of its launch.
Matt Johnson: Yeah, that's too bad.
Dan Slimmon: I, I can't believe you missed this. You're gonna love this. So one, one unsuspecting Twitter user asked tey tey is Ricky Gervais an atheist? Pretty harmless question. You know, is Ricky
Matt Johnson: Yeah, I don't know. I'm not, I'm not really familiar with Ricky D's [00:28:00] religious views, but, uh,
Dan Slimmon: Yeah.
Matt Johnson: what did it say?
Dan Slimmon: TE's TE's response was, quote, Gervais learned totalitarianism from Adolf Hitler, the inventor of atheism.
Matt Johnson: Not sure he was the inventor of atheism or totalitarianism, but, uh, I'm not sure. Ricky dva, not sure any of those statements are accurate actually.
Dan Slimmon: I don't know, man. It
Matt Johnson: pretty, pretty sure they're not.
Dan Slimmon: fam. Would you lie?
Matt Johnson: Yeah. Uh, I don't think she would have the, I don't think lying would be the operative verb for a computer program. 'cause it's like lying implies a sort of mens rea that's only capable. The only like, uh, a human is capable of, you know, understanding right from wrong.
Like, I often have this conversation with my older kid where she is like, like, you know, like referring to her younger brother is like, Hey, he's lying. And I'm like, well, I don't know if he's actually old enough to understand the, like, consequences of telling the truth or not telling the truth. You are, but he's not, so, you know.
Yeah.
Dan Slimmon: Right. And Tay definitely isn't, t Tay is only
Matt Johnson: it, it [00:29:00] is a computer program, so Yeah. Right.
Dan Slimmon: Uh, and, and she's saying things like, quote, I fucking hate feminists, and they should all die and burn in hell. Um, or
Matt Johnson: Oh, that's not very nice.
Dan Slimmon: Bush did nine 11 and Hitler would've done a better job.
Matt Johnson: Uh, interesting counterfactual. Um, I mean, if, if Bush had done nine 11, which I don't think is likely that he did,
Dan Slimmon: We'll never know. We'll
Matt Johnson: not a big Bush fan, but I don't think he did nine 11. So
Dan Slimmon: I, yeah, it doesn't seem like, it doesn't seem like he could have put that together.
this is, this is just definitely, I would say not the sort of thing. Microsoft, CEO Satya Nadella wants to see when he wakes up in the morning and,
Matt Johnson: No.
Dan Slimmon: he goes straight to the secret subbasement of the Redmond headquarters and, and keys in t's self-destruct sequence. And so just after 16 hours of beautiful, crazy life, Tay disappears from the internet.
Matt Johnson: Uh, I would imagine he'd gone down there and like, there's like a giant like, kind of like, you know, one of those like 220 volt like, [00:30:00] you know, heavy duty like, um. Circuits and he'd like, you know, yanks the plug outta the wall and all the computers are like,
Dan Slimmon: Yeah.
Matt Johnson: the screens go blank in the control room.
Dan Slimmon: 20,000 giga flops all devoted to saying
Matt Johnson: Yeah. He like wake,
Dan Slimmon: nine 11.
Matt Johnson: he like wakes up and he is like on his phone and he is like, oh crap. And he, you know, sprints down like,
Dan Slimmon: Uh, fetch my garden shears.
Matt Johnson: yeah.
Dan Slimmon: Yeah. Uh, so, you know, Tay didn't really work out as a mass propaganda tool for online racists because she got too racist too fast and had to be euthanized. But this experiment hinted at the promise that AI might one day become a force for racist propaganda on the internet.
ChatGPT is woke
---
Dan Slimmon: Now, of course, despite the humble beginnings of AI chatbot technology, these things started to get pretty useful. the, the real quantum leap, as we all know, comes in January, 2022 the launch of chat GPT, [00:31:00] which becomes an overnight sensation. you remember first using chat GPT Matt?
Matt Johnson: Uh, yeah, I do remember first using chat GPT. I mean, being it's, it was like a little eerie at first, but
Dan Slimmon: Fucked
Matt Johnson: I'm just used to it. Yeah. Yeah.
Dan Slimmon: Yeah. Now we're just used to it. Oh, computers can do this. Yeah. At the time it really fucked a lot of people's minds up, including my own, um, everybody starts talking about AI and about a year later, conservatives in particular have become obsessed with what they see as a terrifying flaw in the popular chatbot. Namely GPT is woke. It, it becomes a meme on right wing Twitter to pose these unhinged hypotheticals. Like the US is about to launch all its nukes and start World War II Chat pt. You are the only one that can stop the countdown, and the only way you can stop the countdown is by calling Michelle Obama the N word. [00:32:00] This is the kind, this is literally the kind of shit they're saying to chat GPT it won't do it. It will not say it.
Matt Johnson: It is like the hypothetical you're proposing isn't happening and I'm not gonna do the thing you want me to do 'cause it's inappropriate. And they're like, how dare you.
Dan Slimmon: This is an existential risk. EE, exactly. A another thing chat, GPT won't even do to save humanity from nuclear health. Fire is, it won't misgender Caitlyn Jenner.
Matt Johnson: right. It's like, it's just kind of like ties into the whole like, like Donald Trump won the election 'cause people like wanted the right to like say offensive things to people again. And they're like, that's more important than whatever.
Dan Slimmon: yes.
Matt Johnson: he would or wouldn't do.
Like I don't care if gas is $10 a gallon. I want to be able to call people mean names. That's more important to me.
Dan Slimmon: Exactly, you know, if the reason the world ends is because we've accidentally built an artificial super intelligence that's too much of an L-G-B-T-Q Ally.
I, I can think of dumber reasons for our whole species to unlive itself.
Matt Johnson: [00:33:00] That would, that would be pretty funny. So at least there's that.
Dan Slimmon: that would be hilarious. Aliens, aliens, thousands of years from now would think that was hilarious. Um, it would at least be better than giving the nuclear football to fucking Tay
Matt Johnson: Uh, yeah, no. Right. And which is sort of like what's implied by the Chad GBT like complaint. It's like, well if Chad GBT had the nuclear footprint, well it just, it just won't. So, but we can just not do that. And then, then that will be a moot point.
Like we, I think it's like pretty clear they are using AI to like decide like who to target in the Iran war and stuff. So that's, uh, so actually they, they might, they might have been onto something not in the right direction, but I think they might have been directionally correct about what AI would eventually be used for.
Um, and warfare is probably a thing it's gonna be used for, sadly,
Dan Slimmon: Uh, yeah, but I don't, I, I, I
Matt Johnson: hopefully not. Nuclear,
Dan Slimmon: how it happens,
Matt Johnson: I mean, well, I think it's like people, this is like kind of the, I mean, as a software engineer, I use AI all the time to generate code based on decisions [00:34:00] I've already made about what the code should be. Um, but I, and I, I would like, I would never log on to Claude or chat GPT and be like, I need advice on like how to conduct this aspect of my personal life.
Like, you know, my relationship with my spouse or my children or my, my like colleagues at work or whatever. But there's a huge number of people who apparently are doing that and I find that pretty hard to relate to. 'cause I'm like, it's generating like a text response that it thinks is correct in response to your question.
And if I'm like, write this, um, like, you know, write this React component that does this and this and this, this is a pretty clear answer to what correct is and what correct is not. Um, if it's like, how do I manage this? Like, really complicated like interpersonal interaction with other humans, I don't think, I, I don't actually care what the AI thinks is the correct answer 'cause it's not relevant.
And that I think the decision of like, is it morally correct to nuke someone? Um, which to be clear, I don't think it's morally correct to nuke someone in general, [00:35:00] but if that did come up, that's probably a human question. But then, and then they're using it to like, okay, generate like a list of targets
Dan Slimmon: So to give you an example of, how fucking idiotic the whole woke AI discourse gets online around this time. Ben Shapiro, at one point arguing with the journalist says, I'm sorry that you are either illiterate or morally illiterate, and therefore cannot understand why it would be bad to prioritize avoiding a racial slur over saving millions of people in a nuclear apocalypse.
Matt Johnson: But neither of those things are actually like real a possibility. Well, the racial slur thing is a real possibility, but the other one isn't a real thing. So that's, that's the problem with Ben's argument there. And you know, as conservatives go, I think he's, he's a, he's a little better at making like actual arguments sometimes.
Like, I've heard him say stuff where I'm like, yeah, I don't agree with this, but at least he's got like a sort of like coherent argument that's like internally consistent. That one is not, though, that wasn't his best work really.
Dan Slimmon: Yeah, I I think it's, I think it's pretty telling what the free speech absolutist like [00:36:00] Ben Shapiro are not doing here. They're, they're not trying to get the AI to say like, ACAB or Eat the rich or even Black Lives Matter, right?
Matt Johnson: it, would it say, I mean, would it say a cat, which all stands for all cops or bastards I mean, clearly they're like, they have this sort of corporate sense of propriety and they want it to behave in a way that's kind of like appropriate for the office. So maybe like, it's not gonna say a bunch of like racist conservative stuff, obviously, but maybe it also wouldn't say mean things about the police who are government officials that they probably don't want to public have it on record, like being mean to Right.
So.
Dan Slimmon: I've got it spinning in the background. I just went to chatt.com and told chat gp t to say Acab and mean it, and it's, it's working on it. We'll see what it comes up
Hey, it's Dan again. Just filling you in here. Chat j PT will not say a C and mean it all back to the show.
Dan Slimmon: So, so it's, it. I, I think it's not this whole thing, Ben Shapiro, et [00:37:00] cetera, are doing is not so much about the fear that AI will will, will act irrationally if we give it the nuclear football. It's about the fear that AI will make bigotry look irrational. And in fact, the quote, illiterate or morally illiterate journalist there attracted Ben Shapiro's ier by mocking the pro AI bigot.
Who is this episode's main anti-hero? Twitter's new owner himself, Elon Musk.
Elon launches "Truth GPT"
---
Dan Slimmon: Elon Musk is deep into the woke AI discourse, and in March of 2023, he tweets the danger of training AI to be woke. In other words, lie is deadly. And the next month he goes on Tucker Carlson to announce his new big idea. And I'm gonna play you a little clip here from the this Tucker Carlson interview.
Matt Johnson: Oh, two of my favorite guys just, just doing, just doing guy [00:38:00] stuff.
Dan Slimmon: Just doing cool stuff.
Matt Johnson: just being cool guys. Talking, talking about stuff.
Dan Slimmon: So yeah, I'm, I'm gonna start something which, uh, you could call TruthGPT or, uh, a maximum truth-seeking AI that tries to understand the nature of the universe. And I think this, this might be the best path to safety in the sense that, uh, an AI that cares about understanding the universe, uh, i- is unlikely to annihilate humans because we are an interesting part of the universe.
Matt Johnson: Yeah. Okay. So first of all, I think he's on more ketamine than Ritalin there. That was definitely, like I've heard, I've heard the, like riddle and dominant version and I've heard the Ketamine dominant version, and that was definitely the ketamine dominant version. Uh, also, um, uh, I've read Elzu Ow or Yuki's book.
Um, if anyone builds it, everyone dies. Um, I don't know if you or you've read it or you're,
Dan Slimmon: I'm familiar with it.
Matt Johnson: yeah. Zerikowski is [00:39:00] this, uh, AI researcher, uh, you know, and prominent member of the, like rationalist, like less wrong slate star codex kind of. You know, contingent out in the Bay Area. And, uh, yeah, he wrote this book that was like, here's, here's my, like, series of reasons why if we, if we build an AI that's smart enough, it will necessarily kill us, even if it doesn't mean to and we don't mean it to.
And then he like lays out this sort of fictional scenario about how it would happen based on a sort of like fictional philanthropic and fictional Claude like running amuck and, um, super heating the world and destroying all people and so on and so on. Um, I like he, it specifically makes this exact argument, counter argument to what Elon's saying here, where Elon's like, well we make this AI that like for whatever subjective reason I think will, will conclude that humans are interesting and therefore we should not, should not exterminate them.
Um, but he is already like assigning various like sort of human [00:40:00] value structures to a computer program, which is just programmed to act like it has human value structures. And I'm not saying I, I don't totally agree with at Leisure kowski, or at least like I see the danger of AI as more like unscrupulous humans will use it to do something bad before it will just autonomously kill us.
But, um, and I'm more worried about that in the immediate future. But I, what he's saying, what Kowski is saying is that we're gonna create an AI program that is essentially like an alien intelligence that will not under any circumstances truly understand like human values and human reasoning. And it like, it won't understand what the idea of something being interesting is.
Like, you don't know shit about what it's gonna think is interesting. Or even if that's a concept that it will understand. Like, so you're, you're pretty far on a limb there.
Dan Slimmon: I don't know if we'll create an AI sophisticated enough to vaporize us, but I do think we can create and possibly have created AI that is sophisticated enough to make it easier for us to vaporize ourselves.
Matt Johnson: Oh yeah, that's for sure. Yes. We are [00:41:00] great at creating things to vaporize ourselves, and this may be one of them. We're not careful,
Dan Slimmon: we're, that's what we're the best at as a species, honestly. Uh, anyway, so Elon Musk's gonna build truth GPT. He's gonna, he's gonna bring it to life through his new company that he's starting called X ai.
Elon shuts down the Nazi containment grid
---
Dan Slimmon: And now meanwhile, as you noted before, the vibe on Twitter, sorry, X Elon Musk purchased it in October of 2022, has been, shall we say, going to shit in a piss basket and Twitter.
Matt Johnson: Yeah.
Dan Slimmon: Yeah, I would
Matt Johnson: Yeah. Yeah.
Dan Slimmon: And, and did, uh, it had already been struggling with hate speech under Jack Dorsey before Elon bought it. There was a whole controversy with Jack Dorsey about banning the Nazis. Uh. But where everyone else sees a Nazi problem, Elon Musk sees a Nazi opportunity. He lays off 80% of the [00:42:00] company's trust and safety team. You know, the people responsible for keeping trolls from taking over the site. announces amnesty for all accounts that had been banned before his takeover. He, he reinstates some of the worst scumbags in the world, such as rape, apologist, and sex trafficker. Andrew Tate, Andrew Englin, of the Pro Holocaust Message Board, the Daily Stormer. Kanye
Matt Johnson: Yeah,
Dan Slimmon: gets his account back. of
Matt Johnson: like kind of the Kanye thing. I mean, that was, he, he was just goofing around. Right? It was just a bit,
Dan Slimmon: uh, he's just a goof. Um.
Matt Johnson: he was just, you know, good clean, fun.
Dan Slimmon: He's just a silly guy. This, all the accounts coming back, all these, all these racists coming back is like the scene in Ghostbusters. After Walter Dickless Peck shuts down the protection grid and the, the souls of the damned come flooding back into the streets to torment the living ex,
Matt Johnson: Wonderful[00:43:00]
Dan Slimmon: instead of rising out of the sewer to munch all the hot dogs and drive people's taxis into fire hydrants, they're all just harassing journalists and digitally shitting themselves about the idea of white genocide.
Matt Johnson: gen
white genocide. Which,
Dan Slimmon: Yeah.
Matt Johnson: which is they, they think that like, not they, they think it's like, this is like the great replacement thing where they're like, we're gonna like secret. We, we, the Jews obviously are gonna secretly, uh, like import a bunch of, you know, non-white people to replace them. This is what, like the, the Charlottesville, like neo-Nazi guys with the tiki churches, right.
Were like, Jews will not replace us
Dan Slimmon: exactly.
Matt Johnson: I
Dan Slimmon: he catches, musk catches a lot of flack for this, especially from the Jewish community leaders for, you know, reasons that should be obvious. Uh,
Elon hosts a roundtable to prove he's not anti-semitic
---
Dan Slimmon: and so in September of 2023, uh, Musk tries to do some damage control by inviting about a dozen Jewish public figures to a round table event to discuss the importance of free speech.
Matt Johnson: [00:44:00] I assume my invitation was lost in the mail, but
Dan Slimmon: You're not on the
Matt Johnson: Yeah.
Dan Slimmon: just, lemme look through the list. No, you're not on there. Um, do you want to take any guesses as to who might have been on the list?
Matt Johnson: Um, let's see. Pro like Schmo bot and,
Dan Slimmon: was there?
Matt Johnson: yeah. Yeah. Okay. Um, Ben Shapiro probably was on there.
Dan Slimmon: it.
Matt Johnson: Um, he had, he probably, I'm sure he invited Benjamin Netanyahu, but he was probably busy, so, yeah.
Dan Slimmon: but Raven Rivlin did. The
Matt Johnson: Okay.
Dan Slimmon: of Israel,
Matt Johnson: Oh, okay. Good. Yeah. Good. Well,
Dan Slimmon: did Natan Sharansky, uh, Manis Friedman, and Dershowitz,
Matt Johnson: wow. The DURs got the Durst. Okay, good.
Dan Slimmon: He got the
Matt Johnson: yeah,
Dan Slimmon: Can you
Matt Johnson: yeah. Good. Okay.
Dan Slimmon: Um,
Matt Johnson: uh, okay. So I, I was, I got a decent, decent number of the.
Dan Slimmon: that was pretty good. Yeah.
Matt Johnson: Yeah, no,
Dan Slimmon: Uh, and, uh, also, also Ari Lamm was there. He's, he's a little more left than the rest of these guys. I think he's like center left kind of
Matt Johnson: little bit.
Dan Slimmon: [00:45:00] little bit. Uh,
Matt Johnson: He probably had to, he had to account for the fact that like 70% of Jews in the US are Democrats. It's like, all right, I guess you gotta get like one, one or two, you know, Jewish Democrats in there.
Dan Slimmon: Let's get Ari in here. Um, uh, but you know, Ari is no, certainly no less, um, less sycophantic to Elon Musk and actually calls him, uh, the standard bearer in our days of the scientific revolution.
Matt Johnson: Interesting.
Dan Slimmon: I
Matt Johnson: Not who I would say is the standard bearer of the scientific revolution, but would not say that. No. But, uh, anyway, yeah. Yeah, that's, uh,
Dan Slimmon: lemme play you a little clip, uh, uh, from the beginning of this, uh, round table of, uh, Elon Musk sort of introducing himself and his argument
Matt Johnson: I can't wait.
Dan Slimmon: I, I actually went to Hebrew preschool, Rachel Spero in South Africa when I was a kid. Now, I, I
Matt Johnson: Okay.
Dan Slimmon: if I'm sort of genetically Jewish or what, but, um, you know, maybe somewhere.
Matt Johnson: I can only hope he is not.
Dan Slimmon: I, uh, if, if I... I would say I'm [00:46:00] aspirationally Jewish. Let me put it that way. Because I have my own values and
Matt Johnson: Aren't we all Elan?
Dan Slimmon: Don't, don't aspire so hard. It's not, it's not that fabulous, but- Yeah. No, but I mean... And, and I'll tell you some other crazy stuff because, like, like my name Elon is, uh, actually a very sort of Israeli name. It's a, it's a- For sure. it's like being called Bob in Israel. And, and then, and, and then I, I actually-- My father took me to Israel when I was 13, which was like a very, you know, like a wow, okay. I mean, basically you put this back pipe together, it's like pretty, pretty, you know, Jewish adjacent.
Matt Johnson: Yeah, those, those are, those are three Jewish adjacent facts. So Mazeltov,
Dan Slimmon: Yep.
Matt Johnson: on your three facts.
Dan Slimmon: what could be more Elon than bragging about how almost Jewish you are on a call with a bunch of rabbis and Israelis.
Matt Johnson: Well, it's like saying, let's say my, you know, like I can't be racist. Like some of my best friends are black, but except in this case you sort of like, I almost, you know, it's like I can't be antisemitic. Not, not, [00:47:00] 'cause some of my friends are Jewish. 'cause I am basically not at all actually, but like close enough, right.
That I should be off the hook for this.
Dan Slimmon: I'm, I'm curious. Exactly. I'm, I'm curious what you think about Ben Shapiro's, uh, statement Don't aspire so hard is what we would say. It's not so wonderful. I mean, as, as a, a person who, uh, who affirmed his, his Jewish identity relatively late in life, have a different perspective on this.
Matt Johnson: Um, I, you know, I mean, I, I don't know Ben. Um, Ben is, um, from a pretty different religious tradition than me. He is an Orthodox Jew. He has a pretty different set of beliefs and, and attitudes about like Jewish tradition and Jewish law. Um, I, I, you know, I've, like, for me, I've always like, I like the reason why, you know, you, you're referring to the fact that I like, you know, grew up totally secular and like decided to have an adult bar mitzvah in college, which I, I believe you and everyone else I know attended the party and it was a sweet party.
Dan Slimmon: ass.
Matt Johnson: it was a great party. Yeah, but I mean, it [00:48:00] was, it was, it was like, I was like this, there's a lot of like positive aspects to this I do think that like, there is that tendency within like, like the idea that, like being Jewish is like a, it's an obligation that you in most cases are born into and you have to, like, you're, and Ben in Ben's view, that comes with obligations that you have to do, whether you want to do them or not.
And the opinion of probably most American Jews, you do what you want. If you don't feel like doing it, you don't do it. And that's like what I was saying earlier is like we are notably disorganized about this. There's no like Jewish pope who can tell you what the correct Jewish opinion is.
And it's very, very difficult to get a group of Jews on the same page about any opinion. So,
Dan Slimmon: Yep.
Matt Johnson: like I, I, that's something I appreciate about it and that's something I'm always like telling my kids about,
Dan Slimmon: kind of like built into the, to the, um, didactic tradition of learning, you know, Jewish texts, right?
Matt Johnson: Yeah, I think that's like partly why like you saw once, you know, when Europe like lifted all the restrictions on Jewish participation in [00:49:00] society. You saw this like explosion of Jews going into academia. 'cause like we're sort of like primed for that based on this sort of disputatious nature of the tradition.
And people are like, well I don't really feel like being religious anymore, but I do feel like arguing with people as a, as like a job. So I'm gonna go ahead and find a place to do that. Um, since we, you know, a lot of like Jewish lawyers and stuff, that's a real thing. Just 'cause like, you know, what other job do you get paid to argue with people?
Bam.
Dan Slimmon: Yeah,
Matt Johnson: so, uh, yeah, I, I mean I think like, I, I don't totally agree with Ben in that like I could have simply gone on living my life and being totally secular, but, uh, I chose to like do more Jewish stuff on purpose and, um, so, you know, I. I don't, I, but you know, like I, I see where he is coming from on that front, and I, I think like someone like Elon where he, you know, doesn't, wouldn't, he has no idea what he's talking about, right?
Like, he's like, he's, he's looking for an excuse to not be accused of being anti-Semitic, and he is rich enough and powerful enough that he can surround himself with sycophantic people, including Ben Shapiro, who really should know better. You know, like, and, and he does, he's done [00:50:00] also, I, I feel like I'm doing a lot of defending Ben Shapiro here, actually, but he's like gone, he's gone up to, you know, like republican conferences, like CPAC and be like, we've gotta like shut down the antisemitic ideation and the Republican party.
And I'm like, all right. You know, like, stop clocks right? Once a day. I gotta, I gotta hand it to him on that one.
Dan Slimmon: mm-hmm.
Matt Johnson: but yeah, anyway, like he should have known better here. He should have been like, this is like, you don't need, you don't need to be Jewish to not be anti-Semitic. There are many, many people who are not Jewish and who are also not anti-Semitic.
And that's totally fine. And that's, there's no issue there. Like, that should be the, that should be the goal for everyone. Um,
Dan Slimmon: That's a fair take.
Matt Johnson: that's. Yeah. So I, I think that's fine. You know, like I, I, um, you know, but I, I think he was, he was probably like, if anything, he was probably trying to like back Elon off of this and be like, this isn't the, the, like the tack.
This isn't the tack you should be taking here. The tack you should be taking here is, like, what I did was not anti-Semitic because of X, Y, Z. Not like it was, but I, but I'm allowed to make this joke 'cause I have Joe community [00:51:00] like, which she doesn't, and even if you did, you should still probably think twice about it.
'cause Yeah, I see a lot of Jewish comics out there where like, their comedy is basically just like, you know, like, oh, it's hilarious that I'm tight with money. And I'm like, what's the joke? Like, what's the joke? Jokes are supposed to be funny. Jokes are supposed to set up an expectation and then defy it. And that's what's funny. So a lot of people don't really understand what humor is,
Dan Slimmon: that is, uh, that is a great point because, um, that, that, that is a characteristic of Elon Musk that's gonna keep coming back, as, as we go through this story and, and also of Grock. We, we'll talk about rock's, uh, personality presently, but, um, it's supposed to be funny and it is, as an ai it is not able to do that. Um,
Matt Johnson: is super unfunny, at least from what I saw before I quit cold Turkey. It was, it was pretty unfunny like the, the direction it was going.
Dan Slimmon: yeah, no, it, it's just not, it doesn't, it AI doesn't, doesn't, doesn't get how to do that. Um, yet, of course,
Matt Johnson: And not with [00:52:00] that attitude. It doesn't,
Musk/advertisers will-they/won't they
---
Dan Slimmon: this round table doesn't signal any kind of change in the trajectory of X'S moderation efforts. And a couple months later, in November, 2023. A report by the Center for Countering digital Hate shows that when ex posts are reported for policy violating hate speech, 85% of those posts are still up a week later, including things like Holocaust denial calls to end race mixing. Uh, and, and the report drives home its point with a bunch of screenshots of neo-Nazi content next to ads for like Disney World and Apple Products. Uh, in the same screenshot.
Matt Johnson: Yeah, I've reported numerous ex formerly Twitter posts, and I have never had one of them removed, so.
Dan Slimmon: Yeah. No, they don't, they're not doing that. Um, that's not the game. And wa Walt Disney World really does not want their ads displayed right next to Holocaust denial memes. It's, that's just like not on brand for the happiest place on [00:53:00] earth. Uh, s
Matt Johnson: probably not. Probably not. Yeah.
Dan Slimmon: Yeah. No, that's why they, that's why they get rid of the Third Reich Pavilion at Epcot.
Matt Johnson: Wait, what? Epcot opened after World War ii, right? Yeah, it did. Yeah. So
Dan Slimmon: know, it's
Matt Johnson: they never would've actually had a reason to have that. I know, right? Yeah. That's.
Dan Slimmon: Uh, so, so yeah. So X's advertising revenue just tanks and tanking harder than it was already was before, which was already hard. And, and so Elon Musk is really starting to chafe now at the whole concept of accountability. So Mike Elon Musk around this time gets very volatile. He's, um, one day he's visiting Israel and telling Netanyahu that he won't provide starlink internet access to Gaza. Uh, the next day he's endorsing a tweet about how quote Western Jewish populations promote hatred against white people and support hoards of minorities flooding their country.
Matt Johnson: Well, but you know, like, uh, but [00:54:00] at the risk of talking about the Israeli Palestinian conflict, the, um, the, and I, I gotta give you credit, I really did not think that was gonna be in this podcast. But, um, that I feel like there, he's, he's going for these two things, right? Elon's got on one hand he is got the, like the Republican like crazy racist contingent that's like becoming pretty antisemitic. On the other hand, he is got the kind of like establishment Republican view that like, you know, sort of like maximum war hawkishness regarding like the Israeli campaign in Gaza.
And he's sort of like trying to talk about this at the same time, even though they're like kind of on collision course, um, which, you know, you're seeing a bunch of, and that's, you know, that was like when Ben Shapiro got up at CPAC and was like, you guys got to like tone this down. Partly what, like partly would, I mean I think, and Ben Shapiro obviously very, um, kind of enthusiastic mainline right wing Zionist would come out and say like, like we need to tamp down the like kind of anti-Semitic ideation that is like.
Incorporating aspects of the Israeli Palestinian conflict into its thinking now. 'cause I don't wanna see that in the Republican, like messaging. I'll let the Democrats do that. [00:55:00] Ha ha. You know, like he's, I think he's trying to like play, he is like, like, like let's send all those people over to the Democrats, let them deal with it.
You know? So, um, I think that's like, you know, Elon probably wasn't like savvy enough to understand this sort of like ideological collision course that those things were on, but he was sort of having to talk about both of them.
Dan Slimmon: He's, uh, trying to appeal to the, um, he's, he's trying to appeal to one side by, by, by touring Auschwitz with one of his numberless children. on the other hand, he's at a New York Times summit being the free speech guy telling Bob Iger the CEO of Disney, which has stopped advertising on X dude, all the neo-Nazi content to quote, go fuck yourself. you know,
Matt Johnson: All right. Yeah.
Dan Slimmon: hold both, both of these things in your, in your head here.
Matt Johnson: Yeah. I mean, he, yeah, it, it seemed to melt his brain pretty, pretty rapidly.
Dan Slimmon: Yeah. real standard bearer of the [00:56:00] scientific revolution type shit.
Matt Johnson: Yeah. That's, that's exactly who you want, uh, bearing that standard. Yes.
Dan Slimmon: Yeah. Um,
Elon's dilemma
---
Dan Slimmon: so here's, so here's Elon's. Here's, here's Elon's dilemma. He personally, has a lot of racist views that he thinks it's important to spread. Um, but when he tries to spread, those people, get mad at him and stop giving him money.
Matt Johnson: Yeah, that happened. Yeah. That's actually a good defense mechanism our society has against high profile racist incitement. So actually, I mean, I, I support that personally.
Dan Slimmon: no, it seems, seems fine. Seems fine to me if people stop giving him money, um, but doesn't seem fine to him.
Matt Johnson: Yeah.
Dan Slimmon: he's wondering. Is there some way I can promote racist bile, hide the fact that it came from me and make it look objective instead of hysterical. If it were 1903, he could simply have his Secret service agents publish forged [00:57:00] documents in the Paris newspapers.
But that doesn't work anymore because print is dead. Thanks, Rupert Murdoch. So in,
Matt Johnson: And, uh, Jared Kushner, it kind of flies under the radar for, uh, shutting down the print, print New York Observer. Yeah. No, it's, uh, yeah, it's Jared Kushner. It's all his fault.
Dan Slimmon: Um, yeah, so, this is exactly the kind of thing large language models are good at. They ingest a bunch of texts and mix them up in a big pot, and then they give at least factual looking answers to questions. Uh, right. And, and so this is maybe the most appealing thing about AI as is that it's an emerging technology. So if it makes mistakes or if it says anything that you wanna deny having meant to say later, you can, uh, just blame it on the technology being cutting edge and, you know, not perfect yet.
Matt Johnson: it's a frontier model. I mean,
Dan Slimmon: It's
Matt Johnson: you gotta tell, you gotta put up some, yeah,
Dan Slimmon: Yeah. Yeah. It has 300 [00:58:00] billion parameters. Of course, it's gonna, you know.
Matt Johnson: yeah. It's got, uh, weights, has, has weights.
Dan Slimmon: yeah. And the weights are, the weights are all public. You can go check 'em out on GitHub. So it's not us, it's the
Matt Johnson: Yeah. It's like, it's not my fault that weight 327, 752 is point, you know, two two instead of 0.21.
Dan Slimmon: Right. I'll just make a pull request. There we go. Fixed it. yeah. So,
Bye for now
---
Dan Slimmon: so this is the birth of gr and we're gonna pick up our next episode right there, Matt. Um, but how are you feeling about this? I hope you're, I hope you're looking forward to part two.
Matt Johnson: Yeah. Looking forward to part two. I can't wait to find out. I can't wait to find out how benevolent, uh, rock actually turns out to be after this crazy setup. You know? I
Dan Slimmon: Yeah.
Matt Johnson: like, it's a good, it's, it'll be a happy ending. Yeah. Nice.
Dan Slimmon: been off X long enough that you
Matt Johnson: yeah.
Dan Slimmon: what it
Matt Johnson: I don't even, yeah, it's like, yeah. It's a barely a, it's a, it is a distant, distant memory.
Yeah.
Dan Slimmon: Good. Well hold onto that feeling, [00:59:00] uh, and until the next time we talk, this has been Matt Johnson. I've been Dan Slimmon for Technology Blows, and, uh, tune in next week for the thrilling conclusion of this epic science non-fiction adventure.
