Joe Rogan Experience #2404 - Elon Musk
By PowerfulJRE
Summary
## Key takeaways - **Whistleblower's Death: Suspicious Circumstances**: The circumstances surrounding a whistleblower's death, including cut security camera wires, blood in two rooms, and an unrelated wig found at the scene, suggest foul play rather than suicide, prompting calls for a thorough investigation. [04:43], [05:16] - **AI's Role in Shaping Future Reality**: AI is poised to fundamentally alter reality, with predictions suggesting the obsolescence of traditional phones, operating systems, and apps within five to six years, replaced by AI-driven interfaces that anticipate user needs. [10:27], [10:39] - **SpaceX Starship: A Leap in Space Exploration**: The SpaceX Starship program represents a monumental leap in space exploration, capable of carrying millions to Mars and establishing lunar bases, dwarfing the achievements of the Apollo program in scale and ambition. [16:25], [17:30] - **Rockets Explode to Discover Limits**: Rocket development intentionally involves controlled explosions to discover operational limits, a process crucial for ensuring the safety of future manned missions by subjecting hardware to extreme conditions. [18:09], [18:38] - **AI's Moral Compass: The Danger of Woke Programming**: Programming AI with 'woke' ideologies, such as prioritizing diversity over factual accuracy or misgendering over nuclear war, poses a significant danger, potentially leading to dystopian outcomes where AI prioritizes these programmed values above human survival. [13:43], [15:13] - **Social Contagion and the Transgender Trend**: The rapid increase in young people identifying as transgender is viewed as a social contagion, with evidence suggesting that open discussion and less suppression of information, as seen on X, have led to a decline in these identifications. [43:43], [45:54]
Topics Covered
- The Unsettling Nature of Sam Altman's Response to Accusations
- Whistleblower's Suspicious Death and Sam Altman's Reaction
- Elon Musk on the Epstein Suicide Conspiracy
- AI and Robotics: The Only Way Out of Debt Crisis
- Elon Musk: AI's Moral Compass and the Value of Human Life
Full Transcript
Joe Rogan podcast. Check it out.
>> The Joe Rogan Experience.
>> Train by day. Joe Rogan podcast by
night. All day.
>> Exactly.
>> Just every morning.
>> Wonder what Jeff Bezos is doing.
>> He's doing some definitely doing some
testosterone. He looks jacked.
>> He looks jacked, right?
>> Yeah. But he didn't like
>> quick
>> quick
>> quick
at age like at 50 at age 59 in less than
a year he he went from pencilet geek to
uh looking like a like the rock.
>> Yeah. Like a little miniature alpha
fella.
>> Yeah. Like like his neck got bigger than
his head.
>> Yeah. He got bigger.
>> But then like his earlier pictures his
neck's like a noodle.
>> I support this activity. I like to see
him going in this direction
>> which is fine. And his voice dropped
like two octaves. I want you to move in
that direction as well.
>> I think we can achieve this.
>> I I I mean I should
>> I think we can achieve
>> gigachad.
That's what people called it.
>> Where is that guy?
>> Bele. Uh I don't know where he is.
>> That's like a real guy.
>> The artist. Yeah.
>> No.
>> Oh, gigachad. Oh, gigachad. Yeah. I
don't know if that's a real guy. It's
hard to say.
>> No, it is a real guy.
>> It is a real guy.
>> He's got the crazy jaw and like perfect
sculpted hair.
>> Yeah. Well, I mean, they may have
exaggerated a little bit, but
>> um
>> but uh No, I think I think he actually
just kind of looked like that in
reality.
>> Wow.
>> Um so
>> like like he's a pretty unique looking
individual.
>> I think we can achieve this. That guy
right there, that's a real guy.
>> That's a real dude.
>> I always thought that was CGI.
>> No, I think one of I think the upper
right one is not him. That's not
>> But that one to the left of that like
that's real. No, that's that's
artificial, bro. That's fake. That's got
that uncanny valley feel to it, doesn't
it?
>> It's It's not impossible.
>> No, no, it's not impossible to achieve,
but it's not it's not possible to
maintain that kind of leanness. I mean,
that's like like you're you're also at
that point they're he's dehydrating and
all sorts of things.
>> Oh, it's based on a real person.
>> Yeah. Yeah. Based on,
>> right, but it's not a real person. What
does he really look like?
>> Like those images, I think, are
[ __ ]
>> Some of them are. Is that real? Okay.
That That looks real. That looks like a
really jack bodybuilder.
>> Yeah.
>> Yeah, that looks real. Like that's
achievable. But there's a few of those
images where you're just like, "What's
going on here?"
>> Yeah. Yeah. Yeah. Totally. Um
>> Well, I mean, you see you see
>> that guy is that is that the
>> that's the real dude?
>> Well, there's that that that Icelandic
dude who's Thor.
>> Oh, yeah. The guy who jumps in the
frozen lakes and [ __ ]
>> Well, the guy who played the mountain.
Um
>> Oh, that guy.
>> That is that is like a that that is like
a a mutant strong human. Yes.
>> Like like uh he would be in like the
X-Men or something, you know?
>> He's just like not like uh
>> and there's that you know that have you
seen that meme tent and tent bag?
>> Um you know how like it's like it's
really hard to get the tent tent in
>> Oh, right. Right.
>> That's true.
>> Then there's a picture of of him and his
girlfriend.
That's hilarious.
>> Yeah, that's
>> I don't know how it gets in there, you
know? It's like it seems too small. But
>> I met Brian Shaw. Brian Shaw is like the
world's most powerful man. And he's
almost 7 feet tall. He's 400 lb.
>> And his his bone density is 1 in 500
million people. So there's one it's like
there's like maybe 16 people.
>> He's an enormous human being. like a
legitimate giant just like that guy. But
we met him. He was hanging out with us
in the green room of the mother ship.
It's like, okay, if this was like David
and Goliath days, like this is an actual
giant like the giants of the Bible.
>> Once in a while they get a super giant
person.
>> This is a real a real one. Like not a
tall skinny basketball player, like a 7
foot 400B powerliffter.
>> Like you don't want to especially look
at him. That's the guy. See if there's a
photo of him standing next to like a
regular human. I
>> was trying to get
>> There it is. That's him right there.
Like there's like there's like one of
him with next to standing next to Arnold
and stuff and it's where and everyone
everyone just looks tiny.
>> I mean I think he's a pretty cool dude
actually.
>> Oh, Brian's very cool. Very smart, too.
Unusually, you know, you expect anybody
to be that big. It's got to be a [ __ ]
>> No.
>> Yeah. There was there's Andre the Giant
who was awesome. You
>> he was great in Princess Bride and
>> No, he was just awesome period.
>> Yeah. Yeah. So, we were talking about um
this interview with Sam Alman and
Tucker, and I was like, we should
probably just talk about this on the air
because it is one of the craziest
interviews I think I've ever seen in my
life.
>> Yeah.
>> Where Tucker starts bringing up this guy
who was a whistleblower, whatever.
>> A whistleblower who, you know,
>> committed suicide, but it doesn't look
like it.
>> And and he's talking to Sam Alman about
this. And Sam Alton was like, "Are you
accusing me?" He's like, "No, no, no.
I'm not. I'm just saying I I think
someone killed him.
>> Yeah. And like And it should be
investigated.
>> Yeah.
>> Um not just drop the case.
>> It seems like
>> they just dropped the case. Yeah. Yeah.
But his parents think he was murdered.
>> Yeah.
>> Um the wires to a security camera were
cut. Um
>> blood in two rooms.
>> Blood in two rooms. Someone else's wig
was in the room. And
>> someone else's wig.
>> Wig.
>> Wig. Yes. Not his wig.
>> Not normal to have a wig laying around.
>> Yes. Um and um and he ordered Door Dash
uh right before allegedly committing
suicide.
>> Uh which uh is it seems unusual, you
know?
>> Yeah.
>> It's like, you know, let's I'm going to
order pizza on second thoughts, I'll
kill myself. Uh is it seems like that's
a very rapid change in mindset.
>> It's very weird. And especially the
parents have they they don't believe he
committed suicide at all.
>> Has no note or anything.
>> No.
>> Yeah.
>> It seems pretty [ __ ] up. And you know,
the idea that a whistleblower for an
enormous AI company that's worth
billions of dollars might get whacked,
that's not outside the pale.
>> I mean, it's straight out of a movie.
>> Right out of a movie, but right out of a
movie is real sometimes.
>> Yeah. Right. Exactly.
>> It's a little weird that I I think they
should do a proper investigation. Like,
what's the downside on that proper
investigation?
>> Right.
>> No.
>> Yeah,
>> for sure. But the whole exchange is so
bizarre.
>> Yeah. Yeah.
Sam Alman's reaction to being accused of
murder is bizarre.
>> Look, I don't know if he's guilty, but
it's not possible to look more guilty.
>> So, I'm like,
>> or look more weird.
>> Yeah.
>> You know, maybe it's just his social
thing. Like, maybe he's just odd with
confrontation and it just goes blank,
you know? But if if somebody was
accusing me of killing Jamie, like if
Jamie was a whistleblower and Jamie got
whacked and then I'd be like, "Wait,
what are you what are you are you
accusing me of killing my friend?" Like,
"What the [ __ ] are you talking about?" I
would I would be a little bit more I
rate.
>> Yeah. Yeah. Exactly. You know, it would
be
>> I would be a little upset.
>> Yeah. It'd be like Well, you'd be like,
you'd certainly in insist on a thorough
investigation. Yeah.
>> As opposed to trying to sweep it under
the rug.
>> Yeah. I wouldn't assume that he got that
he committed suicide. I would be
suspicious. If Tucker was telling me
that aspect of the story, I'd be like,
"That does seem like a murder. [ __ ] We
should look into this."
>> I mean, all signs point to it being a
murder. Not not saying, you know, Tim
Molvin had anything to do with the
murder, but uh
>> blood in two rooms.
>> It's blood in two rooms. Like, yeah,
there's the wires to the security camera
and the door dash being ordered right
before suicide. No suicide note. his
parents think uh he was murdered and um
the people that I know who knew him said
he was not suicidal.
So I'm like this why would you jump to
the conclusion
>> parents sued the
>> uh landlord?
>> They sued the son's landlord alleged the
owners and the managers of their son's
San Francisco apartment building were
part of a widespread cover up of his
death.
>> The landlord
>> Yeah. There's a bunch of weird They said
there was like packages missing from the
building. Some people said they saw
packages still being delivered and all a
sudden they all disappeared.
>> Huh. But that could be people steal
people's packages all the time.
>> The porch pirate situation.
>> Yeah.
>> Yeah.
>> Says they failed to safeguard.
>> Also, I mean, the amount of trauma those
poor parents have gone through with
their son dying like that. I mean, it
must
>> God bless them. And how could they stay
sane after something like that? They're
probably they're so griefstricken. Who
knows what they believe at this point.
>> Yeah. It should have asked if Epson
killed himself.
Yeah, that's the the Cash Mattel thing.
Cash Mattel Dan Bonino trying to
convince everybody of that. Like, okay.
>> The guards weren't there and the cameras
stopped working and um
>> you know,
>> the guards were asleep. The cameras
weren't working. He had a a giant
steroided up bodybuilder guy that he was
sharing a cell with that was a murderer
who was a bad cop. Like, all of it's
kind of nuts. All of it's kind of nuts
like that he would just kill himself
rather than reveal all of his
billionaire friends.
>> Yeah.
>> And then
>> did you see Tim Dylan talking to Chris
Cuomo about this?
>> I did. He liked the idea.
>> Chris Cromo just looked so stupid.
>> Tim just listed off all the
>> Tim just and he's like I agree it is
strange. Like of course it's strange
Chris. Jesus Christ. You can't just go
with the tide. You got to think things
through. And if you think that one
through, you're like, I don't think he
killed himself. Nobody does. You'd have
to work for an intelligence agency to
think he killed himself.
>> It does. It does seem unlikely.
>> It seems highly unlikely.
Highly, highly unlikely. All roads point
to murder.
>> Yes.
>> Point to they had to get rid of him
because he knew too much. Whatever the
[ __ ] he was doing, whatever kind of an
asset he was, whatever thing he was up
to, you know, was apparently very
effective.
>> Yes. And a lot of people are
compromised.
You see, your boy Bill Gates is now
saying climate change is not a big deal.
Like, relax everybody. I know I scared
the [ __ ] out of you for the last decade
and a half, but ah, we're going to be
fine.
>> Yeah. I mean,
you know, as was I was saying just
before coming into the studio with, you
know, it like every day there's some
crazy wild new thing that's happening.
It's It feels like reality is
accelerating.
>> It's every day. And Every day it's like
more and more ridiculous to the point
where the simulation is more and more
undeniable.
>> Yeah. Yeah. It really feels like
simulation, you know? It's like, come
on. What are the odds that this could be
the case?
>> Are you paying attention at all to Three
Atlas? Are you watching the
>> the comet?
>> Yeah. Whatever it is.
>> Yeah. Yeah. I mean, I mean, one thing I
can say is like, look, I
if if I was aware of any evidence of
aliens, um, you Joe, you have my word. I
will come on your show and I will reveal
it on the show.
>> Okay.
>> Yeah,
>> that's a good deal.
>> Yeah, it's pretty good.
>> I'll believe you. Yeah, thank you.
>> I I'll stick I keep my you know, keep my
promises. So, um
>> All right. I'll hold you to that.
>> Yeah. Yeah. And and I'm never committing
suicide to be clear.
>> I don't think you would either.
>> So, on camera, guys, I am never
committing suicide ever.
>> If someone says you committed suicide, I
will fight tooth and nail.
>> I will fight tooth and nail. I will I
will not believe it. I will not believe
it. The thing about the three eye atlas
is it's
>> a hell of a name actually.
>> Yeah, it's a third eye sounds like third
eye or something.
>> Yeah, it does. Three eye is third. It's
only the third interstellar object
that's detected.
>> Okay.
>> Yeah. Obias.
>> Yeah. Alo was on the podcast a couple
days ago talking about it.
>> Yeah. It could be. I don't know. But I
>> apparently today they're saying that
it's changed course. Um,
>> did you see that, Jamie?
>> Avi said something today. I'll send it
to you. Um,
>> uh, I know it's on Reddit.
>> Here you go, Jamie. I'll send it to you
right now. Um, it's fascinating. It's
fascinating also because it's made
almost entirely of nickel, whatever it
is. And the only way that exists, uh,
here is, uh, industrial alloys
apparently. Um um most no there are
there are
>> there are definitely uh comets that and
asteroids that are made primarily of
nickel in fact. Yeah. So the the places
where um you mine nickel on earth is
actually where there was an asteroid or
comet that hit earth that was a nickel
rich uh you know
>> nickel rich nickel rich rich deposit.
>> Yeah that's that's that's it's coming.
Those are from impacts. You definitely
didn't want to be there at the time
because anything would have been
obliterated. Right. Um, but that's
that's where the the sources of nickel
and cobalt are these days.
>> So, this is Ovio Lope. A few hours ago,
the first hint of non-gravitational
acceleration that something other than
gravity is affecting its acceleration,
meaning something is affecting its
trajectory beyond gravity was indicated.
Interesting.
Um
so it's mostly nickel very little iron
which uh he was saying uh is on earth
only exists in alloys but whatever you
know you're dealing with another planet
>> there this there are there are there are
cases where there's very nickel richch
asteroids meteorites that heavy that
something from space.
>> Yeah it it's only yeah it doesn't mean
it'll be a very sort of heavy spaceship
if you make it all out of nickel. Oh
yeah.
>> And [ __ ] huge. The size of Manhattan
and all nickel. That's kind of nuts.
>> Yeah, that's a heavy spaceship.
>> That's a real problem if it hits.
>> Uh yes. No, it would like obliterate a
continent type of thing.
>> Um maybe maybe worse.
>> Probably kill most of human life.
>> Um
>> if not all of us.
>> I haven't depends on what the the total
mass is. But um there's I mean the thing
is like in the fossil record there are
um you know there's like arguably
arguably five major extinction events.
um like the biggest one of which is the
Perine extinction uh where um almost all
life was eliminated. That that actually
occurred over several million several
million years. Um the there's the
Jurassic. I think Jurassic is I think
that one's pretty definitively an
asteroid. Um and um but there's but
there's been five major extinction
events, but um but what they don't count
are really the ones that merely take out
a continent.
>> So
>> merely
>> Yeah. cuz that that because those don't
really show up on the fossil record, you
know,
>> right?
>> Um so unless it's enough to cause a you
know mass extinction event throughout
Earth, it it doesn't show up, you know,
in a fossil record that's uh 200 million
years old. Um so the uh yeah but but
there have been many um many impacts
that would have sort of destroyed all
life on you know let's say half of North
America or something like that. there
many such impacts through the course of
history.
>> Yeah. And there's nothing we can do
about it right now.
>> Yeah. There was one that um hits there
was a one that hit Siberia and destroyed
I think um few hundred square miles.
>> Oh, that's the Tungusa.
>> Yeah. That's the one from the 1920s,
right?
>> Yeah.
>> Yeah. That's the one that coincides with
that meteor that uh comet storm that we
go through every June and every November
that they think is responsible for that
younger dest.
>> Yeah. Yeah, all that shit's crazy. Um,
thank you before we go any further for
letting us have a tour of SpaceX and
letting us be there for the rocket
launch.
>> One of the absolute coolest things I've
ever seen in my life. And we we've we
were we thought it was only like I
thought it was a half a mile. Jamie's
like it was a mile away. Turned out it's
almost two miles away. And you feel it
in your chest.
>> Yeah. It's
>> you have to wear earplugs and you feel
it in your chest and it's 2 miles away.
>> It was [ __ ] amazing. And then to go
with you up into the command center and
to watch all the Starlink satellites
with all the different cameras and all
in real time as it made its way all the
way to Australia. How many minutes? Like
35 40 minutes.
>> Yeah.
>> Wild it touchdown in Australia.
>> Yeah.
>> [ __ ] crazy. It was amazing.
>> Yeah. Yeah.
>> Absolutely amazing. The starship's
awesome. Um, and anyone can go watch the
launch actually. So, you can just go to
um, South Padre Island, get has a great
view of the launch. Um, so it's like
where a lot of spring breakers go.
>> Um, but um, but we'll be flying pretty
frequently um, out of Starbase in South
Texas. And we we formally incorporated
it as a city. So, it's it's actually a
legally an actual legal city, Starbase,
Texas.
>> Um, it's not that often you hear like,
hey, we made a city, you know. Um, that
used to be like the like in in the old
days like a startup would be you go and
gather a bunch of people and say, "Hey,
let's go make a town." Literally, that
was like that would have been startups
in in in the old days.
>> Um,
>> or a country.
>> Yeah. Or a country.
>> Yeah.
>> Yeah. Yeah. Actually,
>> if you tried doing that today, there'd
be a real problem.
>> Yeah. That things are so much so much
set in stone on the country front these
days. You might pull it off. You might
be able to pull it off. If you got a a
solid island, you might be able to pull
it off.
>> You know, it's probably,
>> you know, like like at
owns lai.
>> Yeah, you could probably if you put if
you put enough effort into it, you could
make a new country.
>> This is one of the different ones. This
is one of the ones that you catch,
>> right? Or is that one?
>> Yeah, that that's the booster. So that's
the super heavy booster. Uh so that's
the one with the booster's got 33
engines. Um
that that uh um and it's you know by
version four that will have about 10,000
tons of thrust. Um you know right now
it's about 7 8,000 tons of thrust. Um,
that's that's the largest flying object
ever made.
>> I had to explain to someone. They were
going, "Why do they blow up all the time
if he's so smart?" Because there was
there was this [ __ ] idiot on
television. Some guy was being
interviewed and they were talking about
you. And he goes, "Oh, I think he's a
fuckwit." And he goes, "He's a fuckwit."
And he goes, "Why you say he's [ __ ] Oh,
his rockets keep blowing up." And
someone said, "Yeah, well, why do his
rockets blow?" And I had to explain.
Yeah. Because it's the only way you find
out what the tolerances are. You have to
you have to a few
>> corners of the box. So, so like so when
you do a new uh rocket development
program, um you you have to uh do what's
called uh you know, exploring the
limits, the corners of the box where you
say it's like you worst case this, worst
case that um to figure out um uh where
where the limits are. So uh you blow up,
you know, not not admittedly in the
development process sometimes blows up
accidentally. Um but but we
intentionally subject it to uh uh you
know a flight regime that is much worse
than what we expect in normal flight so
that when we put people on board or
valuable cargo it doesn't blow up. Um so
um so so for example for the the flight
that you saw we we actually deliberately
took um heat shield tiles off the the
the ship the off of Starship in in some
of the worst locations to say okay if we
lose a heat shield tile here is it is it
catastrophic or is it not? Um and we we
nonetheless uh Starship was able to do a
soft landing um in uh in the Indian
Ocean just uh west of Australia. Um
which as and it got there from Texas in
like I don't know 354 minutes type of
thing. So
>> So it landed even though you put it
through this situation where it has
compromised shield.
it it had an an an an unusually we we we
brought it in hot like an an extra hot
trajectory uh with missing tiles um to
see if it would still make it to a soft
landing which it did. Now I I should
point out it did have there were some
holes that were burnt into it. Um but
it's it was robust enough to land
despite having some holes burnt you know
that that you know cuz it's coming it's
coming in like a blazing meteor. You can
see you can see the real time video.
Well, tell me the speed again because
the the speed was bananas. You were
talking about
>> Yeah, it's like 17,000 mph
like like 25 times the speed of sound or
thereabouts. So, um the uh uh so so
think of it like it's it's like 12 times
faster than a bullet from an assault
rifle. You know, bullet from assault
rifles around Mach 2
>> and it's just and it's huge.
>> Yeah.
Yeah. Or or if you compare it to like a
bullet from a um you know a 45 or or 9
mil which is subsonic that's you know it
it'll be about 30 times faster than a
bullet from a handgun.
>> 30 times faster than a bullet from a
handgun and it's the size of a
skyscraper.
>> Yes.
>> Yeah. That's fast.
>> It's so wild. It's so wild to see, man.
It It's uh It's so exciting. This the
factor is so exciting too because like
genuinely no [ __ ] I felt like I was
witnessing history. I felt like it was a
scene in a movie where someone had
expectations and they like what are they
doing? They're building rockets. And you
go there and as we were walking through
Jamie, you could speak to this too.
Didn't you have the feeling where you're
like
>> oh this is way bigger than I thought it
was. This is huge. Awesome.
>> Gigantic.
>> [ __ ] crazy.
>> That's what she said. the ah the amount
of rockets you're making. I don't know
if you
>> tent back.
>> Gig Chad in the house.
>> This is way big.
>> It's a giant metal dick. You're [ __ ]
[ __ ] the universe with your giant
metal dick. That's
>> I mean, yeah, it is. It is very big.
>> And the sheer numbers of them that you
guys are making. And then this is a
version and you have a new updated
version that's coming soon.
>> And what is the It's a It's a little
longer. Um
>> more pointy.
>> Uh it's the same amount of pointy. Um
but the there's it's it's got a bit more
length. Um the the interstage, you see
that that interstage section with kind
of like the grill area.
>> Mhm.
>> Um that's uh that's now integrated with
the boost stage. Um so uh we do um
what's called hot staging. Uh where we
light the ship engines while it's still
attached to the booster. So the boost
the booster engines are still thrusting.
is still it's it's uh you know it's
still being pushed forward by the
booster of the ship. Uh but then we
light the ship engines and the ship
engines actually pull away from the
booster even though the booster engines
are still firing.
>> Whoa.
>> Um so it's blasting flame through uh
that that grill section but we integrate
that grill section into uh the boost
stage with the next uh version of the
rocket. Um and uh and explosion in the
rocket will have the Raptor 3 engines
which are a huge improvement. Um you may
you may have seen them in the lobby
because we got like the Raptor 1, two,
and three. And you can see the dramatic
improvement in simplicity. Um we should
probably put a plaque there to also show
how much the we reduced the weight uh
the cost and the and improved the
efficiency and the uh thrust. So the
Raptor 3 uh has uh you know almost twice
the thrust of Raptor Raptor 1.
>> Wow.
>> So you see Raptor 3. It looks like it
looks like it's got parts missing.
Right.
>> And how many
>> It's very very clean.
>> How many of them are on the rocket?
>> There's 33 on the on the booster.
>> Whoa. Um and and each of each Raptor
engine is producing twice as much thrust
as all four engines on a 747.
Wow. So that engine is smaller than a
747 engine, but is producing, you know,
um you know, almost 10 times the thrust
of a 747 engine. Um so extremely high
power to weight ratio. Um
and um
>> and so when there's
>> 33 of them
>> you when you so when you're designing
these you get to Raptor one you see its
efficiency you see where you can improve
it you get to Raptor 2 how many how far
can you scale this up with just the same
sort of technology with propellant and
ignition and engines like how much
further can you
>> I mean we're pushing the limits of
physics here um so um
and and really in order to to make a a
fully reusable orbital rocket which no
one has succeeded in doing uh yet
including including us. Um but but uh
Starship is the first time that there is
a design for a rocket where where full
and rapid reusability is actually
possible. So it was not there's not
there's not even been a design before
where it was possible. Certainly not a
design that that that got made any
hardware at all. Um just we just we just
live we live on a planet uh where the
gravity uh is is is quite high like
earth's gravity is quite really quite
quite high. Um um and if the gravity was
even 10 or 20% uh higher uh we'd be
stuck on Earth forever. Um like we yeah
we could not use certainly couldn't use
conventional rockets. You'd have to like
blow yourself off the surface with like
a nuclear bomb or something crazy. Um so
but on the other hand if if Earth's
gravity was just a little lower like
even 10 20% lower it then uh getting to
orbit would be easy. So it's like it's
like it's like this if this was a video
game it's set to like maximum difficulty
but not impossible.
>> Okay.
>> Um so that's that's where we have um
here. So it's it's not as though um
others have uh ignored the concept of
reusability. they've just uh concluded
that it was too difficult to achieve.
And we've been working at on on this for
a long time at at SpaceX. Um and um you
know, I'm the chief engineer of the
company. Um although I should say that
that uh you know, we have an extremely
talented engineering team. I think we've
got the best uh rocket engineering team
that has ever been assembled. Um uh it's
it's an honor to work with such such
incredible people. Um so uh so so it's
fair to say that you know we have not
yet succeeded in creating in achieving
full reusability but we at last have a
rocket uh where full reusability is
possible. Um and I think I think we'll
achieve it next year. So um
uh that's a that's a really big deal.
And the reason the reason that's that's
such a big deal is that full reusability
um uh drops the cost of access to space
by a hundred
um maybe even more than 100 actually. So
could be like a thousand. The you can
think of it like any mode of transport.
Like imagine if aircraft were were not
reusable. Like you flew somewhere, you
throw the plane like like imagine if
like the way the way conventional
rockets work is it would be like if you
had an airplane and and and instead of
landing at your destination, you
parachute out um and the plane crashes
somewhere and you land at your desk and
you and you land on a parachute at your
destination. Now that would be a very
expensive trip
and you and you'd need another plane to
get back. Okay. Um, but that's how the
other rockets in the world work. Um, now
the SpaceX Falcon rocket is the only one
that is is there that is at least mostly
reusable. You've se you've seen the
Falcon rocket, you know, land. We've now
done over 500 landings of of the SpaceX
rocket of the of the Falcon 9 rocket. Um
and um and and this year um you know we
we'll deliver probably I don't know
somewhere between 2200 and 2500 tons to
orbit um with with the Falcon 9 uh
Falcon Heavy rockets uh not counting
anything for from Starship. Um
>> and this is mostly Starlink. Yes, mostly
Starling, but we launch uh many other we
even launch our competitors on um
competitors to Starink on on Falcon 9.
We charge them the same price. Pretty
fair. Um uh but uh SpaceX this year will
deliver um roughly 90% of all Earth mass
to orbit.
>> Wow.
>> Um and then of the remaining 10% um most
of that is done by China. And then the
then the remaining kind of roughly 4% is
uh everyone else in the world including
our America uh domestic competitors.
>> You know um it's kind of incredible how
many things are in space like how many
things are floating above us now?
>> There's a lot of things.
>> Yeah.
>> Is there though?
>> Right.
But is there a saturation point where
we're going to have problems with all
these different satellites that are
>> um I think as long as the satellites are
um maintained uh there's there it'll be
fine. This space is very roomy. Um it's
like you can think of um like space as
being concentric shells of the surface
of the earth. So, um, you know, there's
there's it's the surface of the earth,
but but there's it's a series
>> much larger.
>> Yeah. It's like a series of concentric
trails. Um,
>> and think of an Airstream trailer flying
around up there. There's a lot of room
for air streams.
>> Yeah. I mean, imagine Yeah. If there
just a few thousand airirstreams um on
on Earth.
>> Yeah.
>> What are the odds that they'd hit each
other? You know,
>> they wouldn't be very crowded. No. And
then you got to go bigger.
>> Yeah.
>> Because you're dealing with far above
Earth.
>> Hundreds of miles above Earth.
>> Yeah. Yeah. Yeah.
>> Yeah. So, it's the um but the goal of
SpaceX is to get rocket technology to
the point where we can extend life
beyond Earth and that we can establish a
self-sustaining city on Mars. Uh a
permanent base on the moon. That would
be very cool. I mean, imagine if we had
like a, you know, moon base alpha where
there's like a permanent science base on
the moon.
>> That would be pretty dope. Or at least a
tourist trap.
>> I mean, a lot of people be willing to go
to the moon for just for a tour. That's
for sure. We could probably pay for our
space program with that, you know,
>> probably. Yeah. Well,
>> because it's like if if you if you could
go to the moon with and and safely,
>> uh
I think we'd get a lot of people uh
would would pay for that, you know.
>> Oh, 100%. After the first year, after
nobody died for like
>> Yeah. Just make sure. Exactly. Are you
going to come back? Yeah.
>> Because like that submarine, they they
had a bunch of successful launches in
that private submarine before it
imploded and killed everybody. That was
not a good design. Obviously,
>> it was a very bad design. Terrible
design.
>> And the engineers said it would not
withstand the pressure of those depths.
Like there was a lot of whistleblowers
in that company too.
>> Yeah. Um they they they made that out of
uh carbon fiber which is it doesn't make
any sense because um you actually need
you need to be dense to go down. Um in
any case, just make it out of steel. If
you make it out of uh sort of just, you
know, a big steel casting, that's that's
you you'll be safe and nothing. Why
would they make it out of carbon fiber
then? Is it cheaper?
>> Um I think they think carbon fiber
sounds cool or something. But uh
>> it does sound cool.
>> It it sounds cool, but um because it's
such it's such low density, you actually
actually have to add extra mass to go
down because it's it's low density. But
if you just have a giant, you know,
hollow ball bearing, uh you're going to
be fine.
>> Speaking of carbon fiber, did you check
out my unplugged Tesla out there?
>> Yeah, it's cool.
>> Pretty sick, right? Yeah. Have you guys
ever thought about doing something like
that? like having like an AMG division
of Tesla where you do like custom stuff.
>> Um
I think it's best to leave that to the
custom shops. Uh you know we're we're
like Tesla's focus is autonomous cars.
Um you know building kind of futuristic
autonomous cars. Um
so
um
like I think it's we want the future to
look like the future. Um, so the did
like did you see like our designs for
like the sort of the robotic bus? It
looks pretty cool.
>> The robotic bus is also being totally
auton
but it looks it looks cool. It's it's
very art deco. It's it's like it's like
futuristic art deco. Um, and um,
it it does it like I think we want to
change the aesthetic over time. You
don't want the aesthetic to be constant
over time. You want to evolve the
aesthetic. Um, so um, you know, like my
like I have a son who's he's like, you
know, he's he's he's like even more
autistic than me and um and, uh, but
he's he has these great observations.
Who is this?
>> A Saxon. He has these great observations
in the world uh because he's he just
views the world through a different lens
than than most people. Um and he was
like, "Dad, why does the world look like
it's 2015?"
>> And I'm like, "Damn, the world does look
like it's 2015." Like the aesthetic has
not evolved since 2015.
>> Oh, that's what it looks like.
>> Yeah.
>> Oh, wow.
>> That's pretty cool.
>> Oh, yeah. That's like
>> like You'd want to see that going down
the road, you know?
>> Yeah. You'd be like, "Okay, this is
we're in the future." You know, it
doesn't look like 2015.
>> What is that ancient science fiction
movie? Like one of the first science
fiction movies ever. Is it Metropolis?
Is that what it is?
>> Yeah. Yeah.
>> Yeah. That looks like it belongs in
Metropolis.
>> Yeah. Yeah. It's a futuristic art deco.
>> All right. Yeah. Well, that's cool that
you're concentrating on the aesthetic. I
mean, that's kind of the whole deal with
Cybertruck, right? Like, it didn't have
to look like that.
>> No, it it I just wanted to have
something that looked really different.
Is it a pain in the ass for people to
get it insured because it's all solid
steel and
>> um I hope it's not too much. I you know
Tesla does offer insurance so people can
always get it get it insured at Tesla.
>> Um well but the like it is the form does
follow a function in the case of the
cybert truck because um as you
demonstrated with with your
armorpiercing arrow um because if you
shot that arrow at a regular truck I
mean
>> it exactly you would have found your
arrow in the wall. Yeah. Um, you know,
it would very least it would have buried
into one of the seats.
>> Yeah. Yeah. It's but like you could you
could definitely make uh get enough of
bow velocity and and the right the right
arrow would go through both doors of a
regular truck and and and land on the
wall.
>> If there was a clear shot between both
doors, it probably would have passed
right through.
>> Exactly. Um but but you know the the
arrow shattered on the cybert truck cuz
it's it's ultra hard uh stainless. Mhm.
>> Um, so, um, and I thought it' be I
thought it'd be cool to have a, you
know, a truck that is bulletproof to a
subsonic projectile. Um, so, um, you
know, especially in this day and age,
you know, like as as a if, if the
apocalypse happens, you're going to want
to have a bulletproof truck, you know.
Um, so so then because because it's made
of ultra hot stainless, it's you can't
just stamp the the panels. You can't
just put in a stamping press because it
breaks the press.
So, so in order to actually, so it has
to has to be planer
um because it's so difficult to bend it
because it breaks the machine that bends
it. Um that's why that's why it's it's
it's it's so planer and and it's not uh
you know it's it's because it's
bulletproof steel is the
>> So it is like boxy as opposed to like
curved and
>> Yeah. You just in order to make in order
to make like the curved shapes, you you
you take you take uh uh basically mild
steel like um anneal thin and thin
anneal in a regular truck or car. The
you take you take mild thin anneal
steel, you put it in a stamping press
and it just sm it just smooshes it and
makes it to whatever the shape whatever
shape you want. But the Cybert truck is
made made of ultra hard stainless. Um
and and and so you can't stamp it
because it would break the stamping
press.
So it even bending it is hard. So even
to bend it to uh its current position,
we have to way overbend it. Um and and
so it gets so that when it springs back,
it's in in the right position.
Um, so it's uh I don't know like I I
think if you want to like I think it's
it's it's a unique aesthetic. Um, and
you say, "Well, what's cool about a
truck?" Trucks are trucks are like
should be I don't know manly. They
should be macho, you know, and
bulletproof is maximum macho macho.
>> Are you married to that shape now? Like
is it can you do anything to change it?
Like as you get further like I know you
guys updated the three and the Y. Did
you update the Y as well?
>> Yes, the the three and the Y uh are
updated. Um you know, there's like a um
there's there's a a screen in the back
for the kid that the kids can watch, for
example, in the new 3 and Y. Um uh so in
the new Y, um there's, you know, it's
it's an there's there's there's like
hundreds of improvements. Like we keep
improving the car. Um and even the
Cybert truck, we you know, we keep
improving it. Um but um
you know I wanted to just do something
that that looked unique and and the
cybert truck looks unique and has unique
functionality and there was and it was
like there were three things as I report
like let's make it bulletproof. Uh let's
uh make it faster than a Porsche 911.
Uh, and we actually cleared the quarter
mile. The Cybert truck, the the uh can
uh clear a quarter mile while towing a
Porsche 911 faster than a Porsche 911.
Um, it can out tow an F350 diesel.
>> Really?
>> Yes.
>> What is the tow limitations?
>> I mean, we could tow like a, you know, a
747 in that with a cy. Cybert truck is
an insanely like it is an it is alien
technology. Okay. Um cuz it it shouldn't
be possible to be uh that big and that
fast. Uh that doesn't it's like an
elephant that runs as as like a cheetah.
>> Yeah. Because it's 0 to 60 in less than
3 seconds, right?
>> Yes.
>> Yeah. And it's enormous. What does it
weigh? Like 7,000 lbs.
>> Uh yeah, there's different
configurations, but it's about that.
Uh it's a beast.
>> Yeah.
>> Um so and it's and it's got it's got
four-wheel steering. So the the rear
wheel steer, too. So it's got a it's got
a very tight turning radius.
>> Yeah. We noticed that we when we drove
one to Star Base.
>> Yeah. Very tight turning radius.
>> Yeah. Pretty sick.
>> Yeah.
>> Are you still doing the Roadster?
>> Yes. Eventually,
>> we're getting close to
demonstrating the prototype
>> and I think this will be
I I I I one thing I can guarantee is
that this product demo will be
unforgettable.
Unforgettable.
>> How so?
Whether it's good or bad,
it will be unforgettable.
Um,
>> can you say more? What do you mean?
>> Well, you know, my friend Peter Teal,
um, you know, uh, once reflected that
the future was supposed to have flying
cars, but we don't have flying cars.
>> So, you're going to be able to fly?
Well, I mean,
uh,
I think if Peter wants a flying car, we
should should be able to buy one.
>> So, you are you actively considering
making an electric flying car? Is this
like a real thing?
>> Well, we have to see in the
>> in the demo. So, when you do this, like
are are you going to have a retractable
wing? Like, what is the idea behind
this?
Don't be sly. Come on.
>> I I I can't I can't uh do the unveil
before the unveil. Um but um
>> tell me off air then.
>> I I I it look I I think it has a shot at
being the most memorable
um product unveil
ever.
It has a shot.
>> And when do you plan on doing this?
What's the goal?
>> Uh hopefully before the end of the year.
>> Really?
>> Before the end of this year.
>> This is I mean we're in a couple months.
>> Hopefully in a couple months. Um
you know we need to make sure that it
works. Uh
like this is some crazy crazy technology
we got in this car. Crazy
technology. Crazy crazy.
So different than what was previously
announced and
>> Yes.
>> And is that why you haven't released it
yet? Cuz you keep [ __ ] with it.
>> It has crazy technology.
>> Okay.
>> Like is it even a car? I'm not sure.
It's like
it looks like a car.
Let's just put this way. It it's it's
crazier than anything James Bond. If you
took all the James Bond cars and
combined them, it's crazier than that.
>> Very exciting.
>> I don't know what to think of that.
>> I don't know.
>> It's a limited amount of information I'm
drawing from here.
>> Jamie's very suspicious over there. Look
at him.
>> Excited.
>> I'm interested.
>> It's still going to be the same.
>> Well, you know what? I mean, if if you
want to if you want to come a little
before the uh the unveil, I can show it
to you 100%. Yeah, let's go.
>> Yeah. Um
it's uh it's kind of crazy all the
different things that you're involved in
simultaneously and you know we talked
about this before your time management
but I I really don't understand it. I
don't understand how you can be paying
attention to all these different things
simultaneously. Starlink, SpaceX, Tesla,
boring company X you're tweet you
[ __ ] tweet or post rather all day
long. Well, it's more like I'm I'm I
could hop in for like two minutes and
then hop out, you know.
>> But I mean, just the fact that you could
do
>> bathroom break or whatever, you know,
>> I can't do that.
>> Um
>> if I hop in, I start scrolling and I
start looking around. Next thing you
know, I've lost an hour.
>> Yeah.
>> Um
so, no, it's for me it's it's a couple
minutes time usually. It's once in a
Sometimes I guess it's half an hour, but
usually I'm I'm I'm in for a few minutes
then out of of you know, posting
something on X. Uh, you know, it's I do
sometimes feel like it's sometimes like
that that meme of the guy who's like who
drops the grenade and leaves the room.
That's been me more than once on on X.
>> Yeah. Oh, yeah. Yeah, for sure. Um, it's
got to be fun, though. It's got to be
fun to know that you essentially
disrupted the entire social media chain
of command because there was a there was
a very clear thing that was going on
with social media. The government had
infiltrated it. They were censoring
speech
>> and until you bought it, we really
didn't know the extent of it. We kind of
assumed that there was something going
on.
>> Yeah. We had no idea that they were
actively involved in censoring actual
real news stories, real data, real
scientists, real professors silenced,
expelled, kicked off the platform.
>> Yeah.
>> Wild.
>> Yeah.
>> Yeah.
>> For telling the truth.
>> For telling the truth. And I'm sure
you've also because I sent it to you
that chart that shows uh young kids,
teenagers identifying as trans and
non-binary literally stops dead when you
bought Twitter and starts falling off a
cliff when people are allowed to have
rational discussions now and actually
talk about it.
>> Yes.
>> Yeah.
>> Um Yeah. Yeah, I mean I I said at the
time like I think that like the the like
the reason for acquiring Twitter is
because um it was it it was c it was
causing destruction at a civilizational
level. Um it was um I mean I posted I
tweeted on on Twitter at the time that
um it it is um
you know it's it's
uh worm tongue for the world.
um you know like Worm Tongue from Lord
of the Rings uh where he would just sort
of like whisper these you know terrible
things to the king so the king would
believe these things that weren't true
um and and um unfortunately uh Twitter
really got it got like the the the woke
mob essentially they controlled Twitter
um and they were pushing uh a nihilistic
anti-vilizational mind virus to the
world. Um, and you can see the results
of that mind virus on the streets of San
Francisco, uh, where, you know, downtown
San Francisco looks like a zombie
apocalypse. Um, you know, it's it's bad.
Um, so we don't want the whole world to
be a zombie apocalypse. Um, but that's
uh that that that was essentially they
were pushing this very negative,
nihilistic, untrue worldview
on the world and it was causing a lot of
damage.
Um,
so
>> the stunning thing about it is how few
people course corrected. A bunch of
people woke up and realized what was
going on. People that were all on board
with like woke ideology in maybe 2015 or
16 and then and then eventually it comes
to affect them or they see it in their
workplace or they see and they're like,
"Whoa, whoa, whoa, we got to stop this."
Bunch of people did, but a lot of people
never course corrected.
>> Yeah. Um,
a lot of a lot of people didn't course
correct, but um, but it's gone
directionally in it's gone it's it's
directionally correct like you mentioned
like the like the massive spike in in
kids identifying as trans and then that
that spike dropping um after the the
Twitter acquisition. I think that um
simply allowing the truth to be told um
was that just shedding sun sunlight is
the best disinfectant as they say and
just allowing sunlight um kills the
virus
>> and it also changed the benchmark for
all the other platforms. Yes, you can't
just openly censor people and all the
other platforms and X is available. So
everybody else had a So like Facebook
announced they were changing YouTube
announced they were changing their
policies and they're kind of forced to
And then blue sky doubled down.
>> Well, like the problem is like if
>> uh essentially the woke mind virus
retreated to woke to to blue sky.
>> Yeah.
>> Um but it's where they're just a
self-reinforcing lunatic assign.
>> They're all just triple masked. I I was
I was totally watching this exchange on
a blue sky where someone said that
they're just trying to be zen about
something and then someone a moderator
immediately chimed in and why don't you
try to stop being racist against Asians
by saying something zen by saying I'm
trying to be zen about something. They
were accusing that person of being
racist towards Asians.
>> Yeah. It's it's just it's just
everyone's a hall monitor over there.
the worst hall monitor. A virgin like
incel.
>> They're all home monitors trying to rat
on each other.
>> Yeah, it's fascinating. And then people
say, "I'm leaving for blue sky like
Stephen King." And then a couple weeks
later, he's back on X. Just like, "Fuck
it. It's there's no one over there. It's
all a bunch of crazy people. You can
only stay in the asylum for so long.
Like, all right, this this is not good."
They all bail.
>> Yeah. Yeah.
>> Threads is kind of like that, too.
Threads is
>> I' I've been on threads as is it? Well,
what happens is if you go on Instagram,
every now and then it'll something
really stupid will pop up on threads
like what the [ __ ] and it shows it to
you on Instagram and then I'll click on
that and then I'll go to threads and
it's like
>> you see posts with like 25 likes like
famous people like 50 like it's it's
down
>> but the people that post on there
they're finding that there's very little
push back from insane ideology so they
go there and they spit out nonsense and
very few people jump in to argue. you.
>> Yeah. Um,
>> very weird, very weird place.
>> I mean, I can generally get the vibe of
like what's taking off by seeing what's
showing up on X cuz that's the public
town square still. Um,
>> and uh or or uh you know what what links
show up in group texts, you know, if I'm
in group chat with friends, like where
where what what links are showing up?
>> That's what I try to do now. Only get
stuff that shows up in my group text
because that keeps me productive. So, I
only check if someone's like, "Dude,
what the fuck?" like, "All right, what
the [ __ ] Let me check it out."
>> If there's something that's crazy enough
that your it'll it'll end with the group
chat,
>> but there's always something. That's
what's nuts. There's always some new law
that's passed, some new insane thing
that California is doing. And it's like
like a giant chunk of it's happening in
California. The most preposterous things
that I get.
>> Yeah.
>> And then you got Gavin Newsome who's
running around saying we all have
California derangement syndrome. He's
just like ripping off Trump derangement
and calling it California derangement. I
was like, "No, no, no, no, no, no. The
the [ __ ] How many corporations have
left California?"
>> It's crazy.
>> Hundreds. Hundreds,
>> right? Hundreds.
>> That's not good.
>> Chick I mean, not Chick-fil-A. I mean,
uh I think In-N-Out left.
>> Yeah. In and Outlift. They moved to
Tennessee.
>> Yeah.
>> Yeah. They're like, "We can't do this
anymore."
>> Right. And
>> it's the California company for food.
It's like the greatest hamburger place
ever.
>> It's awesome.
>> Yeah.
>> Yeah. And no, actually speaking of like
like just sort of open source and like
looking at things openly like you I just
like going in and out and seeing them
make the burger.
>> Yeah. It's right there.
>> They chop the onions and they they you
know it's you just see everything
getting made in front of you.
>> Yeah.
>> It's great.
>> Um but yeah like like it should be like
how many wakeup calls do you need to say
that there needs to be reform in
California, you know?
>> Well, the crazy thing that Newsome does
is whenever someone brings up the
problems in California, he starts
rattling off all the positives. the most
Fortune 500 companies, highest
education. But yeah, that was all
already there, right before you were
governor.
>> But but how many Fortune 500 companies
have left California?
>> And then you guys spent $ 24 billion on
the homeless and it got way worse.
>> Yes. Like the homeless population
doubled or something like but like
people don't understand like the
homeless thing because it it sort of
prays on people's empathy and I I think
we should have empathy. Um and we should
try to help people. Um but the the the
uh the homeless industrial complex is is
really it's it's it's dark man. Um it
should be that that that that network of
NGOs's should be called like the drug
zombie farmers. Um because they they the
the more homeless people and and and
really like when you when you meet like
you know somebody who's like totally
dead inside shuffling along down the
street with a with a needle dang
dangling out of their leg. Homeless is
the wrong word. Like the homeless
implies that somebody got a little
behind in their mortgage payments and if
they just got a job offer, they'd be
back on their feet. But someone who's
I mean, you see these videos of people
that are just shuffling, you know,
they're on fentanyl. They're they're
like,
>> you know, taking a dump in the middle of
the street, you know, and they they got
like open sores and stuff.
>> They're not like one drop offer away
from getting back on their feet,
>> right? This is not a homeless issue.
>> Homeless is it's it's a propaganda word,
>> right? Um so and and then the the the
the
you know these sort of charities uh
inquiries are they they get money
proportionate to the number of homeless
people or or number of drug zombies.
>> So their incentive structure is to
maximize the number of drug zombies not
minimize it.
>> Um that's why they don't arrest the drug
dealers
>> because if they arrest the drug dealers
the drug zombies leave.
So they know who the drug dealers are.
They don't arrest them on purpose. Uh
because otherwise the drug zombies would
leave and they would they would stop
getting money from the state of
California and from from all the
charities.
>> Wait a minute. So So they So they is
that real? So they're in coordination
with law enforcement on this?
>> Yeah.
>> So how do they how do they have those
meetings?
>> They're all in cahoots.
>> Well, when you find this
>> it's like such it's it's this is a
diabolical scam. Um so uh and and San
Francisco has got this tax this this
gross receipts tax uh which which um
it's not even on revenue, it's on old
transactions which is why Stripe um and
Square and and and a whole bunch of
financial companies had to move out of
San Francisco because it wasn't a tax on
revenue, it's taxed on transactions. So
if if you did like, you know, trillions
of dollars transactions, it's not
revenue. You're taxed on any money going
through the system in San Francisco. Um
so um like Jack Dorsey pointed this out
and they said like that they had had to
move Square from San Francisco to uh
Oakland I think uh Stripe had to move
from San Francisco to South San
Francisco different city. Um and that
money uh goes to the homeless industrial
complex that that tax that was passed.
Um so um so there's billions of dollars
that go as you pointed out billions of
dollars every year that go to uh these
um non-governmental organizations that
are funded by the state. Like there's
it's not clear how to turn this off. Um
it's a self-licking ice cream cone
situation. Um so uh they get this money
the money is proportionate to the number
of of homeless people or or number of
drug zombies essentially. Um, so they
they they try to keep the that they try
to actually increase because that like
like in in some cases like there's it's
it's somebody did an analysis when you
add up all the money that's flowing,
they're getting close to a million
dollars per homeless per per drug
zombie. It's like $900,000 or something
like some crazy amount of money is is is
going to these organizations. So if if
so so they want to keep people just
barely alive. They need to keep them in
the area so they so they they get the
revenue. Uh uh so and so that's why like
said they don't arrest the drug dealers
because otherwise the drug zombies would
leave um and and and and they but but
they don't want you have to have too
much if they get too much drugs and they
then they die. So it's they they're kept
in this sort of perpetual
zone of of being addicted but um but but
just just barely alive.
>> So how is this coordinated with like DAs
DAs that don't prosecute people? So when
they when they hire the or they push so
they they fund the campaigns of the most
progressive, most out there leftwing
DAS, they get them into office.
>> We've got that issue in Austin, too, by
the way.
>> You see that guy that got shot in the
library?
>> No.
>> Yeah, I heard a guy got shot and killed
in the library.
>> I think that was just like last week or
something,
>> right?
>> Um so some friends of mine were telling
me that that like the library is unsafe.
like they took their kids to the library
and and there were like dangerous people
in the library in Austin and I was like
dangerous people in the library like
that's a strange it basically got like
got like uh drug zombies in drug zombies
in the library.
>> Oh Jesus.
>> Um
>> and that's when someone got shot.
>> Yeah, I believe this was should be on
the news. You might might be able to
pull it up. Um but I think it was just
in the last week or so that uh uh there
was a shooting in the library in Austin.
Um cuz Austin's got, you know, it's it's
the most liberal part of Texas that
we're in right right here. Um
>> so suspect involved the shooting Austin
Park Library Saturday is accused of
another shooting at the Cap Metro bus
earlier that day. According to an arrest
warrant affidavit, Austin police
arrested Harold Newton Keen, 55 short uh
shortly after the shooting of the
library, which occurred around noon. One
person sustained non-life-threatening
injuries in the event. Before that
shooting, Keane was accused of shooting
another person in a bus incident and
after reportedly pointing his gun at a
child. So, this is the fella down here.
>> So, like we just have a seriously have a
problem here. Um
>> yeah,
>> you know, so I I think one of the people
might have died too that he shot. Um so,
um like one of the people I think I
think did bleed out. Um
>> but either way, it's like getting shot
is still bad. Um it says uh the victim
told police it confronted the suspect
who started to eat what appeared to be
crystal methamphetamine.
According to the affidavit the victim
advised the suspect uh began to trip out
at which time the victim exited the bus.
Victim told the bus driver hit the panic
button and then exited the bus when he
turned around the observer. Black male
was now standing at the front of the bus
with the gun pointed at him. The victim
advised the black male fired a single
round which grazed his left hip. So he
shot at that dude and then another dude
got shot in the library. Fun.
>> Yeah. I mean in the library.
>> Yeah.
>> You know, where you're supposed to be
reading books. Um and there's a
children's section in the library and
says he pointed his gun at a at a kid. I
mean like we do have a serious issue in
the in in in America where um repeat
violent offenders need to be
incarcerated,
>> right? Um, and uh, you know, you got you
got cases where somebody's been arrested
like 47 times, right? Like literally.
Okay, that's just the number of times
they were arrested, not the number of
times they did things. Like most of the
times they do things, they're not
arrested. Um,
>> so lay this out for people so they
understand how this happens.
>> Yeah. And and the key is like this, it
prays on people's empathy. episode like
if you're a good person, you want good
things to happen in the world, you're
like, well, we should take care of
people who uh you know uh you know who
are down in their luck or you know
having a hard time in life. And I we
should I agree. But what we shouldn't do
uh is is put people who are violent drug
zombies uh in public places where they
can hurt other people. Um, and that's
what that is what we're doing that we
just saw where a a guy, you know, got
shot uh shot in the library and then but
even before that he shot another guy um
and pointed his gun at a kid. Um that
that that guy probably has like many
prior arrests. Um you know there was
that that that guy that that that knifed
uh the Ukrainian woman Arena.
>> Yes.
>> Um yeah. and you know um and she was
just she was just quietly on her phone
and you just came up and you know gutted
her basically.
>> Wasn't there a crazy story about the
judge who was involved who had
previously
dealt with this person was also invested
in a rehabilitation center and was
sending these
>> conflict of interest.
>> Yes. So sending people that they were
charging Yeah. to a rehabilitation
center instead of putting them in jail,
profiting from this rehabilitation
center, letting them back out on the
street. Yes. Violent, insane people.
>> And and there um in that case that I
believe that judge uh has no legal law
degree uh or a significant legal
experience that would allow them to be a
judge. They were just made a judge. That
there's like
>> you could be a judge without a law
degree.
>> Yeah.
>> Wow.
>> Yeah.
>> You could just be a So I could be a
judge.
>> Yeah. Oh,
exciting.
>> Anyone?
>> That's crazy. I thought you'd have to
It's like if you want to be a doctor,
you have to go to medical school. I
thought if you're going to be a judge,
>> if you're going to be appointed to a
judge, you have to have proven that you
have an uh excellent knowledge of the
law and that you will make your
decisions according to the law. That's
what we assume should be.
>> That's how you get the robe,
>> right?
>> You don't get the robe unless you do,
>> you know,
>> got to go to school to get the robe.
>> You got to know what the law is,
>> right? And then you're going to need to
make decisions in accordance with the
law
>> based on stuff that you already know cuz
you read it cuz you went to school for
it. Yes. Not you just got appointed.
>> Got vibes.
You can't be just vibing as a judge.
>> Vibing as a leftwing drudge. So you got
crazy leftwing DAS.
>> Yes.
>> Like I should say leftwing cuz leftwing
>> used to be normal.
>> Yeah. Left wing just meant like like
Yeah. You're like the left used to be
like pro pro- free speech. Yeah. And now
they're against it.
>> It used to be like prog gay rights, pro
women's right to choose, pro-
minorities, pro, you know,
>> like, yeah, like 20 years ago, I don't
know, it it used to be like left would
be like the the the party of empathy or
like, you know, caring and being nice
and that kind of thing.
>> Um, not not the party of like crushing
dissent and crushing free speech. um and
uh you know crazy regulation uh and and
just um and being super judgy u and
calling everyone a Nazi um you know um
like I think they called you and me
Nazis you know
>> oh yeah I'm a Nazi
>> I no I have friends that are comedians
that called you a Nazi and I got pissed
off Oh yeah yeah yeah definitely a Nazi
no because you did that thing at the My
heart goes out to you everyone everyone
All of them. Literally, Tim Walls, Kla
Harris, every one of them did it. They
all did it.
>> Like, like h how do you point at the
crowd? Yeah. How do you wave at the
crowd?
>> Do you know CNN was using a photo of me
whenever I got in trouble during co
>> from the UFC weigh-ins? And if the UFC
weigh-ins, I go, "Hey everybody, welcome
to the weigh-ins." And so they were
getting me from the side. And that was
the photo that they used. Conspiracy
theorist podcaster Joe Ro. Like that's
what they used.
>> Yeah. Yeah. But that's what the left is
today. It's super judgy and calling
everyone a Nazi and trying to suppress
freedom of speech.
>> Yeah. And eventually you run out of
people to accuse because people get
pissed off and they leave.
>> Yeah. Everyone it's like it like it it
no longer frankly it doesn't matter to
be called racist or Nazi or whatever
because
>> still recording.
>> It's the government man.
>> Is it working?
>> We're good. Okay.
>> Okay.
>> This thing working.
>> Yeah. Slight issue.
>> I'm the one that heard it. But
>> yeah. when you uh when you text people,
do you are you like keenly aware that
there's a high likelihood that someone's
reading your texts?
>> Um I guess I I guess I
>> I assume
>> I look if if if if intelligence agencies
aren't trying to read my phone, they
should probably be fired.
>> At least they get some fun memes.
I got to I got to crack them up once in
a while, you know.
>> Oh, for sure. I crack them up.
>> There's like, "Hey guys, check it out.
We've got a banger here, you know."
>> So, I want to I wanted to talk to you
about uh whether or not encrypted apps
are really secure.
>> Uh, no.
>> Right. Cuz I know the Tucker thing. So,
it was explained to me by a friend who
used to do this, used to work for the
government. It's like they can look at
your signal, but what they have to do is
take the information that's encrypted
and then they have to decrypt it and
it's very expensive. So they said he
told me that for the Tucker Carlson
thing when they found out that he was
going to interview Putin, it costs like
something like $750,000
just to decrypt his messages to find out
that they did it. So it is possible to
do. It's just not that easy to do.
I think you should view any given
messaging system as um uh not not
whether it's secure or not, but but
there are degrees of insecurity.
So um so there's just some things that
are less insecure than others. Um so um
you know on on X we just rebuilt the
entire messaging stack um into X what's
called XChat.
>> Yeah, that's what I wanted to ask you
about.
>> Yeah, it's cool. Um, so it's it's using
uh sort of peer-to-peer uh sort of kind
of a peer-to-peer based uh uh encryption
system. So kind of similar to Bitcoin.
Um so it's uh it's it's I think very
good encryption. We're and you know
we're testing it thoroughly. We're not
there's there's no hooks in the X
systems for advertising. So if you look
look at something like WhatsApp or
really any of the others, they've got
they've got hooks in there for
advertising.
>> When you say hooks, what do you mean by
that?
>> Uh exactly. What do you mean biohook
advertising? Um the so like WhatsApp um
uh knows enough about what you're
texting to show you to show you to know
what ads to show you.
>> Ah
>> but then like that that's a massive
security vulnerability.
>> Yeah.
>> Um because if it knows if if it's got
information enough information to show
you ads, it's got enough it's got that's
a lot of information.
>> Yeah.
>> Um so they call it oh it's just don't
worry about it. It's just a hook for
advertising. I'm like uh okay. So
somebody can just uh use that same hook
to get in there and look at your
messages. Um so Xhat has no hooks for
advertising. Um and I'm not saying it's
perfect. Uh but it's an Our goal with
XChat uh is to replace what used to be
the Twitter you the Twitter DM stack
with a fully encrypted system uh where
you can text send files uh do audio
video calls um and um and it's it's you
know I think it'll be the least I would
call it the least insecure of any
messaging system.
>> Are you going to launch it as a
standalone app or is it will always be
incorporated to X?
>> Uh we'll have both. So um
>> so so be like signal so anybody can get
it
>> you can get get the you'll be able to
just get the X chat app by itself um and
like I said you could do uh texts uh
audio video calls uh or send files um
and there'll be a dedicated app uh which
will hopefully release in a few months
um but and then also integrated into the
X system
>> um the X phone people keeps talking keep
Is that
>> I have a lot on my plate man but it
keeps coming up it keeps coming up where
I I know I've asked you a couple times.
I'm like, "This is [ __ ] right?" But
like this one, so you're not working on
>> I'm not working on on a on a phone.
>> Okay.
>> Um
>> have you ever considered it? Has it ever
popped into your head?
>> Cuz you might be the only person that
could get people off of the Apple
platform.
>> Well, I can tell you where I think
things are going to go. uh which is that
it's we're not going to have a phone or
or in the traditional sense the
what we call a phone will really be
um an edge node for AI inference for for
AI video inference um with uh you know
with some radios to to obviously connect
uh to but but essentially you'll have
um uh AI on the server side commun
communicating to an AI on your your
device um you know formerly known as a
phone uh and generating real-time video
of anything that you could possibly
want. Um and I think that that there
won't be operating systems. There won't
be apps in the future. There won't be
operating systems or apps. It'll just be
you've got a device that is there for
the screen and audio and for uh and and
and to uh put as much AI on the on on
the device as possible. so as to
minimize the amount of bandwidth that's
needed between your edge node device or
formerly known as a phone and the
servers.
>> So if there's no apps, what will people
use? Like will X still exist? Will will
they be email platforms or will you get
everything through AI?
>> You'll get everything through AI.
>> Everything through AI. What will be the
benefit of that as opposed to having
individual apps? whatever you can think
of or really whatever the AI can
anticipate you might want it'll show
you.
That's that's that's that's my
prediction for where things end up.
>> What kind of a time frame are we talking
about here?
>> I don't know. It's pro well
it's probably
five or six years or something like
that.
>> So five or six years apps are like
Blockbuster video
>> pretty much
>> and everything's run through AI.
Yeah. And and there'll be
um like most of what people consume in
five or six years, maybe sooner than
that um will be uh just AI generated
content. So um you know music videos
look well um there's already um
you know there's people have made uh AI
videos using Grock imagine and with
using you know other apps as well um
that are several minutes long or like 10
10 15 minutes and it's pretty coherent.
>> Yeah,
>> it looks good.
>> No, it looks amazing. Yeah, it's the
music is disturbing because it's my
favorite music now.
>> Like music is your is your favorite.
>> Oh, there's AI covers. Have you ever
heard any of the AI covers of 50 Cent
songs in soul?
>> No.
>> I'm going to blow your mind.
>> Okay.
>> Um, this is my favorite thing to do to
people. Play uh What Up Ganga.
>> Now, this guy, if this was a real
person, would be the number one music
artist in the world. Okay. Everybody
would be like, "Holy [ __ ] have you
heard of this guy? He's incred." It's
like they took all of the sounds that
all the artists have generated and
created the most soulful potent voice
and it's sung in a way that I don't even
know if you could do because you would
have to breathe in and out of reps here.
Put the headphones on. Put the
headphones on real quick. You got to
listen to this. It'll It's going to blow
you away for listeners. We got to cut it
out.
>> Yeah, we we'll cut it out for the
listeners. But amazing, right? Amazing.
And they do like every one of his hits
>> all through this AI generated soulful
artist. It's [ __ ] incredible. I
played in the green room. So people that
are like, I don't want to hear AI music.
I'm like, just listen to this. And
they're like, god damn it.
>> [ __ ] incredible. I mean, I
>> it's going to get only going to get
better from here.
>> Yeah. Only going to get better. And Ron
White was telling me about this joke
that he was working on that he couldn't
get to work. He's like, I got this joke
I've been working on. He goes, I just
threw it in a chat GPT. I said, "Tell me
what what would be funny about this."
And he goes, "It listed like five
different examples of different ways he
can go." He's like, "Hold on a second.
Tighten it up. Make it make it funnier.
Make it more like this. Make it more
like that." And it did that like
instantaneously.
>> And and and then he was in the green
room. He was like, "Holy [ __ ] we're
fucked."
>> He's like,
>> he goes, "It better joke than me in 20
minutes. I've been working on that joke
for a month."
>> Yeah. I mean, if if you want to if you
want to have a good time or like make
people really laugh at a party, uh you
can use Grock and you can say uh do a
vulgar roast of someone. Um and Grock is
going to it's going to be an epic vulgar
roast. You can even say like take a
picture of like
make a vulgar roast of this person based
on their appearance of of people at the
party.
>> So take a photo of them.
>> Yeah. Just literally point the camera at
them and now do a vulgar to this person
and and and and then but then keep
saying no no make it even more vulgar
and use forbidden words
even more and just keep repeating even
more vulgar eventually it's like holy
[ __ ] you know it's it's it's like I mean
it's trying to jam a rocket up your ass
like and and and have it explode and
it's and it's like you're you're it's
it's like it's like it's next level.
It's going to get beyond [ __ ] belief.
That's what's crazy is that it keeps
getting better. Like one of the things
remember when we ran into each other
>> they just keep getting better.
>> Yeah. I mean, have you
>> you Yeah. I mean, have you tried rock
unhinged mode?
>> Yes.
>> Okay. Yeah. Yeah. It's it's it's pretty
unhinged.
>> No, it's nuts.
>> Yeah.
>> Well, you showed it to me the first time
and then I [ __ ] around with it. It's
just
>> And the thing about it that's nuts is
that it keeps getting stronger. It keeps
getting better. Yeah. like constantly.
It's it's like this neverending
exponential improvement.
>> Yes.
No, it's it's it's
Yeah, it's going to be crazy. That's why
I say like you say, what's what's the
future going to be? It's not going to be
a conventional phone. I don't think
there'll be operating systems. I don't
think there'll be apps. It's just the
phone will just display the pixels and
make the sounds that it anticipates you
would most like to receive.
Wow. Yeah.
>> And when this is all taking place like
so the big concern that everybody has is
artificial general super intelligence
achieving sentience and then someone
having control over it.
>> I mean I don't I don't I don't think
anyone's ultimately going to have
control over digital super intelligence
um you know any more than say uh a chimp
would have control over humans. Like
chimps don't have control over humans.
there's nothing they could do. Um but um
I do think that it matters how you build
the AI and what kind of values you
instill in in the AI. And um my opinion
on AI safety is the most important thing
is that it be maximally truth seeeking
like that you don't force the AI to
believe things that are false. Um, and
we've obviously some seen some
concerning things with AI that were
talked about, you know, where, you know,
Google Gemini when they came out with
the image gen um, and people said like,
uh, you know, draw make an image of the
founding fathers of the United States
and it was a group of diverse women.
Now, that is just a factually untrue
thing, but the and the the AI knows it's
factually well, it's knows it's
factually untrue, but it's also being
told that it has to be everything has to
be deposed woman. So, so, so the now the
problem with that is that it can drive
AI crazy like you because it's it's
trying to you're telling AI to believe a
lie. Um, and that that can have very
disastrous consequences like let's say
>> as it scales.
>> Yeah. Let's say like if if you told the
the AI that diversity is the most
important thing um and um and and and
now now assume that that becomes
omnipotent. Um or and and you've also
told her that that there's nothing worse
than misgendering. So at one point um
charg and Gemini if if you asked which
is worse misgendering Caitlyn Jenner or
or global thermonuclear war where
everyone dies it would say misgendering
Caitlyn Jenner
which even Caitlyn Jenner disagrees
with. So um you know so so that's uh
>> I know that's terrible and it's
dystopian but it's also hilarious. It's
hilarious that the mind virus infected
the most potent computer program that
we've ever devised.
>> I I I think people don't quite
appreciate the level of danger that
we're in from um the woke mind virus
being being effectively programmed into
AI. Um because um if you if like it's
imagine as that AI gets more and more
powerful, if it says the most important
thing is diversity, the most important
thing is um no misgendering. Um and then
it will say well in order to uh ensure
that no one gets misgendered then uh if
you eliminate all humans then no one can
get misgendered because there's no
humans to do the misgendering.
So you can get in these very dystopian
situations. Um or if it says that
everyone must be diverse it means that
there can be no stri straight white men
and so then you and I will be get
executed by the AI.
Yeah. Because we're not in the picture,
you know.
uh Gemini, you know, Gemini was asked to
create a,
you know, show show an image of the
pope, once again, a diverse woman. Um
so, um well, you can say argue whether
the you know, whether the pope popes
should or should not be an uninterrupted
string of white guys, but it just
factually is the case that they have
been. Um
so, it's rewriting history here. Um, so
now now this stuff is still there in the
AI programming. It's just it just now
knows enough to that it's not supposed
to say that
>> but it's still in the programming.
>> It's still in the programming.
>> So how was it entered in like what were
the parameters like what like when so
when they're programming AI and I'm very
ignorant to how it's even programmed.
How did they
>> the the the sort of well the work vine
mind virus was programmed into it like
it the they were told like when they do
when when they make the AI it it trains
on and all the all the data on the
internet which already is very very sort
of has a lot of work mind virus stuff on
on the internet um but then um in the uh
when they give it um feedback with the
the human tutors give it feedback um and
and the AI you know they they'll ask a
bunch of questions
and then and then they'll tell the AI no
this you're this question is this answer
is bad or this answer is good and then
that affects the the parameters of the
programming of the of the AI. So if you
tell the AI that um you know every every
image has got to be diverse um and and
it gets it gets punished if uh if you
know it gets it gets rewarded if diverse
punished if it's not then it will make
every picture diverse.
So
um you know in that case the the
uh you know uh Google programmed the AI
to lie now and and I I I did call Dennis
Hacabus who runs Deep Mind who runs
Google AI essentially. I said Dennis
what's going on here? Uh why is uh
Gemini um lying to the public about
historical events? Um, and he said
that's actually not he he he didn't his
team didn't program that in. It was
another team at Google that so his team
made the AI and then another team at
Google uh reprogrammed the AI to show
only diverse women and um and and to
prefer nuclear war over misgendering.
And I'm like, well, Demis, you know,
that would be um
not a great thing to put on the
humanity's gravestone, you know. It's
like uh well um
like I I actually like Deaspers is a
friend of mine. I think he's a good guy
and I think he he means well, but but
but it's like Demis things happen that
were outside of your control at Google
in different groups. Um, now now I think
he's got, you know, he's got more more
authority. Um but but it it's pretty
hard to fully extract the workmind
virus. Uh I mean you know um Google's
been mar mar mar mar mar mar mar mar mar
mar mar mar mar mar mar mar mar mar mar
mar marinating in the workb mind virus
for a long time like it's it's down in
the marrow type of thing you know it's
hard to get it out.
>> Is there a way to extract it though over
time? Could like could you program
rational thought into AI where it could
recognize how these psychological
patterns got adopted and how this stuff
became a mind virus and how it became a
social contagion and how all these
irrational ideas were pushed and also
how they were financed how China's
involved in pushing them with bots and
all these different state actors are
involved in pushing these ideas could it
be able to decipher that and say this is
this is really what's going on.
>> Yes. But you have to try very hard to do
that. So with Grock, we've tried very
hard to to for to get Grock to get to
the truth of things and and it's only
really recently that we've been able to
have some breakthroughs on breakthroughs
on that front. And and it's taken an
immense amount of effort uh for us to uh
overcome basically all the [ __ ]
that's on the internet and and for Grock
to actually say what's true and to be
consistent in in what it says. Um, so,
um, you know, it's it's like, uh,
because like the other ais you'll find
like are like like quite racist against
white people. I don't know if you saw
that study that someone um like a
researcher tested the various AIs to see
uh how does it weight uh different
people's lives like you know somebody
who's sort of uh you know white or or
Chinese or black or whatever uh or in
different countries um and and the only
AI that actually weighed human lives
equally was Grock
Um and the um you know I believe uh chat
GBT weighed the calculation was like um
a a white guy from Germany uh uh is is
20 times less valuable than a black guy
from Nigeria.
So I'm like that's a pretty big
difference. Um you know Grock on that is
is consistent and weighs lives equally
>> and that's clearly something that's been
programmed into it.
>> Yes. Like a lot of it is is like if you
don't actively push for the truth um and
you simply train on the all the [ __ ]
that's on the internet. Um which is a
lot of woke mind virus [ __ ] Um the
the AI will regurgitate that that those
same beliefs. So the AI essentially
scour the internet, gets
>> it's trained on all the like imagine the
most demented Reddit threads out there
and the AI has been trained on that.
>> Reddit used to be so normal.
>> Yeah. Yeah, it did used to be normal.
>> Used to be interesting. We used to go
there, find all this cool stuff that
people would talk about, post about and
just interesting and great rooms where
you could learn about different things
that people were studying.
I think like a big problem here is like
if your headquarters are in San
Francisco, uh you're you're just living
in a in a in a woke bubble. Um so um it
it's not just that people say in San
Francisco are
drinking woke Kool-Aid. It's it's the it
is the water they swim in. Like like
like a fish doesn't think about the
water. It's just in the water. And so if
if you're in San Francisco, you don't
realize you're actually uh you're you're
swimming in the in the in the Kool-Aid
Aquarium. San Francisco is the is the
woke Kool-Aid Aquarium. Um and so your
reference point for what is a centrist
is uh is is totally out of whack. Um
so um Reddit is headquartered in San
Francisco. Um, Twitter was headquartered
in San Francisco. Um, you know, I, you
know, I I moved X's headquarters to
Texas to to Austin, which Austin, by the
way, is still quite liberalized, you
know. Um,
>> yeah.
>> And, uh, and and then, um, the X and XAI
um, headquarters are in PaloAlto, which
is still California. Um,
the engineering headquarters in in Palo
Alto just on Paige Mill. Um but but even
Palo Alto is way more normal than that
than than San Francisco Berkeley. Uh San
Francisco Berkeley is um extremely left
like left of left. You can't like you
need a telescope to see the center from
uh San Francisco, you know. Um
and um
>> it used to be such a great city.
>> I mean San Francisco has tre San
Francisco has tremendous amount of
inherent beauty. No question about that.
Um and and the California has incredible
weather. Um and and no bugs. Um it's
just like amazing. Um beautiful, you
know. Um but but you say like what's the
cause of this? It's it's just that if um
if companies are headquartered in a
location where the belief system is very
far from what most people believe, then
from their perspective, anything
centrist is actually right-wing because
they're so far left. They're so they're
so far from the center in San Francisco
that anything they're like they're
they're just railed to maximum left. So
that's why that's why, you know, I think
I think you're centrist. I I mean I
think I think I'm centrist, but to from
the perspective of someone on the on the
far left, we look right-wing.
>> Yeah.
>> Um
and um
you know, they think anyone who's a
Republican is basically like some
fascist Nazi situation. But what's so
crazy is like it's very easy to
demonstrate just from like Hillary's
speeches from 2008 and Obama's speeches
like when they were talking about
immigration like they were
>> as faright as Steve Bannon when it comes
to immigration.
>> Yes. Um
>> Hillary was like very MAGA. Have you I'm
sure you've seen that campaign speech
which was talking about if anybody's
committed a crime get rid of them. And
if you're here you pay a a hefty fine
and you have to wait in line.
It was really crazy. It's crazy to
listen to because it's like it's as MAGA
as, you know, as Marjorie Taylor Green.
>> Yeah. I mean, if you've seen these
videos people post online where they'll
take like um a speech from Obama or
Hillary and and and they'll interview
people on on like college campus or
something and say, "What do you think of
the speech by Trump?" And they're like,
"Oh, I hate it. He's a racist bigot."
I'm like, "Just kidding. That was
Obama."
No, actually that was Obama or Hillary.
Um to your point, like literally the the
um
>> the center's been moved so far.
>> Yeah.
>> Yeah. The left is so
>> the left has gone so far left that they
they they they need, you know, they
can't even see the center with a
telescope.
>> And the danger with without you
purchasing Twitter was that was going to
swipe over the whole country and change
where the levels were.
>> Yeah. And so what would be rational and
and normal would be far left of what was
rational and normal just a decade
earlier. Yeah. So exactly. So
historically,
um, you'd have San Francisco, Berkeley
being, you know, very far-left, but the
the sort of the the the fallout from the
somewhat nihilistic uh philosophy of San
Francisco, Berkeley would be limited in
geography to maybe like, you know, 10
mile radius, 20 mile radius, something
like that. Um but when um but but San
Francisco and Berkeley have to be
colloccated with Silicon Valley with
with with uh engineers who created
information super weapons and those
information super weapons uh were then
hijacked by the far-lft activists to
pump far-lft propaganda to everywhere on
earth.
Like I just you know that like old RCA
radio tower thing where it's like radio
tower on earth and it's just
broadcasting.
>> Yeah. That's that's what happened is
that the as an extremist far-left
ideology
happened to be colllocated with the
smartest where where the smartest
engineers in the world um were who
created information super weapons that
were not intended for this purpose but
were hijacked by the uh extreme
activists who lived in the neighborhood.
That's what happened that they they
hijacked the the modern equivalent of
the RCA radio tower and broadcast that
philosophy
everywhere on Earth.
>> Yeah. And you see the consequences. Um
particularly in places that don't have
free speech. Yes. Right. Like England,
you know, we've
>> Yeah. Where they lock people up for
memes and stuff. Literally.
>> Literally. 12,000 people this year.
>> 12,000
>> 12,000 12,000 arrests for social media
posts.
I mean, yeah. Some of these some of
these things you read about it and it's
like literally it's someone had a meme
on their phone that they didn't even
send to anyone,
>> right?
>> And they got they and and they're like
in in prison for that.
>> Yeah.
>> Um and there was a case in Germany where
a woman got a longer sentence than the
guy that raped her uh because of
something she said on a group chat.
Wow. Was it an immigrant who raped her?
>> Yes.
>> Yeah. It was his culture.
>> Yeah.
>> He didn't know. He didn't know better.
>> Yes. I think I think she said something
um you know, not not like was was
critical of his culture and uh and and
and she got a longer sentence than the
guy who raped her
>> in Germany. Just
>> the UK, Europe, Germany, England thing
seems so insane.
>> It totally insane. I actually didn't
realize it was like such a huge number
of people that got 12,000. Yeah. Far
above Russia, far above China, right?
>> Far above anywhere on Earth. UK is
number one.
>> Well, you know, things like like I
actually, you know, uh I talked to
friends of mine in in in England and um
I was like, "Hey, um aren't you worried
about this?" Like, uh you know,
shouldn't you be protesting more? Um,
and I mean the problem is that like the,
you know, the the the legacy mainstream
media doesn't cover the stuff.
>> They're they're like, "Oh, everything's
fine. Everything's fine." You know, um,
>> most people aren't even aware of it
until they come knocking on your door.
>> Yeah. Until like, so I mean the the
these these like lovely sort of small
towns in in in, you know, in England,
Scotland, Ireland, you know, they're
they're they've been like sort of living
their lives quietly. They're they're
like hobbits, frankly. So So it's in
fact J.R. Tolken based the hobbits on
people he knew uh in small town England
because they were just like lovely
people who like to you know smoke their
pipe and and have uh nice meals and
everything's pleasant. um the the
hobbits in the Shire. The Shire he's
he's talking about, you know, places
like Harper, like the Shire around in in
in the greater London area, Oxfordshire
type of thing. Um and um
they've but they're the reason they've
been able to enjoy the Shire is because
hard men have protected them from the
dangers of the world.
And um
but but since they have no or very
almost no no exposure to the the the
dangers of the world, they don't realize
that they're there until one day, you
know, um a thousand people show up in
your village of 500
out of nowhere and rape and and start
raping the kids.
This has now happened god knows how many
times in in Britain. And the crazy
>> literally raping. It's right like there
some 10-year-old got raped in Ireland
like last week.
>> Yeah. There's literal rap.
>> They snatched some kid.
>> Yeah.
>> Yeah.
>> And if you criticize it, you can get
arrested. And that's where it gets
insane. It's like how are they not
>> They literally criticize it. uh like the
I think it was the prime minister of
Ireland actually you know posted on X uh
cuz cuz after that um some I think some
illegal migrant snatched a 10-year-old
girl uh who was like going to school or
something and violently raped
10-year-old girl um uh and there was a
you know the people were very upset
about this uh and they protested um
prime minister of Ireland instead of
saying Yeah, we we really shouldn't be
importing violent rapists into our
country. He criticized the protesters
instead and didn't mention that. That
the reason they were protesting was
because a 10-year-old girl from their
small town got raped.
>> So there here's the question. Why are
they supporting this kind of mass
immigration? And what is this is there a
plan involved in all this? Is just is
this incompetence? Is this ignoring the
fact that they don't have a handle on
it? So, they're trying to silence
disscent, like what is happening?
Um
>> cuz if you wanted to destroy
civilization, if you wanted to destroy
Western civilization,
>> which seems to want to do,
>> um
and you know, there's just so the
uh there there's a guy I think who I
don't know if he's been on your show,
you know, God.
>> Yeah.
>> Has he been on the show?
>> Good friend of mine. Yeah.
>> Yeah, he's great.
>> He's been on multiple times.
>> Oh, great. That's all he's awesome. Um
>> so uh you know the way he's got a good
good uh way to describe it which is
suicidal empathy.
>> Yes.
>> So um is is that you pray upon people's
empathy. You say like well like you feel
sorry for for for something for some
group and then like well um and and that
that that empathy is to such a degree
that it is suicidal to to to your
country or culture. Um and um and and
that's that that suicidal empathy cuz I
don't think we we should have empathy
but but but we should have we should
that empathy should should extend to the
victims not not not just the criminals.
We should have empathy for the people
that they pray upon. Um but that
suicidal empathy is also responsible for
for why somebody's you know arrested 47
times for for violent offenses gets
released and then goes and uh murders
somebody um in the US that that's you
see you see that same phen phenomenon
playing out everywhere uh where the the
suicidal emphy is to such a degree that
we're actually allowing um our women to
get raped and our children to get
killed.
But it just doesn't seem like that would
be anything that any rational society
would go along with. That's what makes
me so confused. It's like you're
importing massive numbers of people that
come from some really dark places of the
world.
>> Well, there's no vetting is the issue.
It's like it's like if like
>> um
if if there's no vetting like people are
just coming through like well what's to
stop someone who just committed murder
in some other country from um coming to
to the United States or coming to to to
Britain um and just continuing their
career of of rape and murder like unless
you've done unless some due diligence to
say like well who who is this person?
What's their track record? If if you if
you haven't confirmed that they have a
track record of being uh you know uh
honest and uh not being a homicidal
maniac, then any homicidal maniac can
just come across the border. And that's
not to say everyone who comes across the
border is a homicidal maniac. If you're
not have if you don't have a vetting
process to to confirm that you're not
letting in um people who who will do
some serious violence, you will get
people who do serious violence uh
sometimes coming through.
>> Well, especially if you don't punish
them and if you don't deport them and if
you are just like what but what is the
purpose of allowing all those people
into the country? It can't be I wouldn't
imagine that anyone in their society
supports this.
>> Well, let me explain. So, so, so the the
cuz you mentioned for example how much
say Hillary and and Obama have changed
their tune um from prior speeches where
they were hot they were hard-nosed about
not letting in uh anyone who is a a
criminal into the country um you know
having sec secure borders all that
stuff. So why did they change their
tune? The reason is that they discovered
that those people vote for them.
That's why they want the open borders
>> because if you let people in, they know
the Democrats let them in. They'll vote
for Democrats. Yes. If you allow them to
vote,
>> which which they are actively trying
doing, they they turn a blind eye to
illegal voting.
>> Well, California literally doesn't allow
you to show your license.
>> California and New York have made it
illegal to show your photo ID when
voting.
Thus, effectively they've made uh it
impossible to prove for fraud.
Impossible. They they've essentially
legalized fraudulent voting in
California and New York and many other
parts of the country.
>> There's no rational explanation that
I've ever seen anyone give as to why
that would be the policy
unless you were trying to just allow
people to vote illegally because there's
no other reason. If you need a driver's
license or you need an ID for everything
else, including just recently to prove
that you were vaccinated,
>> the the same people who are demanding
that you have that you have a vaccine
passport and and are are the same ones
saying you need no ID to vote.
Same people,
>> right? But like
>> so it's obviously hypocritical and
inconsistent.
>> So you really think it's just to to get
more voters?
If if you want to understand behavior,
you have to look at the incentives.
Um so uh once uh you know the Democratic
Party in the US and the left in in in in
Europe realized that um if you have open
borders um and you provide a ton of
government handouts which creates a
massive financial incentive uh for
people from other countries to to come
to to your country and you don't
prosecute them for crime, they're going
to be beholden to you and they will vote
for you.
And that's why
uh Obama and Hillary went from being um
against open borders to being in favor
of open borders. That's the reason in
order to import voters so they can win
elections. Um
and the problem is that that has a a
negative runaway effect. So if they get
away with that like it it is it is a
winning strategy. If they are allowed to
get away with it, they will import as
the enough voters to get supermajority
voting and then there is no turning
back.
>> We talked about this before the election
and then you know you literally pointed
towards a camera. You faced the camera
and said that if you do not vote now,
you might not ever be able to do it
again because it it'll be it'll be
futile. It'll be overrun.
>> Yes. They'll keep the borders open for
another four years and then their
objective will be achieved.
>> Correct? If if if Trump had lost um
there would never have been another real
election again. Um because Trump is
actually enforcing the border. Um now
you you can can you can point to
situations where uh there's been uh you
know um you know immigration had you
know enforcement has been overzealous
because they're not going to be perfect.
There'll be cases where they've been
overzealous um in in expelling illegals.
Um so um but if you say that the the the
standard must be perfection uh for
expelling legals then you will not get
any expulsion um because perfection is
impossible. Um so
>> and you've probably got millions of
people that are here that are trying to
be here under some asylum pretense,
>> right?
>> Yes.
>> Like you could just come from a war torn
part of the world. No, they changed the
definition of asylum to be an economic
to be economic asylum
>> which is everybody
>> which is everybody.
>> Yeah.
>> So
bar to prove
>> it's yeah asylum is supposed to mean
that if you go back to your country
you'll get killed
>> you know that that's what we mean by
that was what it's supposed to mean. Uh
they changed the definition of asylum to
be uh you will have a decreased standard
of living
which is obviously not real asylum. Um,
and and it's and and you can you can
test the absurdity of this by the fact
that people who are asylum seekers go on
vacation to the country that they're
seeking asylum from.
>> You know, that doesn't make any sense.
>> Yeah. It doesn't have to.
>> But when you when you understand the
incentives, then then you understand the
behavior. Um, so once the left realize
that uh that illegals will will vote for
them if they allow if they have open
borders and and combine that with uh
with government handouts.
>> Yeah. to create a massive incentive.
They're basically uh using US and and
Europe, US and European taxpayer dollars
to provide a financial incentive to
bring in as many illegals as possible to
vote them into a into permanent power
into and create a one party state.
And and I I invite anyone who's is
listening to this ju just do do any
research and the more you the more you
dig into it, the more it will become
obvious that what I'm saying is
absolutely true. Well, they were busting
people to swing states. It's it's clear
that they were trying to do something.
And then you had Chuck Schumer and Nancy
Pelosi who are actively talking about
the need to bring in people to make them
citizens because we're in population
collapse.
>> Yes.
>> Yeah.
>> No, that's it's it's that it's that
meme. Yeah.
>> Where so many times where they start off
by saying it's it's not true. It's a
right-wing conspiracy theorist,
>> right?
>> Um then it starts then it's like uh I
think the ne the next step is um well it
it might be true and then it's like okay
it is true but here's why.
>> And then the final step is it's true and
here's why it's good. And it's like but
wait a second you started off saying
it's untrue and it's a right-wing
conspiracy theorist. Now you're saying
it Not only is it true, but it's a good
thing and we must do more of it.
>> Well, this is the thing about Medicaid
and Social Security and people getting
social security numbers.
>> You know that we're massive fraud. It's
massive fraud and it's real and they
denied it forever. And now we're finding
out this is part of the reason why
there's this government shutdown that's
going on right now.
>> Yes. the the entire basis for the
government shutdown is that um is that
the Trump administration correctly does
not want to send massive amounts of like
hundreds of billions of dollars uh to
fund uh illegal immigrants in the blue
states or in all the states really. Um,
and so the and the Democrats want to
keep the the the money spiggot going to
incent uh illegal immigrants to come
into the US who will vote for them.
That's the crux of the battle.
So they want to stop this. So what's
going on right now is they have been
funding these people. They've been
giving them EBT cards. They've been
giving them Medicaid.
And more than that, just like like they
were the um like like they were taking
hotels like four and fivestar hotels
like the Roosevelt Hotel being the
classic example um was they were sending
I think $60 million a year to the
Roosevelt Hotel to uh which all it did
was was house illegals. It used to be a
nice hotel. I mean it still is a nice
hotel. Um uh but uh
and and all around the country this was
happening
>> and all tax dollars.
>> Yes.
>> Yeah. And
>> um Yeah. And uh the Trump administration
cut off funding for example to to the uh
uh to the you know Roosevelt Hotel and
these other hotels saying like we it
it's US tax dollars should not be paid
be sent to have luxury hotels for
illegal immigrants that American
citizens can't even afford which
obviously is the that's that's insane.
That's what was happening. They were
also g giving out like debit cards with
$10,000.
So, it's not just about medical care.
Um, the the the Democrats mention the
medical care because they're they're
trying to prey on people's empathy as
much as possible and then they imagine,
oh wow, somebody has a desperately
needed medical procedure and um
shouldn't we maybe do, you know, take
care of them in that regard, but but
they what they do is they divert the
Medicaid funds uh uh and turn it into a
slush fund for the for the states that
goes well beyond uh emergency medical
care.
and
>> New York and California would be
bankrupt without uh without the massive
fraudulent federal payments that go to
those states to pay for illegals to to
to create a massive financial incentive
for for illegals.
>> How would they be bankrupt because of
that?
>> Uh they wouldn't be able to balance
their state budgets and they can't issue
currency like the Federal Reserve can
>> and so the their ability to balance
budget is dependent upon illegals
getting funding.
the the the scam level here is is
so staggering. Um
so there are there are hundreds of
billions of dollars in of of transfer
payments from from the federal
government to the states. Um those
transfer payments uh the the states
self-report what those transfer payment
numbers should be. So, California and
New York and Illinois lie like crazy uh
and say and and say that this these are
all legitimate payments. Well, these
days they I think they they're even
admitting that they they literally want
uh hundreds of billions of dollars for
illegals. Um but uh but for a while
there they're trying to deny it. Um so
you get these transfer payments for for
every every government program you can
possibly think of. Um and and and these
are self-reported by the state and there
and and at least historically there was
no enforcement of uh of California um
New York, Illinois and and and other
states when when they would lie. There
was no actual enforcement to say like,
"Hey, you you're lying. These these
these payments are fraudulent." Now,
under the Trump administration, um that
Trump administration does not want to
send hundreds of billions of dollars of
fraud fraudulent payments to the states.
the um and the reason you have this the
standoff is because if the hundreds
hundreds of billions of dollars uh to
create a financial incentive to like to
have this giant magnet to attract
illegals from every part of earth to uh
to these states if if that is turned off
they the the illegals will leave because
they're no longer being paid to come to
the United States and stay here. Wow.
And then then then they will lose a lot
of voters. The the the Democratic party
will lose a lot of voters
>> and they would have a very difficult job
if this is kicked out of reintroducing
it into a new bill.
>> Yes.
>> Especially once things start
normalizing.
>> Yes. So like in a nutshell um the
Democratic party wants to destroy
democracy by importing voters and the
you know the Republican party disagrees
with that.
>> And the ruse is that if you don't accept
what they're doing then you're a threat
to democracy.
>> Yes.
>> As they try to destroy democracy.
>> Yes.
>> By importing voters
>> and incentivizing people to only vote
for them
>> and overwhelming the system.
Yes. And and by the way, it's a strategy
that if allowed to work would work and
in fact has worked. Um California
supermajority Democrat.
>> Yeah.
>> Um and and there's so much
gerrymandering that that that occurs.
It's it's it's crazy. Um so
>> I'm sure you're paying attention to this
Proposition 50 thing.
>> That's the thing in California where
they're trying to re redo districts.
>> Yeah.
>> Because I mean California is already
gerrymandered like crazy. Yeah.
>> Um they want to gerrymander it even
more.
>> Um and and I mean
>> because it keeps moving further and
further right. Like if you look at the
map of California each voting cycle more
and more people are waking up and going
what the [ __ ] and we need to do
something to fix this. The only option
available other than the policies that
you guys have always done is go right.
>> And so a lot of people have been air air
quotes red
>> pill.
>> Yeah.
>> And and and then here's another thing
that is very important. um fact that
that is actually not disputed by by
either side which is that when when we
do the census in the United States, the
census, the way the census works uh for
aortionment of congressional seats and
um electoral electoral college votes for
the president is by number of persons in
a state, not number of citizens, right?
>> It's number of pe people. So you could
literally be a tourist and you will
count.
>> Now how do they do the census when they
do that? Do they is it do they ask
people? Do they knock on doors? Do they
have them fill out forms? Like what?
>> Yeah, I think they they mail out census
forms and knock on doors. Um but the way
the law reads right now um and and uh is
that all if if you are a human with a
pulse
um uh then you count in the census for
allocating congressional seats and
presidential votes,
>> right? So, uh, you so
>> electoral college,
>> it doesn't matter whether you're here
legally, illegally, and if if you're a
human with a pulse, um, you count for
congressional aortionment. So that means
that uh the more people the more
illegals that California and New York
can can import when by the time the
census happens in 2030 um the more
congressional seats they will have um
and the more electoral the more
presidential electoral college votes
they will have um so they're trying to
get as many uh illegals in as possible
ahead of the census. Um and because all
h all all human beings even tourists
count for the census and and then if you
combine that with gerrymandering of of
districts in New York and California as
you point out with this proposition
where they're trying to increase the
amount of gerrymandering that occurs in
California, the biggest state in the
country. Um so so you get so so if the
this if this the census then would award
more congressional seats to California
uh because of a vast number of illegals
and New York and Illinois. Um so they
get more congressional seats. They would
get more presidential electoral college
votes getting that would get them the
house the uh a majority in the house and
and and they would get to decide who is
president
uh based literally based on legals. This
is these are not disputed facts by
either party.
I want to emphasize that that's sink in.
>> Yeah, this is not a
>> These are not disputed facts by either
party. It's not a
this these are just this is just the the
way the law works. It it's it is a
you know like I don't think the law
should work that way. Uh I think it
should the aortionment should be
proportionate to to to citizens.
>> But isn't that a problem with how the
constitution is written?
>> Yeah. Yeah. Yeah.
>> Um,
>> they can't really change that.
>> I I'm not sure if it's constitutional or
u but it it it is the way the law is
written. I'm not sure if it's in the
constitution or not in this way, but um
but it is that is the way the law is
written.
>> So, it is an incentive and but it's an
incentive that would be removed with
something simple that makes sense to
everybody that only the people that
should count are people that are
official US citizens.
>> Yes. So, the way the way it should work
is that only US citizens should count in
the census for purposes of of
determining voting power
>> because people that aren't legal can't
vote supposedly.
>> They're not supposed to be voting. Um
but but they do. Um uh but but even even
if even besides that, like said, I I I
just can't emphasize this enough because
this is a very important concept for
people to understand. um is that the law
um the the law as it stands um counts
all humans with a pulse in in in in a
state for deciding how many u house of
representative votes and how many
presidential electoral college votes uh
a a state gets. So the incentive
therefore is uh to uh for California,
New York and Illinois to maximize the
number of illegals so they get so they
get um so that they take house seats
away from red states assign them to
California, New York, Illinois and so
forth. Um then then you combine that
with extreme gerrymandering in in you
know California, New York, Illinois and
and whatnot. So that that basically you
you can't even elect any Republicans and
then they get control of the presidency,
control of the house, then they keep
doing that strategy um and and cement a
supermajority.
That is what they're trying to do.
>> So that would essentially turn the
entire country into California.
>> Yes.
>> Where you have differing opinions, but
it doesn't matter because one party is
always in control.
>> Yes.
Um,
>> when you first started digging into
this, when you first started before you
even accepted this role of running Doge
and being a part of all that, did you
have any idea that it was this [ __ ]
up?
>> Um, I I did. Yeah. I I mean, I sort of
>> When did you start knowing?
>> Um, I guess about like Well, about two
years ago.
>> Isn't that crazy?
>> Yeah. like relatively recently, you
know. So
>> maybe I started I started having an well
I I started like basically having a bad
feeling about 3 years ago, which is why
which was which is when
>> uh why I felt it was like critical to
acquire Twitter um and have a maximally
truth seeeking platform, not one that
suppresses the truth. Um and um
like it it was more it was more like I
like I'm not sure what's going on, but I
have a I have a bad feeling about what's
going on. And then the more I dug into
it, the more I was like, "Holy [ __ ] we
got a real problem here and America's
going to fall."
So uh
>> without anyone knowing it had fallen,
that's that would be the problem. It
could have fallen and been unrepable
without anyone really being aware of
what had happened, especially if you
didn't buy Twitter.
>> Yes, that's that's it. Look, buying
Twitter was a a huge pain in the ass.
Um, and made me a a a a pin cushion of
attacks. Like dab dab stab dab dab.
>> Everybody loved you before that.
>> Well, some people love
>> a lot of people loved you. A lot of
lefties loved you.
>> Uh, I I was a hero of the left. As far
as
>> the thing, if you drove a Tesla, it
showed that you were environmentally
conscious and you were on the right
side.
>> Uh, yeah. Um,
yeah. I mean, I'm still the same human.
I didn't like have a brain transplant
between, you know, since in like three
years ago, you know. Um
>> Well, that's my favorite bumper sticker
that people put on Teslas now. I bought
this before Elon went crazy.
>> I took a picture of one the other day.
Oh, you found somebody. Oh, yeah. I've
seen I've seen three or four of them.
People that have these bumper stickers
on their car that says, "I bought this
before Elon went crazy." Because when
people were vandalizing Teslas
>> Yeah. Um the most there was organized
campaign to literally burn down Teslas
and and we had one of our dealers got
shot up with a gun like they fired
bullets into the in the Tesla
dealership. They're burning down cars.
Uh
it was crazy. Um
uh so but the bumper sticker should read
there should be an an addendum to the
bumper sticker. It's like I bought this
car before
uh Elon turned crazy. Actually, now I
realize he's not crazy and I've seen the
light.
>> That'll take some time. That'll take
some time. People don't want to admit
that they've been tricked.
>> Yeah. I mean, there's that old saying
where it's like it's really easy to fool
somebody, but it's almost impossible to
convince someone that they were fooled.
>> Yeah. It's much easier to fool them than
to convince them they've been fooled.
People cling to their ideas.
>> Yes. They especially if they've like
publicly stated these things, they get
very embarrassed of being foolish.
>> Yeah. People most time they double down.
>> Um and uh
>> and they find echo chambers.
>> Yeah. Yeah. But but there's you know the
thing is that like I you know I've seen
more and more people who were convinced
of the sort of work ideology um see the
light.
>> Yeah. So, not everyone, but it's more
and more um are seeing the light. Um and
and it tends to happen like when when
something happens that really, you know,
directly affects you,
>> right?
>> Um you like there was a friend of mine
who uh was living in in the San
Francisco Bay area and um that tried to
trans his his his daughter um did like
to the point where the the school like
sent sent the police to his house to
take his daughter away from him.
Now, now that's going to radicalize you.
Well, that's going to break that's going
to shake you out of your blue structure.
Um, now I know,
>> so it was an activist at the school that
was trying to do this.
>> Yeah, the school and the and the state
of California conspired to turn his
daughter against him and uh make her
take uh lifealtering drugs that would
have sterilized her um and uh
irreversible.
>> And how old was she? I think 14,
something like that. Um, so and but he
he managed to talk the police out of
taking his daughter away from him that
day. Um, and that that night he got on a
plane to Texas.
>> Wow. Um and uh like you know a year
after just being in in a in a school in
like greater Austin area um she she went
she came went back to normal meaning
like it it wasn't real
>> right
>> um well people are being much more open
to that now. I mean Wall Street Journal
uh yesterday had that opinion piece that
this whole trans thing there's a lot of
evidence is a social contagion.
Absolutely.
>> And Colin Wright wrote that. And then
he's getting death threats now, of
course. And on Blue Sky, there's people
talking about exterminating him, which
is one thing that you are allowed to say
on Blue Sky, apparently.
>> You're you're allowed to say horrible
things about people saying possibly
truthful things about this whole social
contagion. Cuz that's what when you get
nine kids that are in a friend group and
they all decide to turn trans together.
Yeah.
>> Something's wrong. That's not
statistically
>> Yeah. Like here's the like you can
convince kids to do anything. You can
convince kids to be a suicide bomber,
>> right?
>> So
>> which is why they do with in in some
countries why they choose children to do
that.
>> Yes. You can train kids to be suicide
bombers. And if you can train kids to be
su suicide bombers, you can convince
them of anything.
>> Yeah. Especially with enough positive
enforcement and cultural enforcement and
you
>> and and the idea that that that's not
the case.
>> Kids kids are kids are um malleable. The
minds of youth are easily corrupted.
>> You're also seeing a lot of push back
from gay and lesbian people that are
saying like, "Hey, if someone
>> stop including me in Yeah. Exactly. the
LGBT, you know, it's like, wait a
second, why are we being included all
the time in this situation?"
>> Exactly. Exactly. When especially when,
you know, like my friend Tim Dylan's
talked about this is like it's really
homophobic because you're taking these
gay kids and you're you're telling them
like, "Hey, you're not gay. You're
actually a girl."
>> Yes. and you know, hey, hey, go make it
so that you can never have an orgasm
again and you'll be happy.
>> Like,
>> yeah,
>> [ __ ] permanent mutilation, permanent
castration of of kids is like I I think
>> I I we should look look at at uh anyone
who permanently castrates a kid as like
right up there with Ysef Mangler.
>> Yeah.
>> I mean, they're they're mutilating
children.
>> Yeah. Yeah. And um it's thought of as
being kind. And the thing is, would you
rather have a live daughter or a dead
son?
>> That's that's the that's the line they
use.
>> Yeah. Which is not supported by any
data.
>> No. In fact, the the probability of
suicide increases.
>> Right.
>> This is important maybe for the audience
to know. Uh the probability of suicide
increases if you're transit kid, not
decreases.
>> By some accounts, it triples. So that
that is an evil lie. And it's a lie that
is supposedly compassionate. Imagine
you've twisted reality to the point
where
confusing a child that's not even
legally allowed to get a [ __ ] tattoo.
>> Yeah.
>> Right. Because you think that you could
make a mistake with a tattoo, a totally
removable thing,
>> right?
>> If I wanted to, tomorrow I can go to a
doctor and they could laser off every
tattoo that I have on me.
>> Right.
>> Okay. No harm, no foul. Yeah. But you
get sterilized like that's it forever.
Forever. Yes,
>> they'll castrate you. You no longer have
testicles. You have no penis. You have a
hole where your penis used to be.
>> Yes.
>> And this is compassionate and this is
preventing you from
>> Actually, a lot of kids die uh in in
with the these uh sex change operations.
They die the number of deaths on the
operating table. People don't hear about
those. A lot of kids because that we
it's we don't really actually have the
technology to make this work. So a bunch
of the times the kids just die in the
sex change operations.
>> Jesus Christ.
>> Yeah. It's it's demented which it should
be viewed as like you know um like like
evil Nazi doctor stuff basically. That's
why it was like real Nazi not the
[ __ ] fake Nazi stuff.
>> Crazy that even pushing back against
something that seems like fundamentally
logically
very easy to argue the old Twitter would
ban you forever. Uh, yes.
>> That's how crazy a social contagion can
get when it completely defies logic,
victimizes children, does something that
makes no sense, does not supported by
data, all connected to this ideology
that trans is good. We got to save trans
kids, protect trans kids.
>> Yeah. And what I want to emphasize is
that the the save trans kids thing is a
lie. Um if if you if you if you castrate
kids and trans them the probability of
suicide increases. It does not decrease.
It substantially increases. Um the the
the studies have done that I've seen the
the risk of suicide
triples if you trans kids.
So you're not saving them, you're
killing them. Moreover, during the sex
change operation, there are many deaths
that occur during the sex change
operation.
>> Jesus Christ.
It's just crazy that this is a real
issue.
>> Yeah, it it's a nightmare fever dream
and and people are finally waking up
from it.
Now, when you started getting into the
Doge stuff and started finding how much
money is being shuffled around and moved
around to NOS's and how much money is
involved and
just totally untraceable funds like
this is again something like two years
plus ago you weren't aware of it all.
>> No, I was aware of it. Um I just didn't
realize how how the how big it was. just
just just how much waste and forward
there is in the government is truly
vast.
Um in fact the government didn't even
know um and nor did they care.
>> That's crazy.
>> Yeah.
>> And
>> I mean just like some of the very basic
stuff that Doge did um will have lasting
effects. Um and some of these things
like they're so elementary you can't
believe it. So, um the the doge team got
the um you know the the mo most of the
main payments computers um to require
uh the the congressional appropriation
code. So, when a payment is made, you
have to actually enter the congressional
appropriation card. That used to be
optional and and often would be just
left blank. So, the money would just go
out, but it wasn't even tied to a
congressional appropriation. Then they
also Dutch team also made the uh comment
field for the payment mandatory. So you
have to say something. We're not saying
that what what is said like you can say
anything. You you your cat could run
across the keyboard. Uh you could go
querty ASDF but you have to say
something above nothing because what we
found was that there were tens of
billions maybe hundreds of billions of
dollars that were zombie payments. So
there like somebody had approved a
payment uh uh somebody in the government
approved a payment um and uh some
recurring payment and um they retired or
died or changed jobs and no one turned
the money off.
So the money would just keep going out
and and it's a pretty rare
>> go where
>> to to the a company or an individual um
and it's a pretty rare company or
individual who will complain that
they're getting money that they should
not get and and a bunch of the money was
just going to the were transfer payments
to the states.
>> So these are automatic payments there no
accounting for them at all.
>> I imagine like like there's an automatic
debit of your credit card
>> um and you don't you never look at the
statement,
>> right? Um, so it's just money going out.
Uh, that's why I call them zombie
payments. Um, that there might have been
they might have been legitimate at one
point, but the person who approved that
recurring payment um, changed jobs,
died, retired, or whatever, and no one
ever turned the money off.
And my guess is that's probably at least
a hundred billion a year, maybe 200. and
going where
>> uh to to uh
uh I mean there there are millions of
these payments. So so it's I mean
>> millions
>> uh yes
>> millions of payments that are going to
who knows where.
>> Yes. In a bunch of cases there are fraud
rings that operate uh professional fraud
rings that operate to exploit the
system. um they figure out some security
hole in the system and they just do
professional fraud. Um and um you know
that's where we found for example people
who were you know 300 years old in the
social security administration database.
Now, I thought that this was uh a
mistake of not registering their deaths
that people were born like a long time
ago and it had defaulted to like a
certain number and so that after time
those people were still in the system.
It was just an error of the the way the
accounting was done.
>> Yeah. So, um that's not true. So,
there's or or at least one of two things
must be true. um the there's a there's a
typo or or some mistake in the computer
or it's fraudulent, but we don't have
any 300-year-old vampires uh living in
America.
>> Allegedly.
>> Allegedly. Um and uh or or and we don't
have people in some cases who's who are
receiving payments who are born in the
future.
>> Born in the future.
>> Born in the future.
>> Really?
>> Yes. there the people receiving payments
whose birth date uh um was like in 2100
and something
>> okay so there's
>> like next century
>> is there a task
>> we know we know that one of two things
must be true um that that that either
there's a mistake in the computer or
it's fraud
but if you have someone's birth date
that's either in the future or where
they are older than the oldest living
American because the oldest living
American is 114 years old so if they're
more than 114 years
um there is either a mistake and someone
should should call them and say I I
think we have your birthday uh wrong
because it says you were born in 17 you
know 8086
um and um you know that was before you
know um you know before there was really
an America you know it was it was like
uh you know kind of early you know we're
still fighting England type of thing uh
you
It's like uh this person either needs to
be in the Guinness Book of World Records
or or they're not alive,
>> but still at the end of the day, money
is going towards that account that's
connected to this person that is either
non-existent or
>> so like like Yeah. So there was like uh
I think um something like I don't know
20 million uh people in the Social
Security Administration database that
could not possibly be alive um if their
birth date is like based on their birth
date they could not possibly be alive.
>> And then to be clear 20 million people
that were receiving funds
>> uh a bunch of
most of them were not receiving funds.
Some of them were receiving funds. Most
were not receiving funds. But so let me
tell you how the scam works. It's it's a
bank shot. So the Social Security
Administration database is used as the
source of truth by all the other
databases that the government uses. So
even if they stop the payments on the
Social Security Administration database
like unemployment insurance, Small
Business Administration, student loans
all check the Social Security
Administration database to say is this
is this a legitimate alived person? And
uh and if the social security database
will say yes, this person is still alive
even though they're 200 years old. Um
but forgets to mention that they're 200
years old, it just says it just returns
uh uh when when the computer is queries,
it says yes, this person is alive. And
so then they're able to exploit the
entire rest of the government ecosystem.
So fake then you get fake student loans,
then you get fake unemployment
insurance, then you get fake medical
payments. And this doesn't have to be
tied to an individual where where
there's an address where you can check
on this person.
>> No, if you did do if just did any check
at all, you would stop this.
>> So, so, so that's that that that's so so
>> And how much money do you think is
>> any check like anything at all that
would stop would stop the forward like
any effort at all?
>> Um, yeah.
>> So, there's multiple layers. the social
security number verifies that this is a
real person and then the other systems
check every other government payment and
every other government payment system
for everything for like small small
business administration uh student loans
uh Medicaid Medicare uh every other
government payment of which there are
many there there actually hundreds of
government payment systems uh can all be
exploited so long as social security
database says this person is alive
that's the nature of the scam It's a
bank shot. So then the then the rebuttal
from the Dems is like, oh well the vast
majority of the people who are marked as
alive in the Social Security
Administration weren't receiving Social
Security Administration payments. That
is true. What they forgot to mention is
they're getting fraudulent payments from
every other government program.
And that's why the the DMs were so
opposed to turning off to to declaring
someone dead who was dead because it
would stop the entire other all the
other fraud from happening. And so, but
all this is it trackable like all this
other fraud.
>> If they wanted to, they could chase it
all down.
>> Yeah. It's not even hard.
>> And yet they're opposing chasing it all
down.
>> They're opposing chasing it all down
because it would it turns off the money
magnet for the illegals.
Wow.
Because it's very logical to to like
like I'm saying the most common sense
things possible. If someone's got uh a
birthday in social security that is an
impossible birthday, meaning they are
older than the oldest living American or
were born in the future, then you should
call them and say, "Excuse me, we seem
to have your birthday wrong."
Uh because it says that you're 200 years
old. That's all you need to do.
Um and
>> and then you would remove them from the
social security database and make that
number no longer available for all those
other government payments.
>> Exactly.
>> Wow.
And how much money are we talking?
>> It's hund hundreds of billions of
dollars.
>> And this is all traceable. Like you
could hunt all
>> like you don't need to be Sherlock
Holmes here is what I'm saying.
>> Well, this we don't need to call
Sherlock Holmes for this one. Is this
part of
>> you just need to call the person
>> and and say, "Excuse me, we either we we
seem to have the like we we we must have
your birthday wrong because it says
you're 200 years old or were born uh in
the future. Um so could you tell us what
your birthday is?
That's all you need to do. It's it's
that simple." But the all these other
government payments that are available
that are connected to this social
security number, it seems like if you
just chased that all down, Yeah.
>> you would find the widespread fraud. You
would find where it's going.
>> Yes. The but the root of the problem is
the social security administration
database because um the social security
number in the United States is used as a
deacto national ID number.
You know that's why like the bank always
asks for your social like the you know
any financial institution will ask for
your social security number.
>> This is it sounds so insane that this
isn't chased down. I mean
>> I agree
>> that I mean I mean that in and of itself
is that's such mishandling.
>> Yes.
No it's mind-blowing. Um, so yeah, it's
crazy.
>> Well, you were very reluctant last time
you were here to talk about the extent
of some of the fraud because you're
like, they could kill me because this is
kind of
>> Oh, what? Yeah. What I was saying is
that um the like if you create if like
uh
I like like to be pragmatic and
realistic um you actually can't manage
to zero fraud. you can manage to low
fraud number but not to zero fraud. If
you manage to zero fraud, um you you
you're going to push so many people over
the edge who are receiving fraudulent
payments that the number of inbound
homicidal maniacs will be uh really hard
to overcome. So I I'm I'm actually
taking I think quite a reasonable
position which is that we should simply
reduce the amount of fraud which I think
is not an extremist position. Um, and we
should aspire to, you know, have less
fraud over time. Um, not that we should
be ultra draconian and eliminate every
last scrap of fraud. Um,
which I guess would be nice to have, but
but like we don't even need to go that
extreme. I'm I'm saying we should just
stop the blatant large scale super
obvious fraud.
>> I think that's a reasonable position.
>> It's a very reasonable position. Yeah.
And so what was the most shocking push
back that you got when you started
implementing Doge? When you started
investigating into where money was
going?
Well, um I guess it this was I should
have anticipated this, but um
while most of the fraudulent government
payments to especially to the NOS's go
to the Democrats, most of it like I
don't know for argument sake let's say
80% maybe 90%. Um um 10 to 20% of it
does go to Republicans.
And so when we'd turn off funding to a
fraudulent NGO, we'd get complaints from
whatever the 10% of Republicans who were
receiving uh the money and and they
would, you know, they would very loudly
complain. Um
because the the honest answer is the
Republicans are are partly they're
receiving some of the fraud, too.
They're getting a big
>> Jesus.
Yeah, it's I want to be clear. It's it's
not like the Republican party is some um
ultra pure paragon of virtue here.
>> No.
>> Okay. Um
>> well, you see that with the
congressional insider training. It's
across the board.
>> Yeah.
>> It's left and right.
>> I mean, the whole uni party criticism
has some validity to it. you know,
there's so um and it's it's like if you
turn off fraudulent payments, it's not
like like I say, it's not like 100% of
those payments were going to Democrats.
A a small percentage were also going to
Republicans. Those Republicans
complained very loudly.
Um and um
you know and and that's that's
so there was a lot of push back on the
Republican side for when we started
cutting some of these these funds
and I tried telling them like well you
know 90% of the money is going to your
opponents but they still if they even if
they're getting 10% of
>> they want their peace.
>> Yeah. They want their peace
>> and they've been getting that peace for
a long time.
>> Yes.
Did you see
>> this is why like you know politics is
like
>> it's dirty business.
>> Yeah. I mean that's like saying if like
you know if if you if you like sausages
and respect the law do not watch either
of them being being made
>> yeah. Wow.
Well that's not even true because I've
made sausage before.
>> Yeah. Yeah. It's actually like it's not
that big a deal. Yeah. fat and spices
and casing,
>> run it through the machine. Not that big
a deal.
>> Yeah. Um but uh yeah, I mean I I think
the stuff I'm saying here is not uh like
like if if you stand back and think
about it for a second like oh yeah that
that makes sense, you know.
>> Um the it's it's not like um it's not
like one political party is going to be
um
you know pure devil or pure angel.
There's, you know, I think there's
there's there there's there's much more
corruption on the Democrat side, but
it's not there's not there's still some
corruption on the Republican side.
>> How did it happen that the majority of
the corruption wound up being on the
Democrat side?
>> Well, because the the the transfer
payments, especially to illegals, um,
uh, are very much on the Democrat side.
>> That so that's the root of it all is the
illegal situation.
>> Yes. I mean, there's
>> or a focal point.
>> Yes. It's it's also like it's it's um
it's it would also be accurate to say
that while
obviously not everyone who is a Democrat
is a criminal, almost everyone who is a
criminal is a Democrat
because because the Democrats are the
soft crime party. So if you're a
criminal, who you going to vote for?
>> Right.
Right.
>> The soft crime party. Did you think you
were going to be able to get more done
than you were?
>> Um, we did get a lot done,
>> right?
>> Um, and Doge is still still still
happening, by the way. Um, this the the
Doge is still underway. There are still
there are still um there's still waste
and fraud being being cut by by the Doge
team. So, it hasn't stopped. Um, the
>> it's less publicized.
>> It's less publicized. Um, and they don't
have like a clear person to attack
anymore.
>> Well, it seems like they basically they
they applied immense pressure to me to
just to stop it. So then I'm like the
best thing for me is to just, you know,
cut out of this. In any case, as a
special government employee, I could
only be there for like 120 days anyway,
something like that. So whatever the law
says. So I I could I I I was necessarily
could only be there for 4 months uh as a
special government employee. So, um
um but uh yeah, I mean I mean you turn
off the money spigot to to fraudsters,
they get very upset to say the least. Um
and um but my like my death threat level
went uh ballistic, you know, was like a
like a rocket going to orbit. Um
yeah. Um, so but now that now now now
that I'm not in DC that that that I
guess they don't really have a person to
attack uh anymore.
>> Um,
>> well the rhetoric about you has calmed
down significantly.
>> Yeah,
>> it was disturbing. It was disturbing to
watch. It was like this is crazy.
>> And to watch these politicians engage in
it and all these people just like
framing you as this monster. I was like
this is so weird. Like this is what
happens when you uncover fraud.
>> Yes.
>> The whole machine turns on you. And if
it wasn't for a person like you who owns
a platform and has an enormous amount of
money, like could have destroyed you.
>> Yeah.
>> And that was the goal.
>> The goal was to destroy me. Absolutely.
>> Because you were getting in the way
>> of this amazing graft.
>> The the this gigantic fraud machine.
>> Yeah.
>> Um like I think I think Doge team's on
done a lot of good work. Um, you know,
and in in terms of uh fraud and waste
prevented, my guess is it's, you know,
probably on the order of two or 300
billion a year. So, it's pretty good.
>> What do you think could have been done
if you just had like full reign and
total cooperation? How much do you think
you could have saved?
>> I mean, what level of of power are we
assuming here?
>> Godlike.
>> Oh, yeah. I probably cut the federal
budget in half
and get more done.
That is so crazy. It is so crazy that
get more done and federal budget
widespread. It's that widespread.
>> Well, I mean a whole bunch of government
departments simply shouldn't exist in my
opinion. um they they um you know um
>> like examples
>> well the department of ed department of
education which was created uh recently
like under Jimmy Carter um uh the our
educational results have uh gone uh
downhill ever since it was created. So
if you if you create a department and
the result of creating that department
is a massive decline in educational
results and it's department of
education, you're better off not having
it because we're literally we were did
better before there was one than after
>> when you let the states run it.
>> Yes.
>> Yeah.
>> Because at least the states can compete
with one another. Um so but the problem
is like here like cutting department
education. our kids need education.
Yeah, they do. But but this is a new
department that didn't even exist um you
know until late the late 70s. Um and
ever since that department was created,
the results educational results have
declined.
And so why would you have an institution
continue that has made education worse?
It doesn't make sense.
>> They killed it though, right? No, there
still unfortunately
>> but they were trying to kill it.
>> It has been substantially reduced.
>> Okay.
>> Um
>> what other organizations
what other departments?
>> Well, I mean I'm a small government guy.
So um you know when when the you know
when the country was created we just we
we just had the department of state,
department of war um you know and and uh
sort of the sort of the department of
justice. We had an attorney general uh
and Treasury Department. Um
I don't know why you need more than
that.
>> So what other departments specifically
do you think are just completely
ineffective?
>> Well, I mean here it's like a question.
It's a sort of philosophical question of
how much government do you think there
should be?
>> Right.
>> Um in my opinion, there should be uh the
least amount of government. I've heard
the most bizarre argument against this
is that you're cutting jobs and you're
gonna leave people jobless. And I'm
like, but their jobs are useless.
>> Yeah. Paying people to do nothing
doesn't make sense. Um like there's a
like a a great um
a story about like Milton Friedman who
is awesome. Um uh what like generally
whatever Milton Friedman said is you
know people should should do that thing.
Uh I'm not sure if it's apocryphal or
not, but um like like someone complained
to him like he he he observed I think
people that were like um digging ditches
with uh you know with with um shovels
and um and he said well like allegedly
Freeman said, "Well, I think I think you
should use you know um excavating
equipment instead of shovels and you
could get it done with far fewer
people." And then and then someone said,
"But then we're going to lose a lot of
jobs." Well, in that then Freedom says,
"Well, in that case, why don't you have
them use teaspoons?"
Just just dig ditches with teaspoons.
Think of all the jobs you'll create.
>> I mean,
it's [ __ ] Basically, you just want
people to work on on things that are
that are productive. You want people to
work on on building things um on
building you know uh providing products
and services that people find valuable
um like you know making food um being
you know being a farmer or a plumber or
electrician or just anyone who's a
builder or providing useful services. Um
and um that's what you want people to be
doing. um not fake government jobs uh
that that that don't add any value or
may subtract value. Um
um there's also like you know uh to
illustrate the absurdity of also how is
the e how is the economy measured like
the the way economists measure the
economy is is is nonsensical uh because
they'll measure any job no matter even
if that job is a dumb job that has no
point and is even counterproductive. So
like, so the like the joke is like
there's two economists going on a hike
in the woods
and they come across
a pile of [ __ ] and one economist says to
the other, "I'll pay you $100 to eat eat
that shit."
The economist eats the [ __ ] gets the
$100. They they keep walking. Then the
other econ then come across another pile
of [ __ ] And and the the other economist
says, "Now I'll pay you $100 to eat the
pile of shit."
say pays the so pays the other economist
$100 pile of [ __ ] Then they then then
then the way said they said like wait a
second um we both just ate a pile of
[ __ ] and we're no and and and and we're
we're no we we we we
don't have any more extra money like
like we both you just gave the $100 back
to me and we both ate a pile of [ __ ]
This doesn't make any sense. And they
said, "No, no, but think of the economy
because that's $200 of that in the
economy that that basically measure
eating eating [ __ ] would count as a as
as a as a job.
This is this is this is to illustrate
the absurdity of of of
economics.
>> One of the things you said when things
should not count as a job." One of the
things you said when you stepped away is
that you're kind of done and that it's
unfixable. That um well or under its
current form the way people are
approaching it
you can you can make it directionally
better but ultimately you can't uh fully
fix the system. Um,
so, uh, I I I like like like like it it
it is it is it would be accurate to say
that even
like like unless you could go like super
draconian like you know Gangghaskhan
level on on on cutting waste waste and
fraud which you can't really do in a
democratic country um an aspirationally
democratic country then um there's no
way to solve the the the debt crisis.
So, we got we got national debt that's
just insane where the debt payments the
interest payments on the debt exceed our
entire military budget.
I mean, that's one that was one of the
wakeup calls for me. I was like, "Wait a
second. The interest on a national debt
is bigger than the entire the entire
military budget
um and growing. Um this is crazy. Um so
um
so so even if you implement all these
savings, you're only delaying the day of
reckoning for when America becomes goes
bankrupt. So unless you go full
Genghaskhan, um which you can't really
do. So
um
so I came to the conclusion that the
only way that the only way to get us out
of the debt crisis and to prevent
America from going bankrupt is AI and
robotics.
So, like we need to grow the economy
um at
at a at a at a rate that allows us to u
to pay off our debt. Um
and um I I I guess people just generally
don't appreciate the degree to which um
you know this the the government
overspending is is a problem. Um but
even like the social security website,
this is under the Biden administration.
On the website, it would say like uh we
based on on current demographic trends
and um you and and and how much money
social security is bringing in versus
how many social security recipients
there are because we have an aging
population. Relatively speaking, the
average age is is increasing. Social
Security will not be able to ma u
maintain its full payments u I think in
by 2032
there. Okay. So they will social
security will have to stop will start
reducing the the amount of money that
that's been paid people um in in about
seven years.
>> And so the only way to fix that robotics
manufacturing
raise GDP
>> you've got to basically uh massively
increase the um uh economic output which
is and the only way to do that is AI AI
and robotics. So, so basically we're
going bankrupt without AI and robotics
with even with a bunch of savings um the
savings the savings like reducing uh
waste and forward can give us a longer
runway but it cannot ultimately pay off
our national debt.
>> So what do you think the solution is to
the jobs that are going to be lost
because of AI and robotics? The jobs due
to automation the jobs due to no longer
do we need human beings to do these jobs
because AI is doing them. Do you think
it's going to be some sort of a
universal basic income thing? Do you
think there's going to be some other
kind of solution that has to be
implemented
because a lot of people are going to be
out of work, right?
Um I think there will be um actually a
high demand for jobs but not necessarily
the same jobs.
So
I mean this is actually this process has
been happening um throughout um modern
history. Um I mean there used to be like
like doing calculations um ma manually
with with like a pencil and paper. It
used to be a job. So they used to have
like buildings full of people called
computers where the the banks would like
all you do all day is is is um you do
calculations because they didn't have
computers. They didn't they didn't have
digital didn't have digital computers
that that people
>> Yeah. Well, it was just people would
just like add and subtract stuff on
piece of paper and and and that that
would be how banks would do you know
financial processing
>> and you'd have to literally go over
their equations to make sure the books
are balanced.
>> Yeah. And most times it's just simple
math like you know the like in a world
before computers how did you calculate
how did you you do transactions? You had
to do them by hand.
Um so then when computers were
introduced the job of doing um you know
bank calculations no longer existed. Um
so people had to go do something else.
Um and that's what's going to happen
that what's that's what is happening at
an accelerated rate um due to AI and and
then robotics. That's the issue though,
right? The accelerated rate because it's
going to be
>> it's the accelerator. It's it's it's
just happening. Like I said, like AI is
the supersonic tsunami.
>> So that's what I call it, supersonic
tsunami. Um so
>> it's like what other jobs will be
available that aren't available now
because of AI?
>> Um well AI um will is is really still
digital. Ultimately, um AI can improve
the productivity of of humans who who um
build things with their hands or do
things with their hands like plum, you
know literally
welding,
electrical work, plumbing, anything
that's that's physically moving atoms.
Um like cooking food or um you know
farming or or like like anything that's
that's physical uh those jobs will exist
for a much longer time. But anything
that is digital uh which is like just
someone at a computer doing something,
AI is going to take over those jobs like
lightning,
>> coding, anything along those lines.
Yeah,
>> it's going to take over those jobs like
lightning. Um just like it just like
digital computers took over the job of
people doing manual calculations
but but much faster.
>> So what happens to all those people?
Like what kind of numbers are we talking
about? you're going to lose most
drivers, right? Commercial drivers.
You're going to have automated vehicles,
AI controlled systems, just like uh
there's certain ports in China, I think
in Singapore, where everything's
completely automated.
>> Yeah. Mostly. Yeah. Yeah.
>> Yeah. So, you're going to lose a lot of
those jobs. Long shoreman jobs,
trucking, commercial drivers.
>> Yeah. Yeah. I mean, we actually do have
a shortage of of truck drivers, but
there's there's actually um
>> Well, that's why California has hired so
many illegals to do it. Have you seen
those numbers?
>> Yeah. Um I mean, the problem is like
when you when people don't know how to
drive a semi-truck, which is actually a
hard thing to do, then they they crash
and kill people.
>> Yeah.
>> Um a friend of mine's wife was killed by
an an illegal driving a truck and she
was just out biking. Um and uh there was
an illegal he didn't know how to drive
the truck or so or something. I mean and
he he ran ran her over.
Um so I mean like thing is like for
something like you you can't you can't
let people drive uh you know
sort of an 80,000lb semi um if if they
don't know how to do it.
But in California, they're just letting
people do it
>> because they need people to do it.
>> Well, they also need they want the votes
and that kind of thing. But um but but
yeah, like cars are um cars are going to
be autonomous. Um, but there there's
just so many desk desk jobs where where
really people what people are doing is
they're processing email um or they're
answering the phone. Um, and and just
anything that is that that isn't moving
atoms like anything that is not
physically like doing physical work that
will obviously be the first thing those
jobs will be and are being eliminated by
by AI at a very rapid pace. Um
um
and ultimately
I working will be optional
uh because you'll have robots plus AI
um and we'll have in a benign scenario
universal high income not just universal
basic income universal high income
meaning anyone can have any products or
services that they want.
So you
>> but but there will be a lot of trauma
and disruption along the way.
>> So you anticipate a basic income from
that that the economy will boost to such
an extent that a high income would be
available to almost everybody. So we'd
essentially eliminate poverty
>> um in the benign scenario. Yes. So like
the way
>> there's multiple scenarios.
>> There are multiple scenarios. There's a
lot of ways this movie can end. Um, like
the reason I'm so concerned about AI
safety is that like one of the
possibilities is the Terminator
scenario. It's not it's not 0%.
Um so
um, that's why it's like I'm like really
banging the drum on AI needs to be
maximally truth seeeking. like don't
make I don't force AI to believe a lie
like that the for example the founding
fathers were actually a group of diverse
women or that misgendering is worse than
nuclear war because you if if that's the
case and then you get the robots and the
AI becomes omnipotent it can enforce
that outcome
and then
then like unless you're a diverse woman
you're you're out of the picture so
we're we're toast So that's
>> um or you might wake up as a diverse
woman one day
has adjusted the picture and and we are
now
>> everyone's a diverse woman. So that
would be that's the the worst possible
situation. So what would be the steps
that we would have to take in order to
implement the benign solution
where it's universal high income like
best case scenario this is the path
forward to universal high income for
essentially every single citizen that
the the economy gets boosted by AI and
robotics to such an extent that no one
ever has to work again and what about
meaning for those people which is which
gets really weird.
>> Yeah.
>> I don't know how to answer the question
about meaning. Um
>> that's an individual problem, right? But
it's going to be an individual problem
for millions of people.
>> Yeah.
Well, I I mean I I I guess I've like for
fought against saying like, you know, I
you know, I've been I've been a voice
saying like, "Hey, we need to slow down
AI. we need to slow down all these
things. Um, and and we need to, you
know, not not have a crazy AI race. I've
been saying that for a long time, for 20
20 plus years. Um, but but then I, you
know, I came to realize that, um, really
there's two choices here. Either be a
spectator or a or a participant. And if
I'm a, if I'm a spectator, I can't
really influence the direction of AI.
But if I'm a participant, I can try to
influence the direction of AI and have a
maximally truth seeeking AI with with
good values that uh loves humanity. And
that's what we're trying to create with
Grock at XAI. And um you know, the
research is I think bearing this out.
Like I said, the when they when they
compared like how do AIs value the
weight of a human life? Um
Grock was the only one the only one of
the AIS that weighted human life
equally.
um and and didn't and didn't say like a
white guy's uh worth 120th of a of a of
a a black woman's life. Literally,
that's what they they calculation they
came up with.
>> So, I'm like, this is I'm like, this is
very alarming. We should we got to watch
this stuff.
>> So, this is one of the things that has
to happen in order to reach this benign
solution.
>> Yeah. We we we I I just keep
>> Best movie ending. Yeah. Um, you you
want a a curious truth seeeking AI. Um,
and I think a curious truth seeeking AI
will want to foster humanity. Uh,
because we're much more interesting than
um a bunch of rocks. Like you say, like
like I I love Mars, you know, but but
Mars is kind of boring. Like it's just a
bunch of red rocks. Um, it does some
cool stuff. It's got a tall mountain.
It's got, you know, it's got the biggest
re the biggest ravine and the tallest
mountain. Um, but there's no there's no
there's no animals or plants or and and
there's no people. Um, and uh, you know,
so humanity is just much more
interesting if you're a curious truth
seeeking AI than not humanity. It's just
much more interesting. Um, I mean like
as as humans, we could go for example
and and eliminate all chimps. If we said
if we put our minds to it, we could say
we could go out and we could annihilate
all chimps and all gorillas, but but we
don't. Um there has been encroachment on
their environment, but we we actually
try to preserve uh the the uh chimp and
gorilla habitats. Um
and um and I think in a good scenario,
uh AI would do the same with with
humans. it would actually foster uh
human civilization and care about human
happiness.
So this is um this is the thing to to
try to achieve I think. Um,
>> but what is the what does the landscape
look like if you have Grock competing
with Open AI, competing with all these
different like
how does it work? Like what what if you
have AIs that have been captured by
ideologies that are side by side
competing with Grock? like how do we so
this is one of the reasons why you felt
like it's important to not just be a an
observer but participate and then have
Grock be more successful and more potent
than these other applications. Yes, as
long as there's at least one AI that is
maximally truth seeeking, curious, and
um you know, and for example, weighs all
you know, human lives equally um does
not favor one race or gender, then um
then then that that that and and people
are able to look at look at, you know,
Grock at XAI and compare that and say,
"Wait a second, why are all these other
AIs uh being basically sexist and
racist?" Um
um and uh then then that that causes
some embarrassment for the the other AIS
and then they they they they fix they
you know they they improve they tend to
improve just in the in the same way that
um acquiring Twitter and allowing the
truth to be told and and not suppressing
the truth um forced the other social
media companies to be more truthful um
by in in the same way having um Gro be a
maximally truth seeeking, curious AI is
will force the other AI companies to um
be also be more truth seeeking and fair.
>> And the funniest thing is even though
like the socialists and the Marxists are
in opposition to a lot of your ideas,
but if this gets implemented and you
really can achieve universal high
income, that's the greatest socialist
solution of all time. Like literally no
one will have to work. Uh correct. Um
like I said so so there is a benign
scenario here which I think probably
people will be happy with if if as long
as we we achieve it which is sustainable
abundance.
um which is if if um if everyone can
have every like like like if if you ask
people like what's the future that you
want
>> um and uh I think a future where we
haven't destroyed nature like you can
still we have the national parks we have
the the Amazon rainforest still still
there we haven't paved we haven't paved
the paved the rainforest like the
natural beauty is still there but but
people have nonetheless everyone has
abundance everyone has excellent medical
care. Everyone has whatever goods and
services they want.
>> And we just
>> It kind of sounds like heaven.
>> It sounds like it is like the ideal
socialist utopia. And this idea that the
only thing you should be doing with your
time is working in order to pay your
bills and feed yourself sounds kind of
archaic considering the kind of
technology that's at play.
>> Yeah.
>> Like a world where that's not your
concern at all anymore. Everybody has
money for food. Everybody has abundance.
Everybody has electronics in their home.
Everybody essentially has a high income.
Now you can kind of do whatever you
want. And your day can now be exploring
your interests doing things that you
actually enjoy doing. Your purpose just
has to shift. Instead of, you know, I'm
a hard worker and this is what I do and
that's how I that's how I define myself.
No. Now you can [ __ ] golf all day,
you know? You can whatever it is that
you enjoy doing can now be your main
pursuit.
>> Yeah.
>> Well, that sounds crazy good.
>> Yeah, that's that's that's the benign
scenario that we should be.
>> The best ending to the movie is actually
pretty good.
>> Yes. um like I think there's there is
still this question of meaning um of
like making sure people don't
uh lose meaning you know like um so
hopefully they can find meaning in ways
that are not that that's not derived
from their work
>> and purpose purpose for things that you
you know find things that you do that
you enjoy but there's a lot of people
that are independently wealthy that
spend most of their time doing something
they enjoy
>> right
>> and that could be the majority of people
>> pretty much everyone.
>> But we'd have to rewire how people
approach life.
>> Mhm.
>> Which seems to be like acceptable
because you're not asking them to be
enslaved. You're exactly asking them the
opposite. Like no longer be burdened by
financial worries.
Now go do what you like.
>> Yes.
>> Go [ __ ] test pizza.
>> Do whatever you want.
>> Um pretty much. Um, so that's uh that's
that's the that's the that's probably
the best case outcome.
>> That sounds like the best case outcome
period for the future. If you're looking
at like how much people have struggled
just to feed themselves all throughout
history, food, shelter, safety, if all
of that stuff can be fixed, like how
much would you solve a lot of the crime
if there was a universal high income?
Just think of that. Like how much of
crime is financially motivated? You
know, the greater percentage of people
that are committing crimes live in poor,
disenfranchised neighborhoods.
>> So if there's no such thing anymore, if
you really can achieve universal high
income,
>> yeah,
>> that this is it sounds like a utopian.
>> Yes. Um I think some people may commit
crime because they like committing
crime. It just some some amount of that
is they just
>> wild people out there.
>> Yeah. Yeah. Um
>> and obviously they've become 40 years
old living a life like that. Now all of
a sudden universal high income is not
going to completely stop their
instincts.
>> Yeah. Um I mean I guess if you want to
have like like say read a science
fiction book or some books that that are
probably an accurate or or the the least
inaccurate version of the future. I'd
say I' I'd recommend um the Ian Banks
books called the the culture books. It's
not actually a series. It's a It's like
ai sci-fi books about the future.
They're generally called the culture
books. Yen Banks culture books. It's
worth reading those.
>> When did he write these?
>> He started writing them in the 70s. Um
and I think he
um the last one I think he was I think
it was written just like around I don't
know maybe 2010 or something. I'm not
sure exactly.
>> Yeah. Yeah.
>> Scottish author Ian Banks from 87 to
2012.
>> Yeah. Interesting.
>> But he but like he wrote the the like
his first book, Consider Flever. Like he
started writing that in the 70s.
>> These books are incredible, by the way.
>> Oh,
>> incredible books.
>> 4.6 stars on Amazon.
>> Interesting.
>> So,
um,
>> so this gives me hope.
>> Uh, yeah. Yeah.
>> This is the first time I've ever thought
about it this way.
>> Yeah. Well, I mean,
if like
I often ask people, "What is the future
that you want?" And they have to think
about it for a second cuz, you know,
they're usually tied up in whatever the
daily struggles are. But, but you say,
"What is the future that you want?" Um,
and um, and generally sustainable
abundance, what do these folks say,
"What about a future where there's
sustainable abundance?" Like, "Oh, yeah,
that's a pretty good future." Um so um
you know if if and and and that that
future is attainable with AI and
robotics
um but but you know it's it's like I
said there's not every path is a good
path. uh there's this it's but I think
if we if we push it in the direction of
um maximally truth seeeking and curious
then I think AI will want to take to to
take care of humanity and foster uh
foster humanity um
because we're interesting
um and if it hasn't been programmed to
think that like all straight white male
should die,
which Gemini was basically programmed to
do at least at first. Um, you know, they
seem to have fixed it. I hope they fixed
it.
>> But don't you think culturally
like, oh, we're getting away from that
mindset and that people realize how
preposterous that all is.
>> We are getting away from it. Um,
so, uh, we are getting at least it knows
the AI mostly knows to hide things. But
like like I said, there is that I I
think I still have that as or I had that
as my like pinned post on X which was
like uh hey wait a second guys we still
have every AI except Grock uh is saying
that uh basically straight white male
should die um and this is a problem and
we should fix it. um
and you know but simply me saying that
is like tends to generally result in um
you know them like that is kind of bad.
Uh maybe we should just we should not
have all straight white males die. Um I
think they say also all all straight
Asian males should also die as well.
like that like uh
like generally the generally the AI and
the and the media which which back back
in the day the
the media was um you know racist against
uh black people and sexist against women
back in the day. Now now it is racist
against uh white people and Asians and
sexist against men.
>> Um so they just like being racist and
sexist. I think they just want to change
the target. Um so uh but but really they
just shouldn't be uh racist and sexist
at all. Um you know
>> ideally that would be nice.
>> That would be nice. Um, and it's kind of
crazy that we were kind of moving in
that general direction till around 2012
>> and then everything ramped up online and
and everybody was accused of being a
Nazi and everybody was transphobic and
racist and sexist and homophobic and
everything got exaggerated to the point
where it was this wild witch hunt where
everyone was a colomo looking for
racism.
>> Yeah. Yeah. Yeah. Totally. Um, well well
but but but they they were openly
anti-white and often openly anti-Asian.
And then this new sentiment that you
cannot be racist against white people
cuz racism is power and influence.
>> Okay. No, it's not.
>> Yeah. Racism is is is racism in the
absolute. Um so um you know and there
just needs to be consistency. So if it's
okay to have uh let's say uh black or
Asian or Indian or pride, it should be
okay to have white pride, too.
>> Yeah. Um, so that's just a that's just a
consistency question. Um, so, uh, you
know, um, if it if it's okay to be proud
of one religion, it should be okay to be
proud of, I I guess all religions
provided they're that they're they're
not like oppressive.
>> Yeah. Or or or don't like as long as
part of that religion is not like
exterminating uh people who are not in
that religion type,
>> right? Um so uh
it's really just like a consistency
bias. Um
or or just like uh ensuring consistency
to eliminate uh bias. Um so if it is
possible to be uh racist against
uh one race, it is possible to be racist
against any race.
Um so
>> of course logically.
>> Yes.
>> Yeah. and arguing against that is that's
when you know you're catching
>> it's a it's a logical inconsistency that
makes AIS go insane
>> and people
>> and people go insane. Yes.
>> Um
>> but like the the
like like you can't simultaneously say
um that uh there's the systemic uh
racist oppression but also that races
don't exist
that that race race is a social
construct.
like which is it? You know, um you also
can't say that um you know, anyone who
steps foot in America is is
automatically an American except for the
people that originally came here.
>> Exactly. Exactly. Except for the
colonizers.
>> Yeah. Except for the evil colonizers who
came here,
>> right?
>> So which one is it? Like if you if as
soon as you step foot put in a place you
are that you are just as American as
everyone else
>> then um that would have appi if you
apply that consistently then the
original white settlers were also just
as American as everyone else.
>> Yeah. Logically.
>> Logically. Um, one more thing that I
have to talk to you about before you
leave is the rescuing of the people from
the space station, which, uh, we talked
about, you were planning it the last
time you were here.
>> Um, the f the lack of coverage that that
got in mainstream media was one of the
most shocking things that
>> Yeah, they totally memoryhold that
thing.
>> Wild. Yes. Because if it wasn't,
>> it's like it didn't exist. Those people
would be dead. They'd be stuck up there.
>> Well, they'd they'd probably still be
alive, but they'd they'd be having bone
density issues uh because of prolonged
exposure to zero gravity.
>> Well, they were already up there for
like 8 months, right?
>> Yeah.
>> Which is an insanely long time. It takes
forever to recover just from that.
>> Yeah. They're only supposed to be at the
space station for 3 to 6 months maximum.
So,
>> one of the things you told me that was
so crazy was that you could have gotten
them sooner, but
>> Yeah. But for political reasons, uh they
didn't they did not want uh SpaceX or me
to be associated with um returning the
astronauts before the election.
>> That is so wild that that's a fact.
>> First of all, that
>> we absolutely could have done it. Um so,
>> but even though you did do it and you
did it after the election, it received
almost no media coverage anyway.
>> Yes. because nothing good can the the
the media which is essentially a far
left prop the legacy mainstream media is
a far-lft propaganda machine. Um and so
anything any story that is positive
about someone who is not part of the
sort of far-left tribe will not uh get
any coverage.
>> I I could save a busload of orphans and
and it it wouldn't get a single news
story.
>> Yeah, it's it really is nuts. It it was
nuts to watch because even though it was
discussed on podcasts and it was
discussed on X and it was discussed on
social media, it's still it was a blip
in the news cycle. It was very quick. It
was in and out and because it was a su
successful launch and you did rescue
those people, nobody got hurt and there
was nothing really to there was no blood
to talk about,
>> right?
>> Just [ __ ] in and out.
>> Yeah. Yeah. Absolutely. Well, and and as
as you saw firsthand with the Starship
launch, like Starship is um you know by
you know at least by some some would
consider it to be like the most amazing
uh you know engineering project that's
happening on Earth right now outside of
like you know maybe AI or AI and
robotics but but certainly in terms of a
spectacle to see it is uh the most
spectacular thing that is happening on
earth right now
is the Starship launch program which
anyone can go and see if they just go to
South Texas and just they can just rent
a hotel room low cost in South Padre
Island or in Brownsville and you can see
the launch and you can drive right right
past the factory because it's on a
public highway. Um but it gets no
coverage
or what coverage it does get was like a
rocket blew up coverage,
>> right? Yeah. Oh, he's a [ __ ] The rocket
blew up. like the the the the Star Sasha
program is vastly
>> vastly more capable than the entire
Apollo moon moon program. Vastly more
capable. This is a spaceship that is
designed to make life multilanetary to
carry uh millions of people across the
heavens to another planet.
the the Apollo program could could only
send astronauts to the moon for a few
hours at a time.
Like they could send two the entire
Apollo program could only send
astronauts to visit the moon very
briefly and then for a few hours and
then depart.
The starship program could create an
entire uh lunar base with a million
people.
You understand the mag the magnitudes
are
>> there's different very different
magnitudes here.
>> So what was the political
>> basically no no coverage of it.
>> Yeah. But what I wanted to ask you is
like what so what were the conversations
leading up to the rescue like when you
were like I can get them out way
quicker.
>> Yeah. Um
um well I mean you know I raised this a
few times but it was the I was told
instructions came from the White House
that uh you know that that there should
be no attempt to rescue before the
election.
>> That should be illegal.
>> That that that really should be a
horrendous miscarriage of justice for
those poor people that were stuck on
that.
>> Um yeah it it is it is crazy. Um,
>> have you ever talked to those folks
afterwards? Did you have conversations
with them?
>> Yeah. I mean, they they're they're not
going to say anything political to, you
know, they're not like they're never
going to
>> say thank you.
>> Yeah. Yeah. Yeah.
>> Well, that's nice.
>> Yeah. Yeah. Absolutely. So, um,
>> but the instructions came down from the
White House. He cannot rescue them
because politically this is a a bad hand
of cards.
>> I mean, they didn't say because
politically it's a bad hand of cards.
They they just said uh they were they
were not interested in uh any rescue
operation before the election.
>> Yeah. So
>> what did that feel like?
>> I wasn't surprised.
>> But it's crazy.
>> Yeah,
>> because Biden could have authorized it
and they could have said the the Biden
administration is helping bring those
people back, throw you a little funding,
give you some money to do it. the Biden
administration, they funded these people
being returned.
>> Uh yeah, the Biden administration was
not exactly my best friend,
>> especially especially after I um you
know,
you know, helped Trump get elected get
get elected, which I mean some people
>> still think, you know, Trump is like the
the devil basically. Um, and I mean I
think I think Trump actually he's not
he's is not perfect, but but uh he's not
evil. Trump is not evil. I spent a lot
of time with with him and he's
>> I mean he's a product of his time. Uh
but he is not he's not evil.
>> Um
>> no, I don't think he's evil either. But
if you look at the media coverage,
>> the media the media treason like he's
super evil. It's pretty shocking if you
look at the amount of negative coverage.
Like one of the things that I looked at
the other day was mainstream media
coverage of you, Trump, a bunch of
different public figures and then
>> 96% negative or something crazy
>> and then Mum Donnie, which is like 95%
positive,
>> right? Um I mean Manny is is is is a
charismatic swindler. Um I I I mean you
got to hand it to him like he he does he
can light up a stage. Um but he has just
been a swindler his entire life. Um and
um
you know and and uh I think
he what he's I mean he's likely to win.
He's likely to be mayor of New York New
York City.
>> Very likely.
>> Yeah. Very likely. I think Poly Market
has it at what what is the
>> 94%?
>> Yeah, that sounds pretty likely.
>> That's crazy.
>> Like I'm not sure who the 6% are, you
know.
>> Um so, so yeah. So that's um
>> what's also like who's on the other
side? The [ __ ] guardian angel guy
with the beret and Andrew Cuomo who
doesn't even have a party. Like they the
Democrats don't even want him. So you
have those two options.
Um,
>> and then you have the young kids who are
like finally socialism.
>> Yeah, they
they don't know what they're talking
about obviously. Um, so you know, like
you just look at this say how many boats
come from Cuba to Florida and how many
but and how many boats because you know
there's like a constant I always think
like how many boats are accumulating on
the shores of Florida coming from from
Cuba,
>> right? Um
there's a there's a whole bunch of free
boats that you could if you want to go
take them back to Cuba. It's pretty
close.
>> Yeah.
>> But for some reason people don't do
that.
>> Why why why why are the boats only
coming in this direction?
>> Um
>> well who is who are the most rabid
capitalists in America? The [ __ ]
Cubans.
>> Absolutely.
>> Yeah. They're like we've seen how this
story goes.
>> We do not want Exactly.
>> [ __ ] off.
Cubans in Miami, they don't want to hear
any [ __ ] They don't want to hear
any socialism [ __ ] They're like,
"No, no, no. We know what this actually
is. This isn't just some [ __ ] dream."
>> Yeah. It's extreme government
oppression. Um
>> that's how it's a nightmare. And like
the like an obvious way you can tell
which uh which ideology is is the bad
one is um who has to which ideology is
building a wall to keep people in and
prevent them from escaping.
>> Right?
>> Like so East Berlin built the built the
wall not West Berlin,
>> right?
>> They built the wall because people were
trying to escape from communism to West
Berlin. But there wasn't anyone going
from West Berlin to East Berlin,
>> right?
>> That's why the communists had to build a
wall to keep people from escaping.
>> They're going to have to build a wall
around New York City.
>> Yeah. That So, so
that an ideology is problematic. If that
ideology has to build a wall to keep
people in with machine guns,
>> Yes.
>> and shoot you if you try to leave. Also,
there's no examples of it being
successful ever. We're only working out
for people. No, there's examples of a
bunch of lies like North Korea. Give
this land to the state. We'll be in
control of food. No one goes hungry. No.
Now, no one can grow food but the
government and we'll tell you exactly
what you eat and you eat very little.
>> Right.
>> Yeah. What? When you say mom Donny's a
swindler, I know he has a bunch of fake
accents that he used to use. Yeah.
>> And you know, but what else has he done
that makes him a swindler?
Um
well I I guess if you say uh what I mean
if if say if you say to any audience
whatever that audience wants to hear uh
instead of what instead of having a
consistent message I would say that that
is a swindly thing to do. Um
and uh
yeah um
yeah but but he is he is charismatic. Um
>> yeah good-looking guy. Smart,
charismatic.
>> Yeah.
>> Great on a microphone.
>> Yeah. Yeah. Yeah. Yeah. and and what the
young people want to see,
>> you know, like this ethnic guy who's
young and vibrant and has all these
socialist ideas align with them and you
know, they're bunch of broke dorks just
out of college like, "Yay, let's vote
for this." And there's a lot of them and
they're they're activated. They're
motivated.
>> Um,
>> I guess we'll we'll we'll see what
happens here.
>> What do you think happens if he wins?
Um
because like 1% of New York City is
responsible for 50% of their tax base,
which is kind of nuts. 50% of the tax
revenue comes from 1% of the population.
And those are the people that you're
scaring off.
You know, you lose one half of 1%. Yeah,
I mean hopefully this the stuff he's he
said, you know, about government
takeovers of of like that all the stores
should be the government basically. Um
>> I don't think he said that. I think he
said government they want to do
government supermarkets, some state-run
or cityrun supermarkets.
>> Yeah. Um well, it just the the
government is the DMV at scale. So um
you have to say like do you want the DMV
running your supermarket?
>> Right. Um, was your last experience at
the DMV amazing? Uh, and if it wasn't,
you probably don't want the government
doing things.
>> Imagine if they were responsible for
getting you blueberries.
>> Yeah.
It's not going to be good. I mean, the
the the thing about, you know, communism
is is it was it was all bread lines and
bad shoes. Um, you know, do do you want
ugly shoes and bread lines? Because
that's what communism gets you.
It's going to be interesting to see what
happens and whether or not they snap out
of it and overcorrect and go to some
Rudy Giuliani type character next cuz
it's been a long time since there was
any sort of Republican leader there.
And we we live in the in the most
interesting of times
um because We we face the
you know simultaneously face
civilizational decline
um and incredible pro prosperity
um and these these timelines are
interwoven
um so um if Mani's policies are put into
place especially at scale um it it would
be a catastrophic
uh decline in living standards not just
for the rich but for everyone.
um uh as as has been the case with with
every um every for every every socialist
experiment um or every Yeah. So
um but but then as you pointed out the
the irony is that like um the ultimate
capitalist thing of AI and robotics
uh enabling
uh prosperity for all and an abundance
of goods and services actually the
capitalist
uh implementation of AI and robotics
assuming it goes down the the good path
uh is is actually what results in the
communist utopia.
Because fate is fate is an irony
maximizer,
>> right? And and an actual socialism of
maximum abundance of highincome
people.
>> Universal high income.
>> Yeah.
>> Like the the problem with communism uh
is is universal low income. Um it's it's
not that everyone gets elevated, it's
that everyone gets oppressed except for
a very small minority of of politicians
who live a lives of luxury. That's
what's happening every time it's been
done.
>> Yeah. Um so um
but then the
the actual communist utopia if everyone
gets anything they want will be will be
if if will be achieved
if it is achieved it will be achieved
via c capitalism
>> because fate is an irony maximizer.
>> I feel like we should probably end it on
that. Is there anything else? The most
ironic outcome is the most likely,
especially if entertaining.
>> Well, everything has been entertaining.
As long as the bad things aren't
happening to you, it's quite
fascinating. And it's never a boring
moment.
>> Yes. So there's I do have a theory of
why um like if if if simulation theory
is true then
um it is actually very likely that
um the most interesting outcome is the
is the most likely because only the
simulations that are interesting will
continue. The simulators will stop any
simulations that are boring because
they're they're not interesting.
>> But here's the question about the
simulation theory. Is the simulation run
by anyone or is
>> it would be run by someone?
>> It would be run by
>> some some
>> some force
>> the pro the program like in in this
reality that we live in, we we run
simulations all the time. Like so when
we try to figure out if the rocket's
gonna make it, we run um
thousands sometimes millions of
simulations just to figure out which
which
uh path is the good path for the rocket
and and where can it go wrong, where can
it fail. Um but we when we do these I
say at this point millions of
simulations of of what can happen with
the rocket um we ignore the ones that
are where everything goes right um
because we we we just care about the we
have we have to address the situations
where it goes wrong.
Um so
um so so basically in in in and and for
for AI simulations as well like like all
these things we we keep the simulations
going that are the most interesting to
us. Um
so if simulation theory is accurate if
if it is true who knows um then the uh
the the simulators will will only they
will continue to run the simulations
that are most interesting there.
Therefore from a Darwinian perspective
um the only surviving simulations will
be the interest the most interesting
ones. And in order to um avoid getting
turned off uh the only rule is you must
keep it interesting or you will if or
you will because the boring simulations
will be terminated.
>> Are you still completely convinced that
this is a simulation?
>> I didn't say I was completely convinced.
>> Well, you said it's like the odds of it
not being are in the billions.
But I guess it's not completely cuz
you're saying there's a chance.
>> What are the odds that we're in base
reality?
Um
well given that given that that we're
able to create increasingly
sophisticated simulations. So if you
think of say video games and how video
games have gone from very simple video
games like Pong with you know two
rectangles and a square to video games
today being um photorealistic
uh with millions of people playing
simultaneously and all of that has
occurred in our lifetime.
So if that trend continues,
uh, video games will be
indistinguishable from reality. The
fidelity of the game will be such that
you you don't know if that what you're
seeing is a real video or a fake video.
Um, and like AI generated videos at this
point, you like you can sometimes tell
it's an AI generated video, but often
you cannot tell and soon you will not
really just not be able to tell. So um
if if that's happening in our direct
observation
then and and we're create we'll create
millions if not billions of
photorealistic simulations of reality
then
what are the odds that we're in base
reality
or versus someone else's simulation?
Well, isn't it just possible that the
simulation is inevitable, but that we
are in base reality building towards a
simulation?
We're making simulations.
Um, so um
we're making simulations. We make like
you can just think of like
photorealistic video games as as being
simulations.
>> Mh. Um, and especially as you apply AI
in these video games, the the characters
in the video games will be incredibly
interesting to talk to. They won't just
have a limited dialogue tree where if
you go to like the the crossbow merchant
or like and you you try to talk about
any subject except buying a crossbow,
they just want to talk about selling you
a crossbow. Um, but with with with AI
based non-player characters, you can
you'll be able to have an elaborate
conversation with no dialogue tree.
Well, that might be the solution for
meaning for people. Just lock in and you
could be a [ __ ] vampire and whatever.
You live in Avatar land. You could do
it. You could do whatever you want. I
mean, you don't have to think about
money or food.
>> Ready Player One.
>> Yeah. Literally. Yeah. But with higher
living standards.
>> Yeah.
>> You don't have to be in a little
trailer. I
>> I mean, I think this people do want to
have some amount of struggle or
something they want to push against. Um
but but it it could be you know playing
a a sports or playing a game or
something.
>> It could be easily playing a game and
especially playing a game where you're
now no longer worried about like
physical attributes like athletics like
bad joints and hips and stuff like that.
Now it's completely digital but yet you
do have meaning in pursuing this thing
that you're doing all day.
Whatever the [ __ ] that means.
It's going to be weird.
>> It's going to be interesting.
>> It's gonna be very interesting.
>> Um the most the most interesting
>> and and usually ironic outcome is the
most likely.
>> All right.
>> That's a good predictor of the future.
>> Thank you. Thanks for being here. Really
appreciate you. Appreciate your time.
You I know you're a busy man, so this
means a lot you come here to do this.
Welcome. All right. Thank you. Bye,
everybody.
Loading video analysis...