Video File
Audio File
Full Transcripts
00:00:00
Speaker 1
We’re 7 minutes then and we.
00:00:01
Speaker 1
Produce absolutely nothing that will go in the show.
00:00:05
Speaker 2
It’s breaking up with.
00:00:06
Speaker 4
His with.
00:00:07
Speaker 3
His commentary.
00:00:08
Speaker 3
When Freeburg is criticizing you for being too negative, you’re in a dark place, Max.
00:00:12
Speaker 2
I’m actually angry.
00:00:13
Speaker 3
It’s sad for not publishing my animation.
00:00:16
Speaker 1
It’s coming.
00:00:17
Speaker 3
The ball.
00:00:18
Speaker 1
Crashed. We had such a.
00:00:19
Speaker 1
Crowded room we had 1000.
00:00:20
Speaker 3
That was impressive.
00:00:21
Speaker 1
People in the room for like 4 hours.
00:00:24
Speaker 3
It was like the original days of clubhouse, everyone.
00:00:27
Speaker
I know that.
00:00:27
Speaker 3
Was trying to get in but was texting saying he.
00:00:28
Speaker 3
Couldn’t get in.
00:00:29
Speaker 3
So it’s definitely capped out, right?
00:00:31
Speaker 3
I know well.
00:00:31
Speaker 1
We hit we hit some scalability with it.
00:00:33
Speaker 3
You may want to buy an extra server or sacks.
00:00:36
Speaker 3
Weren’t you the same guy who’s responsible scaring PayPal?
00:00:38
Speaker 3
No, that was somebody else that was in jail.
00:00:41
Speaker 3
He sold it before it’s killed.
00:00:42
Speaker 1
No, that’s not true.
00:00:43
Speaker 1
We had you feel bully challenges that PayPal too.
00:00:46
Speaker 3
It seems like a singing.
00:00:48
Speaker 1
Yeah, The thing is, when you have an access breaking out, you hit scalability challenges.
00:00:52
Speaker 1
It’s called a high class problem.
00:00:56
Speaker 2
But the trickle? It’s 2022.
00:00:59
Speaker 1
2000% conversation is.
00:01:02
Speaker 3
So I haven’t written code in 20 years. Here’s what you do when you get to 1000.
00:01:06
Speaker 3
People coming to.
00:01:06
Speaker 3
The room everybody else is in passive.
00:01:09
Speaker 2
But you’ve never written code.
00:01:11
Speaker 3
Of course I drive.
00:01:12
Speaker 3
That’s where I come up to.
00:01:13
Speaker 2
Others. So yeah, it’s actually.
00:01:14
Speaker 3
Been 25 the last time I.
00:01:15
Speaker 3
Wrote code with Lotus notes.
00:01:19
Speaker 4
Second, Raymond James.
00:01:19
Speaker 1
Let your winners live.
00:01:23
Speaker 4
2nd we open sources.
00:01:34
Speaker 3
So there have been speed cheating scandals across poker chats and even competitive fish.
00:01:40
Speaker 3
I don’t know if you guys saw the fishing run, but they thought rates in Calais or?
00:01:45
Speaker 2
Well, and then.
00:01:46
Speaker 3
Everybody wants us to check in on the chest and the poker scandals.
00:01:50
Speaker 3
Chuck.com just released their report.
00:01:53
Speaker 3
Back this Grand Master has been suspended.
00:01:56
Speaker 3
That evidence he cheated.
00:01:58
Speaker 3
Basically in a bunch of.
00:02:02
Speaker 3
Tournaments that were, in fact, for money.
00:02:05
Speaker 3
He denied that he had done that, but he had previously cheated as a kid.
00:02:07
Speaker 3
They now have the statistical proof that he was playing essentially perfect checked and they’ve outlined this in like hundreds of pages in the report.
00:02:18
Speaker 3
So actually.
00:02:21
Speaker 1
Broken finally came out and explained why he thought Hans Niemann was cheating.
00:02:26
Speaker 1
Basically, he got the strong perception during the game that Hans wasn’t really putting in a lot of effort, that he wasn’t under a lot of stress.
00:02:33
Speaker 1
And he’s he it’s his experience.
00:02:35
Speaker 1
That when he’s playing, you know, the top players, they’re intensely concentrating and the house payment didn’t seem to be.
00:02:41
Speaker 1
Exerting himself at all so.
00:02:43
Speaker 1
His, you know, hackles were raised and got suspicious and then, you know, he has had this meteoric rise, the fastest rise and classical chess rating ever, and I guess there were he had gotten suspended.
00:02:55
Speaker 1
Interest.com.
00:02:56
Speaker 1
In the past for cheating.
00:02:57
Speaker 1
So on this basis, so maybe other things that Magnus is telling US, 9 space, he said that this guy is.
00:03:02
Speaker 4
Meeting I think that.
00:03:03
Speaker 1
Maybe the interesting part of this is that there’s been a lot of analysis now on Steam and games, and I do think the methodology is kind of interesting.
00:03:10
Speaker 1
So what they do?
00:03:11
Speaker 1
Is they run all of his?
00:03:12
Speaker 1
Games through a computer.
00:03:15
Speaker 1
And they compare his moves to the.
00:03:17
Speaker 1
Best computer move and they basically assign a percentage.
00:03:21
Speaker 1
That cost matching the correlation.
00:03:24
Speaker 1
Matches the computer. Move and.
00:03:25
Speaker 1
What they found is there were handful of.
00:03:27
Speaker 1
Games where was literally 100%, that’s.
00:03:29
Speaker 1
Basically impossible without cheating. I mean, you look at the top players who he entire career I’ve never had 100% game, but you know, chances so subtle that the computer can now see somebody.
00:03:40
Speaker 1
Moves into the future.
00:03:41
Speaker 1
That naming the best move every single time for forty 5000 news is just.
00:03:45
Speaker 3
Now angling found and in chests, which a human really can’t do that well is that there are.
00:03:50
Speaker 3
Positional sacrifices that you will make in short lines that pay off my translator in the future, which is impossible for a human to calculate.
00:03:58
Speaker 3
And so you know.
00:03:59
Speaker 3
And, and you saw this, by the way, when I think it was the there was the Google AI, the DeepMind AI that that also played chess.
00:04:06
Speaker 3
So the idea that this guy could barely, could play absolutely perfectly according to those.
00:04:11
Speaker 3
Lines is only possible if you cheat.
00:04:13
Speaker 1
Right, exactly. So there were handful of games at 100% and then there were tournaments where his percentages were in the 70 something plus.
00:04:21
Speaker 1
And so, just to give you some basic comparison, Bobby Fischer during his legendary 20 game-winning streak was at 70.
00:04:28
Speaker 1
2% so he only matched the computer. So for best move 72% of the time, Magnus Carlsen playing at his best is 70%.
00:04:38
Speaker 1
Jerry test balls and his career was 69% and then the, you know the the Super GMs category are typically in the 64 to 68% range. So I think it’s really interesting.
00:04:49
Speaker 1
Actually, how you cannot qualify by comparing the human move to the best computer move.
00:04:54
Speaker 3
Whether he’s playing the best, they actually.
00:04:58
Speaker 1
It provides a way to ask us to be the greatest player ever is I actually thought that it.
00:05:01
Speaker 1
Was Magnus but.
00:05:03
Speaker 1
Now maybe there’s a basis for believing it was Bobby Fischer ’cause he wasn’t 72 and I slowly extremity.
00:05:08
Speaker 1
However, like the idea that Hans Niemann is in the 70s, eighties or 90s during tournaments would be, you know, just an off the charts level of play and if if.
00:05:19
Speaker 1
He’s not shooting.
00:05:20
Speaker 1
Then we should expect over the next.
00:05:22
Speaker 1
Couple of years.
00:05:23
Speaker 1
That he could.
00:05:24
Speaker 1
Rapidly become the world’s number one player over the board.
00:05:28
Speaker 1
You know now that they have all this anti chewing stuff, right?
00:05:31
Speaker 1
So it would be nice to see what happens in his career now that they’ve really cracked down on you know on with with anti in teaching technology.
00:05:39
Speaker 3
I have a general observation, which is these people are complete losers.
00:05:45
Speaker 3
The people that cheat in any of these games don’t understand the basic simple idea, which is that crying is a huge part of the human experience.
00:05:54
Speaker 3
The whole point is to be out there in the field.
00:05:56
Speaker 3
Of play trying.
00:05:58
Speaker 3
And it’s basically taking the wins and the losses and getting better.
00:06:02
Speaker 3
That is the path that’s less fun.
00:06:04
Speaker 3
Once you actually win, it’s actually not that much fun because then you have this pressure of maintaining excellence that’s a lot less enjoyable than the path to getting there.
00:06:14
Speaker 3
And so the fact that these people don’t understand that makes them slightly.
00:06:19
Speaker 3
In my opinion.
00:06:21
Speaker 3
And then the other thing is like why is it that we have this train of people now that are just so devoid of any personal responsibility that they will?
00:06:29
Speaker 3
Just so brazenly.
00:06:30
Speaker 3
Take advantage of this stuff.
00:06:31
Speaker 3
It’s really ridiculous to be bigger.
00:06:33
Speaker 3
It’s really bad, these people.
00:06:35
Speaker 3
It’s weird prophetic, this it, but it is really interesting how they caught him and run this against a computer.
00:06:41
Speaker 3
Here’s a chart of his scores in these tournaments.
00:06:45
Speaker 3
Over here is the first chart is how quickly he advanced, which was off the charts, and then the second chart that’s really interest.
00:06:52
Speaker 3
Thing is, his chess.com strengths. You don’t know chess.com and it’s become like a juggernaut in the chess world, especially after that HBO series.
00:07:01
Speaker 3
Came out.
00:07:01
Speaker 3
A lot of people subscribed to Tyson.
00:07:03
Speaker 3
Structuralist push up there.
00:07:05
Speaker 3
And then you look at the chest.
00:07:06
Speaker 3
Franks were there.
00:07:08
Speaker 3
He meets this, like, perfect.
00:07:09
Speaker 3
And then the number of games you likely cheated on.
00:07:11
Speaker 3
You can see the last two columns.
00:07:12
Speaker 3
He’s busy tweeting in every game.
00:07:15
Speaker 3
Yeah, Queen’s gambit. Yeah, great show on Netflix.
00:07:19
Speaker 3
And he said he didn’t cheat on any of the games.
00:07:24
Speaker 3
Were they were live streaming but they proven that wrong facts.
00:07:27
Speaker 3
How did he cheat in person then?
00:07:29
Speaker 1
That’s the thing no one really knows.
00:07:30
Speaker 1
And I don’t want to overly judge it until they have hard proof that he was cheating.
00:07:36
Speaker 1
I mean, like, here’s the thing.
00:07:36
Speaker 1
He was never caught in the act.
00:07:38
Speaker 1
It’s just that the computer evidence, you know, seems pretty damning.
00:07:42
Speaker 1
And I mean.
00:07:43
Speaker 3
It’s The thing is I.
00:07:44
Speaker 1
Don’t know how they.
00:07:44
Speaker 1
Prove I don’t know how they prove that he was cheating over.
00:07:47
Speaker 1
The board without actually catching.
00:07:49
Speaker 1
Him doing it.
00:07:49
Speaker 1
That I don’t.
00:07:50
Speaker 1
I still don’t think anyone really has a good theory in terms.
00:07:53
Speaker 1
Of how he was able to do that.
00:07:54
Speaker 3
But it’s such a big deal about the deficient thing, Jason, which.
00:07:57
Speaker 3
Was crazy. Checking preferred.
00:07:59
Speaker 3
Shared the video the sky with inefficient competition and they basically copies.
00:08:03
Speaker 3
Fish and then they put the big.
00:08:05
Speaker 3
Weighted pellets inside the fish’s body. They even put like.
00:08:08
Speaker 3
So, you know, chicken breasts and chicken filets inside of the the thing side, they would rain north.
00:08:14
Speaker 3
You know.
00:08:15
Speaker 3
Then there is local info for everybody who framed that.
00:08:19
Speaker 3
There are ways in which you can read the RFID news and some of the cards and somebody you know televised situations and and front run what the what the planning situation is so that you know whether you’re winning or losing.
00:08:30
Speaker 3
And again I just asked the question like.
00:08:33
Speaker 3
Is it?
00:08:33
Speaker 3
Is it?
00:08:34
Speaker 3
Is it?
00:08:34
Speaker 3
Are things that bad that this is what it gets to like?
00:08:37
Speaker 3
We all play poker, the idea that we would play against somebody that would take that edge.
00:08:43
Speaker 4
That’s correct. Yeah.
00:08:44
Speaker 3
It’s really makes me really sad to disappointed.
00:08:47
Speaker 3
Yeah, I think maybe one woman operation.
00:08:48
Speaker 3
Might be that.
00:08:51
Speaker 3
Across all three ’cause I’m.
00:08:52
Speaker 3
Trying to find some common.
00:08:53
Speaker 3
Thread across these.
00:08:54
Speaker 2
But it could be.
00:08:56
Speaker 3
That there is a lot of cheating going on for.
00:08:57
Speaker 3
A long.
00:08:58
Speaker 3
Time, and maybe the fact that we do have.
00:09:03
Speaker 3
So much digital imagery that’s live on these things now, and so much coverage and everyone got a cell phone, but suddenly.
00:09:11
Speaker 3
Our perception of the cheating in competitive events is becoming.
00:09:17
Speaker 3
More attuned, where maybe there’s been a lot of cheating for a long time and it’s just.
00:09:22
Speaker 3
Kind of coming.
00:09:22
Speaker 1
To light.
00:09:23
Speaker 3
I mean, we didn’t have.
00:09:24
Speaker 3
A lot of live streaming in poker?
00:09:26
Speaker 3
Who knows?
00:09:27
Speaker
I mean, we.
00:09:27
Speaker 3
Could probably ask.
00:09:28
Speaker 3
Phil this like eating but like for how many years? Well there’s space at least. well-being in the.
00:09:32
Speaker 1
Yeah, there was strangely cheating in online poker in online remember like people are using these like software programs that.
00:09:34
Speaker 2
High limit, yeah.
00:09:35
Speaker 2
And Parliament culture.
00:09:40
Speaker 1
Would tracked down in history of your opponents?
00:09:44
Speaker 1
God yeah, exactly so.
00:09:47
Speaker 1
So it helps you assess whether the person might be blocking the optical situation.
00:09:50
Speaker 1
Yeah, he has superhuman memory, so.
00:09:52
Speaker 3
I don’t know if you guys, I don’t know if you guys watch Twitch like video games like Fortnite or whatever, but there are like players that have been accused of using the screen overlay systems that basically more accurately show you and drive the mouse.
00:10:05
Speaker 3
The word individual is on the screen.
00:10:07
Speaker 3
You need some more after they shoot.
00:10:08
Speaker 3
Them, and so the software overlays.
00:10:10
Speaker 3
That makes you a better.
00:10:13
Speaker 3
You know.
00:10:14
Speaker 3
Can I tell you like the two liners and then the the stuff basically became like, so now what’s interesting is now there’s eye tracking software that people are using on Twitch streams to see if the individual is actually spotting the target when they shoot or if the software is finding the target and.
00:10:31
Speaker 3
Called Aim Bot for India.
00:10:34
Speaker
Yeah, and like.
00:10:35
Speaker 3
Reverse cheat the whole thing.
00:10:36
Speaker 3
And I think what’s interesting is just that there’s so much, you know, insight now.
00:10:39
Speaker 3
So much more video streams, so much more.
00:10:42
Speaker 3
I mean, think about all those guys that got fixed income positions that would have cell phones and they all videos.
00:10:43
Speaker 4
You are under tools, yeah, understood.
00:10:46
Speaker 3
This thing happening.
00:10:47
Speaker 3
Yeah, I think 10 years ago that would have.
00:10:49
Speaker 3
Been the case, then there wouldn’t have been a big story about it.
00:10:51
Speaker 3
And so.
00:10:53
Speaker 3
Do where?
00:10:53
Speaker 3
Yeah, I I think the theme is, is pretty obvious, which is that there’s been an absolute decade of personal responsibility.
00:11:02
Speaker 3
People don’t feel like there’s any.
00:11:05
Speaker 3
Downside to cheating anymore and they’re not willing to take it upon themselves to take a journey of wins and losses to get better at something.
00:11:12
Speaker 3
They want the easy solution.
00:11:14
Speaker 3
The easy solves the quick answer, you know, that gets them to some sort of finish line that they would imagine for themselves will solve all their problems.
00:11:24
Speaker 3
The problem is it doesn’t solve any problems and it just makes them a wholly corrupt individual.
00:11:30
Speaker 3
OK, so let’s talk about this after casino live cash game play.
00:11:34
Speaker 3
This is William Robbie who is a new player.
00:11:38
Speaker 3
Apparently she’s being saved in a very high safety and springs about Garrett, who is a very, very.
00:11:43
Speaker 3
A known winning cash game player.
00:11:48
Speaker 3
And it was a very strange hand.
00:11:50
Speaker 3
On the turn all the money gets in, she says she has a bluff catcher.
00:11:53
Speaker 3
Then she claims that she had thought she missed round her hand.
00:11:57
Speaker 3
Now people are saying that.
00:11:59
Speaker 3
The poker world seems to be 7030 that she cheated, but people keep vacillating back and forth.
00:12:06
Speaker 3
There was a lot of weird words, valid, that she said that she had a block catcher, which would normally be an ace.
00:12:12
Speaker 3
Then she said she brushed her freeze, and then she merely said afterwards that he was giving her too much.
00:12:18
Speaker 3
They confronted her in the hallway.
00:12:19
Speaker 3
She gave the money back because he supposedly loves production.
00:12:23
Speaker 3
So all of this stuff.
00:12:23
Speaker 3
Sounds very weird.
00:12:24
Speaker 3
One side says, OK, all this was happening because she’s a noob player.
00:12:28
Speaker 3
The other side is saying somebody was signaling her that she was good and giving her just a binary.
00:12:34
Speaker 3
You’re good because if you.
00:12:35
Speaker 3
Weren’t going to.
00:12:36
Speaker 3
Choose cheating with Jack I.
00:12:37
Speaker 3
In a situation where you just put all info like 200 quarter, $1,000,000, pot seems very fresh.
00:12:42
Speaker 4
In fact.
00:12:43
Speaker 3
I don’t know if.
00:12:43
Speaker 3
You guys watch the hand breakdown.
00:12:46
Speaker 3
Where does everybody stand on a percentage basis?
00:12:48
Speaker 3
I guess they’ve they’ve been disputing or not ’cause this is not definitive.
00:12:55
Speaker 3
It’s not.
00:12:55
Speaker 3
It’s not so obvious in that situation.
00:12:57
Speaker 3
But I think the way that that line played made no sense.
00:13:01
Speaker 3
Did not using.
00:13:02
Speaker 1
She was holding a Jack for and I guess in her previous hand she had a Jack three and there was a 3 on the board.
00:13:10
Speaker 1
So if she misread.
00:13:10
Speaker 3
Sir, hold the board. Was 1010 ninth right now?
00:13:11
Speaker 1
Our hands for it.
00:13:13
Speaker 3
But you wouldn’t. You wouldn’t.
00:13:14
Speaker 3
You would have had to call the slot style thinking what?
00:13:17
Speaker 1
Yeah, no, I get it.
00:13:19
Speaker 1
The hand makes no sense, but I think trying to find a logical.
00:13:21
Speaker 3
Explanation and that Jack 3 explanation, somebody kind of fed that to her and then she changed her story to that.
00:13:28
Speaker 3
So the changing of the story is the thing I was sort of cute on through burgers.
00:13:32
Speaker 3
Why she keeps changing her story isn’t because she’s embarrassed.
00:13:34
Speaker 3
Maybe she’s had a couple of beverages or whatever.
00:13:38
Speaker 3
Or should I say new player and she’s embarrassed by her play and can’t explain it.
00:13:42
Speaker 3
She can’t explain the hand history.
00:13:43
Speaker 3
All of the things you’re saying are probable, I don’t think.
00:13:47
Speaker 3
Yeah, I don’t think there’s any data for us to have a stronger point of view on this.
00:13:51
Speaker 3
I’m just looking forward to us all playing live. Yeah, HCL poker log October 21st minus David Sacks.
00:13:58
Speaker 3
Unfortunately to Mark J.
00:14:00
Speaker 3
Cole, Gardner family can still this.
00:14:03
Speaker 3
We’re going to be doing work, be talking on the same St, and seeing people I figured out how to hack into.
00:14:09
Speaker 3
The video stream.
00:14:10
Speaker 3
No, I just got my RFT ID.
00:14:12
Speaker 3
We are on glasses as well.
00:14:14
Speaker 3
All your city handshake, al.
00:14:15
Speaker 3
I’m gonna take your money.
00:14:17
Speaker 3
And let me buy my kids in my shorts.
00:14:17
Speaker 2
Thank you.
00:14:19
Speaker 3
For my 40th birthday, style organized poker in Tahoe, OK and and we brought in the team from CBS.
00:14:26
Speaker 3
That was the presence there and they don’t taste it as if it was being broadcast with whole cars and commentator.
00:14:32
Speaker 3
Then we edited it into a two day show.
00:14:34
Speaker 3
It was an incredible birthday present.
00:14:36
Speaker 3
I it was.
00:14:37
Speaker 3
It’s one of our creative.
00:14:38
Speaker 3
Things that that anybody’s ever given me appreciate.
00:14:41
Speaker 3
There is a one hour block where somebody at the table says, OK guys, how about we do a changing free for?
00:14:47
Speaker 3
All yes, where you could look at each other cards and you know you could sort of help somebody else, script cards, whatever.
00:14:55
Speaker 3
In that one hour, our beautiful.
00:14:58
Speaker 3
Home came with friendship became Lord of the flies I.
00:15:01
Speaker 2
Have never seen so much hatred.
00:15:04
Speaker 3
Angling mean behavior. Oh my God, it was impossible. Olivander capable of so I I hope that we never, never, we never see cheating in our game. Well, we’ll see how it goes on October 24th at ACL poker.
00:15:20
Speaker 3
I’m excited. I can’t wait.
00:15:21
Speaker 4
It should be a lot of.
00:15:22
Speaker 3
Fun. There should be a.
00:15:23
Speaker 3
Lot of fallout and we’re not having.
00:15:25
Speaker 3
Any official 100 stuff?
00:15:27
Speaker 3
But the fans, some of the fans who are at the Orleans Plummet 2022 are doing their own 100 episode 100 V drops on October 15,000, all in need.
00:15:42
Speaker 3
So there are family that’s happening in zero.
00:15:44
Speaker 3
A bunch of other places are going to face problems or something.
00:15:46
Speaker 3
And just in holiday things, you know, it might be like 10 people in a bar somewhere.
00:15:50
Speaker 3
I think the largest one is like Miami or San Francisco are going to be like 50 people or something.
00:15:54
Speaker 4
We could.
00:15:57
Speaker 3
Basically we are sending an invite in office anytime.
00:16:01
Speaker 3
Looking next week, what is this? The 15,000 cases comma?
00:16:06
Speaker 3
Yeah, October 15th.
00:16:08
Speaker 3
The Saturday after the 100th episode, providing all in need of silence.
00:16:12
Speaker 2
Earlier this week, it was reported.
00:16:15
Speaker 3
That Elon contacted Twitter’s board and suggested that they move forward with closing the transaction at the original term and the original purchase price of $54.20 a share in the couple of days since then.
00:16:28
Speaker 3
And even as of right now, with some user reports coming out here on Thursday morning, it appears that there are still some question marks.
00:16:35
Speaker 3
Around whether or not the deal is actually going to move forward at 5420 years share because you.
00:16:40
Speaker 3
On as of right now, the, the report said is still asking for a financing contingency in order to close and there’s a lot of back and forth on what the terms are.
00:16:48
Speaker 3
Meanwhile, the court takes in.
00:16:49
Speaker 3
Delaware is continuing forward.
00:16:51
Speaker 3
On whether or.
00:16:52
Speaker 3
Not Iran breached his terms in the original agreement to close and by Twitter at 5420.
00:16:59
Speaker 2
As we know.
00:17:00
Speaker 3
Leading up to the signed deal or post signing the deal.
00:17:04
Speaker 3
You all put together a financing syndicate, a combination of debt investors as well as equity Co investors with him to do the purchase of Twitter at $54.20 a share.
00:17:15
Speaker 3
So the 40 some odd billion dollars of capital that’s needed was committed by a set of investors that were going West debt.
00:17:22
Speaker 3
And equity and there’s a big question mark.
00:17:24
Speaker 3
Now on whether or not those investors want to or would still consummate the transaction to be run, given how the markets have turned and given how debt markets are trading and equity markets are trading system off.
00:17:35
Speaker 3
I’d love to hear your point of view on what hurdles does Elon still have in front of him, but you still want to get this done?
00:17:42
Speaker 3
And is there still a financing?
00:17:43
Speaker 3
Indicates that standing behind him at the original purchase price to get it done.
00:17:47
Speaker 3
That’s a great question.
00:17:49
Speaker 3
Maybe the best way to start is Nick do.
00:17:50
Speaker 3
You want to.
00:17:51
Speaker 3
Queue up what I said in August 25th. The lawsuit really boils down to one very specific clause, which is V Pinnacle.
00:18:00
Speaker 3
Question at hand, which is there is a specific performance?
00:18:05
Speaker 3
Clause that Elon signed up to.
00:18:08
Speaker 3
Right. Which, you know, his lawyers could have struck out and either chose not to or, you know, couldn’t get the deal done without. And that specific performance clause says that Twitter can forced him to close at 5420 share.
00:18:24
Speaker 3
And I think that the issue at hand at the Delaware Business Court is going to be that because food is going to point to all of these, you know, gotchas and disclaimers that they have around this pot issue.
00:18:36
Speaker 3
As their cover story.
00:18:40
Speaker 3
And I think that really, you know, this kind of again builds more and more momentum in my mind that the most likely outcome here is a settlement where you have to pay the economic difference between where the stock is now and 5420, which is more than a billion dollars.
00:19:00
Speaker 3
Or you close at some number.
00:19:03
Speaker 3
Below $54.20 a share.
00:19:06
Speaker 3
And I think that that is like you know if you had to be a betting person that’s probably and if you look at the the way the stock is traded and if you also look at the way the options market trade, that’s what people are assuming that there’s a 7 to $10 billion plane.
00:19:21
Speaker 3
And if you infuse that into the stock price, you kind of get into the $51.00 a square kind of a.
00:19:26
Speaker 3
An acquisition price, again, I’m not saying that that is right or should be right.
00:19:30
Speaker 3
That’s just sort of.
00:19:31
Speaker 3
What the market?
00:19:31
Speaker 3
Says yeah.
00:19:33
Speaker 3
So it turns out.
00:19:34
Speaker 3
That, you know, sort of like that.
00:19:36
Speaker 3
Kind of guessing, it turned out to be pretty accurate to the stock. Today’s at $51.00 a share.
00:19:40
Speaker 3
So I think that the specific performance thing is exactly what this thing has always hinged on.
00:19:46
Speaker 3
And I think that there is a realization that there were very few outs around how that contractual term was written and agreed to.
00:19:54
Speaker 3
So there is an out in the contract.
00:19:57
Speaker 3
And that out says that I think it’s quite painful if if the deal doesn’t get done by April, then the bank can walk away from their commitment to fund the debt.
00:20:07
Speaker 3
And if the banks walk away, then Elon does have a financing contingency that allows him to walk away.
00:20:13
Speaker 3
So the actual set of events that have to happen in those two things specifically get the Eagles or the banks impacts and say we’ve changed our mind, mark, conditions are different and then Ellen is able to save both.
00:20:25
Speaker 3
You know the banks just talk to it.
00:20:27
Speaker 3
Right now the.
00:20:27
Speaker 3
Banks, if you look at all of the debt that they’ve committed to.
00:20:32
Speaker 3
Well, they committed at a point in time when the debt markets were much better than they are today.
00:20:37
Speaker 3
In the last, you know, six or seven months since they agreed to do this, the debt markets have been clobbered and specifically junk bond with a bunch of junk bond debt, the yields yesterday.
00:20:48
Speaker 3
So the price you get that kind of debt has skyrocketed.
00:20:52
Speaker 3
So roughly, Dr the envelope math would tell me that right now the banks will outside between one and $2 billion because they’re not going to be able.
00:21:00
Speaker 3
To sell this set anyway.
00:21:02
Speaker 3
So I think the banks obviously want to weigh out.
00:21:04
Speaker 3
The problem is their only way out is to run the shot clock off in.
00:21:08
Speaker 3
Front of people.
00:21:09
Speaker 3
So I think that’s the dance that they’re in right now. Elon’s trying to find a way to solve, you know, for the merger.
00:21:16
Speaker 3
I think Twitter is going to say we’re not going to give you a financing contingency you have.
00:21:19
Speaker 3
To bring the dogs.
00:21:20
Speaker 3
In and close right now.
00:21:22
Speaker 3
And now we will not go to court, otherwise we’re going to court.
00:21:26
Speaker 3
And so I think it’s a very delicate predicament that they’re all in, but my estimate is that the equity is probably 20% off side.
00:21:34
Speaker 3
So funny.
00:21:35
Speaker 3
Change things, he can make that up.
00:21:36
Speaker 3
Because he can create equity value like.
00:21:38
Speaker 3
Nobody’s business.
00:21:39
Speaker 3
The debt is way outside by a couple billion dollars.
00:21:43
Speaker 3
Which is hard to make that, but I think in the end, you know given enough time they can probably make that back the best off in all of.
00:21:50
Speaker 3
This or the Twitter shareable.
00:21:52
Speaker 3
They’re gathering an enormous premium to one that company was worth today in the open market.
00:21:57
Speaker 3
And so I think that feels kinda close.
00:21:59
Speaker 3
This probably.
00:21:59
Speaker 3
Can close in the.
00:22:00
Speaker 1
Next few weeks.
00:22:01
Speaker 3
And have you bought Twitter? When we were talking about it in August, he would have made 25% in six weeks. And, you know, to deal close this city for you, you would have made, you know, a third of your money in eight weeks, which is, you know, very hard to.
00:22:13
Speaker 3
Do in a monitor as one of the.
00:22:14
Speaker 2
With your GPA.
00:22:16
Speaker 3
Like embracing or Sequoia and you had made this commitment to Elon or even Larry Ellison couple months ago. Do you fight against closing at 5420?
00:22:28
Speaker 3
Do you stick with the deal and support him?
00:22:31
Speaker 3
I mean, what do you do given that the premium is so much higher?
00:22:34
Speaker 1
Than where the market.
00:22:34
Speaker 3
Would trade it out today, some people.
00:22:36
Speaker 3
I think the stock should be like.
00:22:37
Speaker 3
20 bucks a share. The average premium in any transaction in the public markets is about 30%. So and I think the fair value of Twitter is around 32 to 35 bucks a share. So you know, it’s not like he is massively, massively overpaying.
00:22:55
Speaker 3
And so, you know, I would just sort of keep that in the realm of the possible, so like if you take $35.
00:23:00
Speaker 3
At the midpoint.
00:23:02
Speaker 3
Fair values will be 4550. So yeah, he paid 20% more than he should have, but he didn’t.
00:23:07
Speaker 3
Pay 100% more.
00:23:09
Speaker 3
So it’s not as if you can’t make that equity back as a private company particularly because.
00:23:15
Speaker 3
There’s probably $10.00 of fat in the stock if you think about just optics, right?
00:23:19
Speaker 3
In terms of.
00:23:20
Speaker 3
All the buildings they have, maybe they don’t need as many employees.
00:23:23
Speaker 3
Maybe they would revisit salary.
00:23:24
Speaker 3
You know, one thing is when I looked at doing an activist play with Twitter, I think I mentioned this five or six years ago, one of the things that I found was at that time Twitter was running their own.
00:23:34
Speaker 3
Data centers. And you know, the most obvious thing for me at that time was like we’re going to move everything to a WS. Now I don’t know if that happened, but I’m sure that if it hasn’t, just fitting that up to Azure or GCP and can raise, you know, three or $4 billion because I’m sure those companies would want this kind of an app on there.
00:23:53
Speaker 3
So there’s all kinds of things that I think you want to do some private company to make back, maybe the small bit that he overpaid and then he can get to the poor job of rebuilding this company to be usable as this product, to be usable because I didn’t look all.
00:24:05
Speaker 3
To speak as a user right now.
00:24:08
Speaker 3
It has been decaying at a very, very rapid clip and I think that his trepidation closing the merger.
00:24:16
Speaker 3
In part, also, even if he hasn’t said it has to do with the quality of the experience is just degraded.
00:24:21
Speaker 3
It’s not as fun to use as it was during the pandemic or even before the pandemic, so something is happening.
00:24:29
Speaker 3
Inside that app.
00:24:30
Speaker 3
That needs to get fixed, and if he does, it will make a.
00:24:33
Speaker 3
Ton of money.
00:24:33
Speaker 3
So like what happened with Friendster or Myspace?
00:24:36
Speaker 3
With any social networking app over time but quality degrades, it’s just not growing or shrinking.
00:24:41
Speaker 3
And it gets it if it’s if it’s not growing and also if the product hygiene isn’t being forced in code.
00:24:47
Speaker 3
And product hygiene in this case are just, you know, spam thoughts, you know, the scrolling.
00:24:54
Speaker 3
It can really take away from the experience, yeah.
00:24:57
Speaker 3
I mean, interesting.
00:24:58
Speaker 3
Like if you think back to the original, they were starting to the original days of Twitter.
00:25:02
Speaker 3
I don’t know if.
00:25:03
Speaker 3
You guys remember?
00:25:03
Speaker 3
You would send in an SMS to do your tweet and then it would post up and other people would get the SMS notification and it would crash all the time and the app.
00:25:14
Speaker 3
When they offered, notoriously crashingly, it was poorly architected at the beginning, and some people have argued.
00:25:20
Speaker 3
That Twitter has had.
00:25:22
Speaker 3
A cultural technical incompetence from the earliest days.
00:25:26
Speaker 3
I mean that’s.
00:25:26
Speaker 3
A little harsh.
00:25:27
Speaker 3
So I do think that Twitter is known for what’s called the fail rail.
00:25:30
Speaker 3
You know, they used.
00:25:31
Speaker 3
To have these failed constantly.
00:25:34
Speaker 3
And they did hire people that attempted to try.
00:25:36
Speaker 3
To fix it.
00:25:37
Speaker 3
I remember the the funniest part of when I went in there and said, hey, here’s my plan and here’s what I want to do is literally Spiritualist engineering quit a camera.
00:25:47
Speaker 3
I think it’s just out the door.
00:25:50
Speaker 2
But it is a.
00:25:54
Speaker 3
I think it is a team that has tried its best that probably at the edges definitely made some technical miscalculations.
00:26:02
Speaker 3
Like I said at that time, the idea that any app of that scale would use their own data centers make note technical sense whatsoever.
00:26:09
Speaker 3
It may be APT laggy.
00:26:11
Speaker 3
They painted hard to use again did more prone to downtime to your point.
00:26:15
Speaker 3
But that being said, I would be shocked if they haven’t made meaningful improvements because the staff of the Internet has gotten so much better over the last seven years.
00:26:23
Speaker 3
And so, to your point, if it if they didn’t take advantage of all these new abstractions and mechanism mechanisms to rebuild the app, or to rebuild search, or to rebuild, you know how you know all these infrastructure elements?
00:26:36
Speaker 3
The app work.
00:26:36
Speaker 3
I would be really surprised because then what are they doing over there?
00:26:40
Speaker 3
There we look.
00:26:41
Speaker 3
At it to the point earlier besides the product points.
00:26:43
Speaker 3
There was a.
00:26:44
Speaker 3
Uh, really good.
00:26:46
Speaker 3
A tweet went.
00:26:49
Speaker 3
That’s good. For what it’s worth, I think Q1 will show us just how leaning the Silicon Valley advertising companies can be run at.
00:26:55
Speaker 3
The very least it will be an interesting thought experiment for spectators because if he does go away then actually does significantly reduce OpEx and head count and the company does turn profitable and he can grow it.
00:27:07
Speaker 3
Really, by the way, it’ll really be a beacon.
00:27:07
Speaker 3
Alright, well.
00:27:10
Speaker 1
So setting better?
00:27:11
Speaker 3
Day companies. From a financial perspective, there is $10 appear in all tax cuts that he could make right away. Just for that he is economically break even.
00:27:20
Speaker 3
And he looks like every other M&A transactions. You know, you paid a 30% premium and you bought a company.
00:27:25
Speaker 3
There is a lot of margin of safety there if you want does that.
00:27:28
Speaker 3
So to your point there probably is and there probably needs to be a meaningful risk.
00:27:32
Speaker 3
Twitter, I’m not saying it’s right.
00:27:33
Speaker 3
I’m not saying it’s, you know, and I feel for the people that may go through it, but from a financial perspective, the math makes sense.
00:27:40
Speaker 3
For him to do that because then he is a break even proposition on a going any transaction and I think that there’s a lot of intelligent Financial Times so that all the debt holders feel like he’s doing the right thing and all the equity holders particularly see a chance for.
00:27:56
Speaker 3
Them to make a decent return here.
00:27:58
Speaker 2
All right.
00:27:58
Speaker 2
Well, let’s move on.
00:28:00
Speaker 3
Jake Alfaqui there’s a great conversation between Jamal Poly Hoppity and David Freiberg about.
00:28:07
Speaker 3
The Twitter transaction and now we’re being re joined by our besties who are from.
00:28:12
Speaker 2
Yeah, by other methods.
00:28:14
Speaker 3
How is your cappuccino jail?
00:28:16
Speaker 3
It’s great.
00:28:16
Speaker 3
Have an ice cold beer and ice cold beer.
00:28:20
Speaker 3
And I started drip coffee.
00:28:21
Speaker 3
I’m working.
00:28:22
Speaker 1
But let’s talk about topics that I’m not being subpoenaed or deposition about.
00:28:26
Speaker 3
We will have a lot to say.
00:28:27
Speaker 3
In the coming weeks, I wasn’t talking about.
00:28:29
Speaker 1
Topic that my lawyers are advising not to talk about.
00:28:32
Speaker 3
How eerie was our prediction?
00:28:34
Speaker 3
15 bucks a share.
00:28:35
Speaker 3
It is exactly where the stock is right now.
00:28:37
Speaker 3
Heart series.
00:28:39
Speaker 3
All right.
00:28:40
Speaker 3
Lots of advances.
00:28:40
Speaker 3
That looks good.
00:28:42
Speaker 3
Speaking of you.
00:28:44
Speaker 3
Tesla ido.
00:28:45
Speaker 3
Was last week I actually went.
00:28:47
Speaker 3
It was great.
00:28:48
Speaker 3
This is a recruiting event where it’s aftercare, so homeless and I went and I drove full homes, homes at the end.
00:28:49
Speaker 2
What did you do after?
00:28:56
Speaker 3
It was a great event and it is essentially a giant recruiting event.
00:29:01
Speaker 3
Hundreds of AI machine learning services.
00:29:05
Speaker 3
Can we just talk about Filmus Ecuador in the group chat about Ken Griffin?
00:29:10
Speaker 4
I mean.
00:29:12
Speaker 3
Oh yeah, well, he’s just like, I made a joke about his net worth and he responded.
00:29:16
Speaker 3
What is going on?
00:29:18
Speaker 3
We were talking about the most serious topics and he just comes.
00:29:23
Speaker 4
Hit and run.
00:29:24
Speaker 3
7 seconds so so by the way, I I was testing with Daniel Negron, he didn’t include.
00:29:30
Speaker 3
Bill podcast if you guys with less sleep and if you haven’t listened to it.
00:29:34
Speaker 3
So Daniel McGraw new pod with less is incredible.
00:29:38
Speaker 3
Oh, you know.
00:29:38
Speaker 3
I but I I was joking with Daniel that there’s a section where he’s talking about the greatest filter players of all time.
00:29:43
Speaker 3
And if you look in the bar of YouTube that showed where the most viewership was and it was exactly the 30 seconds he talks about how mute and I said to Daniel, this must have been so rewatching it over.
00:30:01
Speaker 2
He’d rather talking about.
00:30:03
Speaker 3
Jump start.
00:30:04
Speaker 3
No, no, it’s all good.
00:30:05
Speaker 3
So anyway, the event was super impressive.
00:30:10
Speaker 3
You only spoke when he showed the optimist the new robot he’s building a a general purpose robot that will work in the factories.
00:30:18
Speaker 3
It’s very early days, but they showed 2 versions of it and he said he thinks they could get it down to $20,000. It’s going to work in the factory, so it’s actually got a purpose.
00:30:27
Speaker 3
And obviously the factories already have a ton of robots.
00:30:29
Speaker 3
But this is more of a.
00:30:31
Speaker 3
More about that will benefit from the general or the the computer vision and the AI than our AI being pursued by the self-driving team.
00:30:41
Speaker 3
This is like 2 1/2 hours of really intense presentations. The most interesting part for me was they’re building around supercomputer and they’re chipped and the.
00:30:51
Speaker 3
Additional supercomputer.
00:30:53
Speaker 3
It’s really impressive at how much they can get through scenarios, so they’re building every scenario are very self driving.
00:31:01
Speaker 3
I actually have the full self-driving data on my car.
00:31:04
Speaker 3
Been using it, it’s.
00:31:06
Speaker 3
Pretty impressive.
00:31:07
Speaker 3
I have to sell.
00:31:08
Speaker 3
If you haven’t used it yet, I feel like.
00:31:12
Speaker 3
AI is moving at a pretty advanced clip the past year, if you haven’t also seen Meta, announced a text video generator.
00:31:20
Speaker 3
So this is even more impressive than Dolly.
00:31:23
Speaker 3
You’re putting a couple of words, free bird, and you get a painting or whatever this is.
00:31:27
Speaker 3
Put in a couple of words and you get a short video.
00:31:30
Speaker 3
So they had one of a teddy bear painting a teddy bear.
00:31:33
Speaker 3
So it looks like you’re going to be able to.
00:31:38
Speaker 3
It’s actually create a whole movie by just talking to a computer, really impressive.
00:31:42
Speaker 3
Where do you think we are Freeburg in terms of?
00:31:46
Speaker 3
The compounding nature of these narrow AI efforts, you know, obviously so poker, chess, go bowling, GPT, 3 self drive, and it feels like this is all compounding at a faster rate.
00:31:59
Speaker 3
So am I just imagining that?
00:32:02
Speaker 2
Yeah, look, I mean.
00:32:02
Speaker 3
It’s interesting when people saw the first computer playing chess, they.
00:32:06
Speaker 3
Said the same thing.
00:32:06
Speaker 3
I think anytime that you’ve seen progress with a computer that starts to mimic the predictive capabilities of human.
00:32:15
Speaker 3
It’s impressive but I will argue and I just I’ll say a few words on this.
00:32:20
Speaker 3
It’s I think this is part of the A 16 year cycle that we’ve.
00:32:23
Speaker 3
Been going through.
00:32:25
Speaker 3
Fundamentally with humans and human brains do is we can sense our external environment, then we generate knowledge from that sensing, and then our brains build a model that predicts an outcome and then that that predicted outcome of with crowds our actions in our behavior we observed the sunrise every morning that we observed that itself.
00:32:46
Speaker 3
And you see that enough times and you build a predictive model from that data that’s been generated.
00:32:49
Speaker 3
In your brain.
00:32:51
Speaker 3
That I predict that the sun has risen, it will therefore set it has said it will therefore rise.
00:32:56
Speaker 3
And I think that the computing approach is very similar.
00:32:59
Speaker 3
It’s all about.
00:33:00
Speaker 3
Stenciling or generating data and then creating a predictive model and then you can browse action and.
00:33:06
Speaker 3
Initially the first approach was distinctive algorithms and these are deterministic models that are built.
00:33:11
Speaker 3
A piece of.
00:33:12
Speaker 3
Code that says here’s an influx, here’s an output, and that that model is really built by the steering and the human designed to design that algorithmic model.
00:33:20
Speaker 3
And so this is what the predictive potential of this software is.
00:33:25
Speaker 3
Then there was this term called data science.
00:33:27
Speaker 3
So as data generation began to proliferate, meaning there was far more sensors in the world, it was really.
00:33:33
Speaker 3
To create digital data from the physical world.
00:33:36
Speaker 3
Really cheap to transmit it, really cheap to store it, really cheap to compute with it.
00:33:39
Speaker 3
Data science became the hot term and Broken Valley profile, and these models were not just a bit basic algorithm written by humans, but it became an algorithm that was a similar deterministic model.
00:33:51
Speaker 3
They had parameters, and the parameters were ultimately resolved by the data that was being generated, and so these models became much more complex than much more predictive.
00:34:01
Speaker 3
Finer. Granularity finer.
00:34:03
Speaker 3
Then we use this term machine learning.
00:34:05
Speaker
And and in.
00:34:06
Speaker 3
The data science era, it was still.
00:34:08
Speaker 3
Like, hey, there’s a model and.
00:34:10
Speaker 3
You would solve it statically.
00:34:12
Speaker 3
You would get a bunch of data, you would statically solve for the parameters and that would be your model, and it would run.
00:34:16
Speaker 3
Machine learning then allowed those parameters to become dynamic, so the model was static, but generally speaking, the parameters.
00:34:25
Speaker 3
That drove the model became dynamic as more.
00:34:27
Speaker 3
Data came into the system.
00:34:29
Speaker 3
And they were dynamically updated and then this era of AI became a message in your path.
00:34:33
Speaker 3
Court and what AI is realizing is that there’s so much data that rather than just resolve the parameters.
00:34:39
Speaker 2
Of the model you.
00:34:40
Speaker 3
Can actually resolve a model itself.
00:34:42
Speaker 3
The algorithm can be written by the data.
00:34:45
Speaker 3
The algorithm can be written by the software, and so it.
00:34:47
Speaker 4
Would end with.
00:34:48
Speaker 3
Inline example, so poker point, an adaptive model, so people.
00:34:53
Speaker 3
So you’re playing poker, and the software begins to recognize behavior, and it builds a predictive model that says, here’s how you’re playing.
00:35:00
Speaker 3
And then over.
00:35:00
Speaker 3
Time it actually changes not just the parameters of the model, but the model itself, the algorithm itself.
00:35:05
Speaker 2
And so any idea?
00:35:06
Speaker 3
And then it eventually.
00:35:07
Speaker 3
Gets to a point where the algorithm is so much more complex than a human would have never written it.
00:35:12
Speaker 2
And suddenly the.
00:35:13
Speaker 3
AI has built its own intelligence, its own ability to be productive in a way that a human algorithmic programmer would have never done.
00:35:20
Speaker 3
And this is all driven by statistics.
00:35:22
Speaker 3
So none of this is new science perfect.
00:35:25
Speaker 3
There’s new techniques that all on their underlying use statistics as their basis and then there’s these techniques that allow us to build these new.
00:35:32
Speaker 3
Systems of model developments like neural Nets and so on.
00:35:35
Speaker 3
And those statistics build those neural landscape states, they solve those parameters and so on.
00:35:39
Speaker 3
But fundamentally there is a geometric increase in data and a geometric decline in the cost to generate data from sensors as the cost of sensors are coming down with Mozilla.
00:35:50
Speaker 3
Well, transmit that data is the cost of moving data come down with broadband communications. The cost of storing data says the cost of DRAM and solid-state hard drives has come down with noise law.
00:36:01
Speaker 3
And now the ability to actually have enough data to do this AI driven where people are calling AI, but it really is the same.
00:36:07
Speaker 3
It’s part of the spectrum of things have been going on for 60 years.
00:36:10
Speaker 3
To actually broad predictions in the in the world, it’s really being realized in a bunch of areas that we would have historically been really challenged and surprised to see.
00:36:19
Speaker 3
And so my argument is.
00:36:21
Speaker 3
At this point data played a big role.
00:36:23
Speaker 3
Yeah, yeah, we over the last decade we’ve reached this tipping point in terms of data generation, storage and computation that’s allowed these statistical models to resolve dynamically and as a result they are far more predictive and as a result we see far more human like behavior in the predictive systems, both, physically both.
00:36:41
Speaker 3
Those that are, you know, like a like a robot is the same as one that existed 20 years ago.
00:36:46
Speaker 3
But the way that it’s running is using the software that is driven by this dynamics model and that.
00:36:51
Speaker 3
Yeah, I love.
00:36:52
Speaker 3
For a better answer, trust.
00:36:55
Speaker 3
OK, I have two things in table one.
00:36:57
Speaker 3
The first one is a total non sequiturs.
00:36:58
Speaker 3
It use the term data scientists.
00:37:00
Speaker 3
Do you know where the term data scientists came from?
00:37:04
Speaker 3
As as classically used in Silicon Valley, it came from Facebook and it came from my team in a critical moment for this was in 2007. I was trying to 2008 hours.
00:37:13
Speaker 3
Trying to build the growth team.
00:37:15
Speaker 3
This is the team that’s speaking very famous for getting 2 billion users and, you know, building a lot of these algorithmic insights.
00:37:22
Speaker 3
And I was trying to recruit a person from Google and he was like a PhD in some crazy things like astrophysics or particle physics or something.
00:37:32
Speaker 3
And we gave him an offer as a data analyst.
00:37:35
Speaker 3
Because this is.
00:37:36
Speaker 3
What I needed at the time is what I.
00:37:37
Speaker 3
Thought I needed an analyst because Pascal texted.
00:37:41
Speaker 3
And he said absolutely not understanded by the job title.
00:37:44
Speaker 3
And I remember talking to my my HR in a business process partner and I asked her like, I don’t understand what is this business coming from.
00:37:52
Speaker 3
And she said he fashions himself a scientist.
00:37:55
Speaker 3
And I said, well then call him a data scientist.
00:37:57
Speaker 3
So we wrote in the offer for the first time data scientists and at the time, people internally.
00:38:03
Speaker 3
So like this is a dumb title.
00:38:04
Speaker 3
Does this mean?
00:38:05
Speaker 3
Anyways, he hired the guy, he was a star and and that title just took off internally. So disappointing because parallel we started climate court in 2006 and the original the first guy.
00:38:16
Speaker 3
Are hired with a.
00:38:16
Speaker 3
Buddy of mine who was a 4.0.
00:38:18
Speaker 3
For in applied math from Tao.
00:38:20
Speaker 3
And then everyone we hired on with him, we called them the math team and they were all applied math and statistics.
00:38:26
Speaker 3
Yeah, and we called them the math team and was really cool to be part of the math games, but then.
00:38:30
Speaker 3
We switched the team to do the.
00:38:32
Speaker 3
Scientists and then?
00:38:33
Speaker 3
It obviously created this much more kind of impressive role, impressive title, central functions in the organization that was more than just a math person or data.
00:38:43
Speaker 3
As I think it may have been classically treated.
00:38:45
Speaker 3
’cause they really were building.
00:38:46
Speaker 3
The algorithms that drove the models that.
00:38:49
Speaker 3
Made the product work right.
00:38:50
Speaker 3
Peter Skill is a very funny observation.
00:38:52
Speaker 3
Not funny, but you know, observation, which is you should always be wary of any signs that actually have science in the name.
00:38:59
Speaker 3
Political science and social science, I guess maybe data scientists.
00:39:03
Speaker 3
You know, because the real sciences don’t need to qualify themselves.
00:39:06
Speaker 3
Physics, chemistry, biology anyway.
00:39:08
Speaker 3
So here’s what I wanted to.
00:39:10
Speaker 3
Talk about with respect.
00:39:12
Speaker 3
Two very important observations that I think is useful for people to know. The first one making people it up here is this, uh baselining of, you know, when we have thought about intelligence and compute capability, we’ve always talked about Moore’s Law and new as well. Eventually this idea that there is a fixed amount of time where the.
00:39:31
Speaker 3
Density of transistors inside of a chip would double and roughly that period for many many years.
00:39:36
Speaker 3
Was around 2:00.
00:39:36
Speaker 3
There’s it was largely led by Intel, and we used to equate this to intelligence, meaning the more density there was in a chip, the more things could be learned and understood.
00:39:48
Speaker 3
And we used to think about that as the progression of how computing intelligence would grow and eventually ily and artificial intelligence.
00:39:57
Speaker 3
Would we get to mass market? Well, what we are now at is a place where many people have said Moore’s law has broken.
00:40:06
Speaker 3
It’s because we cannot Cran anymore transistors into a fixed amount of area.
00:40:11
Speaker 3
We are at the boundaries of physics.
00:40:14
Speaker 3
And so people think, well, does that mean that our ability to compute?
00:40:18
Speaker 3
Will eventually come to an end and stop.
00:40:21
Speaker 3
And the answer is no.
00:40:22
Speaker 3
And that’s what’s demonstrated on this next chart just to make it simple, which is that what you really see is that if you think about, you know supercomputing power, so the ability to get to an answer that has actually continued unabated and if you look at.
00:40:39
Speaker 3
This chart the reason why this is possible is entirely because we’ve shifted from CPUs to these things called GPU.
00:40:46
Speaker 3
So you may have heard companies like NVIDIA why companies like NVIDIA done so well.
00:40:51
Speaker 3
It’s because they said they raised their hand and said we can take on the work.
00:40:55
Speaker 3
And by taking on the work away from the traditional CPU, you were able to do a lot of recipe books, that is, get into these very complicated models.
00:41:04
Speaker 3
So this is just an observation that I think that we are continuing to compound knowledge and intelligence effectively at the same rate as Moore’s law and.
00:41:16
Speaker 3
We will continue to be able to do that because this makes it a problem of power.
00:41:22
Speaker 3
And a problem of money.
00:41:24
Speaker 3
So as long as you can buy enough GPU from NVIDIA or build your own, and as long as you can get access to enough power to run those computers, there really isn’t many problems you can’t solve.
00:41:37
Speaker 3
And that’s what’s so fascinating and interesting.
00:41:39
Speaker 3
And this is what companies like opening eyes are really proving.
00:41:43
Speaker 3
You know, when they raised a billion dollars, what they did was they looked at this problem because they realized that by shifting the problem to defuse, it left all these amazing opportunities for them to uncover.
00:41:55
Speaker 3
And that’s effectively what they have.
00:41:57
Speaker 4
The second thing.
00:41:58
Speaker 3
That will take very quickly with us.
00:42:00
Speaker 3
It’s been really hard for us as a society to build intelligence in a multimodal way like our brain works.
00:42:09
Speaker 3
So think about how our brain works.
00:42:10
Speaker 3
Our brain.
00:42:11
Speaker 3
Works in a multimodal by we can process imagery, we can process words and sounds, we can process all of these different.
00:42:20
Speaker 4
Modes text.
00:42:22
Speaker 3
Come into one system and then into it, some intelligence from it and make a decision, right?
00:42:28
Speaker 3
So, you know, we could be watching this YouTube video, there’s going to be transcription, there’s video, voice, audio, everything.
00:42:34
Speaker 3
All at once.
00:42:36
Speaker 3
And we are moving to a place very quickly where computers will have that same ability as well.
00:42:41
Speaker 3
Today we go to very specific models and kind of balkanized silos, correct solve different kinds of problems.
00:42:47
Speaker 3
But those are now quickly merging again because of what I’ve just.
00:42:51
Speaker 3
Said about GPUs.
00:42:52
Speaker 3
So I think what’s really important about AI?
00:42:56
Speaker 3
For everybody to understand is the marginal cost of intelligence.
00:43:00
Speaker 3
Is going to go.
00:43:00
Speaker 3
To 0.
00:43:01
Speaker 3
And this is where I’m just going to put out another prediction of my own.
00:43:05
Speaker 3
When that happens.
00:43:08
Speaker 3
It’s going to be incredibly important for humans to differentiate themselves from computers.
00:43:13
Speaker 3
And I think the best way for humans to differentiate ourselves is to be more human, since it’s to be less compute intensive.
00:43:21
Speaker 3
It’s to be more empathetic is to be more emotional, not less emotional, because those differentiators.
00:43:27
Speaker 3
Are very difficult for brute force compute to solve.
00:43:31
Speaker 3
Be careful the replicants on this car a little nervous here.
00:43:34
Speaker 3
They’re not processing that that.
00:43:36
Speaker 3
Was an emotional do not want to process balance.
00:43:38
Speaker 3
Well, as to your point, during this era all day they were showing in self driving as we’re talking about this walking today, Quinn and trying to make decisions across many different decision trees.
00:43:51
Speaker 3
You know they’re looking at lane changes, they’re looking at other cars and pedestrians, or looking at road conditions like fog and rain.
00:43:59
Speaker 3
And then they’re using all this big data we’re going.
00:44:01
Speaker 3
Free vote to run.
00:44:03
Speaker 3
Tons of different simulations.
00:44:05
Speaker 3
So they’re building like this virtual world at on Market Street and then they will throw people, dogs, cars.
00:44:13
Speaker 3
People have left the hints about that into.
00:44:15
Speaker 3
The simulation it’s.
00:44:16
Speaker 3
Such a wonderful example.
00:44:17
Speaker 3
Imagine that system here that warn.
00:44:20
Speaker 3
Yeah, well, you hear a horn.
00:44:23
Speaker 3
So clearly there’s some auditory expression of risk, right?
00:44:26
Speaker 3
There’s something risky.
00:44:28
Speaker 3
And now you have to scan your visual field.
00:44:31
Speaker 3
You have to probabilistically decide what it could be, what the evasive maneuver, if anything, should be.
00:44:37
Speaker 3
So that’s a multimodal list set of intelligence that today.
00:44:41
Speaker 3
Isn’t really available.
00:44:43
Speaker 3
Yeah, but we have to get there if we’re going to have real full self-driving.
00:44:46
Speaker 3
So that’s a perfect example chase in the real world.
00:44:48
Speaker 3
Example of how hard the problem is that it will get solved because we can brute force it now with with chips and with compute anything that’s going to be very interesting with the robots as well as all of these decisions they’re making, moving cars through roads, all of a sudden we’re going to see that with.
00:45:04
Speaker 3
Details vertical takeoff and landing.
00:45:08
Speaker 3
You know, aircraft, and we’re going to see it with this general robot and everybody wanted to ask around about General I, you know, the Terminator kind of.
00:45:16
Speaker 3
And his position is, I think if we solve enough of these problems, free bird, it’ll be an emergent.
00:45:22
Speaker 3
Behavior or an emergent phenomenon I guess would be a better word based on each of these cities crumbling into each of these tasks getting solved by groups of people.
00:45:32
Speaker 3
You have any thoughts as we wrap up here on the discussion about General AI and the timeline for that ’cause obviously we’re going to solve every vertical AI problem in short?
00:45:33
Speaker 2
Yeah, I mean, I’m.
00:45:40
Speaker 2
Brother, I spoke about this.
00:45:42
Speaker 3
A little bit on the AMA on calling on Tuesday night.
00:45:47
Speaker 3
One stack.
00:45:48
Speaker 3
Sketch it out that you can listen to it.
00:45:50
Speaker 3
But I I really.
00:45:51
Speaker 2
Have this strong belief.
00:45:53
Speaker 4
That it’ll be, it’ll be out.
00:45:53
Speaker 3
Servers crash with.
00:45:54
Speaker 3
No error, no AI team at fallen.
00:45:55
Speaker 4
It’ll be out by.
00:45:56
Speaker 1
The time this episode drops.
00:45:57
Speaker 3
OK, yeah, you guys can try to download the app, but it might crash and just be careful.
00:46:01
Speaker 3
So here’s.
00:46:03
Speaker 3
Here’s my my.
00:46:04
Speaker 1
The problem there is free bird, that you were 10 times more popular than Jake households.
00:46:08
Speaker 1
Unexpected numbers.
00:46:10
Speaker
Well, you had you.
00:46:11
Speaker 3
Did have an account with 11,000.
00:46:13
Speaker 3
Followers I mean.
00:46:14
Speaker 1
Yeah, right.
00:46:14
Speaker 1
You’re right, Jake out.
00:46:15
Speaker 1
We’ll put you on that account.
00:46:16
Speaker 1
Next time, yeah, please.
00:46:18
Speaker 1
I’m starting from zero.
00:46:20
Speaker 1
Yeah, that’s fair. Yeah.
00:46:21
Speaker 2
Like my my.
00:46:22
Speaker 3
Core thesis is, I think, human transitions from hearing, let’s call it, you know, passive in the system on Earth to being the laborers.
00:46:32
Speaker 3
And then we transition.
00:46:34
Speaker 3
From being laborers to being creators.
00:46:35
Speaker 3
And I think our next transition with AI is to transition from being creators to being narrators.
00:46:40
Speaker 3
And what I mean by that is as as we started to do work on Earth and engineer the world around us, we did labor to do that.
00:46:48
Speaker 3
We literally plowed the fields, we walked distances, we built things, and over time we built machines that automated a lot of that labor.
00:46:58
Speaker 3
You know, everything from a plow to a tractor to a Caterpillar equipment to a microwave that cooks for US, labor became less, we became less dependent on our labor ability, and then we got to switch our time and spend our time as creators, as knowledge workers.
00:47:12
Speaker 3
And a vast majority of the developed world now primarily spends their time as knowledge workers, creating.
00:47:19
Speaker 3
And we create stuff on computers, we do stuff on computers, but we’re not doing physical labor anymore, as a lot of the knowledge work gets supplanted by AI, or as it’s being termed now.
00:47:29
Speaker 3
But it really gets supplanted by software, the role of the human.
00:47:33
Speaker 3
I think transitions to.
00:47:34
Speaker 3
Being more of a narrator where instead.
00:47:36
Speaker 3
Of having to create the blueprint for a.
00:47:39
Speaker 3
You narrate the house you want and the sophomores taste the blueprint for you.
00:47:42
Speaker 2
To dictate instead of.
00:47:44
Speaker 2
Yeah, and.
00:47:45
Speaker 2
Instead of creating the.
00:47:46
Speaker 3
Movie, and I’m spending $100 million producing a movie. You dismiss or you narrate the movie you want to see, and you iterate with the computer and the computer renders the entire film for you.
00:47:55
Speaker 3
Because those films are.
00:47:56
Speaker 3
Shown digitally anyway, so you can have a computer rendering instead of creating a new piece of content.
00:48:02
Speaker 3
You narrate the content you want to experience.
00:48:04
Speaker 3
You create your own video game.
00:48:06
Speaker 3
You create your own movie.
00:48:07
Speaker 4
Period and I think.
00:48:08
Speaker 3
That there’s a whole evolution that happens and.
00:48:10
Speaker 3
If you look.
00:48:10
Speaker 3
Steve Pinker’s book Enlightenment now has a great statistic, a set.
00:48:14
Speaker 3
Of statistics on this.
00:48:15
Speaker 3
But the amount of time that humans are spending on legal activities per week has climbed extraordinarily over the past couple of decades.
00:48:22
Speaker 3
We spend more time enjoying ourselves and exploring our creative interests than we ever did in this.
00:48:27
Speaker 3
In the past, in human history.
00:48:29
Speaker 2
We were burdened by all the.
00:48:30
Speaker 3
Labor and all the.
00:48:31
Speaker 3
Creators of knowledge work we have to do and now things are much more accessible to us, and I think that AI allows us to transition into an area that we never really thought possible.
00:48:40
Speaker 3
Realize where the limits are really our imagination of what we can do with the world around us.
00:48:45
Speaker 3
And the software resolves to the and automation resolves to make those things possible.
00:48:50
Speaker 3
And I really kind of vision for the future that I think we are enabled.
00:48:50
Speaker 2
Right.
00:48:53
Speaker 3
Star Trek had this right.
00:48:55
Speaker 3
People didn’t have to work and they could pursue things in the holodeck or whatever that they felt was rewarding to them.
00:49:02
Speaker 3
But Speaking of jobs, the job reports for August came in. We talked about this. We were we were trimming 300,000 jobs a month.
00:49:09
Speaker 3
While wondering if any other shoe dropped avoided his drop, over a million jobs burned off in August, so without getting into the macro talk, it does feel like what the Fed is doing and companies.
00:49:22
Speaker 3
So when hiring freezes and cuts is finally, finally having an impact if we start losing alone, as we predicted could happen here on the show.
00:49:32
Speaker 3
People might actually go back to work, and Lyft and Uber are reporting that the driver shortages are over.
00:49:38
Speaker 3
They no longer have to pay people stress and stuff like that to get people to come back to work.
00:49:41
Speaker 3
So the teacher in America feels like we’re.
00:49:43
Speaker 3
Turning a corner do we want?
00:49:45
Speaker 3
To go out, let’s talk about the marijuana.
00:49:48
Speaker 3
Not pardoning this since I did.
00:49:49
Speaker 3
Just said yeah, yeah, I’m going to say we got a couple of things we really want to get to hear.
00:49:52
Speaker 3
You creating section 2:30 and then this breaking news. I will pull it up here on the screen while we’re recording the show, President Biden says that I’m just going to quote here.
00:50:06
Speaker 3
I’m pardoning all prior federal offenses of simple marijuana possession.
00:50:10
Speaker 3
There are thousands of people who were previously convicted of simple possession who may be denied employment, housing, or educational opportunities.
00:50:17
Speaker 3
My problem to remove this burden is big news.
00:50:19
Speaker 3
Second, I’m calling on governors to parting simple state marijuana possession offenses, just as no one should be.
00:50:26
Speaker 3
In a federal prison server for possessing marijuana.
00:50:28
Speaker 3
Maybe should be in a local jail or state president for that reason.
00:50:32
Speaker 3
Finally, this is happening aboard, and this is an important one.
00:50:36
Speaker 3
We classified marijuana at the same level as.
00:50:38
Speaker 3
Heroin and even.
00:50:39
Speaker 3
And more serious than fentanyl makes more sense?
00:50:42
Speaker 3
I’m asking.
00:50:44
Speaker 3
Secretary Burr Keller and the Attorney general to initiate the process of reviewing how marijuana is scheduled on the.
00:50:49
Speaker 3
Front of the law.
00:50:51
Speaker 3
I’d also like to note that as federal and state regulators change, we still need important limitations on traveling marketing under a shells of marijuana.
00:51:00
Speaker 3
Back on this breaking news like is this mode getting the timing on this is kind of mid term?
00:51:07
Speaker 3
Related, it seems it’s just like this.
00:51:10
Speaker 3
I guess it was a politically popular decision to do.
00:51:13
Speaker 1
I think so.
00:51:14
Speaker 1
I mean like I support it, so I buy and finally just something I like.
00:51:19
Speaker 1
I mean, I thought.
00:51:20
Speaker 1
That we should decriminalize marijuana for a long time, or specifically I, you know, I agree with this idea of descheduling it.
00:51:28
Speaker 1
It it does not make sense to treat marijuana the same as heroin as a schedule one.
00:51:33
Speaker 1
Narcotics just doesn’t make any sense.
00:51:35
Speaker 1
It should be regulated separately and differently.
00:51:38
Speaker 1
Obviously you want to keep it out of hands of minors, but no one should be going to jail I think for simple.
00:51:43
Speaker 1
Possession, so I do agree with that.
00:51:45
Speaker 1
And I think the.
00:51:46
Speaker 1
Thing they need to do, I don’t see it mentioned here, is they should pass a federal law that would allow for the normalization of what’s called legal, legal, you know, cannabis companies, so companies that are allowed to operate under state laws like.
00:52:05
Speaker 1
In California, should have access to the banking system, should have access to payment rails.
00:52:09
Speaker 1
Because right now, the reason why the legal cannabis industry isn’t working at all in California is because they can’t bank, they can’t take payments.
00:52:18
Speaker 1
So it’s this.
00:52:19
Speaker 1
Weird all cash business that makes those.
00:52:20
Speaker 1
Grants, so. So listen if.
00:52:22
Speaker 1
We’re not going to criminalize it.
00:52:24
Speaker 1
As a drug like heroin, if we’re going to allow states to make it legal, then allow it to be a more normal business where the state can tax it and it can operate in a more aboveboard way, so.
00:52:37
Speaker 2
But a federal mandate is.
00:52:38
Speaker 1
I guess this is me.
00:52:39
Speaker 3
What you’re saying is, I mean, I think.
00:52:41
Speaker 1
It’s considerably regulated on a state by state.
00:52:42
Speaker 1
Basis but I.
00:52:43
Speaker 1
Think you need.
00:52:44
Speaker 1
The Fed to bless the idea that banks and payment companies.
00:52:49
Speaker 1
Can take on those clients with states have already said are legally operating companies and right now they can’t, and it’s a huge gap in the laws.
00:52:57
Speaker 1
So maybe that’s the one thing I would add to this, but I don’t have any complaints about this right now based on what we know from.
00:53:03
Speaker 1
This tweet storm.
00:53:03
Speaker 1
And I would say that this this side away was an about face.
00:53:05
Speaker 3
What is the polling data?
00:53:07
Speaker 1
This wasn’t about faith by Biden.
00:53:09
Speaker 3
Yeah, you know, the polling data says, I mean is there, I’m assuming there’s big support in kind of independence in the middle.
00:53:17
Speaker 3
It was 70% at one.
00:53:18
Speaker 3
Point yeah.
00:53:19
Speaker 1
Yeah. So, like this. This is the kind of thing that Biden should be doing with the 5050 Senate finding these sorts of bipartisan compromises.
00:53:27
Speaker 3
Right.
00:53:29
Speaker 1
So yeah, look, this is good news for.
00:53:31
Speaker 1
Us. I’m concerned.
00:53:31
Speaker 3
Why hasn’t this happened in?
00:53:32
Speaker 3
The past like looked in.
00:53:34
Speaker 3
The political reason that other presidents, Obama even.
00:53:40
Speaker 3
Didn’t didn’t have child.
00:53:41
Speaker 2
Very rumors.
00:53:42
Speaker 3
Labor ideology?
00:53:43
Speaker 3
Like why?
00:53:44
Speaker 3
But why?
00:53:44
Speaker 3
Does anyone know?
00:53:45
Speaker 3
Why this hasn’t been done?
00:53:46
Speaker 3
In the past.
00:53:46
Speaker 3
There was rumor she was going.
00:53:47
Speaker 3
To do in the second term.
00:53:49
Speaker 3
They just didn’t have the political capital to do.
00:53:50
Speaker 4
But why?
00:53:51
Speaker 3
It why I don’t I don’t let it.
00:53:53
Speaker 3
Harden judgment with fire.
00:53:55
Speaker 3
Yeah, the pardon doesn’t require political capital.
00:53:57
Speaker 3
I think it’s probably the perception that this is soft on crime in some way or there wasn’t enough broad based support as David said.
00:54:04
Speaker 3
I mean I think the the United States population has moved pretty meaningfully in the last 10 years.
00:54:10
Speaker 3
It’s not. Look at the chart here. You know, we were talking about 2000 was only 31%.
00:54:16
Speaker 3
And then you look at 2018, it’s up at 60 plus percent. So when people saw the safe doing.
00:54:22
Speaker 3
It and they.
00:54:22
Speaker 3
Saw absolutely no problem, you know, in every state and I think what people will see now.
00:54:27
Speaker 3
A Gallup poll. Gallup poll.
00:54:30
Speaker 3
So if you just increased dramatically, MDMA, psilocybin and some of these other plant based medicines are lost by the next and they’re doing studies.
00:54:35
Speaker 2
I will.
00:54:37
Speaker 2
On them now.
00:54:38
Speaker 3
I don’t want to take away from how important this is for all the people for whom this will positively impact.
00:54:44
Speaker 3
I just want to talk about the schedule.
00:54:46
Speaker 3
Change for marijuana?
00:54:48
Speaker 3
Had the parents love the things that I’m.
00:54:49
Speaker 3
Really, really concerned.
00:54:50
Speaker 3
About, it said.
00:54:52
Speaker 3
Through this process of legalization.
00:54:55
Speaker 3
Getting access to marijuana has, frankly, become too easy.
00:54:58
Speaker 3
Particularly for kids.
00:55:01
Speaker 3
At the same time, I saw a lot of really alarming evidence that the the intensity of these marijuana based products have gone.
00:55:09
Speaker 3
You know, I think it’s like 5 or 6.
00:55:11
Speaker 3
Times more intense than you.
00:55:12
Speaker 3
Know 50 or 100 shots much higher right. So. So it’s no longer you know this kind of like, you know do know.
00:55:21
Speaker 3
Pharm drug that it was 20 years ago this is this could be actually David the way that it’s.
00:55:27
Speaker 3
Prioritized today as bad as some of these other.
00:55:31
Speaker 3
You know, for narcotics, so.
00:55:34
Speaker 3
In June of this year.
00:55:36
Speaker 3
The Biden administration basically made this press release that said the FDA was going to come out with regulations that would cap the amount of nicotine in cigarettes.
00:55:46
Speaker 3
And I think that was a really smart move because it basically sets the stage to taper nicotine out of out of cigarettes, which would essentially, you know decapitated as.
00:55:57
Speaker 3
Being an addictive product and I think by.
00:56:01
Speaker 3
Thinking about how it’s how it’s dealt with, what I really hope the administration does is it empowers the FDA.
00:56:09
Speaker 3
If you’re going to legalize it, you need to have expectations around what the intensity of these drugs are.
00:56:16
Speaker 3
Because if you’re delivering drugs OTC and now any kid can go in at 18 years old and buy them, which means.
00:56:22
Speaker 3
That 18 year olds are going to buy them for 16 year olds, 16 year olds are going to get freaked out.
00:56:26
Speaker 3
People will buy them for themselves.
00:56:28
Speaker 3
You need to do a better job so that parents, you’re helping parents.
00:56:31
Speaker 3
Do our job.
00:56:32
Speaker 3
Here’s what you.
00:56:32
Speaker 1
Need to learn to learn like alcohol?
00:56:33
Speaker 3
For regulation.
00:56:37
Speaker 1
It’s all 21 then, of course.
00:56:38
Speaker 3
Yeah, fine.
00:56:39
Speaker 3
But even alcohol David you know that there are there.
00:56:41
Speaker 3
We know what the intensity of deserve their labels and their warnings and you know the difference between fields versus line versus hard alcohol.
00:56:46
Speaker 2
They’re they’re jetting bendables.
00:56:48
Speaker 3
But let me just give you some statistics version off.
00:56:50
Speaker 3
If you think about the the the cannabis in the movies and prior to that then it’s very tough studies on this in Colorado.
00:56:59
Speaker 3
There was well, the THC content was less than 2% and then in 2017.
00:57:04
Speaker 3
We were talking about, you know, things going up to 17 to 28% for specific strains. So they have been building strains like Girl Scout cookies etc.
00:57:14
Speaker 3
That have just increased and increased and then there are things like shards and obviously edibles.
00:57:20
Speaker 3
You can create whatever intensity you want.
00:57:22
Speaker 3
So you have this incredible there, you know?
00:57:25
Speaker 3
You could have an edible that’s, you know got 1 milligram of tools.
00:57:29
Speaker 3
You get more than 100.
00:57:30
Speaker 3
Or you could have a.
00:57:31
Speaker 3
Pack of edibles and you see this happen in the news all the time.
00:57:34
Speaker 3
Some kid gets their parents pack or somebody gives them that you don’t know and this dabbing phenomenon combined with.
00:57:41
Speaker 3
The dabbing is like Bouchard’s at this really intense stuff.
00:57:45
Speaker 3
Combined with the edibles is really the issue in the labeling of them.
00:57:48
Speaker 3
So you got to be incredibly careful with this.
00:57:50
Speaker 3
It’s not good for kids.
00:57:51
Speaker 3
It chews up their.
00:57:52
Speaker 3
Bones and so on, yeah.
00:57:54
Speaker 3
People can.
00:57:55
Speaker 3
Favor either 0 tolerance policy.
00:57:56
Speaker 3
In this place I don’t care if it’s legal or illegal, like I don’t want my kids touching any other.
00:58:00
Speaker 3
Stuff until then.
00:58:02
Speaker 3
But you should not.
00:58:03
Speaker 3
Be here until they’re 35 or 40, and even then, I hope they.
00:58:06
Speaker 3
Never do it but but I need some help.
00:58:09
Speaker 3
And I’m not sure I’m the only parent that thought you can’t have this stuff be available effectively sold like in a convenience store.
00:58:15
Speaker 3
No, no, that’s not going to happen.
00:58:17
Speaker 3
There isn’t even labeling, at least like cigarettes are labeled.
00:58:20
Speaker 3
It’s very clear how bad this stuff is for.
00:58:22
Speaker 3
You if you have any feedback on the job report or anything for all going away with the win.
00:58:28
Speaker 3
Well, that’s why I brought it up is like we’re now going to see a potential, you know, situation where jobs go away and a lot of this stuff, like even developers, right, doesn’t mean free bird developers are going to start.
00:58:39
Speaker 3
Development tasks, design tasks.
00:58:42
Speaker 3
Going through the Ibis, everyone assumes the static locals work.
00:58:45
Speaker 3
I think what happens, particularly in things like developer tools, is the developer can do so much more and then we generate so much more output and so the overall productivity goes up, not down.
00:58:56
Speaker 3
So it’s pretty exciting.
00:58:58
Speaker 3
And these and remember like.
00:59:00
Speaker 2
Like we were talking on the Internet the other.
00:59:01
Speaker 3
Night Adobe Photoshop was a tool for photographers, so you didn’t have to take.
00:59:06
Speaker 3
The perfect photograph and then?
00:59:07
Speaker 3
Printed you could, you know, you could use our software to improve the quality of your photographs.
00:59:12
Speaker 3
And I think that that’s what we see happening with all software in the creative process as it helps people do more than they realize they could do before.
00:59:19
Speaker 3
And that’s pretty powerful and it opens up all these new avenues of interest and things were not even imagining today.
00:59:24
Speaker 3
Alright, so so this is going here 2 cases perception.
00:59:28
Speaker 3
The family of Nahema Gonzalez, a 23 year old American College student who was killed in an ISIS terrorist attacks in Paris back in 2415. Women in this horrible tax described that YouTube helped and aided and abetted ISIS.
00:59:42
Speaker 3
The families arguments YouTube’s algorithm was recommending videos don’t make it. That makes it a publisher of content.
00:59:49
Speaker 3
As you know, exception to ability common carrier, if you make editorial decisions, if you promote certain content you lose or 230 protections.
00:59:57
Speaker 3
In court papers filed in 2016, they said the company, quote morally permitted ISIS to post on YouTube hundreds of radicalizing videos inciting violence which helped the group recruit, including some who were actually involved in the charged attacks that ever made that connection.
01:00:11
Speaker 3
Well, let’s, let’s be honest, we can, we can, we can put a pin in this thing because I think it would be shocking to me if this current code is.
01:00:20
Speaker 3
All of a sudden founded in the cockles of their heart to protect big task.
01:00:24
Speaker 3
I mean, they dismantled a lot of other stuff that I think is.
01:00:31
Speaker 3
A lot more controversial than this.
01:00:34
Speaker 3
And so, you know, we’ve, we’ve basically looked at gun laws, we’ve looked at affirmative action, we’ve looked at.
01:00:41
Speaker 3
Abortion rights, right?
01:00:43
Speaker 3
Well, I mean I think as we said, I think we all know where that guy is unfortunately going to get caught up.
01:00:50
Speaker 3
So to me it just seems like this could be an interesting case where it’s.
01:00:53
Speaker 3
Actually 90.
01:00:55
Speaker 3
In favor for complete for completely different sets of reasons. I mean, if you think of the liberal left part of the court, they have their own reasons for saying that there are 230 protections for big tech.
01:01:05
Speaker 3
And if you look at the far right or the right meaning parts, Members of this of skodas they have.
01:01:11
Speaker 3
They have another set of.
01:01:12
Speaker 3
So you think you’re gonna make a political this is not illegal.
01:01:14
Speaker 3
No, but even in their politics, they actually end up in the same place.
01:01:18
Speaker 3
They both don’t want the protections, though for different reasons.
01:01:21
Speaker 3
So there there is a reasonable outcome here where you know Roberts going to have a really interesting time trying to pick through lights and majority.
01:01:28
Speaker 1
Opinions there was a related case in the 5th Circuit in Texas where GIC the 5th Circuit decision where Texas passed a law imposing common carrier restrictions on social media companies.
01:01:42
Speaker 1
The idea being that social media companies need to operate like phone companies and it can’t just arbitrarily deny you service or so you access to them.
01:01:50
Speaker 1
Platform and the argument why previously that had been viewed actually unconstitutional with this idea of compelled speech, that you can’t compel a corporation to support speech that they don’t want to because I was a violation of their own First Amendment rights.
01:02:07
Speaker 1
And what the 1st the 5th Circuit said is no, that doesn’t make any sense.
01:02:11
Speaker 1
Facebook or Twitter can still advocate for over speech they want as a corporation, but as a platform they if Texas requires them to not discriminate against people on the basis of viewpoint, then Texas is the right term to impose that because.
01:02:26
Speaker 1
That doesn’t it?
01:02:27
Speaker 1
Their quote was that’s not troll speech.
01:02:29
Speaker 1
If anything, it shows censorship.
01:02:31
Speaker 1
So in other words.
01:02:31
Speaker 3
What’s the right legal decision here in your mind, putting aside politics if you can for a moment putting on the legal hat what is the right thing for society?
01:02:39
Speaker 3
What is the right legal issue around Section 230, specifically in the YouTube case and just generally? Should we look at YouTube? Should we look at a blogging platform like Medium or Blogger? Twitter. Should we?
01:02:51
Speaker 2
Look at those as common carrier.
01:02:54
Speaker 3
And they’re not responsible for what you publish on them.
01:02:57
Speaker 3
Obviously have to take stuff down if it if it breaks their terms of service, etc, or if it’s illegal.
01:03:01
Speaker 1
I’ve made the case before that I do think that common carrier requirements should apply onto both of the stack to protect the rights of ordinary Americans to have their speech in the face of these giant monopolies which could otherwise depart from them for arbitrary.
01:03:16
Speaker 1
Reasons just to you know, just explain this a little bit so.
01:03:21
Speaker 1
Historically, there was always a debate between so-called positive rights and negative rights.
01:03:26
Speaker 1
So where the United States start off as a country was with this idea of negative rights that what uh right meant is that you’d be protected from the government taking some action against you.
01:03:36
Speaker 1
And if you look at the Bill of Rights, you know the original right so all about.
01:03:40
Speaker 1
Protecting the citizens and its intrusion on their liberty by.
01:03:44
Speaker 1
Like why is stated by the federal government.
01:03:46
Speaker 1
In other words, Congress shall make no law.
01:03:48
Speaker 1
It was always a restriction.
01:03:50
Speaker 1
So the right was negative, but it wasn’t sort of positive reinforcement.
01:03:53
Speaker 1
Then with the Progressive Era, you started seeing, you know, more progressive rights.
01:03:58
Speaker 1
Like, for example, American citizens have the right to health care, right?
01:04:03
Speaker 1
That’s not protecting you from the government.
01:04:04
Speaker 1
That’s unless the government can be.
01:04:06
Speaker 1
Used to give you a right that you didn’t otherwise have, and so that was sort of the big progressive revolution.
01:04:12
Speaker 1
My take on it is I actually think that the probably evidence supplied you right now is that free speech is only a negative right.
01:04:19
Speaker 1
It’s not a positive, right.
01:04:21
Speaker 1
I think it actually needs to be a positive right.
01:04:23
Speaker 1
I’m embracing a more progressive version.
01:04:26
Speaker 1
Or why?
01:04:27
Speaker 1
But on behalf of sort of this original negative right, so and the reason is because the town square got privatized, right?
01:04:34
Speaker 1
I mean, you used to be able to go anywhere in this country would be a multiplicity of town squares anyone could pull out there.
01:04:39
Speaker 1
So boss frog crowd, they kept listening.
01:04:41
Speaker 1
That’s not how speech occurs anymore.
01:04:43
Speaker 1
It’s not on public land or public spaces the way that’s.
01:04:47
Speaker 1
Each political speech refresh rate occurs today.
01:04:50
Speaker 1
Is there any science social networks that have giant network effects and are based given operators?
01:04:55
Speaker 1
So if you don’t protect the right to free speech in a positive way, it no longer exists.
01:05:00
Speaker 2
So you not only believe.
01:05:02
Speaker 3
That YouTube should keep its section 2:30 you believe.
01:05:06
Speaker 3
The YouTube shouldn’t be able to deep platform as a private company you know Alex Jones is but one example basically have their free speech rights and we should lean on that side of forcing YouTube to put Alex Jones or Twitter to put.
01:05:20
Speaker 3
Trump, back on the platform, says your position.
01:05:23
Speaker 1
I’m not saying that the Constitution requires YouTube to do anything, but what I’m saying is that if.
01:05:29
Speaker 1
A state like Texas, or if the federal government wants to pass a law saying that YouTube, if you are, say, of a certain size, you’re special.
01:05:39
Speaker 1
From a certain size, you’ve been operating network effects.
01:05:41
Speaker 1
I wouldn’t apply this to all the little guys, but for those big monopolies, we know who they are.
01:05:46
Speaker 1
It’s the it’s the federal government or state.
01:05:49
Speaker 1
I wanted to say that.
01:05:50
Speaker 1
They are required to be a common carrier.
01:05:53
Speaker 1
And they cannot discriminate against certain view.
01:05:55
Speaker 1
Because I think the government should be allowed to do that because it furthers a positive right.
01:06:00
Speaker 1
Historically, they have not been able to do that because of this idea, because this idea of compelled speech meaning that it would infringe on YouTube speech rights.
01:06:02
Speaker 4
Right.
01:06:08
Speaker 1
I don’t think it would.
01:06:09
Speaker 1
I think Google.
01:06:10
Speaker 1
And YouTube can advocate for other positions.
01:06:12
Speaker 1
They want they.
01:06:13
Speaker 1
Can produce whatever content they want, yeah.
01:06:15
Speaker 1
But but the point.
01:06:16
Speaker 1
That I think section 230 kind of makes this form.
01:06:18
Speaker 1
As well is that they are platforms. Their distribution platforms will not publish those. So if they want to, especially if they want special 2:30 protection, they should not be engaged.
01:06:22
Speaker 3
OK.
01:06:28
Speaker 3
OK, so now there is.
01:06:29
Speaker 3
A web her explanation David, you explanation that you just gave this code was.
01:06:35
Speaker 3
Excellent. Thank you.
01:06:36
Speaker 3
That it allows me to understand it even more clearly that.
01:06:39
Speaker 3
Was really using the algorithm is an act of editorial data.
01:06:46
Speaker 1
Yes, yes.
01:06:46
Speaker 3
And so then.
01:06:48
Speaker 3
Would you 2?
01:06:49
Speaker 3
Try it at the end of the day.
01:06:50
Speaker 3
Me. Let.
01:06:51
Speaker 3
Me break down.
01:06:51
Speaker 3
An algorithm for you.
01:06:53
Speaker 3
OK, effectively it is a mathematical equation of variables and weights.
01:06:59
Speaker 3
An editor 50 years ago with somebody who had that equation of variables and weights in his or her mind.
01:07:08
Speaker 3
OK.
01:07:09
Speaker 3
And so all we did was we translated against this multimodal model that lived in.
01:07:14
Speaker 3
Somebody brained into a model that’s mathematical.
01:07:18
Speaker 3
That system code you’re talking about Instagram page at Everything York Times. Yeah. And I think it’s a steak. We need to say that because there is not an individual person who likes .2 in front of this one variable and point agent for the uh.
01:07:32
Speaker 3
But all of a sudden that this isn’t editorial decision making is wrong.
01:07:36
Speaker 3
We need to understand the current moment in which we live, which is that these computers are thinking actively for us.
01:07:45
Speaker 3
They’re providing this, you know, computationally intensive.
01:07:50
Speaker 3
Decision making and reasoning, and I think it’s.
01:07:55
Speaker 3
It’s worth pretty ridiculous to assume that that isn’t true.
01:07:58
Speaker 3
That’s why when you go to Google and you search for, you know, Michael Jordan, we know what the right Michael Jordan is because it’s reasoned.
01:08:07
Speaker 3
There’s an algorithm that is doing that.
01:08:09
Speaker 3
It’s making an editorial decision around what the right answer is.
01:08:12
Speaker 3
They have deemed it to be right, and that is.
01:08:15
Speaker 3
It is true. And so I think we need to acknowledge that because I think it allows us at least to be in a position to rewrite these laws through the lens of the 21st century.
01:08:25
Speaker 3
And we need to update our understanding.
01:08:28
Speaker 3
For how the world goes.
01:08:29
Speaker 3
Today and you know, so now there’s such an easy way to do this. If your tik T.O.K is your YouTube, if you want section 2:30. If you want to have common carrier and not be responsible with their when a user signs up, it should give them the option.
01:08:42
Speaker 3
Would you like to turn on an algorithm?
01:08:44
Speaker 3
Here are a series of algorithms which you could turn on.
01:08:47
Speaker 3
You could bring your own algorithm you could write.
01:08:49
Speaker 3
Their own algorithm with a bunch of sliders where here are ones that other users and services provide like in App Store so huge moth can pick one for your family, your kids.
01:08:59
Speaker 3
That would be ideal.
01:09:00
Speaker 3
It’s leaning towards education and takes out conspiracy theories, takes out cannabis news, takes out this month.
01:09:05
Speaker 3
It’s a wonderful Ledger saying is so wonderful because, for example, like, you know this organization, common Sense media.
01:09:10
Speaker 3
Yes, I love that website.
01:09:12
Speaker 3
Every time I put in the movie I put common sense media, decide if you should watch it or like, and I use it a lot for apps because they’re pretty good at just telling you which which.
01:09:20
Speaker 3
Apps are reasonable and unreasonable, but you know, if common sense media could raise a little bit more money and create an algorithm that would help filter.
01:09:29
Speaker 3
Stories in Tik T.O.K for my kids, I’d be more likely to give my kids ticked off when they turn 14.
01:09:35
Speaker 3
Right now I know that they’re going to sneak it by going to YouTube and looking at YouTube shorts and all these other things because I cannot control that algorithm.
01:09:43
Speaker 3
And it does worry me.
01:09:45
Speaker 3
What kind of content that they’re getting access to?
01:09:47
Speaker 3
And you could do this by the.
01:09:48
Speaker 3
Way trim off.
01:09:49
Speaker 3
On the operating system level or on the router level in your house you could say I want.
01:09:53
Speaker 3
The common sense algorithm.
01:09:54
Speaker 3
I will pay $25.00 a month.
01:09:57
Speaker 3
$100 for the airport we are put on your router and then any IP that goes through would be programmed properly.
01:10:03
Speaker 3
I want less violence.
01:10:04
Speaker 3
I want legs, you know, whatever.
01:10:05
Speaker 3
I think we are as a society, sophisticated enough now, yes, to have these controls.
01:10:11
Speaker 3
And so I think we need them and so.
01:10:13
Speaker 3
I think we.
01:10:13
Speaker 3
Do need to have the right?
01:10:17
Speaker 3
Observation of the current state of play free.
01:10:21
Speaker 3
When you said almost cheating, the algorithm should be.
01:10:23
Speaker 2
I don’t. I don’t.
01:10:24
Speaker 3
I don’t agree with 2:30.
01:10:24
Speaker
Doubt it.
01:10:26
Speaker 2
Yeah, I I.
01:10:27
Speaker 3
Don’t totally bring this back on the monopolistic assumption I I think that there are.
01:10:34
Speaker 4
I think there.
01:10:35
Speaker 3
Are other places to access content and I think that there is still a free.
01:10:38
Speaker 3
Market to compete?
01:10:40
Speaker 3
And it is possible to compete.
01:10:42
Speaker 3
I think that we saw this happen with Tik T.O.K, we thought happened with Instagram.
01:10:46
Speaker 3
We saw it.
01:10:46
Speaker 3
Happen with you.
01:10:48
Speaker 3
Two, competing against Google Video or Microsoft Video.
01:10:51
Speaker 3
Prior to that there have been a very significant battle for the attention of kind of doing a next Gen of media businesses and we have seen Spotify compete and routines qualify, continue to be challenged by emerging competitors.
01:11:06
Speaker 3
So I don’t buy the assumption that teams are built in monopolies and therefore it allows.
01:11:12
Speaker 3
Some regulatory process to come in and say, hey, free speech needs to be actively enforced because their monopolies.
01:11:18
Speaker 3
This isn’t like when utilities laid power lines and sewer lines and and trains across the country and they had a physical monopoly on being able to access and move goods and services.
01:11:30
Speaker 3
The Internet is still.
01:11:31
Speaker 3
Thank God, knock.
01:11:32
Speaker 3
One room open and the ability for anyone to build a competing service is still possible and there is a lot of money that would love to disrupt these businesses that is actively doing it.
01:11:42
Speaker 3
And I think every day we look at how big Tik T.O.K has gotten. It is bigger than YouTube almost or will be through.
01:11:48
Speaker 3
And there is a competition that happens and because of that competition, I think that the market will ultimately choose where they want to get their content from and how they want to consume it.
01:11:58
Speaker 3
And I don’t think that the government should play a role so.
01:12:00
Speaker 3
Actual battle to that should buy that well.
01:12:02
Speaker 1
So not all these companies were not pleased, but I think they acted a monopolistic way with respect to.
01:12:08
Speaker 1
Restricting free speech, which is they act as a cartel they all share, like best practice with each other on how to restrict speech.
01:12:16
Speaker 1
And we saw the the watershed here was there when Trump is turned off.
01:12:20
Speaker 1
First, Twitter made the decision.
01:12:22
Speaker 1
You know, Jack.
01:12:23
Speaker 1
I don’t know, Jack.
01:12:24
Speaker 1
But basically the company.
01:12:25
Speaker 3
It wasn’t him, actually, he said.
01:12:27
Speaker 3
It was a woman who was running.
01:12:28
Speaker 3
His specific reasons he got desperate for that list.
01:12:30
Speaker 1
Yeah, Jack, how she feels.
01:12:33
Speaker 1
Twitter did it first and then all the other companies follow suit.
01:12:36
Speaker 1
I mean even like Pinterest and Opta and Snapchat.
01:12:40
Speaker 1
Like, officially cost of polythene?
01:12:41
Speaker 3
Make sure you stay with everybody.
01:12:42
Speaker 1
Yeah, but Trump was actually on Facebook.
01:12:44
Speaker 1
He wasn’t on all these other companies.
01:12:46
Speaker 1
Still threw him off.
01:12:46
Speaker 1
So they all copy each other.
01:12:49
Speaker 1
And Jack actually said.
01:12:50
Speaker 1
That in his comments where he said.
01:12:51
Speaker 1
It was a.
01:12:51
Speaker 1
Mistake, he said.
01:12:53
Speaker 1
He didn’t realize the way in which.
01:12:55
Speaker 1
Which Twitter action would actually cascade?
01:12:59
Speaker 1
He said that he thought they were originally that the action was OK because it was just Twitter decides to take away from right to free speech, but he just comes out all these other companies, and then all these other companies, basically, you know, they’re all subject to the same political force, the the leadership of these companies.
01:13:16
Speaker 1
Are all sort of the altered from the same monocultural found they all the same political bias these the polls showed this.
01:13:22
Speaker 1
So the problem, Freeburg, is yeah, I agree, a bunch of these companies aren’t quite.
01:13:25
Speaker 1
Monopolies, but they all act.
01:13:26
Speaker 1
The same way.
01:13:27
Speaker 3
I hate this story, but I’ve heard these actions.
01:13:27
Speaker 1
And so then the pretty collective, yeah, the collective effect is that of.
01:13:29
Speaker 3
I’m agreeing with you.
01:13:32
Speaker 1
A speech part.
01:13:32
Speaker 1
Well, so.
01:13:33
Speaker 1
The question is how do you?
01:13:34
Speaker 1
Protect the rights of Americans to free speech in the face of a speech cartel that wants to basically block them.
01:13:40
Speaker 3
God prepared Christmas.
01:13:41
Speaker 3
Here’s my argument.
01:13:42
Speaker 3
My argument is that these are not public service providers or private service providers, and the market is telling them what to do.
01:13:48
Speaker 3
The market is.
01:13:49
Speaker 3
At Pinkerton function, I did a pressure that was felt by default was that so many consumers were picked off that they were letting Trump rail on or they were pits off about jamstix, they were tipped off about whatever.
01:14:01
Speaker 3
Or whatever the current status is.
01:14:03
Speaker 3
That friend is they respond to the market and they say, you know what this has.
01:14:07
Speaker 3
Crossed the line.
01:14:08
Speaker 3
And this was the case on public television when unity came out.
01:14:11
Speaker 3
And they’re like, OK, you know what, we need to take that off to TV.
01:14:14
Speaker 3
We need to give them the.
01:14:15
Speaker 3
Market is telling us they’re.
01:14:16
Speaker 3
Going to boycott us and.
01:14:17
Speaker 2
I think that.
01:14:18
Speaker 3
There’s a market cluster here that we’re ignoring that is actually pretty.
01:14:21
Speaker 3
Pretty relevant that as a private service provider, they’re going to lose half their audience because people artists about one or two pieces of content showing up that they’re acting in the best interests of their shareholders, in the best interest of their platform.
01:14:33
Speaker 3
They’re not acting as a public.
01:14:35
Speaker 1
Service. Uh, look I love.
01:14:36
Speaker 1
Market forces as much as the next literature.
01:14:39
Speaker 1
But I just think that fundamentally that’s that’s not what’s going on here.
01:14:42
Speaker 1
It just has nothing to do with market forces, has everything to do with political forces.
01:14:46
Speaker 1
That’s what’s driving this well, do you think the average consumer, the average user of PayPal, is demanding that they engage in all these restrictive policies, showing off all these accounts who have the wrong viewpoints?
01:14:58
Speaker 1
No, it has.
01:14:58
Speaker 1
Nothing to do with.
01:14:59
Speaker 3
It has to do with the vocal minority.
01:15:01
Speaker 1
Yeah, it’s a it’s a small number of people who are affordable activists who work at these companies and create pressure from below.
01:15:01
Speaker 3
Then for many cases.
01:15:08
Speaker 1
It’s also the, you know, the, the, the people from outside, the actors who create these boycott campaigns and pressure from outside.
01:15:14
Speaker 1
And then as basically people on Capitol Hill who have the same ideology, who basically creates threats from above.
01:15:19
Speaker 1
So these companies are under enormous.
01:15:21
Speaker 1
Pressure from above, below and sideways that it’s.
01:15:24
Speaker 1
100% political.
01:15:26
Speaker 1
Hold on.
01:15:26
Speaker 3
I don’t think next.
01:15:26
Speaker 1
It’s not.
01:15:27
Speaker 1
It’s not about.
01:15:28
Speaker 1
Maximizing profits.
01:15:29
Speaker 1
I think it’s about maximizing.
01:15:31
Speaker 1
You know, political.
01:15:32
Speaker 1
Outcomes, and that that is what the American people would.
01:15:33
Speaker 4
Yeah, I don’t think so.
01:15:35
Speaker 3
Be protected from now.
01:15:37
Speaker 1
I will.
01:15:37
Speaker 1
I will add 1.
01:15:38
Speaker 1
Nuance to my theory though which?
01:15:41
Speaker 1
I’m not sure what level of the stack we should declare to be common carrier.
01:15:46
Speaker 1
So in other words, you may be right actually that at the level.
01:15:50
Speaker 1
Of YouTube.
01:15:51
Speaker 1
Or Twitter.
01:15:52
Speaker 1
Or Facebook.
01:15:52
Speaker 1
Maybe we shouldn’t make them commentary and I’ll tell.
01:15:55
Speaker 1
You why? Because.
01:15:56
Speaker 1
Just to take the other side right here for a second, which is.
01:15:59
Speaker 1
You know, if you don’t, because those companies do have legitimate reasons to take down some content.
01:16:05
Speaker 1
I don’t like the way they do it, but I do not want to see bots from there.
01:16:08
Speaker 1
I do not wanna see fake accounts and I actually don’t want to see like truly hateful speech or harassment.
01:16:15
Speaker 1
And the problem is I do worry that if you subject them to common carrier.
01:16:19
Speaker 1
They won’t actually engage.
01:16:21
Speaker 1
Then let’s say legitimate curation of their social networks.
01:16:25
Speaker 3
Right.
01:16:26
Speaker 1
However, so.
01:16:26
Speaker 1
So there’s a real debate to be had there and it’s going to be messy.
01:16:30
Speaker 1
But I.
01:16:30
Speaker 1
Think there’s one.
01:16:31
Speaker 1
Level of the stack below that which is at the level of pipes, like in a WS, like a Cloudflare, like a PayPal, like the ISP’s, like the banks, they are not doing any.
01:16:41
Speaker 1
Content moderation or like no.
01:16:43
Speaker 1
Generally individual content moderation.
01:16:45
Speaker 1
No company should be allowed to engage in viewpoint discrimination.
01:16:48
Speaker 1
Have a problem right now.
01:16:49
Speaker 1
Where American citizens are being denied access to payment.
01:16:52
Speaker 1
Rail so we had our.
01:16:54
Speaker 1
Banking system.
01:16:54
Speaker 3
Your statement review.
01:16:56
Speaker 3
WS shouldn’t be able to deny service to the Ku Klux Klan or some hate speech.
01:17:00
Speaker 3
Work, I think.
01:17:01
Speaker 1
That they should be under the same requirements the phone company is under.
01:17:05
Speaker 3
OK.
01:17:06
Speaker 1
So when you put it.
01:17:06
Speaker 3
I mean, it’s a very complicated issue, I think.
01:17:07
Speaker 1
That way it’s it’s you.
01:17:09
Speaker 1
Know the question is like look I could.
01:17:10
Speaker 1
Claim the same question to you, should you?
01:17:13
Speaker 3
Hybrids are there.
01:17:13
Speaker 1
Know certain such horrible.
01:17:14
Speaker 1
Group should such and such horrible would be able to get a phone a.
01:17:17
Speaker 1
Phone account, right?
01:17:18
Speaker 3
Yeah, no.
01:17:18
Speaker 1
And you say, no, they can’t get anything but.
01:17:20
Speaker 1
They have that.
01:17:21
Speaker 3
Right.
01:17:21
Speaker 3
That has been mitigated and that’s been pretty much protected by the Supreme Court, you know?
01:17:26
Speaker 3
It’s a government conferred monopoly.
01:17:28
Speaker 3
The Supreme Court has said, OK, listen, like it’s not violating 1 constitutional right.
01:17:33
Speaker 3
For example, if your water bill gets terminated without you getting due process.
01:17:38
Speaker 3
And even the inverse is also true, so.
01:17:41
Speaker 3
For what?
01:17:42
Speaker 3
Whether we like it or not that Jason, that issue has been litigated I think.
01:17:47
Speaker 3
I think, I think for many, again, just like practically speaking for the functioning of civil society, I think it’s very important for us.
01:17:56
Speaker 3
To now introduce this idea of algorithmic choice.
01:17:59
Speaker 3
And I don’t think that that will happen in the absence of US rewriting section 230 in a more.
01:18:05
Speaker 3
Intelligent way.
01:18:05
Speaker 3
I don’t know.
01:18:07
Speaker 3
Don’t know whether this specific case.
01:18:10
Speaker 3
Creates enough standing for.
01:18:11
Speaker 3
Us to do all of that.
01:18:14
Speaker 3
But I think it’s an important thing that we have to revisit as a society because, Jason, what you described as having suppressive algorithmic choices over time where there are purveyors and sellers, Can you imagine that’s not a job or a company that the four of us would ever imagine could be possible five years ago, but maybe there should be an economy of algorithms?
01:18:35
Speaker 3
And there are these really great algorithms.
01:18:38
Speaker 3
That one would want to pay a subscription for because one believes in the quality of what.
01:18:42
Speaker 3
It gives you.
01:18:44
Speaker 3
We should have that choice and I think it’s an important.
01:18:46
Speaker 3
Set of choices that will allow actually.
01:18:49
Speaker 3
YouTube as an example. To operate more safely as a platform because it can say listen, I’ve created this set of abstractions.
01:18:56
Speaker 3
You can plug in all sorts of algorithms.
01:18:58
Speaker 3
There’s a default algorithm that works, but then there’s a marketplace of algorithms to think there’s a marketplace.
01:19:03
Speaker 3
My dear, I don’t discriminate unless people choose.
01:19:07
Speaker 3
Well, this isn’t even genius model like if it was.
01:19:09
Speaker 3
On a chain.
01:19:10
Speaker 3
If all the videos, all the video content was uploaded to a public blockchain and then distributed on distributed computing system, then your ability to search and use that media would be a function of a service provider you’re willing to take, one that provides the best service experience.
01:19:25
Speaker 3
And by the way, this is also why I think.
01:19:27
Speaker 3
Overtime to tax to kind of step in our both arguing both sides a.
01:19:31
Speaker 3
Little bit, but I think that what will happen.
01:19:34
Speaker 3
I don’t think that the government should come.
01:19:35
Speaker 3
In and regulate.
01:19:36
Speaker 3
These guys and tell them that they can’t take stuff down and whatnot.
01:19:39
Speaker 3
I I really don’t like the president success period.
01:19:42
Speaker 3
I also think that it’s a terrible idea for YouTube and Twitter to take stuff down.
01:19:48
Speaker 3
And I think that there is an incredibly difficult balance that they’re going to have to find because if they do this.
01:19:53
Speaker 3
As we’re seeing right now.
01:19:55
Speaker 3
The quality of the experience was set of users declines and they will find somewhere else.
01:19:59
Speaker 3
Any market will develop for something else to compete effectively against them.
01:20:03
Speaker 3
And so I that’s why I don’t like the government intervening, because I want to see a better product emerged when the big company makes some stupid mistake and does a bad job, and then the market will find a better option.
01:20:13
Speaker 3
And it’s just, it’s messy in the middle.
01:20:15
Speaker 3
And as soon as you do government intervention on these things and tell them what they can and can’t take down, I really do think that over time you will limit the user experience for what is possible if you allow the free market to.
01:20:27
Speaker 3
Is where the industry needs to police itself. If you look at the movie industry with the MPI and Indiana Jones and the Temple of Doom, they came up with PG13 rating specifically for those that were little too edgy for PG.
01:20:28
Speaker 4
Right.
01:20:40
Speaker 3
This is where our industry.
01:20:41
Speaker 3
Could get ahead of this.
01:20:43
Speaker 3
They could give algorithmic choice and algorithmic App Store.
01:20:46
Speaker 3
And if you look at the original film, it was these lifetime.
01:20:49
Speaker 3
Yeah, it’s like Trump should not have been given a lifetime gone.
01:20:52
Speaker 3
They should have given the one.
01:20:53
Speaker 3
Year bond so should have had a.
01:20:55
Speaker 3
But in this day over week, we wouldn’t be in this position where it was so visceral.
01:20:58
Speaker 1
Jason, when you.
01:20:59
Speaker 1
Think about when you talk about like having a industry consortium like the MPAA, what you’re doing is formalizing the communication that’s already taking place, already happening between these companies and what is the result of communication?
01:21:10
Speaker 1
They all standardize on overly restrictive policies because they all share the same political.
01:21:13
Speaker 2
And then there is no free market, no but.
01:21:15
Speaker 3
Bias they they get it correctly.
01:21:17
Speaker 3
Yes, the cube themselves.
01:21:18
Speaker 3
It has to be executed properly by some movie industry.
01:21:21
Speaker 3
Doesn’t matter.
01:21:22
Speaker 3
You’ll end up with the same problem of having the government intervene if the government ravine or private body.
01:21:26
Speaker 3
There being any sort of death standard intervention that prevents the market from comply.
01:21:31
Speaker 3
I just, I disagree with you.
01:21:32
Speaker 3
I think you can create more competition if the government says, OK folks, you can have the standard algorithm, but you need to make a simple.
01:21:41
Speaker 3
Substracted way for somebody else to write some other filtering mechanism and to basically so that users can dictate.
01:21:47
Speaker 3
I don’t.
01:21:48
Speaker 3
I don’t like that, yeah, but the MPI I.
01:21:50
Speaker 3
Did was.
01:21:51
Speaker 3
I don’t understand why.
01:21:52
Speaker 3
Why isn’t?
01:21:52
Speaker 3
That more choice?
01:21:53
Speaker 3
Because otherwise, I suppose at the product company I don’t want to be told.
01:21:56
Speaker 3
How to make my?
01:21:57
Speaker 3
Product right or not?
01:21:58
Speaker 3
You have an.
01:22:00
Speaker 3
Algo, you’re now saying that there is this distinction of the algo from the US from the data, and my choice might be to create different content libraries.
01:22:06
Speaker 3
For example, YouTube has YouTube Kids, and it’s a different content library and it’s a different user interface and it’s a different algorithm and you’re trying to create.
01:22:14
Speaker 3
Abstraction that may not necessarily be natural to the evolution of the product set of that car.
01:22:18
Speaker 2
I would much.
01:22:19
Speaker 3
Rather see them figure it out as a good argument. Again, if you were not a monopoly, I I would be more sympathetic, but because like somebody, somebody’s feelings would get hurt, or product manager feelings will get hurt instead.
01:22:20
Speaker 4
That’s not does it.
01:22:31
Speaker 3
The reasons to not protect freeze between your unnaturally disrupting the product evolution and help block. That’s what it that’s what happens when you’re worth $2 trillion. Get over it.
01:22:40
Speaker 3
Impact a billion people on the planet.
01:22:42
Speaker 3
When you start having massive impact in society, you have to take some responsibility.
01:22:47
Speaker 3
Most companies are not taking responsibility or not super Super super successful if this is not going to affect you, so you have.
01:22:53
Speaker 3
Nothing to worry about.
01:22:54
Speaker 3
You’ll see.
01:22:55
Speaker 3
Apps offshore and you’ll see Tik, T.O.K and other things.
01:22:58
Speaker 3
Compete because they’ll.
01:22:58
Speaker 3
Have a better product experience.
01:23:00
Speaker 1
Nobody, no, no, no. If they create a new Google because they’re down ranking, you know, one to 10% of the search results for political reasons.
01:23:08
Speaker 3
I agree.
01:23:09
Speaker 3
I agree.
01:23:09
Speaker 3
I agree.
01:23:10
Speaker 3
Some accountability for.
01:23:11
Speaker 1
All the ideal.
01:23:12
Speaker 1
World companies like.
01:23:13
Speaker 1
Google and so forth would not take sides in political debates to be politically neutral.
01:23:17
Speaker 1
But they’re not.
01:23:18
Speaker 1
You look at all the data around the political leanings for people running these companies, and then you look at the actual actions of these companies and they have become fully political and they’ve waded into all these political debates with the result that the American people.
01:23:30
Speaker 1
Rights to speech and to earn have been reduced.
01:23:33
Speaker 1
You have companies like PayPal which charges engaging in retaliation.
01:23:37
Speaker 1
Basic financial retaliation purely on based on what political viewpoints they have, why it’s likely it’s only PayPal needs to be in the business of costs inspiration.
01:23:45
Speaker 3
Hey guys, let’s continue this conversation.
01:23:47
Speaker 3
Let’s continue this conversation.
01:23:49
Speaker 3
We’re not going.
01:23:49
Speaker 2
Trolling it.
01:23:49
Speaker 3
To stop calling anime.
01:23:51
Speaker 3
Yeah, did you get if they can get some servers over that or maybe she got away from money spouts in this app and just the more shows are listed for the dictator who needs to hit the loo to do a #2. Yes, I am the world greatest moderator Friedberg assaulting a science and do it fast.
01:24:10
Speaker 3
He’s the Prince of.
01:24:14
Speaker 3
See you next week on episode.
01:24:17
Speaker 3
Wait, is this 98 or?
01:24:18
Speaker 3
99 it’s 99, it’s 99.
01:24:23
Speaker 3
Well, enjoy.
01:24:24
Speaker 3
Well, rest.
01:24:24
Speaker 3
So we’re walking up here.
01:24:25
Speaker 3
I will see you all.
01:24:26
Speaker 3
Next time, have a great.
01:24:27
Speaker 3
Movement comma bye bye.
01:24:31
Speaker 1
Let your winners live.
01:24:34
Speaker 3
Rain Man, David.
01:24:38
Speaker 1
Then we open sourced it to our fans and they just gone crazy.
01:24:54
Speaker 3
My driveway.
01:25:01
Speaker 3
We should all just get a room and just have one huge.
01:25:04
Speaker 3
Or because they’re.
01:25:04
Speaker 3
Also useful like this, like sexual tension, but they just need.
01:25:07
Speaker 1
To release that out.
01:25:14
Speaker 3
Where did you get merges?
Your transcript is saved. You can close this pane and the document and come back to it