Skip to content

Instantly share code, notes, and snippets.

@nelsonjchen
Created November 15, 2025 19:53
Show Gist options
  • Select an option

  • Save nelsonjchen/0464c8969246d0fd969ff60417026ae5 to your computer and use it in GitHub Desktop.

Select an option

Save nelsonjchen/0464c8969246d0fd969ff60417026ae5 to your computer and use it in GitHub Desktop.
[00:00:00.000 --> 00:00:07.679] Hi guys, welcome back. I hope everyone enjoyed lunch. Our next presenter is George Hott.
[00:00:07.799 --> 00:00:12.900] So please gather around and we're very excited to bring you the second half of the talks.
[00:00:13.359 --> 00:00:14.419] Welcome George.
[00:00:22.420 --> 00:00:25.519] On the mic and then it turns green and then you can hear me
[00:00:25.519 --> 00:00:28.460] I'll give everyone a minute to get in here
[00:00:28.460 --> 00:00:31.280] I made this logo myself
[00:00:31.280 --> 00:00:36.420] let's see if the clicker is going to work
[00:00:36.420 --> 00:00:38.299] we only have one AI picture in the slides
[00:00:38.299 --> 00:00:47.399] this whole space really is a game of Survivor.
[00:00:48.679 --> 00:00:51.079] A lot of people have been voted off the island.
[00:00:53.079 --> 00:00:55.219] So that's kind of what this talks about.
[00:00:55.299 --> 00:00:58.520] I thought it was appropriate to steal branding from a reality show.
[00:00:59.520 --> 00:01:03.859] So my name is George Hotz, and this is Outwit, Outplay, Outlast.
[00:01:04.540 --> 00:01:07.099] So does this work?
[00:01:08.140 --> 00:01:08.859] There we go.
[00:01:09.340 --> 00:01:10.400] So you know what?
[00:01:10.879 --> 00:01:12.040] I don't work at Comma anymore.
[00:01:13.900 --> 00:01:18.500] I am still the president, but, you know, you heard from Adib,
[00:01:18.920 --> 00:01:22.620] who presented the Comma for, and you'll hear from Harold after me.
[00:01:22.620 --> 00:01:24.819] They are the executives who run the company.
[00:01:25.359 --> 00:01:26.239] I am a cheerleader.
[00:01:28.079 --> 00:01:30.099] So I'm mostly here to cheerlead.
[00:01:30.680 --> 00:01:31.760] Sometimes I complain.
[00:01:33.019 --> 00:01:34.340] You know, like, oh, man, you know,
[00:01:34.379 --> 00:01:35.680] you think the cheerleaders ever complain?
[00:01:36.579 --> 00:01:38.540] Like, why'd you guys lose?
[00:01:38.760 --> 00:01:39.239] That sucked.
[00:01:39.299 --> 00:01:40.480] I could have thrown that ball better than that.
[00:01:42.140 --> 00:01:44.019] So, yeah, those are the things I do.
[00:01:46.939 --> 00:01:51.519] So, you know, you're hearing from a cheerleader here. You're not hearing from anyone with any secret inside knowledge, but this has just been
[00:01:51.519 --> 00:01:55.540] my observations. I used to work at Comma, so I know a little bit about it. We can talk about
[00:01:55.540 --> 00:01:59.260] what the original plan was. We were going to contract with Tesla to replace Mobileye.
[00:02:00.500 --> 00:02:04.000] The deal fell through. You know, we had some good press around that. That was fun, you know.
[00:02:04.260 --> 00:02:07.760] Elon's a great guy. It's really, it's's really uh you know you just got to think about the thing
[00:02:07.760 --> 00:02:13.680] i really like about elon is he inspires you even if you're his enemy if you're his friend it doesn't
[00:02:13.680 --> 00:02:18.800] matter it's like it just shows you that great things are possible right and this just you know
[00:02:18.800 --> 00:02:23.199] that's what makes me really happy so you know even though the contract fell through it's great things
[00:02:23.199 --> 00:02:25.020] are possible, right?
[00:02:26.879 --> 00:02:26.919] So we're going to build Autopilot anyway, right?
[00:02:29.699 --> 00:02:30.819] My contract was to replace Mobileye and the original Tesla Autopilot.
[00:02:33.560 --> 00:02:34.599] We build it anyway, and we're going to sell it to car makers.
[00:02:37.419 --> 00:02:37.659] You know, a lot of people are like, George, George, why doesn't Kama work with car makers?
[00:02:38.840 --> 00:02:39.439] You know, this was the original plan.
[00:02:42.360 --> 00:02:42.580] We were going to go sell to car makers, right?
[00:02:45.680 --> 00:02:46.259] But so we built it.
[00:02:46.840 --> 00:02:48.620] We built it.
[00:02:49.879 --> 00:02:51.060] This was the original Kama car.
[00:02:52.460 --> 00:02:53.840] I stole this from another
[00:02:53.840 --> 00:02:55.560] TV franchise. We got Engage.
[00:02:56.199 --> 00:02:56.280] Right?
[00:02:57.860 --> 00:02:59.020] This was on Bloomberg
[00:02:59.020 --> 00:03:01.500] 10 years ago. We had all this
[00:03:01.500 --> 00:03:03.699] stuff. I think Adib
[00:03:03.699 --> 00:03:04.800] talked about this a little.
[00:03:05.340 --> 00:03:07.819] Hots added a joystick to the car center console.
[00:03:07.979 --> 00:03:11.099] A pull of the trigger engages the self-driving system, right?
[00:03:12.599 --> 00:03:14.280] Why did I use the cruise control button?
[00:03:16.060 --> 00:03:19.280] But business, so this works.
[00:03:19.819 --> 00:03:21.280] It was actually really good.
[00:03:22.099 --> 00:03:24.319] We did a road trip to Vegas.
[00:03:24.319 --> 00:03:25.460] We're driving. It's not quite as good as modern open pilot, but a lot of it was really good. We did a road trip to Vegas. We're driving.
[00:03:26.659 --> 00:03:28.599] It's not quite as good as modern OpenPilot,
[00:03:28.599 --> 00:03:29.800] but a lot of it was there.
[00:03:29.800 --> 00:03:32.319] A lot of the stuff we've done for the last 10 years
[00:03:32.319 --> 00:03:33.340] was shipability.
[00:03:34.419 --> 00:03:36.139] So we build this, and I'm like, okay,
[00:03:36.139 --> 00:03:38.560] so now I just gotta talk to the...
[00:03:42.039 --> 00:03:45.000] That's the only AI-gener generated photo in the slides.
[00:03:45.120 --> 00:03:47.860] It's kinda like, hello Mr. Ford.
[00:03:49.139 --> 00:03:51.080] Yes, okay, so you see this?
[00:03:52.800 --> 00:03:57.520] This, and Mr., you don't, what do you mean,
[00:03:57.520 --> 00:03:59.639] you don't know what a PID loop is?
[00:03:59.639 --> 00:04:01.659] No, no, no, no, but like, no, it's using like an AI model
[00:04:01.659 --> 00:04:02.539] and predicting the steering angle
[00:04:02.539 --> 00:04:07.379] and putting that to a PID loop. Oh.
[00:04:07.780 --> 00:04:09.740] Yeah.
[00:04:11.939 --> 00:04:12.039] I mean, it might as well be AI generated, right?
[00:04:14.319 --> 00:04:14.719] There is no Mr. Ford.
[00:04:16.779 --> 00:04:18.720] I was misled by how Tesla was run because there actually is a Mr. Tesla.
[00:04:19.100 --> 00:04:21.480] You can go to Elon and say you're using a neural network
[00:04:21.480 --> 00:04:22.639] with a PID loop, and he's like,
[00:04:22.699 --> 00:04:24.399] that's shit, we got model predictive control.
[00:04:26.939 --> 00:04:29.639] So it turns out you can't just do this.
[00:04:29.800 --> 00:04:32.660] And even people today say, well, why doesn't Common work with car makers?
[00:04:32.860 --> 00:04:36.399] And I want you to take that and then think about that a little bit more
[00:04:36.399 --> 00:04:38.399] and think about how that might actually happen.
[00:04:39.839 --> 00:04:41.800] Notice that a lot of people have this idea.
[00:04:42.819 --> 00:04:46.639] A lot of companies have the idea that they're going to sell to car makers.
[00:04:48.040 --> 00:04:52.639] And we're going to look later in this slide into how it worked out for them.
[00:04:53.339 --> 00:04:58.120] So giving up on this idea was the right move.
[00:04:58.680 --> 00:05:00.980] I understand that it doesn't feel right.
[00:05:00.980 --> 00:05:04.199] I understand that people have this idea that, well, but you should.
[00:05:04.319 --> 00:05:12.660] That's what makes you a legitimate business. Look at my figurehead polishing machine. It is fake legitimacy.
[00:05:14.600 --> 00:05:21.800] So we're going to pivot. I think I just sell to car makers. Who came up with this idea? No? Me?
[00:05:22.459 --> 00:05:27.019] Let's fire that guy. We're going to pivot. And what are we going to pivot me? Oh, fire that guy. Pivot, and what are we gonna pivot to?
[00:05:27.019 --> 00:05:29.560] Well, can a cell phone drive a car?
[00:05:29.560 --> 00:05:30.879] That's kind of the original question.
[00:05:30.879 --> 00:05:32.699] We thought about all that stuff and we're like,
[00:05:32.699 --> 00:05:34.819] can we run all of this on a cell phone?
[00:05:36.000 --> 00:05:39.720] And the answer was yes, and we did it in about six months.
[00:05:39.720 --> 00:05:41.839] And then we launched the Comma One.
[00:05:42.939 --> 00:05:48.060] Comma One was shipping by the end of the year for $9.99 and $24 a month,
[00:05:48.500 --> 00:05:54.639] right, which is actually exactly what the Comma 4 is. The Comma 4 is $9.99 and, you know, optional,
[00:05:54.759 --> 00:05:58.680] $24 a month. We don't have a page on the website describing it now, but you guys should buy Prime
[00:05:58.680 --> 00:06:08.740] anyway. It's in the FAQ, don't worry. But yeah, no, it's the exact same thing. It's the exact same thing it's the exact same thing but I didn't realize actually
[00:06:08.740 --> 00:06:15.540] how hard this was many times in my life I've been hit by Dunning-Kruger and no one is ever
[00:06:15.540 --> 00:06:20.120] going to start a company if they don't believe that things are easy right what's the saying it's
[00:06:20.120 --> 00:06:26.199] you know we didn't do it because it was easy We did it because we thought it was going to be easy.
[00:06:27.680 --> 00:06:28.459] Yeah, so.
[00:06:33.139 --> 00:06:33.540] Yeah.
[00:06:34.899 --> 00:06:37.139] I mean, like, and I didn't really, like, see it,
[00:06:37.199 --> 00:06:39.439] but, like, we just kind of weren't there.
[00:06:42.399 --> 00:06:42.800] Yeah.
[00:06:45.759 --> 00:06:48.500] And then we got to reach out from Nishta.
[00:06:49.220 --> 00:06:54.100] I'm like, oh my God, guys, guys, it's a cell phone in a plastic case.
[00:06:54.240 --> 00:06:57.399] You're already harassing me and telling me you want fines.
[00:06:57.600 --> 00:07:04.360] And I'm just like, you know, when they go left, you just like eat a cheesecake.
[00:07:08.439 --> 00:07:08.480] You see what I mean by that? Right? It's like,
[00:07:11.540 --> 00:07:11.699] you know, the saying like, when they go left, you go right. When they go left, you eat a cheesecake.
[00:07:17.439 --> 00:07:25.100] Like it's like, it's like these people don't even understand. Nobody understands what's happening like right now with AI. It's so fake. And it was fake back then. It was just the beginnings of it.
[00:07:25.160 --> 00:07:27.959] Now people are kind of starting to catch on.
[00:07:28.480 --> 00:07:30.560] But 10 years ago, people were like, oh, the media.
[00:07:30.879 --> 00:07:32.220] Oh, this is in the New York Times.
[00:07:32.279 --> 00:07:33.540] That's a legitimate news source.
[00:07:34.100 --> 00:07:35.740] Right now it's like the media, right?
[00:07:37.540 --> 00:07:40.420] Try, try, try until you die is not what most people would advise.
[00:07:40.420 --> 00:07:44.300] In this case, however, startup comma AI is probably close to calling it quits.
[00:07:44.720 --> 00:07:45.540] Slash gear, 2016.
[00:07:46.379 --> 00:07:53.300] Lessons from the failure of George Hotz and the Kama 1 semi-autonomous driving system, Forbes, 2016.
[00:07:55.240 --> 00:07:59.459] Kama is taking its ball and going home, automotive news, 2016.
[00:08:00.459 --> 00:08:04.740] Autonomous vehicle company shuts down in fear of regulation.
[00:08:07.180 --> 00:08:12.399] Like, this just isn't true and it's so interesting like when you see a media story about something
[00:08:12.399 --> 00:08:17.420] that you understand this is Gelman amnesia right like like you see it you
[00:08:17.420 --> 00:08:19.899] see a news article let's say you know you study physics and you see a news
[00:08:19.899 --> 00:08:22.959] article about physics and you're like oh god these people don't know anything
[00:08:22.959 --> 00:08:26.540] about physics why can't they get a physics guy Oh international politics
[00:08:26.540 --> 00:08:32.059] Wow Iran is sure causing trouble they must be really informed about this it's
[00:08:32.059 --> 00:08:39.759] just like no they're so like like well
[00:08:40.299 --> 00:08:44.559] skate where the puck is going not where it has been so this was the first video
[00:08:44.559 --> 00:09:23.259] of open pilot driving. We open sourced it.
[00:09:23.259 --> 00:09:24.759] This is the first open pilot release.
[00:09:24.759 --> 00:09:25.919] Open pilot is an open sourceced it. This is the first open pilot release. Open pilot is an open source
[00:09:25.919 --> 00:09:30.159] driving agent. Currently it performs the functions of ACC and LCAS for Hondas and
[00:09:30.159 --> 00:09:34.679] Acuras. It's about on par with Tesla autopilot at launch and better than all
[00:09:34.679 --> 00:09:42.519] other manufacturers. And that's still kind of true. So we didn't ship the Kama 1.
[00:09:42.519 --> 00:09:46.200] It actually took a long time and again it really is Dunning-Kruger and it's you know no one's to blame but myself. I don't blame Nishta, I don't ship the Comma One. It actually took a long time. And again, it really is Dunning-Kruger.
[00:09:46.360 --> 00:09:48.000] And no one's to blame but myself.
[00:09:48.220 --> 00:09:49.120] I don't blame Nishta.
[00:09:49.220 --> 00:09:50.059] I don't blame the media.
[00:09:50.299 --> 00:09:51.460] It's fun to make fun of these people.
[00:09:51.460 --> 00:09:55.220] But fundamentally, at the end of the day, I had a phone and a 3D-printed case.
[00:09:55.840 --> 00:09:59.539] So four years later, we shipped a phone and a 3D-printed case.
[00:10:02.279 --> 00:10:02.879] No, no, no.
[00:10:02.960 --> 00:10:04.399] Look, the case is smaller.
[00:10:04.740 --> 00:10:06.980] Look, we added some IRLDs down here.
[00:10:07.279 --> 00:10:08.039] We added a GPS.
[00:10:08.360 --> 00:10:09.580] There's a board in the back here.
[00:10:09.600 --> 00:10:10.240] It's pretty nice, right?
[00:10:10.279 --> 00:10:10.960] Who owned a Comma 2?
[00:10:12.899 --> 00:10:13.919] I had a few people, yeah.
[00:10:14.179 --> 00:10:16.059] You know, it was something.
[00:10:17.840 --> 00:10:20.700] And it was $9.99 and $24 a month.
[00:10:23.299 --> 00:10:28.539] So then after the Comma 2, we shipped the Comma 3.
[00:10:28.539 --> 00:10:30.919] The Comma 3 was starting to look like a real product.
[00:10:30.919 --> 00:10:33.820] The Comma 3 actually shipped pretty quickly after the Comma 2.
[00:10:33.820 --> 00:10:36.440] Look at that open pilot.
[00:10:36.440 --> 00:10:37.440] It looks pretty modern.
[00:10:37.440 --> 00:10:40.100] It looks pretty good.
[00:10:40.100 --> 00:10:42.720] But the Comma 3 was too expensive.
[00:10:42.720 --> 00:10:44.720] Who had a Comma 3?
[00:10:44.720 --> 00:10:45.899] Wow more of you than comma twos.
[00:10:45.899 --> 00:10:48.120] I'm sorry you guys paid so much money.
[00:10:48.120 --> 00:10:49.240] But you know what?
[00:10:49.240 --> 00:10:51.580] Don't feel bad because however much money,
[00:10:51.580 --> 00:10:53.860] what, you're out two grand for a comma three?
[00:10:53.860 --> 00:10:56.799] We're out five million on the comma three.
[00:10:56.799 --> 00:11:00.120] Comma lost five million dollars on the comma three project.
[00:11:00.120 --> 00:11:02.019] Sales stagnated.
[00:11:02.019 --> 00:11:03.240] The product was too expensive
[00:11:03.240 --> 00:11:05.639] and undifferentiated from the two.
[00:11:06.600 --> 00:11:07.100] It was a lot better.
[00:11:10.700 --> 00:11:11.539] We made a lot of progress in making the thing look like a consumer electronic.
[00:11:14.539 --> 00:11:15.220] But fundamentally, this didn't drive sales.
[00:11:16.379 --> 00:11:16.940] This is the comma 2 era.
[00:11:18.379 --> 00:11:22.820] And this is the comma 3 era.
[00:11:24.000 --> 00:11:24.320] But, you know, we fixed it.
[00:11:25.559 --> 00:11:25.659] We shipped the comma 3x. How many people had a comma 3x? All right, all right, a lot it. We shipped the Comma 3X.
[00:11:26.720 --> 00:11:27.460] How many people had a Comma 3X?
[00:11:29.720 --> 00:11:30.879] All right, all right, a lot more Comma 3Xs out there.
[00:11:32.240 --> 00:11:33.740] And it was sold for a good price.
[00:11:34.600 --> 00:11:46.840] And that fixed sales. All right, so what do you guys think?
[00:11:50.340 --> 00:11:52.519] This is future-facing statements.
[00:11:52.620 --> 00:11:54.480] We have future-facing statements here.
[00:11:55.980 --> 00:11:56.139] Yeah.
[00:11:57.080 --> 00:12:01.840] So, no, I mean, I think that...
[00:12:01.840 --> 00:12:03.559] Oh, the presentation's pretty good.
[00:12:04.200 --> 00:12:05.039] The website's pretty good. The website is pretty good.
[00:12:06.000 --> 00:12:07.220] The device is amazing.
[00:12:08.419 --> 00:12:14.000] None of the stuff we've shown off does justice to just how much better this device is than the old one.
[00:12:16.879 --> 00:12:19.820] I mean, like, you see this, like, picture in the UI.
[00:12:19.820 --> 00:12:23.299] And, like, it looks like, oh, you know, this could be, like, Photoshopped by some, like, fake company.
[00:12:23.379 --> 00:12:27.539] You know, it's so difficult today to tell the difference between what's fake and what's real.
[00:12:28.559 --> 00:12:35.720] Because what makes something real is an incredibly difficult engineering problem.
[00:12:36.480 --> 00:12:42.620] And hopefully we conveyed some of that to you today, how just insanely difficult it is to make
[00:12:42.620 --> 00:12:45.759] this device. And people are like, oh, you want an 8-year-old mobile phone chip.
[00:12:46.080 --> 00:12:47.259] None of that matters.
[00:12:47.840 --> 00:12:49.799] My Apple II was more responsive.
[00:12:51.500 --> 00:12:54.139] Apple II running VisiCalc was more responsive
[00:12:54.139 --> 00:12:57.200] than my M4 MacBook running Google Sheets.
[00:12:58.039 --> 00:12:59.059] Why is this?
[00:12:59.879 --> 00:13:04.220] It's just due to this insane complexity of software.
[00:13:04.759 --> 00:13:07.279] None of it has anything to do with the actual hardware.
[00:13:08.720 --> 00:13:12.820] This thing is 10 years of refinement in a piece of software.
[00:13:13.019 --> 00:13:14.259] You saw the first version of the software.
[00:13:14.360 --> 00:13:15.379] You see the current version of the software.
[00:13:15.440 --> 00:13:17.139] I'm hearing about mutation testing this morning.
[00:13:17.240 --> 00:13:18.980] I'm like, damn, we've got to put that in TinyGrad.
[00:13:23.039 --> 00:13:26.639] But yeah, I mean, I think this is the first one that is a
[00:13:26.639 --> 00:13:29.740] it's really a top tier consumer electronic
[00:13:29.740 --> 00:13:32.279] I love consumer electronics
[00:13:32.279 --> 00:13:33.779] Adib really loves consumer electronics
[00:13:33.779 --> 00:13:36.399] yeah it's really
[00:13:36.399 --> 00:13:40.039] if you buy this device
[00:13:40.039 --> 00:13:41.740] and get one shipped to you
[00:13:41.740 --> 00:13:44.039] I think you'll find it
[00:13:44.039 --> 00:13:45.179] it might actually be the best unboxing experience you've ever had and get one shipped to you, I think you'll find it...
[00:13:45.179 --> 00:13:48.419] It might actually be the best unboxing experience
[00:13:48.419 --> 00:13:48.919] you've ever had.
[00:13:50.159 --> 00:13:51.799] And I don't say this like...
[00:13:51.799 --> 00:13:53.460] I say this, I'm actually thinking about it, right?
[00:13:54.100 --> 00:13:57.080] Because when people buy a new laptop,
[00:13:57.340 --> 00:13:59.679] it's like, okay, it's a little bit better than the old one.
[00:13:59.940 --> 00:14:01.179] And if you guys had a Comma,
[00:14:01.259 --> 00:14:03.259] maybe some of the magic would be lost.
[00:14:03.259 --> 00:14:04.899] But for someone who's never had this,
[00:14:05.980 --> 00:14:09.639] for someone to realize that you can buy a little device
[00:14:09.639 --> 00:14:12.019] that fits in the palm of your hand, stick it on your windshield,
[00:14:12.159 --> 00:14:14.460] and it drives the car, that's magic.
[00:14:15.700 --> 00:14:17.379] And you can describe it to a six-year-old.
[00:14:18.299 --> 00:14:19.179] So here's your test.
[00:14:19.279 --> 00:14:21.279] Here's your test for all technology.
[00:14:21.279 --> 00:14:23.480] Whenever someone tells you that they have some new technology,
[00:14:23.620 --> 00:14:30.419] oh, we're building humanoid robots. Okay, cool. Can I have one over and have it cook me some pasta?
[00:14:31.620 --> 00:14:36.740] No, that doesn't exist. But you wouldn't know it. You wouldn't know it. If you consume media,
[00:14:36.740 --> 00:14:43.080] if you consume advertising, all of this stuff looks possible that's fake.
[00:14:43.080 --> 00:14:43.840] all of this stuff looks possible that's fake.
[00:14:47.399 --> 00:14:47.980] Oh, AI coding is going to lay off 30% of engineers,
[00:14:49.500 --> 00:14:50.120] like actual AI coding.
[00:14:52.220 --> 00:14:53.539] You can't even use these things for customer support jobs.
[00:14:57.980 --> 00:15:01.379] The disconnect from reality and hype is insane. And I don't have any language to convey.
[00:15:03.899 --> 00:15:08.659] I'm not better at hype and better at advertising than those people.
[00:15:08.759 --> 00:15:14.559] They're actually better at it than me. But what Kama is better at is making a product.
[00:15:15.779 --> 00:15:20.759] So I think if you've never driven a Kama and you buy this Kama 4, it might be the best
[00:15:20.759 --> 00:15:27.419] consumer electron experience, consumer electron experience you've had since, I don't know, you guys remember
[00:15:27.419 --> 00:15:31.980] getting an N64 when you were a kid? Those are the real magic ones, right? Because from
[00:15:31.980 --> 00:15:37.700] the SNES to the N64, that was magic. From the PlayStation 4 to the PlayStation 5, and
[00:15:37.700 --> 00:15:40.919] now I'm playing Modern Warfare 6 instead of Modern Warfare 5. Well, it's not really better,
[00:15:41.019 --> 00:15:47.120] but they're shutting the server down for Modern Warfare 5. So it's a top-tier consumer electronics.
[00:15:47.200 --> 00:15:48.139] Look at this great circuit board.
[00:15:49.960 --> 00:15:51.639] Now, that's the circuit board from the Comma 3.
[00:15:52.080 --> 00:15:53.299] That's the circuit board from the Comma 4.
[00:15:55.379 --> 00:16:03.019] Adiba already went through this, but, yeah, I mean, that's on par with the true state of the art.
[00:16:03.700 --> 00:16:06.500] Like I said, we have an SMT line that's, yeah,
[00:16:06.559 --> 00:16:07.799] doing things that they're not doing
[00:16:07.799 --> 00:16:09.080] in the fanciest SMT lines in China.
[00:16:10.519 --> 00:16:14.500] Vapor phase oven, 3D optical inspection.
[00:16:14.940 --> 00:16:15.720] They're good about inspection,
[00:16:15.860 --> 00:16:17.139] but they don't have the nice ovens.
[00:16:17.919 --> 00:16:19.179] That's what the front of it looks like.
[00:16:19.759 --> 00:16:22.159] This is an LTE modem, GPS antenna.
[00:16:22.720 --> 00:16:24.899] I did have a contribution to this.
[00:16:26.440 --> 00:16:30.940] I did, I did. Even though I'm a cheerleader, I cheerlead for something. I wanted the GPS antenna to be big. There was going to be a
[00:16:30.940 --> 00:16:34.620] small GPS antenna, but I'm like, the photons come from the sky, and here's the antenna,
[00:16:34.620 --> 00:16:57.000] and you've got to hit the... So, yeah, camera board connector. What's it very low dynamic range. It's hard to see. What is success?
[00:16:57.000 --> 00:17:05.440] These are the kind of questions you start to think about. Who succeeded?
[00:17:07.559 --> 00:17:07.880] I should have included this in the talk.
[00:17:09.259 --> 00:17:10.099] There's a great quote from Jensen.
[00:17:11.920 --> 00:17:13.220] This man owns the biggest company in the world.
[00:17:14.000 --> 00:17:14.059] And he's like talking,
[00:17:15.420 --> 00:17:15.539] he's just like talking cat in an interview.
[00:17:16.559 --> 00:17:16.640] He's like, man, you know,
[00:17:17.880 --> 00:17:17.980] if I'd known how much like,
[00:17:19.359 --> 00:17:20.339] oh, how much headache it was going to be and how hard it was going to be,
[00:17:20.400 --> 00:17:21.160] I never would have done it.
[00:17:23.599 --> 00:17:24.279] I love this.
[00:17:24.359 --> 00:17:25.200] I mean, I love this.
[00:17:25.200 --> 00:17:27.400] I love, you know, it's great that he, you know,
[00:17:27.400 --> 00:17:31.559] it's great to see the success of people like that.
[00:17:32.099 --> 00:17:36.099] And, yeah, I mean, if the top company in the world,
[00:17:40.559 --> 00:17:42.079] the greatest founder in history,
[00:17:42.079 --> 00:17:44.799] this is the biggest company in history, is saying this,
[00:17:44.799 --> 00:17:46.059] you know, then what is success?
[00:17:47.059 --> 00:17:48.400] And you think about a game of chess.
[00:17:48.980 --> 00:17:51.140] You think about what success means in a game of chess.
[00:17:52.759 --> 00:17:55.319] Success is when you win, right?
[00:17:56.140 --> 00:17:58.480] And you might lose, you might draw,
[00:17:58.880 --> 00:18:00.059] but either way the game ends.
[00:18:01.380 --> 00:18:02.440] Chess is a finite game.
[00:18:06.980 --> 00:18:07.839] But what does success look like in an infinite game
[00:18:07.839 --> 00:18:12.519] success is simply surviving
[00:18:12.519 --> 00:18:17.359] so uh
[00:18:17.359 --> 00:19:07.000] you know let's you Thank you. The American Pronunciation Guide Presents ''How to Pronounce Keith Rabois''
[00:19:22.000 --> 00:19:24.000] This is the founder's phone back, Keith Rabois.
[00:19:24.000 --> 00:19:26.720] The founder of this company steals all the comma rhetoric,
[00:19:27.240 --> 00:19:28.700] literally steals all my rhetoric,
[00:19:30.279 --> 00:19:32.900] has over 10x the funding.
[00:19:34.779 --> 00:19:36.059] Oh, they're going to ship.
[00:19:36.779 --> 00:19:37.920] Oh, now it's automaker partners,
[00:19:38.000 --> 00:19:40.500] but originally they were going to ship a kit to put it on your car.
[00:19:42.500 --> 00:19:57.900] $220 million. Who remembers Embark? Did a SPAC. Did anyone ever do a SPAC and succeed?
[00:19:58.500 --> 00:20:03.779] Did anyone ever go, oh yeah, I'm so glad I invested in that SPAC, bro. Come on.
[00:20:11.799 --> 00:20:17.140] in that SPAC, bro. Come on. I was on a panel with these people. I hear myself talking and I'm just like, oh my God, you have no idea what the hell I'm saying. They're actually
[00:20:17.140 --> 00:20:22.059] being investigated now for selling secrets to the Chinese or whatever. My thing is just
[00:20:22.059 --> 00:20:25.019] like, what secrets?
[00:20:26.759 --> 00:20:26.799] Billion-dollar hit on Argo, right?
[00:20:29.980 --> 00:20:30.079] This was the Ford-Volkswagen joint one.
[00:20:32.180 --> 00:20:32.279] Shifts its bet to driver assist tech.
[00:20:37.119 --> 00:20:37.200] Yeah, you can right now go use a blue blues cruise on your Ford,
[00:20:38.420 --> 00:20:40.059] or you can buy a Comma.
[00:20:41.559 --> 00:20:42.339] How many billions, you know?
[00:20:44.099 --> 00:20:48.440] And then there's the really big one, of course.
[00:20:50.680 --> 00:20:52.339] Despite $10 billion spent.
[00:20:53.980 --> 00:20:54.220] Did you know that Cruise was originally,
[00:20:56.160 --> 00:20:56.359] I talked about this in last year's Comic-Con,
[00:20:57.359 --> 00:20:57.740] or two years ago Comic-Con.
[00:21:01.400 --> 00:21:02.960] Cruise was originally building a $10,000 aftermarket kit for Audi A-Force.
[00:21:04.559 --> 00:21:07.319] Kyle Vogt gave up on this dream.
[00:21:07.319 --> 00:21:09.079] He gave up, he talks about this on Lex Friedman,
[00:21:09.079 --> 00:21:10.579] because it was too hard.
[00:21:10.579 --> 00:21:11.759] There are too many different cars
[00:21:11.759 --> 00:21:13.700] you're gonna have to support.
[00:21:13.700 --> 00:21:17.819] Figuring out how to support all those cars is too hard,
[00:21:17.819 --> 00:21:21.240] and figuring out a graceful degradation
[00:21:21.240 --> 00:21:25.140] if everything isn't working perfectly is too difficult.
[00:21:26.660 --> 00:21:27.180] Comma solved all these problems years ago.
[00:21:28.640 --> 00:21:28.920] None of this stuff was hard, right?
[00:21:32.079 --> 00:21:32.299] But then people look, and it drives me crazy.
[00:21:37.259 --> 00:21:39.640] It's like there's still people out there who think Cruise is a bigger success than Comma.
[00:21:41.039 --> 00:21:44.720] They lost $10 billion!
[00:21:47.220 --> 00:21:47.900] I don't even know what to say.
[00:21:50.380 --> 00:21:50.440] Like reality comes and smacks you in the face,
[00:21:53.059 --> 00:21:53.140] and you're like, yeah, yeah, that was, yeah, yeah.
[00:21:54.279 --> 00:21:54.299] No, but it was a good run, man.
[00:21:54.980 --> 00:21:56.259] It was a good run.
[00:22:00.299 --> 00:22:01.200] For $10 billion, you could have had six of these.
[00:22:02.680 --> 00:22:02.900] It's the biggest building in the world.
[00:22:04.180 --> 00:22:04.319] We could have had one in Chicago.
[00:22:05.440 --> 00:22:06.359] We could have had one in San Francisco. We could have had one in LA. We could have one in San Francisco. We could have had one in LA.
[00:22:06.359 --> 00:22:08.240] We could have one in here in San Diego.
[00:22:08.240 --> 00:22:09.880] But no, we had Cruise.
[00:22:12.079 --> 00:22:13.460] Oh yeah, yeah, by the way, Kyle Vogt,
[00:22:13.460 --> 00:22:14.500] he's looking for investors.
[00:22:14.500 --> 00:22:16.740] He started a company called The Bot Company.
[00:22:16.740 --> 00:22:18.400] They raised $150 million.
[00:22:18.400 --> 00:22:20.779] They're building humanoid robots, guys.
[00:22:20.779 --> 00:22:22.599] Yeah, yeah.
[00:22:22.599 --> 00:22:26.000] Self-driving, that was too hard, but humanoid robots?
[00:22:26.000 --> 00:22:33.000] We chose self-driving because it's the easiest problem in applied AI, and it's still not close to done.
[00:22:33.000 --> 00:22:39.000] We can get to some things about that in a little bit, but humanoid robots are way harder.
[00:22:39.000 --> 00:22:42.000] Humanoid robots are so much harder than self-driving cars.
[00:22:42.000 --> 00:22:45.140] And so, you know, if you like losing money,
[00:22:45.319 --> 00:22:47.019] I mean, you can say it was a good ride.
[00:22:48.019 --> 00:22:49.660] Or invest in the new Burj Dubai.
[00:22:49.740 --> 00:22:50.380] I believe that.
[00:22:50.539 --> 00:22:51.339] Someone should build that.
[00:22:51.579 --> 00:22:54.339] Or actually, I really want to build a large statue of Taylor Swift.
[00:22:55.359 --> 00:22:59.140] Like, I want to build, like, a 500-foot stone statue of Taylor Swift
[00:22:59.140 --> 00:23:01.000] on that highway from L.A. to Vegas.
[00:23:01.779 --> 00:23:02.480] It's going to be sick.
[00:23:02.839 --> 00:23:03.960] It's my Taylor Swift statue.
[00:23:04.259 --> 00:23:06.539] If I ever get rich, that's what I'm going to Vegas. This is gonna be sick. That's my Taylor Swift statue. If I ever get rich, that's what I'm gonna do.
[00:23:08.700 --> 00:23:11.940] So this is my slide from 2016, the players.
[00:23:11.940 --> 00:23:14.380] So we have Google, we have Otto.
[00:23:15.519 --> 00:23:17.319] Anthony Lewandowski walked away with a lot of money
[00:23:17.319 --> 00:23:18.960] until they sued him for it.
[00:23:18.960 --> 00:23:20.259] And we have Tesla.
[00:23:20.259 --> 00:23:22.400] So, you know, yeah, let's,
[00:23:24.039 --> 00:23:29.119] who was notably absent from the Old Town Road, you know, compilation was Waymo.
[00:23:30.460 --> 00:23:31.500] Waymo's actually pretty cool.
[00:23:32.240 --> 00:23:33.799] And I'm not saying this, like, ironically.
[00:23:34.140 --> 00:23:40.039] And, like, what makes them cool is that I can download an app and use it.
[00:23:40.440 --> 00:23:43.279] And that's, again, my test for what real technology is.
[00:23:43.839 --> 00:23:46.119] It's not real technology if you see a video.
[00:23:46.319 --> 00:23:50.640] It's not real technology if Waymo is talking about how we're going to help the blind get around.
[00:23:51.240 --> 00:23:52.359] Why can't they just use Uber?
[00:23:53.480 --> 00:23:55.740] Like, huh? I don't get it. Stop it.
[00:23:56.220 --> 00:24:02.440] But when I can go to San Francisco and download an app and call a Waymo, that's actually pretty cool.
[00:24:04.059 --> 00:24:05.920] They're human-supervised and quasi-tele-op.
[00:24:06.000 --> 00:24:07.559] They're not a solution to self-driving cars.
[00:24:08.180 --> 00:24:09.740] They have questionable unit economics.
[00:24:10.160 --> 00:24:10.920] They use maps.
[00:24:11.579 --> 00:24:12.920] But they survived and they shipped.
[00:24:13.420 --> 00:24:14.220] And that makes them cool.
[00:24:14.720 --> 00:24:16.200] And it's really nice to be in the car alone.
[00:24:16.980 --> 00:24:18.720] If you read my 100x investment,
[00:24:19.319 --> 00:24:20.980] I wrote a blog post about this in 2018.
[00:24:21.759 --> 00:24:23.940] And my criticism of Waymo actually was not
[00:24:23.940 --> 00:24:24.680] that it would never work.
[00:24:25.500 --> 00:24:26.900] I remain quite optimistic
[00:24:26.900 --> 00:24:27.900] that self-driving cars
[00:24:27.900 --> 00:24:28.680] are going to work
[00:24:28.680 --> 00:24:29.400] and we're going to get to
[00:24:29.400 --> 00:24:30.539] when the timeline is.
[00:24:30.819 --> 00:24:32.200] And I think that it's very possible
[00:24:32.200 --> 00:24:33.380] that Waymo will actually
[00:24:33.380 --> 00:24:34.420] have the first solution
[00:24:34.420 --> 00:24:36.859] with these caveats.
[00:24:37.019 --> 00:24:39.900] But the unit economics of Waymo
[00:24:39.900 --> 00:24:41.420] would never be as good
[00:24:41.420 --> 00:24:43.420] as the unit economics of Uber.
[00:24:43.859 --> 00:24:44.759] A day to this girl for a bit
[00:24:44.759 --> 00:24:45.500] who was an Uber driver.
[00:24:46.339 --> 00:24:48.539] I'm like, I just looked at how much she was making on the rides and stuff,
[00:24:48.559 --> 00:24:49.759] and I'm like, you're losing money.
[00:24:50.720 --> 00:24:53.599] Like, when you take the cost of depreciation and gas and car maintenance,
[00:24:53.700 --> 00:24:54.559] she's actually losing money.
[00:24:55.460 --> 00:24:58.240] Uber's new system has figured out ways to prey on people
[00:24:58.240 --> 00:24:59.240] who are bad at math, pretty much.
[00:24:59.940 --> 00:25:02.039] Or they have, like, Uber drivers who are, like, you know,
[00:25:02.079 --> 00:25:07.720] they carefully, like, it's so amazing when you look at the disparity of resources between an Uber driver and Uber
[00:25:07.720 --> 00:25:09.160] and all the little tricks they can pull.
[00:25:10.980 --> 00:25:15.019] So Waymo will never be able to compete with Uber on unit economics,
[00:25:15.980 --> 00:25:18.079] but they actually might not have to.
[00:25:18.319 --> 00:25:22.140] I thought that I was in the minority of people who didn't like having the person in the car,
[00:25:22.640 --> 00:25:24.339] but I actually think that's like 80% of people.
[00:25:25.599 --> 00:25:27.420] So Waymo, despite being tele-op
[00:25:27.420 --> 00:25:29.319] and despite having questionable unit economics,
[00:25:29.819 --> 00:25:32.339] may very well be a cool, successful service.
[00:25:33.119 --> 00:25:34.579] So, you know, and it's interesting,
[00:25:34.700 --> 00:25:35.279] you know, they take the same,
[00:25:35.339 --> 00:25:37.359] how Waymo outlasted the competition, right?
[00:25:37.859 --> 00:25:38.220] Surviving.
[00:25:40.160 --> 00:25:44.099] So yeah, you know, the players in 2024, right?
[00:25:44.180 --> 00:25:44.660] Like who's left?
[00:25:47.920 --> 00:25:48.900] Did I miss anyone?
[00:25:52.000 --> 00:25:53.200] Zooks, okay, okay.
[00:25:53.299 --> 00:25:55.599] To be fair, Zooks has something.
[00:25:56.000 --> 00:25:57.920] Does the car drive in all directions anymore?
[00:26:01.640 --> 00:26:02.680] Something, something, something.
[00:26:03.700 --> 00:26:07.240] But no, I mean, look, the big player here is, of course, Tesla.
[00:26:07.559 --> 00:26:08.460] It's always been Tesla.
[00:26:09.920 --> 00:26:13.099] We can just talk about, since the beginning,
[00:26:13.240 --> 00:26:15.779] we've always talked about how far we are behind Tesla.
[00:26:17.039 --> 00:26:18.319] And it has stayed that way.
[00:26:19.000 --> 00:26:23.960] They have managed to stay about two or three years ahead of us continually.
[00:26:23.960 --> 00:26:27.240] The new FSD, I was blown
[00:26:27.240 --> 00:26:32.220] away. Like at the leaps that were made from like the FSD 12 that I tried that I
[00:26:32.220 --> 00:26:35.460] was like this is literally worse than open pilot to like FSD 14 which is like
[00:26:35.460 --> 00:26:38.660] alright they figured something out we got to figure that out too. I think we got it.
[00:26:38.660 --> 00:26:44.380] I think we got it. But you can look at the fleet size. This is the Waymo fleet, the
[00:26:44.380 --> 00:26:46.000] Comma fleet, Tesla fleet. Comm This is the Waymo fleet, the Comma fleet, the Tesla fleet.
[00:26:46.000 --> 00:26:50.000] Comma is the second largest video-connected fleet after Tesla.
[00:26:50.000 --> 00:26:53.000] And these are our growth numbers.
[00:26:53.000 --> 00:27:00.000] So we have almost 7,000 dailies and almost 12,000 monthlies.
[00:27:00.000 --> 00:27:09.380] So yeah, we're gathering big data, but again, there's really only one company to compare
[00:27:09.380 --> 00:27:11.960] yourself to in this space, and it's Tesla.
[00:27:11.960 --> 00:27:14.000] So let's talk about our distance to Tesla.
[00:27:14.000 --> 00:27:16.200] Here's some breakdowns of some different axes, right?
[00:27:16.200 --> 00:27:19.500] So when you look at the difference in fleet size between Kama and Tesla, we're off by
[00:27:19.500 --> 00:27:20.500] three orders of magnitude.
[00:27:20.500 --> 00:27:23.559] So Kama's 10K, Tesla's about 10 million.
[00:27:23.559 --> 00:27:28.960] When you look at the total number of training, total amount of training compute that we have, you can see the tiny
[00:27:28.960 --> 00:27:34.920] box pros over there. You'll hear about them more in the talk later. We have about 1,000
[00:27:34.920 --> 00:27:41.619] GPUs. Tesla has about 100,000 GPUs. So we're off by two orders of magnitude there. Inference
[00:27:41.619 --> 00:27:45.000] compute, which is how much compute is actually on the device.
[00:27:45.000 --> 00:27:48.000] So if you heard about the 10-watt version and the 100-watt version, right?
[00:27:48.000 --> 00:27:50.000] The 100-watt version is not a different device.
[00:27:50.000 --> 00:27:53.000] It's literally just a GPU that plugs into your Comma 4.
[00:27:53.000 --> 00:27:56.000] It's a consumer GPU that plugs into your Comma 4.
[00:27:56.000 --> 00:28:03.000] Your average consumer GPU has the same power as the Tesla FSD AI computer.
[00:28:03.000 --> 00:28:08.539] You know, Tesla will never tell you this, but like a 5090 crushes it.
[00:28:08.539 --> 00:28:10.359] A 5070, that's about the same.
[00:28:11.720 --> 00:28:13.660] But a lot of this is just because of power.
[00:28:13.660 --> 00:28:15.200] Right, the Tesla thing draws 300 watts,
[00:28:15.200 --> 00:28:16.859] a 5090 draws 600 watts.
[00:28:18.359 --> 00:28:20.220] So yeah, inference compute, Tesla's ahead
[00:28:20.220 --> 00:28:23.180] by two orders of magnitude, but we'll get to that one later,
[00:28:23.180 --> 00:28:24.720] and then we have yearly spend.
[00:28:24.720 --> 00:28:28.680] So comms yearly spend is about 10 million, Tesla's about 100 billion, so that's four orders of magnitude, but we'll get to that one later. And then we have yearly spend. So comms yearly spend is about 10 million. Tesla is about 100 billion. So that's four
[00:28:28.680 --> 00:28:33.720] orders of magnitude. So training compute, we closed from three orders of magnitude to
[00:28:33.720 --> 00:28:40.160] two orders of magnitude by building our computer factory. Inference compute, we've closed from
[00:28:40.160 --> 00:28:44.680] two orders of magnitude to zero orders of magnitude. So over there on the table, we
[00:28:44.680 --> 00:28:45.940] have the comma compute boxes,
[00:28:46.119 --> 00:28:48.000] but that's what the comma compute box really is.
[00:28:48.559 --> 00:28:52.759] This is a AMD RTX 9060 XT.
[00:28:54.960 --> 00:28:58.240] It's a $300 GPU, 205 tops,
[00:28:58.660 --> 00:29:01.839] and it already just works with the comma 3, 3X, and 4.
[00:29:02.420 --> 00:29:03.779] The software won't come to the 3,
[00:29:03.859 --> 00:29:05.220] but the software will come to the 3x and four,
[00:29:05.299 --> 00:29:07.380] and someone can backport it to the three if they want.
[00:29:07.839 --> 00:29:10.220] Nice power enclosure coming soon,
[00:29:10.339 --> 00:29:12.140] or you can build your own out of parts on Amazon.
[00:29:13.940 --> 00:29:16.400] We'll charge some amount of markup for the nice box,
[00:29:16.980 --> 00:29:18.240] but you're welcome to build your own.
[00:29:20.200 --> 00:29:20.400] Okay.
[00:29:23.220 --> 00:29:24.759] So how much longer to solve self-driving?
[00:29:20.400 --> 00:29:23.140] Okay.
[00:29:24.759 --> 00:29:28.799] So how much longer to solve self-driving?
[00:29:31.519 --> 00:29:33.099] This is teslafsdtracker.com.
[00:29:35.859 --> 00:29:35.940] It's probably the best way to kind of get at this.
[00:29:37.220 --> 00:29:37.339] We have an internal version of this,
[00:29:38.480 --> 00:29:38.980] but it's a little bit different.
[00:29:40.500 --> 00:29:41.980] I mean, like, Tesla and Kama are pushing on fundamentally different things.
[00:29:42.779 --> 00:29:45.579] Tesla's slogan, or Kama's slogan is make driving chill.
[00:29:46.579 --> 00:29:49.480] Our question is like how do we achieve highway perfection?
[00:29:51.579 --> 00:29:54.779] Tesla's question is how can I ship this crazy feature?
[00:29:54.779 --> 00:29:56.039] Right, like look at this crazy feature
[00:29:56.039 --> 00:29:58.579] is kind of like Tesla's unofficial slogan.
[00:29:58.579 --> 00:30:00.140] So yeah, you can look at the numbers
[00:30:00.140 --> 00:30:03.700] and you can look, the FSD14 has really,
[00:30:03.700 --> 00:30:09.740] the felt experience is really noticeable in data.
[00:30:11.619 --> 00:30:16.299] However, every order of magnitude takes the same amount of time.
[00:30:17.000 --> 00:30:19.880] There's a simple way to predict how long things are going to take.
[00:30:20.720 --> 00:30:25.920] Human car accidents happen about every 500,000 miles.
[00:30:28.920 --> 00:30:32.799] If you believe FSD tracker, current Tesla stuff is around 3,000 miles per critical disengagement.
[00:30:32.799 --> 00:30:35.799] It gets about 2x better every year.
[00:30:35.799 --> 00:30:39.039] If you want to figure out how long something's gonna take,
[00:30:39.039 --> 00:30:42.039] take the trend line and continue it.
[00:30:42.960 --> 00:30:44.960] We have eight more years.
[00:30:50.059 --> 00:30:54.859] And you know what? Whenever I make, like, whenever I say eight years, there's two groups of people. There's two groups of people. One's like, no, man,
[00:30:54.900 --> 00:30:59.039] have you seen what ChatGBG can do? It's going to be, like, next week. And, of course, those people
[00:30:59.039 --> 00:31:03.579] are never right. But there's another group of people out there, too, who says, no, man, you know,
[00:31:03.599 --> 00:31:07.940] AI is never going to have, like, the human reasoning needed to drive a car or something.
[00:31:08.099 --> 00:31:10.920] Or the brain is, it's just like, you're not right either.
[00:31:11.619 --> 00:31:15.480] The actual timeline is going to be about eight years.
[00:31:19.180 --> 00:31:20.180] And that's for Tesla.
[00:31:22.799 --> 00:31:40.200] If Tesla will be there in eight, we'll be there in ten. So, you know, we're halfway.
[00:31:40.200 --> 00:31:46.960] And this is my last slide. I want you guys to remember something else that's lost in a lot of self-driving car rhetoric.
[00:31:48.319 --> 00:31:50.700] A self-driving car is not a car.
[00:31:51.660 --> 00:31:56.039] I have no idea why Cruz and Waymo and Zoox decided to build cars.
[00:31:57.140 --> 00:31:57.880] It's nonsense.
[00:31:58.539 --> 00:32:01.759] A self-driving car is a dude.
[00:32:03.079 --> 00:32:04.619] He's the self-driving.
[00:32:05.160 --> 00:32:06.019] That's a car. We just want to build a self-driving. We don't want to build a dude. He's the self-driving. That's a car.
[00:32:06.900 --> 00:32:08.519] We just want to build a self-driving.
[00:32:08.700 --> 00:32:09.900] We don't want to build a car.
[00:32:11.079 --> 00:32:13.079] The ultimate comma
[00:32:13.079 --> 00:32:16.079] is not a device that plugs into your car.
[00:32:16.400 --> 00:32:18.500] It's a robot that gets in the driver's seat.
[00:32:20.180 --> 00:32:21.279] That's where we're going.
[00:32:22.299 --> 00:32:23.160] Why should we,
[00:32:23.160 --> 00:32:31.200] oh, you're going to crack the Toyota should we oh oh you gotta crack the toyota security oh no who cares build a robot that gets in the seat and grabs the wheel and
[00:32:31.200 --> 00:32:38.240] turns the wheel now don't do this tomorrow because it's not gonna work right having
[00:32:38.240 --> 00:32:54.819] accurate predictions about technology is super important if you want to know where to do investment. I wish VCs understood any concept of this. But yeah, OpenPilot is a general purpose robotics operating system.
[00:32:54.819 --> 00:32:58.839] OpenPilot, everyone always says, oh, comma, you're going to die if you don't partner with
[00:32:58.839 --> 00:33:02.859] car makers. Everyone's going to get security. Or, oh, my God, I don't care. I want to build
[00:33:02.859 --> 00:33:11.960] a robot. I want it to cook and clean for me. But like actually do this. Not build up some hype. In the same way Cruise told
[00:33:12.599 --> 00:33:15.680] you they were going to solve self-driving and wasted $10 billion, you're going to see
[00:33:15.680 --> 00:33:19.700] all the same stuff happen in humanoid robots. And guess who's going to be there to pick
[00:33:19.700 --> 00:33:29.180] up the pieces? Kama and Tesla. Same people who are there today. So, yeah. No, I mean, the Kama
[00:33:29.180 --> 00:33:34.759] 9 is going to be that guy. Get your car. Is it compatible with my car? Yeah, bro. He sits
[00:33:34.759 --> 00:33:39.599] in the seat. All right. That's my talk. I'll take questions.
[00:33:47.619 --> 00:33:50.720] All right. Do we have some questions?
[00:33:52.960 --> 00:33:53.259] I just want to say I'm ready for the comma nine.
[00:33:53.960 --> 00:33:55.119] Yeah, right?
[00:33:57.960 --> 00:33:58.160] Do you see any reason to shorten that two-year gap after Tesla?
[00:33:59.180 --> 00:33:59.380] Would you want to make that shorter?
[00:34:00.700 --> 00:34:01.240] Are you fine being at that?
[00:34:03.720 --> 00:34:08.940] So when you look at some of the fundamental things, like that training order of magnitude thing, so Comma's new compute cluster cost about
[00:34:08.940 --> 00:34:14.940] two million dollars. Tesla's new compute cluster cost about at minimum 200
[00:34:14.940 --> 00:34:20.320] million. Right, so it's even more, more like 500 million. So they're paying a ton
[00:34:20.320 --> 00:34:27.039] of premium to be early. I don't think it's worth it. I'm certainly okay with being two years behind,
[00:34:29.039 --> 00:34:30.619] meaning if you could have a sustainable business
[00:34:30.619 --> 00:34:32.119] in the meantime.
[00:34:32.119 --> 00:34:34.599] What might shorten that gap, if it turns out
[00:34:34.599 --> 00:34:38.300] the Comma 4 sells wildly more than expected,
[00:34:38.300 --> 00:34:40.500] and we have tons and tons of money coming in,
[00:34:40.500 --> 00:34:43.760] we may be able to deploy it somewhat to shorten that gap,
[00:34:43.760 --> 00:34:45.500] but I can't even make any guarantees. If I could make guarantees, maybe it's worth raising that gap, but I can't even make any guarantees.
[00:34:45.500 --> 00:34:47.800] If I can make guarantees, maybe it's worth raising the money,
[00:34:47.800 --> 00:34:49.699] but you can't make guarantees.
[00:34:49.699 --> 00:34:52.119] But I do think that maybe if we, you know,
[00:34:52.119 --> 00:34:56.059] again, the most important thing is survival.
[00:34:56.059 --> 00:34:59.960] So the question is not do you want to shorten that gap?
[00:34:59.960 --> 00:35:03.059] The question is how much extra risk are you willing
[00:35:03.059 --> 00:35:04.139] to take to shorten that gap?
[00:35:04.139 --> 00:35:07.760] And the answer is zero.
[00:35:11.019 --> 00:35:13.179] I was just curious, when you speak about this being comma nine and cooking and cleaning and all those sort of things,
[00:35:13.340 --> 00:35:16.159] do you mean that literally that the final form factor of a comma
[00:35:16.159 --> 00:35:17.599] would be a general purpose robot?
[00:35:17.719 --> 00:35:17.980] Of course.
[00:35:18.280 --> 00:35:18.699] Okay, wow.
[00:35:18.699 --> 00:35:18.920] Yeah.
[00:35:20.179 --> 00:35:22.480] The car thing is just, again,
[00:35:22.619 --> 00:35:24.920] cars are just the best applied AI problem that exists today.
[00:35:25.079 --> 00:35:28.300] Self-driving cars are going to be solved long before humanoid robotics.
[00:35:29.860 --> 00:35:31.380] But yeah, what, we're going to stop?
[00:35:31.900 --> 00:35:34.639] Oh, guys, guys, guys, innovation's over. Let's exploit.
[00:35:35.099 --> 00:35:38.019] Right? Yeah, let's crank up the monthly subscriptions on people.
[00:35:38.119 --> 00:35:39.980] Let's put some ads in that shit. Yeah.
[00:35:40.199 --> 00:35:41.920] What am I going to do with the money?
[00:35:42.639 --> 00:35:45.599] All these people who are like, I got rich. What did you buy?
[00:35:46.639 --> 00:35:48.159] There's nothing to buy.
[00:35:49.219 --> 00:35:50.280] Taylor Swift statue.
[00:35:51.420 --> 00:35:52.380] Taylor Swift statue.
[00:35:52.460 --> 00:35:52.820] Good point.
[00:35:53.099 --> 00:35:55.440] It's actually, I think it can be done pretty cheaply.
[00:35:56.500 --> 00:35:58.480] I'm not licensing Taylor Swift, by the way.
[00:35:58.539 --> 00:36:01.119] It's called generic female pop star, if anybody asks.
[00:36:02.480 --> 00:36:07.000] See, actually, I could probably do this today, but I would need time.
[00:36:13.179 --> 00:36:18.219] So I share the sentiment that human robotics startups are dumb, they will fail, but what
[00:36:18.219 --> 00:36:23.219] makes Tesla Optimus a good bet? Or is it a good bet?
[00:36:23.219 --> 00:36:25.900] Yeah, do you know what Honda Asima was?
[00:36:28.300 --> 00:36:28.340] So Honda, really in its heyday in the 90s,
[00:36:29.420 --> 00:36:29.820] also built a humanoid robot.
[00:36:32.480 --> 00:36:35.239] For a largely successful car company to take 0.1% of their operating budget
[00:36:35.239 --> 00:36:37.260] and dump it into humanoid robots for press,
[00:36:37.599 --> 00:36:38.099] great idea.
[00:36:38.679 --> 00:36:41.119] For a startup to raise a huge amount of money
[00:36:41.119 --> 00:36:43.139] saying they're eventually going to pay investors back,
[00:36:43.199 --> 00:36:43.820] that's absurd.
[00:36:44.460 --> 00:36:48.960] Tesla's building the Optimus. Tesla already has optimists deployed they're already useful they're in the
[00:36:48.960 --> 00:36:55.039] tesla showrooms waving to you right that's sick right spending you know 10 million 100 million
[00:36:55.039 --> 00:37:00.880] dollars on a sick pr project for tesla that may turn into something in the future so worth it
[00:37:01.760 --> 00:37:05.159] raising that kind of money, ridiculous.
[00:37:06.699 --> 00:37:08.559] The reason it makes sense for Tesla is because they're a highly profitable automaker.
[00:37:10.539 --> 00:37:11.059] Questions?
[00:37:15.440 --> 00:37:18.920] So to continue on and look at the chasm that exists
[00:37:18.920 --> 00:37:21.099] from where it is today to comma nine
[00:37:21.099 --> 00:37:24.119] and that what we see now is body
[00:37:24.119 --> 00:37:26.460] is kind of the beginning of the compute
[00:37:26.460 --> 00:37:33.260] core for that. What are your insights around the grassroots oriented applications for comma
[00:37:33.260 --> 00:37:37.360] body to start stepping towards a dude who gets in the car?
[00:37:37.360 --> 00:37:42.239] Yeah, I mean, okay, so there's one example of consumer robots that actually works. Who
[00:37:42.239 --> 00:37:46.079] knows what it is? Vacuum cleaners, right?
[00:37:46.079 --> 00:37:49.500] So yeah, I mean, can the Comma body vacuum?
[00:37:49.500 --> 00:37:50.980] I mean, I wanna do it a little bit differently, right?
[00:37:50.980 --> 00:37:52.500] Like I don't really wanna build a vacuum.
[00:37:52.500 --> 00:37:55.139] My idea is like, can we just like buy a vacuum off Amazon
[00:37:55.139 --> 00:37:57.679] and then like, you know, have the body grab it and vacuum,
[00:37:57.679 --> 00:37:59.739] right, it should tool use, right?
[00:37:59.739 --> 00:38:01.880] So I think once we're at that milestone,
[00:38:01.880 --> 00:38:05.219] we've also considered the Comma body for security.
[00:38:05.219 --> 00:38:08.300] But then you realize that 95% of what you want
[00:38:08.300 --> 00:38:11.159] is just a fixed mounted camera and a loud speaker.
[00:38:11.159 --> 00:38:12.940] And good software.
[00:38:12.940 --> 00:38:15.420] So you actually don't need the robot form factor for that.
[00:38:15.420 --> 00:38:18.719] But vacuums are the first robotic form factor
[00:38:18.719 --> 00:38:20.239] to pay for themselves.
[00:38:20.239 --> 00:38:22.900] So once the Comma body, once the general purposeness
[00:38:22.900 --> 00:38:24.639] of the Comma body is capable of grabbing
[00:38:24.639 --> 00:38:27.219] your existing vacuum cleaner off Amazon and using it,
[00:38:27.500 --> 00:38:29.920] then I think we can start selling the comma body into people's homes.
[00:38:30.360 --> 00:38:35.159] But fundamentally, it has to come, and someone might figure out an application, right?
[00:38:35.619 --> 00:38:40.420] I don't know if the body is going to be talked about more, but our goal with the new body, the first body was $1,000.
[00:38:41.019 --> 00:38:42.619] That's insane, right?
[00:38:42.639 --> 00:38:43.880] Like, how can we try to charge you?
[00:38:44.039 --> 00:38:46.159] We charge you $1,000 for a comma four, and you'll use that shit 40% of your days, right? That's insane. How can we try to charge you? We charge you a thousand dollars for a Comma 4, and you'll
[00:38:46.159 --> 00:38:48.079] use that shit 40% of your days.
[00:38:48.239 --> 00:38:49.199] That's so worth it.
[00:38:50.079 --> 00:38:52.219] But if you buy a Comma body, okay, it's a cool
[00:38:52.219 --> 00:38:54.139] toy. We've got to price it more like a toy.
[00:38:54.440 --> 00:38:56.059] So the new Comma body is going to be priced at $200.
[00:38:57.139 --> 00:38:57.960] And I hope
[00:38:57.960 --> 00:39:00.179] that someone out there with a Comma 4 and a Comma body
[00:39:00.179 --> 00:39:02.139] figures out some use for these things.
[00:39:02.559 --> 00:39:03.599] And we'll productionize it.
[00:39:03.840 --> 00:39:06.039] But I think Unitary is making the same play, right?
[00:39:06.039 --> 00:39:08.400] Like we bought two of those dogs, they're cool.
[00:39:08.400 --> 00:39:09.619] But they don't make money.
[00:39:09.619 --> 00:39:12.360] There's no way to like deploy this thing in a real application where it like
[00:39:12.360 --> 00:39:13.860] provides value to you, it's a toy.
[00:39:15.019 --> 00:39:17.719] So yeah, we're targeting the toy market for bodies.
[00:39:17.719 --> 00:39:19.519] Hopefully someone will figure out what to do with it.
[00:39:19.519 --> 00:39:20.860] The software will get better.
[00:39:20.860 --> 00:39:25.980] But yeah, until you figure out how to make the robot do something that makes money.
[00:39:33.579 --> 00:39:33.980] All right. Next question. Hey, George. Hey, if you were looking to put me to work,
[00:39:40.179 --> 00:39:40.380] how much would you buy me for? Negative. Honestly, negative. And here's why, right? Like,
[00:39:46.119 --> 00:39:48.119] so much of the fact that things don't scale is due to managerial overhead, right? So if someone comes to you and says,
[00:39:48.119 --> 00:39:49.519] how could you put me to work?
[00:39:49.519 --> 00:39:51.420] You're actually asking something of me.
[00:39:51.420 --> 00:39:54.320] You're asking for me to do something, right?
[00:39:54.320 --> 00:39:55.820] And that's why it's negative, right?
[00:39:55.820 --> 00:39:57.340] You gotta pay me for that, right?
[00:39:57.340 --> 00:39:59.239] The real way to be put to work
[00:39:59.239 --> 00:40:01.659] is to go work on tiny grab bounties.
[00:40:01.659 --> 00:40:02.500] Just go work on them.
[00:40:02.500 --> 00:40:05.579] Or go work on comma bounties. But yeah, no,
[00:40:05.659 --> 00:40:06.300] I think the,
[00:40:07.659 --> 00:40:08.079] yeah, I mean,
[00:40:08.159 --> 00:40:08.639] God, we're getting
[00:40:08.639 --> 00:40:09.059] to the point
[00:40:09.059 --> 00:40:09.739] where the marginal
[00:40:09.739 --> 00:40:11.280] utility of a human
[00:40:11.280 --> 00:40:11.800] is negative
[00:40:11.800 --> 00:40:12.659] and that's scary,
[00:40:12.760 --> 00:40:12.840] but.
[00:40:14.920 --> 00:40:15.440] All right,
[00:40:15.519 --> 00:40:16.119] another question?
[00:40:28.920 --> 00:40:29.139] Thank you for the talk.
[00:40:32.219 --> 00:40:32.840] I wanted to ask about the Chinese competition,
[00:40:35.260 --> 00:40:36.300] especially like Baidu and these companies.
[00:40:37.079 --> 00:40:37.360] What about them?
[00:40:39.699 --> 00:40:39.760] I think they make a search engine and a maps app.
[00:40:40.820 --> 00:40:41.440] That's kind of confusing to use.
[00:40:44.900 --> 00:40:46.119] I think they are making 500,000 trips a week.
[00:40:46.260 --> 00:40:46.800] Where?
[00:40:47.840 --> 00:40:48.119] Baidu, in China.
[00:40:48.559 --> 00:40:50.239] In what city?
[00:40:50.980 --> 00:40:51.500] Wuxi, I think.
[00:40:52.019 --> 00:40:52.400] I haven't been there.
[00:40:53.139 --> 00:40:53.179] I haven't tried it.
[00:40:53.639 --> 00:40:53.880] I mean, again,
[00:40:55.039 --> 00:40:55.079] probably what it is,
[00:40:56.000 --> 00:40:56.079] I'm surprised by that.
[00:40:56.780 --> 00:40:57.300] I mean, Baidu's stack was open source
[00:40:57.300 --> 00:40:57.800] at the beginning.
[00:40:59.179 --> 00:40:59.539] Autoware.
[00:40:59.659 --> 00:41:00.159] I don't know if it's
[00:41:00.159 --> 00:41:01.079] still open source.
[00:41:01.440 --> 00:41:02.239] It's Autoware, right?
[00:41:02.340 --> 00:41:02.920] The Baidu one?
[00:41:04.519 --> 00:41:06.079] Baidu Apollo. Yeah, yeah, yeah. Who was, Autoware was another one. Yeah, yeah, yeah. No, they were open source. Yeah's Autoware, right? The Baidu one? Baidu Apollo. Yeah, yeah, yeah.
[00:41:06.079 --> 00:41:09.440] Who was, Autoware was another one. Yeah, yeah, yeah. No, they were open source. Yeah, it
[00:41:09.440 --> 00:41:13.360] was Apollo. I remember their moonshot rhetoric. I wonder what's still open source. I also
[00:41:13.360 --> 00:41:17.800] wonder how teleop it is. So, like, I think the Chinese are generally more open with showing
[00:41:17.800 --> 00:41:22.099] you their teleop stuff. By the way, Waymo admits to teleop and you've never seen a picture,
[00:41:22.360 --> 00:41:25.960] which means that the teleop's a lot more than you think it is, right? I think the Chinese actually
[00:41:25.960 --> 00:41:28.000] show you, well, yeah, yeah, we hire
[00:41:28.000 --> 00:41:30.099] people from this rural province to come
[00:41:30.099 --> 00:41:32.239] sit in this gaming chair and drive you around.
[00:41:32.599 --> 00:41:32.860] Okay.
[00:41:33.780 --> 00:41:36.079] So, yeah, no, I think that a lot of this, like, teleop
[00:41:36.079 --> 00:41:37.360] stuff is just,
[00:41:37.940 --> 00:41:40.219] it's not self-driving cars.
[00:41:40.380 --> 00:41:42.420] It's cool, and it might be a cool service,
[00:41:43.000 --> 00:41:44.199] but it's not the same thing as
[00:41:44.199 --> 00:41:46.000] solving the AI problem of self-driving cars.
[00:41:46.000 --> 00:41:52.000] There's a lot of other fun problems to solve on the way, but yeah, I mean, that's kind of my take on Waymo and all of these kind of like robo-taxi kind of things.
[00:41:52.000 --> 00:41:58.000] It's actually cool and it actually has carved out its own niche that has shipped, but it's not AI.
[00:41:58.000 --> 00:42:07.079] All right, we got time for two more questions? Anyone? Thank you.
[00:42:07.079 --> 00:42:10.760] Hi, George.
[00:42:10.760 --> 00:42:20.940] You've been posting a lot lately on, well, first you started Tiny and you've been selling
[00:42:20.940 --> 00:42:27.500] the Tiny Box. you've been selling the tiny box and then you've been posting a lot on external GPU.
[00:42:27.500 --> 00:42:33.320] I don't know what you would call that but some way to connect a GPU to a MacBook.
[00:42:33.320 --> 00:42:33.860] Yep.
[00:42:33.860 --> 00:42:38.840] And so can you talk about, I guess on a small scale with that second thing,
[00:42:38.840 --> 00:42:42.119] what you're hoping to accomplish and then more broadly,
[00:42:42.119 --> 00:42:46.500] what do you want the ultimate goal
[00:42:46.500 --> 00:42:50.440] of Tiny to be?
[00:42:50.440 --> 00:42:55.219] The mission of Tiny is to commoditize the pay-to-flop.
[00:42:55.219 --> 00:43:01.119] So what that means is right now there's an extreme premium for NVIDIA pay-to-flops.
[00:43:01.119 --> 00:43:04.960] We'd like to bring the general price of the pay-to-flop to only be slightly above the
[00:43:04.960 --> 00:43:07.000] cost of the silicon and the cost of the power.
[00:43:07.000 --> 00:43:14.000] That's mission one. Mission two is to rethink about how software is made.
[00:43:14.000 --> 00:43:21.000] Tiny has cut through every dependency in the NVIDIA stack. We're speaking directly over PCI to the NVIDIA GPU.
[00:43:21.000 --> 00:43:27.280] Without NVIDIA's user space, without NVIDIA's driver, without anything from NVIDIA,
[00:43:27.360 --> 00:43:28.539] without Kublas, without any of that.
[00:43:28.920 --> 00:43:33.880] So I think that in order to get very reliable robots
[00:43:33.880 --> 00:43:36.500] in the future, you're going to need an operating system
[00:43:36.500 --> 00:43:40.800] that matches that level of robustness, of reliability.
[00:43:41.400 --> 00:43:43.059] And I don't think that we're going to get that
[00:43:43.059 --> 00:43:44.480] through things that look like the current stack.
[00:43:44.900 --> 00:43:47.260] Tiny is radically 100x simpler than the existing stack, and that's what you're going to need to get reliability. And I don't think that we're going to get that through things that look like the current stack. Tiny is radically 100x simpler
[00:43:47.260 --> 00:43:48.380] than the existing stack.
[00:43:48.659 --> 00:43:49.360] And that's what you're going to need
[00:43:49.360 --> 00:43:51.119] to get reliability of things that like,
[00:43:51.320 --> 00:43:53.559] you know, a human is never going to glitch out
[00:43:53.559 --> 00:43:54.900] and slam their head as hard as they can
[00:43:54.900 --> 00:43:56.039] into the table unless there's something
[00:43:56.039 --> 00:43:57.179] seriously wrong with that human.
[00:43:57.460 --> 00:43:58.800] Like that's not like some probability.
[00:43:58.940 --> 00:43:59.840] That will just never happen.
[00:44:00.699 --> 00:44:00.800] Right?
[00:44:01.179 --> 00:44:02.139] With computers,
[00:44:03.059 --> 00:44:04.059] oh sorry, sorry,
[00:44:04.179 --> 00:44:07.920] your kernel got stuck in a thing where it was servicing
[00:44:07.920 --> 00:44:11.679] this interrupt for 300 milliseconds and the motor was at full torque.
[00:44:11.679 --> 00:44:12.679] Okay.
[00:44:12.679 --> 00:44:13.679] Yeah.
[00:44:13.679 --> 00:44:16.619] So I think we need high reliability deterministic computation.
[00:44:16.619 --> 00:44:18.659] That's kind of what Honeycrunt is.
[00:44:18.659 --> 00:44:19.659] For robots.
[00:44:19.659 --> 00:44:21.659] Runs in comma already.
[00:44:21.659 --> 00:44:30.380] All right. already so all right let's George will be available also in the Q&A so let's
[00:44:30.380 --> 00:44:36.179] we're gonna finish up with his talk thank you so much George for that
[00:44:38.900 --> 00:44:41.900] thanks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment