TECH010: THE REAL ROBOTICS TIMELINE W/ KEN GOLDBERG
TECH010: THE REAL ROBOTICS TIMELINE W/ KEN GOLDBERG
23 December 2025
Ken and Preston examine whether robotics has lost its way, echoing Rodney Brooks’ concerns. They dissect the gap between AI language models and physical robotics, focusing on dexterous manipulation, tactile sensing, and visual feedback.
Ken shares breakthroughs at Ambi Robotics, the data limitations in robotics, and pragmatic engineering over hype-driven narratives.
IN THIS EPISODE, YOU’LL LEARN
- Why Ken agrees that robotics may have “lost its way”
- The critical gap between AI language skills and robotic manipulation
- How robot mobility is advancing, but dexterity still lags
- Why tying shoelaces is still too complex for robots
- The role of tactile sensing vs. vision in robotic surgery
- How camera placement in robotic hands affects manipulation
- Why the robot data gap could be 100,000 years behind language models
- Why simpler grippers often outperform human-like robotic hands
- The engineering behind DexNet and Ambi Robotics’ success
- How real-world testing exposed unexpected robotic limitations
Disclosure: This episode and the resources on this page are for informational and educational purposes only and do not constitute financial, investment, tax, or legal advice. For full disclosures, see link.
TRANSCRIPT
Disclaimer: The transcript that follows has been generated using artificial intelligence. We strive to be as accurate as possible, but minor errors and slightly off timestamps may be present due to platform differences.
[00:00:00] Intro: You are listening to TIP.
[00:00:03] Preston Pysh: Hey everyone, welcome to this Wednesday’s release of Infinite Tech. Today we’re talking AI and robotics and where there’s still key areas of development that need some work. My guest is Ken Goldberg, a leading robotics researcher whose work bridges academic AI, real world automation, and large scale commercial robotic systems.
[00:00:22] Preston Pysh: One of the things we discussed that’s super interesting is the assumption that large language models automatically unlock physical intelligence, and this is an area where Ken is really well-versed and does a great job explaining what that actually means.
[00:00:44] Preston Pysh: We cover what has improved, like mobility and automation, and what’s still painfully hard, like dexterity, sensing, and real world manipulation. This is a grounded conversation about engineering reality versus expectation, and Ken is a true expert in this field, as you’ll see in the conversation.
So without further delay, I hope you guys enjoy this chat.
[00:00:59] Intro: You are listening to Infinite Tech by The Investor’s Podcast Network, hosted by Preston Pysh. We explore Bitcoin, AI, robotics, longevity, and other exponential technologies through a lens of abundance and sound money. Join us as we connect the breakthrough shaping the next decade and beyond. Empowering you to harness the future today.
And now here’s your host, Preston Pysh.
[00:01:33] Preston Pysh: Hey everyone, welcome to the show. I am here with Ken Goldberg and I am so excited to have this conversation because you are an expert in this field of robotics and AI and it’s something that we talk about all the time on the show. And it’s just exciting to have somebody like yourself here today to talk about it.
[00:01:49] Preston Pysh: So welcome to the show, Ken.
[00:01:52] Ken Goldberg: Thank you, Preston. I’m excited to talk to you too.
[00:02:01] Preston Pysh: So before we started chatting, you sent over an article that I think is very pertinent to kind of set the stage for probably most of the conversation we’re having today, and it was an article that was in The New York Times and it’s talking, Rodney Brooks has quote unquote said, the field has lost its way.
[00:02:37] Ken Goldberg: Well, I think he’s a very respected individual. He’s a good friend of mine, and I agree with him very much, and I think he’s provocative. He’s put it in his own words. But I sent that to you because I think it’s very relevant for us to start this conversation about what’s real and what’s hyped.
[00:02:56] Ken Goldberg: And in robotics and AI, I have to be careful about the word hype, but I want to say there’s a call it inflated expectations that are out there and I understand where they come from. You know, I think people are excited about technology. Everyone. I am too. We all grew up with science fiction and we love it, and we love new things, and there has been some breakthroughs.
[00:03:18] Ken Goldberg: I mean, there’s no doubt that the advances in artificial intelligence, in particular deep learning and then generative AI with the transformer model, have been transformative in the field. The AI systems are doing things that no one thought would be possible by now. So, and then I will be the first to admit they’re capable of creativity.
[00:03:37] Ken Goldberg: They’re immensely valuable, but people then take the next logical step and say, okay, these systems have solved language, so therefore they’ll solve robotics too. And that is where I have a lot of concerns. I really, we can get into the details, but Rod and I agree that there is not at all obvious that what the advances in language in AI will extend to robotics.
[00:04:07] Preston Pysh: What would you say is the number one thing that you’re seeing that’s just grossly out of touch with reality when it comes to the robotics piece being not as far along or it’s not coming in? The talking point is, in five years we’re going to have humanoid robots doing everything, right? So like what are the big chunk pieces that people that aren’t intimately familiar with this space that you see are missing on that particular topic?
[00:04:33] Ken Goldberg: Okay, so let me tell you first of all where some of the advances have been made. And one of them is in quadrupeds, that’s walking dogs, basically, and bipeds, that’s walking machines, and navigation, and I would call mobility. So the ability to get around with robots, with legs, has made immense progress.
[00:04:52] Ken Goldberg: That’s been very exciting and there’s no doubt about it. Those machines are capable of doing backflips, as you know, side flips, parkour, all kinds of things that I certainly can’t do. Also, huge advances in drones.
[00:05:03] Ken Goldberg: And so the past decade we’ve seen drone technology take off from something that was very experimental, but it’s been a number of advances that have made that possible.
[00:05:14] Ken Goldberg: In both cases, a lot of it has to do with motors and the hardware, but also advances in simulation and the ability, for example, for drones, to stabilize themselves and then to be able to control very accurately the motors on the four or six rotors that are there. And the same is true for robots that have legs or quadrupeds or bipeds.
[00:05:39] Ken Goldberg: So where these are big, undeniable, and major advances. And if you just look at the field, you say, okay, all this is coming and now the next thing we’re going to have home robot taking care of us. And, you know, this is around the corner according to Elon Musk. And I’m sure I’m going to get some pushback from some of your listeners who are going to say, I don’t know what I’m talking about.
[00:05:57] Ken Goldberg: Okay. I’ve seen, I’ve had that happen from a number of very confident expert, quote experts from Silicon Valley tell me that. But I’ve been working in this field for 45 years and I’ve studied very closely, and I understand where the gaps remain for particular manipulation.
[00:06:15] Ken Goldberg: And manipulation is being able to pick things up, you know, all kinds of things if this happened to be in your environment, and then being able to manipulate them, you know, do things.
[00:06:24] Ken Goldberg: That skill is very nuanced and tricky.
[00:06:28] Ken Goldberg: And it’s not clear that the current methods for doing AI are going to get us there.
[00:06:34] Preston Pysh: I’ve heard in interviews, Elon in particular say that the hand mimicking the hand and, you know, the tendons and being able to have that tactile ability is extremely difficult, is the way he has said it in interviews.
[00:06:48] Preston Pysh: But I think, you know, I suspect that when it really comes down to it, what you’re seeing is a lot of demos online that you see, like this video, somebody picks up a pen and the robot did it, but what’s actually happening there behind the scenes, whether that was, you know, a programmed publicity stunt or something that the robot can just do quite well.
[00:07:09] Preston Pysh: Seems to be, there seems to be a large gap there. So talk to us about where you see that gap and what it is in reality, in your humble opinion.
[00:07:18] Ken Goldberg: Okay, so what I, and this is understandable, again, I don’t want to say people are naive. They’re, I get where they’re coming from. They see something and it looks human-like, and so they attribute human-like qualities to it.
[00:07:31] Ken Goldberg: And skills. I understand that. And by the way, when Elon Musk says the hands are hard, we can produce, people are designing hands that look very much like human hands. That is, they have 22 degrees of freedom and they can move all these joints independently very quickly and they look almost identical to human hands, so we can reproduce that.
[00:07:50] Ken Goldberg: In fact, there’s like a hundred different hands that are being produced by different companies in China right now. Okay, so the advances in the hand itself are very sophisticated, but the control of the hands is where the challenge is. And this is where if you, you have this hand doing this, but then get it to actually tie your shoelace, that is where the challenge is.
[00:08:15] Ken Goldberg: And this is because we have, there’s so many nuances in the interactions that we have with these fingers, with the environment that we are sensing, the environment. We are exerting forces on the environment, and this is very subtle and very nuanced, and we perceive this through a variety of techniques we have.
[00:08:36] Ken Goldberg: Something like 15,000 sensors in our hand. In every hand. Yeah, I know. It’s remarkable. We don’t even think about it, because it’s subconscious. Yeah. But then we also have sensors in our joints, every one of our joints. So we are able to perceive very subtle forces. Slip. And in particular, one very nuanced thing is deformation.
[00:08:57] Ken Goldberg: So if you look at your fingertips, they’ve evolved in a really interesting way that those pads are extremely helpful. If you put on, let’s say, thimbles on your finger, right? Yeah. Like you’re doing your sewing, that makes it much more difficult to do anything. Yeah. You can imagine. Or just actually heavy gloves.
[00:09:13] Ken Goldberg: Gloves, yeah. This as well. Yeah. But we can do these things very subtly. We have learned this ability to interact with the forces of objects, that the objects are constantly being moved and deformed. So if you think of the shoelace, the object is being deformed. The fingertip is being deformed as well.
[00:09:33] Ken Goldberg: This mutual deformation is something that’s really nuanced and subtle, and we don’t even know how to simulate it accurately. So we can’t even simulate the forces and torques and deformation that are occurring, and then we don’t have the sensing capabilities to perceive these nuances. Like you can feel a shoelace if it’s a little bit slipping out of your fingertip.
[00:09:55] Ken Goldberg: No robot can do that. So what happens is that when you now have this hand and you actually try to execute something, sometimes it works, but a lot of times it doesn’t work. Yeah. And now you have the issue of reliability. That is where we’re seeing, and by the way, you can see robots all day long picking up stuff off a table and moving it somewhere.
[00:10:14] Ken Goldberg: That’s actually not so difficult if you just want to pick up, especially a stuffed animal. By the way, stuffed animals are very easy because you almost can’t go wrong. You just put your gripper anywhere near them and close it and you’ll pick up this thing. Okay? Those are like, you know, they’re sitting ducks, right?
[00:10:29] Ken Goldberg: They’re, no, this is super easy, low-hanging fruit, let’s call it. And that makes it very easy to pick up and move things. But now when you want to start doing things like inserting things, like repairing a stuffed animal by opening it up and getting things, you know, pulling out the stuffing or sewing it back up, this is totally different and much, much more difficult.
[00:10:50] Preston Pysh: Yeah, your example of a shoelace is really profound. because until you like take a step back and just think, if I had to design or build a robot to tie a shoelace, I can’t even imagine how incredibly difficult something like that would be. because it is such a complex task and I’ve never even thought about how difficult something like that is.
[00:11:13] Ken Goldberg: Yeah, here’s what I know, because shoelace, we all do, we learn when we’re young and we kind of do it without even thinking about it. Yeah. It’s subconscious, right? I can be on the phone, tie my shoelace. Don’t even think. But think about this one. I don’t know about you, but do you know how to tie a bow tie?
[00:11:26] Ken Goldberg: A tie, but not a bow tie. Yeah, no, a bow tie. Okay. Because I thought you might, because you seem like a fashionable guy. I have tried it. It’s very tricky. Yeah, it’s a very tricky business and it’s subtle. You have to be able to feel and pull in all these different directions. Yeah, forget it. There’s no robot that’s going to be able to do that for a long time.
[00:11:43] Ken Goldberg: I would love to have it happen because that would be something I would love to have a robot tie. My bow tie. And here’s another one that’s very simple. It’s just buttoning your shirt. Yeah. It’s actually a little tricky for humans if you think about it, how you have to kind of fiddle with it a little bit to hit a button on and off, especially a small button.
[00:12:00] Ken Goldberg: So that’s way beyond robotics. You’ll never see a demo of a robot buttoning up a shirt. We’re actually working on it in my lab, but it’s really hard.
[00:12:09] Preston Pysh: Wow. Yeah. It’s things that you just really take for granted. Now, when you get into solving that problem, it seems like you mentioned this earlier, that it’s almost a sensing issue, that we need a lot of developments on whatever type of sensors you have in the fingertips or whatever you’re using for the manipulation. Is that the biggest hurdle right now, is just kind of replicating how our fingertips can have so much sensing capability?
[00:12:37] Ken Goldberg: So that’s one. But here’s something that’s somewhat encouraging for me at least, which is that if you look in the realm of robot surgery, and by the way, there’s a lot of misconceptions about that. I give talks where people say, well, a robot took out my nephew’s appendix. And I’ll say that was not a robot, sir. That was a surgeon using a robot as a tool. Yeah.
[00:12:59] Ken Goldberg: To do that operation. So they call it a robot, but it’s really a tele-robot. More literally, a puppet.
[00:13:06] Ken Goldberg: A very important and very useful and expensive puppet, but it’s a puppet. And so that’s very important to understand. So surgeons can do remarkable tasks. They can sew up a wound. They can take out an appendix or a gallbladder.
[00:13:23] Ken Goldberg: With these tools. Now, they do not have tactile sensing. Actually, in the very latest versions of it, they started to introduce some, but for many years they didn’t. And surgeons are still able to do amazing things. So this is evidence that maybe we don’t need to know how to do tactile sensing.
[00:13:41] Ken Goldberg: It’s just a hypothesis, but it says we have an existence proof that manipulation, dexterous complex manipulation with very complex deformable surfaces, right? I mean, it’s harder than having a bow tie, to take out an appendix, that you can do that without tactile sensing. Now, what’s fascinating is the way surgeons seem to do it is essentially accommodating the lack of tactile and then using vision.
[00:14:07] Ken Goldberg: Their eyes. They have cameras in there and they’re watching what’s happening and they’re seeing what happens and they have a feedback loop based on vision. So they can see very small deformations of the tissues and they infer what’s going on. This is remarkable because this, I think, is the most exciting path to getting to manipulation, which is rather than trying to reproduce tactile, which is extremely difficult for all kinds of reasons.
[00:14:35] Ken Goldberg: But there’s another path, which is to understand the visual-tactile interactions. And that, I think, if we can do that, we might be able to get away with just using cameras.
[00:14:45] Preston Pysh: Interesting. So this week or last week, I’m sorry, I saw an article that was talking to the difference between Elon Musk’s approach, particularly on the hands, between him and what Figure AI is doing.
[00:14:58] Preston Pysh: Where they put right here in the palm, they put a camera. To your point, they put a camera right here in the hand. And Elon Musk is refusing to put a camera in the hand. And the person who posted this was saying this is akin to him not using LiDAR in the cars. Because in the end it’s going to come down to a cost thing and he wants to force his team to figure it out without additional sensors. And for all intents and purposes, costs for manufacturing. Yeah. And he’s playing this longer game. Yeah. What are your thoughts on that?
[00:15:33] Ken Goldberg: That’s a brilliant point, Preston. I’m really glad you made it. The analogy really works there. Elon Musk is very, in some sense, you know, he’s very confident.
[00:15:43] Ken Goldberg: He’s done amazing things. Understandably, he should be confident. But sometimes that can blind you. So in this case, you know, his decision not to use LiDAR has really, I think, put a limitation on the Tesla driving systems. LiDAR can be very helpful for filling in the edge cases with certain conditions where vision cameras can be distorted or blinded by light flares or especially in rain.
[00:16:10] Ken Goldberg: So LiDAR actually is a great addition there. And also the cost, I don’t know, I think it will come down over time. I’m not in the car business, so I defer to his expertise there. But in the same way he has, you know, originally, you might remember when he first started Tesla, he wanted all the cars to be made with robotic factories.
[00:16:30] Ken Goldberg: And he had a decree. We will have no humans, you know? Everything must be done by robots. And I remember engineers coming in from Tesla to my lab and saying, can you help us? We’re trying to do this thing and we can’t get it to work with a robot. And he was just, you know, unrelenting. And then finally he said I was wrong.
[00:16:49] Ken Goldberg: Yeah, he was wrong.
[00:16:50] Preston Pysh: Yeah. He said,
[00:16:50] Ken Goldberg: I was mistaken. Humans are underrated. Do you remember that quote?
[00:16:54] Preston Pysh: Yes. Yeah.
[00:16:55] Ken Goldberg: So that was really interesting because it was one of the rare times he admitted it, but it was also a great example of the idea that you can’t do everything right with robots. And even if you will it, you know, he can will things into existence by demanding this, it doesn’t always work that way.
[00:17:11] Ken Goldberg: And so the LiDAR story is very analogous. And I think you’re right about the cameras. You know, having cameras in the hand makes a lot of sense. It’s not how humans or animals work, right? They don’t have eyes in their hands. But cameras are something we understand very well.
[00:17:29] Ken Goldberg: We have very high-quality cameras. They’re very fast, they’re very accurate, and they’re really low in cost comparatively. So I’m for more cameras. You know, put a lot of cameras in there because the other issue is when you walk, it’s one thing. You have a camera on the head. You can sort of see what’s around you, right?
[00:17:46] Ken Goldberg: Or drive. By the way, driving, I should have mentioned this earlier, but driving is much easier than manipulation because driving, you’re just trying to avoid objects, avoid hitting anything. In manipulation, you must make contact with objects. You must manipulate them. So it’s very different.
[00:18:02] Preston Pysh: To your point, this is really fascinating because to your point, when you talk about this idea of, you know, if I’m holding a shoelace and the tip of my finger is indented or I can see the compression of that, I can feel it. I’m relying on that touch, like I’m tying my shoe. I’m relying on that sense of touch.
[00:18:28] Preston Pysh: But if you were going to try to build a sensor that can do that and you’re kind of hitting a roadblock or can’t find something that can provide that tactile feedback, I could look at a camera and say, okay, it went in by a half a millimeter. Therefore, it’s about this much pressure and you can substitute that sensing capability through an image or a video of being able to see it. So it is kind of interesting that we have Figure going that path.
[00:18:46] Ken Goldberg: Well, okay, so let me add on to this. So you just made a very nice nuanced point. You said if you weren’t just looking at your fingertip and you saw the shoelace pressing into it by looking at the shadow structures and others by a few, you could probably figure out that it was slipping away or it was firmly grasped. Absolutely. That’s what surgeons do. Yeah. And they, by the way, work with surgical thread, which is really thin, and they have to use a needle. It’s very complicated, right? Yeah. But they’re doing a lot of this with their eyes, with their intuition. Now, it’s not just a matter of putting cameras around because that doesn’t solve it alone.
[00:19:19] Ken Goldberg: You actually, now you need to be able to understand that imagery, the video, and you need to interpret that. Yeah. And that’s also extremely difficult. Yeah, because humans have this incredible ability and we can’t underestimate it. It’s just amazing what humans can do.
[00:19:35] Preston Pysh: Yeah. So from an inference standpoint, as far as like if I hold the shoelace this way, I can also kind of just intuitively infer that if it was held 90 degrees from that, that it’s going to have this same slipping sensation, and that’s something that’s really hard to train a robot on versus humans can just figure it out like very easily. Is that what you mean by that?
[00:19:56] Ken Goldberg: That’s what I mean. Yeah. And here’s the thing. We don’t have good language for describing this. Yeah. You know, we’re trying to, because it’s all intuitive for us.
[00:20:02] Ken Goldberg: If you ask me, tell me how to tie a shoelace, right? I’d be like, you know, it’s not easy. We don’t have language, and that’s part of the reason, by the way, this is the other issue and I’ll come to is the data gap. The gap between the amount of data we have for language versus robots. Maybe this is a good time.
[00:20:18] Preston Pysh: Yeah, let’s talk that.
[00:20:19] Ken Goldberg: Okay. All right. Well, there’s a way of quantifying all this, and this is something that I call the robot data gap, and it’s the following: if you put together all of the data that was used to train language models. Now it’s vast, but it’s hard to wrap your head around.
[00:20:34] Ken Goldberg: How much data is that? Well, my students and I were able to calculate that. If you actually look at it, and actually there’s another, Kevin Black, who’s a researcher at Physical Intelligence. Very smart guy. He had the first insight about this, and then we’ve been taking it a little further, but basically it’s that.
[00:20:52] Ken Goldberg: If you added up all the hours it would take you to read, a human average, a human to read all the text that’s used, that’s available to train the language models. So it’s all the books that are out there, it’s all Wikipedia, it’s everything that’s on the internet. If you add up all those tokens, if you will, and then to figure out, well, a human can read at the average speed of 238 words per minute.
[00:21:13] Ken Goldberg: You can do the math and you end up with a hundred thousand years.
[00:21:18] Ken Goldberg: So you could sit down and read everything that’s used and be a hundred thousand years later, you’d be done. Okay. Now, we don’t have such data for robot manipulation. Oh, it doesn’t exist. Yeah. It’s not like we can just find it on the internet.
[00:21:32] Ken Goldberg: Yeah. The data is very different there. We want to start with vision images and then end with control signals to the robot. This doesn’t exist, so we have to start and basically generate this data. But what we’re up against right is it’s a hundred thousand years. We’re a hundred thousand years behind the language model.
[00:21:53] Ken Goldberg: So again, I’m sort of exaggerating to make a point, which is certainly there are a number of ways to accelerate that and I think we can eventually get there. By the way, I’m not saying this will never happen. Please don’t get me wrong. I believe it will happen, but my big question is when. I think it’s really important to be prepared for the reality.
[00:22:12] Ken Goldberg: There’s a lot of people who say, hey, this makes sense. I want this. We should have this soon. And, you know, remember there’s a lot of cases where people have talked about that in the past. Fusion energy, nuclear fusion. Makes a lot of sense. Sort of, the technology’s pretty obvious, but you have to contain this plasma.
[00:22:36] Ken Goldberg: That seems like a technical issue. We can figure that out. Well, 50 years later we’re still working on it and it’s hard. It’s one of these very, very nuanced problems. Another one is curing cancer. When I was a kid, they used to say, we’re going to have a war on cancer, just like we got to the moon. Ten years, solve cancer. We haven’t solved it.
[00:23:04] Ken Goldberg: So there are problems that are extremely difficult and they take much longer than anyone expects. And it seems like robotics is like that. We don’t know. And listen, I’d be the first to celebrate if someone wakes up, I wake up and I read someone has solved it, right? It could happen.
[00:23:26] Preston Pysh: Yeah.
[00:23:27] Ken Goldberg: And then you’ll look back on this podcast and you say, Goldberg was completely wrong. It could totally happen. But I want to be a voice to say, hey, it might not happen. And let’s just think about that and be a little bit realistic because I know how a lot of people are thinking that it is inevitable it’s going to happen, you know, hopefully by next year, according to Elon Musk and many of his followers.
[00:23:47] Ken Goldberg: They have to be ready for that maybe not to happen. And I’m worried about a backlash that people will say, hey, this whole robotics thing is, you know, was, you know, hocus pocus and, you know, we’re going to move out of this field in droves.
[00:24:02] Preston Pysh: I don’t want to put words in your mouth, so correct me if I’m stating this wrong. I don’t think you’re saying it’s not going to happen. You’re just really suspect on the timeline that everybody seems to be.
[00:24:10] Ken Goldberg: Yeah. That’s it, exactly. And that’s where I line up with Rod Brooks because for very similar reasons. We have experience. We’ve both been working in this field for like, he’s been working longer, slightly longer than me, 60, 50 years. But, you know, we have a lot of experience with trying to solve these problems and they’re much more nuanced than they seem on the surface, especially because a child can pick things up and manipulate it. Yeah. It seems obvious.
[00:24:33] Ken Goldberg: Why can’t robots? It’s very counterintuitive, but when you work with these things and you really see their limitations, you start to understand that this is a very complex problem.
[00:24:43] Preston Pysh: Ken, if you were, you know, you’re a program manager, you’re kind of looking at all the different swim lanes to get there.
[00:24:49] Preston Pysh: The hand seems to be like one of the, you know, critical path, if you will, for getting there. Is there anything else that you would define as being on that critical path, or is the hand so far out there as far as difficulty goes compared to everything else that’s really kind of the limiting factor?
[00:25:06] Ken Goldberg: Great question. I would say it’s not only the, when you say hand, it’s the manipulation ability.
[00:25:12] Preston Pysh: Yeah.
[00:25:13] Ken Goldberg: Right?
[00:25:13] Ken Goldberg: Because by the way, I do have another thing to say here, another opinion, which is that we will get much more out of very simple grippers than we will out of hands that look like human hands.
[00:25:23] Ken Goldberg: Again, if you look at surgery, the tools that surgeons use to perform appendectomy are very simple grippers like this. And they can do immensely complicated things. So I believe you don’t need complex hands.
[00:25:38] Ken Goldberg: So I’m not saying that’s the path to go. I believe you can do simple grippers. In fact, my company Ambi Robotics uses an even simpler gripper, which is a suction cup.
[00:25:47] Ken Goldberg: And you can do incredible things with them. So it’s not necessarily the hardware, but it’s the software. It’s the control of this nuanced interaction that is very challenging. I think many of the other aspects are addressable. You know, we have the ability to tell a robot, go pick up the orange, you know, the orange jumper off the table.
[00:26:10] Ken Goldberg: We can solve that now. Computer vision systems are good enough to know that a jumper is a sweater and there’s an orange one and it’ll pull that up. No problem. But it’s being able to actually pick it up and maybe put it on you and then button it up for you.
[00:26:24] Preston Pysh: Yeah. That’s where it’s difficult.
[00:26:25] Preston Pysh: Yeah. Yeah. I mean, maybe what we see in the interim is robots that go to market, humanoid robots that go to market that have simplified the hands or have the range of activities or things that they can actually perform is very limited relative to a real human being and being able to do it.
[00:26:43] Ken Goldberg: I don’t know. That’s how they go to market or not. I want to talk a little bit more about your company Ambi Robotics. So this is really fascinating. So you guys have gone to market primarily focusing on logistics and warehouse type activities for robotics. Is that correct?
[00:26:59] Ken Goldberg: Correct. So this started about seven years ago.
[00:27:03] Ken Goldberg: We had a breakthrough in robot grasping, and that was just simply the ability to pick things out of a bin.
[00:27:10] Preston Pysh: Okay.
[00:27:11] Ken Goldberg: So it’s not manipulating, you know, doing surgery, but it’s just picking things out of a bin. That was a very old problem. It’s been known as the bin-picking problem, and people have been looking at that for decades.
[00:27:21] Ken Goldberg: But we made an advance, and this was especially the work of Jeff Mahler, who was a PhD student of mine, who was the lead researcher on this. And we can go into more details on the technical aspect of this, but the system was called Dex-Net, and it was based on collecting data.
[00:27:35] Ken Goldberg: Lots of examples. And it was somewhat analogous in many ways to ImageNet, which was a breakthrough for computer images. So we did something similar. We synthesized this dataset. We added noise in a very specific way, but the system started working remarkably well. And so it could pick up almost any object that you put into a bin.
[00:27:55] Ken Goldberg: It would just pull it out and you would throw in a whole pile of objects. We were digging around in our garages and closets and throwing everything we could into it, and it was consistently just being able to pull these things out. And so that was a very exciting moment for us. We got some publicity.
[00:28:11] Ken Goldberg: It was in The New York Times and other places, and then we were approached by a number of companies and we decided to form our own company.
[00:28:18] Preston Pysh: That’s awesome. I’m curious where you’ve seen just good old-fashioned engineering matter more than additional data or larger models, and then to the converse of that, when did you have data actually really surprise you?
[00:28:32] Preston Pysh: Okay, good.
[00:28:33] Ken Goldberg: Well that was, okay, great example. That was a case where data really did surprise us. We were able to generate 6 million example grasps because we had collected 10,000 object models, and then we could generate grasps on those models. And then we had all these, and we trained a network to be able to learn essentially where to grasp an object.
[00:28:53] Ken Goldberg: So that was a data-driven approach. But I will tell you that when you take that and you have to move that into an experimental system or into a commercial system, then you need a huge amount of what I call good old-fashioned engineering. And this is where you have to really sweat the details. You have to make sure that the sensors are calibrated correctly, that your robot arms are calibrated and accurate.
[00:29:16] Ken Goldberg: You have to be able to do the computation to move the arms very quickly. You have to control the surfaces of the grippers, the suction cups, a myriad of details like that, the lighting. We had a little scale underneath the system that would recognize when an object was removed from the bin. There’s like a digital scale.
[00:29:35] Ken Goldberg: It was just another piece of engineering that had to go in there. Lots of all that was just our demonstration system in the lab, experimental system. Then when we moved into Ambi Robotics, and by the way, I want to give credit also to the other students who were involved. Matt — Matt was another computer scientist working closely with Jeff Mahler.
[00:29:53] Ken Goldberg: And then I also had two other PhD students from mechanical engineering, and one of them, Steve McKinley and David Ley. Brilliant. All four of these guys, extremely brilliant engineering students. And so they really knew how to, they were very good friends. They remained good friends. They all worked very closely and spent a huge amount of time camping and hanging out together too.
[00:30:16] Ken Goldberg: But they were perfectly complementary because we had the computing skills and the mechanical skills. And the mechanical guys knew how to design machines that could work reliably over a great period of time, and that’s when we moved into building the Ambi Sort system, which sorts packages for e-commerce.
[00:30:35] Ken Goldberg: And this was a little bit — we didn’t go in with this plan — but what we saw very quickly was that e-commerce was growing and we needed, there was a huge demand for sorting packages, right? It’s just very challenging to get packages out to the customer fast.
[00:30:49] Ken Goldberg: So we started using that technique, Dex-Net.
[00:30:51] Ken Goldberg: We evolved it and commercialized it, and then we could make it work very fast. And then all kinds of other elements had to come in. We had another gantry system that would drop it into, pick it out of bins, pick an object out of a bin, had to be scanned for a zip code, figure out which bin to go into, then put it into the right bin.
[00:31:10] Ken Goldberg: Avoid jamming the whole time. Make the system reliable, safe, and easy to use. All this is what I call good old-fashioned engineering. Yeah. And so I become a big advocate for this because after all, this is a body of research and ideas and insights that have been developed over 400, 500 years in engineering.
[00:31:30] Ken Goldberg: It’s still what we teach at Berkeley and all the major universities. We teach the engineering principles, and my point is, let’s not forget about those. Those are still extremely valuable for engineering and for robots and getting them to actually work in practice. Yeah, and anyone working in robots I think will acknowledge that.
[00:31:50] Ken Goldberg: Although the public perception is, oh, it’s just now, you know, we’re using AI and that’s solving everything. It’s not. It’s solving certain little pieces of it. And as I said, there’s certain pieces that are very difficult that still remain very difficult. So, and this comes back to what I was saying earlier, Preston, about my fear, which is that because there’s so much expectation around humanoids right now, that if companies can’t deliver on that ability, then there might be a big backlash.
[00:32:12] Ken Goldberg: And that’s going to hurt companies like Ambi Robotics who are not trying to do that. Ambi Robotics is trying to solve a real practical problem and do it efficiently and cost-effectively. And actually, you know, basically something that’s very valuable for everyone who shops at Amazon or any of the online companies.
[00:32:36] Ken Goldberg: We’ve sorted a hundred million packages so far. Wow. Wow. And I’m very proud of that because these machines, as we’re talking, are out there sorting packages and they’re very reliable. They’re not featured in the videos about — there’s no humanoids doing this. Yeah. By the way, although some have said that, you know, we’ll have a humanoid doing that, but a humanoid with hands, it’s going to be a long time before that’s even close to the efficiency of the—
[00:32:59] Preston Pysh: Systems that we have with suction cups. After shipping robots that work every day in the warehouse, what’s one belief that you held earlier in your academic career that you’ve had to revise based on that? Lots.
[00:33:14] Ken Goldberg: I would tell you one of the things that is very interesting is that you think, okay, I have this great new technology that’s the breakthrough that really solves an important problem.
[00:33:23] Ken Goldberg: Therefore, I can rush out into the commercial world and build a company around it. Well, it turns out that technology is only a very small core part. It enables, but then there’s all these things that have to come around it that are equally, if not more, important. And actually when you go to the customers and you say, hey, we have this new AI thing.
[00:33:41] Ken Goldberg: They’re like, wait a second, I don’t care about that. How much money is it going to save me? That’s all they care about. Yeah. Yeah. And that’s business. That’s business. Both my grandfathers were entrepreneurs, and so was my father. So I grew up in these kinds of environments, and it’s tough stuff out there.
[00:33:56] Ken Goldberg: One grandfather was very successful in electronics, and my other grandfather was in the housing business building homes. But my father struggled. He was a metallurgist and he had a company doing chrome plating, and it was very difficult. And, you know, he was buffeted by things way behind his control.
[00:34:13] Ken Goldberg: Like, you know, the recession of the seventies actually really hurt his business very badly, so he struggled. So there’s a lot of factors and it has to do with competition and timing. What I would also say in industry is that — and this is going to come back to the data aspect — you can do things in a lab that you think are, you’ve really explored the full range of a problem.
[00:34:37] Ken Goldberg: So let me give you this example. We are addressing the bin-picking problem. Remember? And we were dropping all kinds of objects in there. In fact, when people would come to the lab, they would visit and I’d say, well, where do you have your car keys? Drop ’em in here. I said, if the robot will pick it out, we’ll keep the car. And then — but it would always do that.
[00:34:57] Ken Goldberg: It was no problem picking out someone’s car keys. And we tried it on all kinds of things. Again, toys. We made 3D-printed, weirdly shaped objects, all kinds of things we could think of. We tried to basically consider everything, and we were just trying to push the envelope, right?
[00:35:30] Ken Goldberg: Well, the envelope was the key word because it turned out that one thing we didn’t ever really experiment with was bags. Oh. And bags are extremely common in shipping. Yeah, you probably, you know, if you receive bags from your e-commerce, from Amazon or others, you get bags of all kinds of forms.
[00:35:51] Ken Goldberg: Now, bags are often plastic or paper, but the issue with bags is they’re loose. And so they have objects in them, but there’s a lot of slack. They tend to fold in interesting ways. So we weren’t testing those really in the lab. That wasn’t something that we would’ve thought about too much, but that’s so much more common in real shipping.
[00:36:12] Ken Goldberg: So my point is we had to adapt all of our systems to the reality of the consumer market, which in this case is bags. And that was something we didn’t have a lot of data on. So we had to adapt our systems to work on real bags. And real bags are very difficult to actually even simulate and model because they fold again.
[00:36:33] Ken Goldberg: And by the way, the folding matters because if you go to pick up with a suction cup right on top of a fold, as you lift it, the fold will unfold and you’ll lose the suction and drop the object. So we started collecting data as we started putting these robots to work.
[00:36:58] Ken Goldberg: As our customers were putting these systems into production at Ambi Robotics, right, we also had an agreement that we would maintain these systems at very high performance levels because we are constantly monitoring them. So we have a dashboard at the central headquarters in Berkeley where the team keeps an eye on every machine that’s in operation out there.
[00:37:20] Ken Goldberg: And so what we do is we get data on every single pick operation and what happens — how long it takes, whether it dropped the object, whether it was classified correctly, all kinds of things like that, right?
[00:37:34] Ken Goldberg: And we use that so that we can immediately tell when the performance — let’s say the picks per hour, performance — that’s how it’s often measured — drops. We can spot that early and say, and we call the company and we say, what’s going on? Did something change? Did a camera get knocked? Is the suction cup getting worn?
[00:37:14] Ken Goldberg: And so we’re constantly on top of it. Part of it is that’s a source of big pride for us. We’re really customer-focused and we want to make sure our machines work completely reliably. But the nice, amazing side effect of this is that we’ve been able to collect data from all these real systems and real environments.
[00:37:32] Ken Goldberg: Over the last four years, and we now have the equivalent of 22 years of robot data. So remember I talked about the hundred thousand years? Yeah. Yeah. Now we have 22 years, though it started. Okay. Yeah. But it’s real robot data. It’s extremely valuable. It’s high quality. Yeah. It’s a gold standard for data.
[00:37:51] Ken Goldberg: And so we’re now using that to refine our systems and make them better and higher performing, more reliable, but also allowing us to now branch out into new related types of products. So we now have, we introduced a new product called Ambi Robotics Stack that stacks boxes very efficiently, very densely.
[00:38:14] Ken Goldberg: And that’s a new product that we sold out — our first batch of these—
[00:38:17] Preston Pysh: Systems this year. Amazing. On this idea of robot data or covering this hundred-thousand-year gap that you’re talking about, for a company that would be trying to overcome this because the data just isn’t there,
[00:38:31] Preston Pysh: are they just having to construct a bunch of physical, real-world—going back to the hand example—would they have to have a bunch of physical hands with just a bunch of physical objects to then just be doing this? Or is this something that you think we could simulate in a virtual environment to accelerate that speed, or kind of a combo of both?
[00:38:52] Ken Goldberg: Good. So for grasping, it turns out simulation works pretty well because there you just need to know the geometry of the environment fairly accurately and then the geometry of the object and the gripper. And then you can actually model that fairly well. Now grasping is just lifting an object off the table—
[00:39:10] Ken Goldberg: okay, or out of a bin. That’s very different than tying the shoe that we talked about earlier. There it turns out that we can’t simulate that so well. As I mentioned, we don’t know how to model and simulate the deformation, the minute forces that are going on in the process of interacting with that object.
[00:39:30] Ken Goldberg: So that’s a challenge. This is a little nuanced, and I know that your audience might say, well, what is Goldberg talking about? He said this couldn’t be solved. Now he says it can be solved. Well, it depends. There are certain categories of problems that can be addressed, and I think that picking objects out of a bin is something we’ve made an enormous amount of progress on in the last five years.
[00:39:49] Ken Goldberg: So I’m very optimistic about that. I think we’re getting faster, more reliable, and those systems are—you know, that’s the cutting edge of robotics and it’s real. Yeah. But then tying shoes and doing things around a home, by the way, or in a factory where you’re actually trying to put together, you know, electronic parts or car bodies or installing upholstery and wiring inside a car,
[00:40:14] Ken Goldberg: these are extremely difficult. By the way, even in Detroit or anywhere in the world, they’re still humans doing those jobs because they’re very hard. So those are hard to simulate. And I do think everything’s pointing toward deformation as a key obstacle in doing it. And I’ve talked with people who are physicists and experts in deformation, and they agree this is a very hard problem if we don’t even understand the physics of friction and deformation very well.
[00:40:41] Ken Goldberg: Interesting.
[00:40:43] Preston Pysh: You’ve said your views on AI creativity have changed. Walk us through some of the timeline and what’s changed and just kind of your overall opinion today.
[00:40:52] Ken Goldberg: Okay. Well, on a very different note, I have been working as an artist in parallel with my work as a researcher and engineer.
[00:41:01] Ken Goldberg: You know, I like to say my day job is teaching at Berkeley and running a lab there, but I have another passion, which is making art. And I’ve worked on this for almost the same amount of time, and I make installations and projects. We did a project called the TeleGarden where we had a robot that was controlled by people over the internet.
[00:41:21] Ken Goldberg: And the robot could tend a living garden. We put this online in 1995, which was the very early days of the internet, and I’m very proud of that project because it stayed online for nine years. Huh. Twenty-four hours a day, people could come in and explore this garden and plant seeds and water them.
[00:41:49] Ken Goldberg: So it was an artistic project, but it was also an engineering proof of concept and it had to work reliably. And so, you know, it really pushed us. I sometimes say people think doing engineering is hard—try art. It’s really hard because you have to deal with the public and they’re going to interact and do all kinds of crazy things.
[00:42:08] Ken Goldberg: So we had to really spend a lot of time designing that system. But I continue an interest in art, and I have a new show coming up. It’s a joint project with my wife, Tiffany Shlain, who’s an artist, and she and I are collaborating on an exhibition that’s going to open in San Francisco on January 22nd.
[00:42:35] Ken Goldberg: Okay. All right. So this is a big passion of mine, and it’s using technology like AI and robots to ask questions about technology. And I’m very interested in this contrast between the digital and the natural world, where they seem very symmetric and similar, but there are very profound differences between them.
[00:42:56] Ken Goldberg: So that’s what I think about or try to express in my artwork. And so your question about creativity—yeah, I always said, you know, AI won’t be creative in the sense that you can ask it questions, but it’s not going to actually come up with original ideas.
[00:43:17] Ken Goldberg: But I actually have shifted my view on that. Yeah. And I give this example where I asked ChatGPT in the early days, hey, gimme a hundred uses for a guitar pick. And I just thought it would start repeating the same thing over and over again.
[00:43:42] Ken Goldberg: But then I started listing these as fast as I could read them, or faster, and then it came up with one that I was like, ah—it said a miniature sail for a toy boat. And when I saw that, I was like, oh my God, that is a genius idea. And I would not have thought of it.
[00:44:05] Ken Goldberg: And immediately when you see something that’s original and creative like that, you spot it and you say, ah, why didn’t I think of that? Those are those rare ideas, and AI is capable of that now. And so it’s very exciting. Yeah, it is exciting. It’s super exciting.
[00:44:27] Ken Goldberg: So I’m not negative about AI at all. I love it. I use it. I advocate for it. My daughters, my wife—everyone uses it. And so I’m a hundred percent for it. I do think it’s going to help with robotics, but the question is, is it going to do everything that people are hoping?
[00:44:45] Ken Goldberg: And that’s where I hope that this conversation, Preston, will put things into context for your audience.
[00:44:52] Preston Pysh: So I don’t know if you’re going to like this question or not, but I’m going to throw it over because I’m curious what you would think of this. So Figure AI recently sold their humanoid robot to put into the home, and there was a lot of pushback from PE
[00:45:03] Preston Pysh: You know, at least the comments that I saw online—that this was just a giant marketing scheme. Or maybe they’re trying to raise their next round, I don’t know. But for the audience, I’ll just kind of frame it. It’s a humanoid robot. All the demos that I’ve seen to date are extremely suspect as to its ability to actually do anything in the home.
[00:45:24] Preston Pysh: When you dig into it more, they were using—you’re putting this thing in your house, and then I guess that it has this ability to go back to a human that would actually be manipulating the robot inside the house, which I think has all sorts of security and privacy issues and everything else, right?
[00:45:41] Preston Pysh: But the reason I bring this up is because I’m pretty sure he is the founder and CEO of the company. He was suggesting that in order for AI to really start to accelerate its learning, that it needs to start being embodied into physical form and to put itself into a challenging learning environment.
[00:45:58] Preston Pysh: And what he means by some of this—and Ken, correct me if I’m wrong in the way that I’m describing it—but what he’s getting at is there’s all these ambiguous situations that happen in the household with respect to social dynamics, the way that the family would interact and what they would ask of the robot.
[00:46:23] Preston Pysh: Like, hey, go get me a cup of water. And then the person who’s asking for it always likes it half full, or they like it warm or they like it cold or whatever. And so that learning that the robot would be forced to undergo from a social dynamic would assist in its ability to get smarter collectively.
[00:46:48] Preston Pysh: because I’m sure all this information is then going back up into the mothership and getting networked. But his argument is—the point of my question—which is, do you also agree that for AI to take this next step or this next quantum leap from where it is today, that it really needs to be able to immerse itself and basically partition itself into physical form?
[00:47:26] Ken Goldberg: Well, I think that is helpful, and certainly understanding the dynamics of human interaction, especially in a home—the social dynamics—are very subtle and, as you said, very important. Just understanding tone of voice, gestures—like my daughter will say, you know, “does this,” which is “don’t bother.”
[00:46:48] Preston Pysh: Teenager.
[00:46:58] Ken Goldberg: Yeah. Or just rolls her eyes. The subtle body language is super complex when you think about it. Super complex.
[00:47:26] Ken Goldberg: And we pick up on it in a myriad of ways we don’t even recognize, right? Like when I’m teaching, I can pick up if students are starting to lose interest or get tired or bored. I just feel it. I look around, but I’m always watching.
[00:47:48] Ken Goldberg: So all that—you have to be in real homes to be able to do that. And I think that makes sense. I’m not opposed to having a humanoid in a home that might be helpful for doing certain things, like fetching water or being able to pick up things around the house.
[00:48:04] Ken Goldberg: Remember grasping—I said that is actually something I think robots can do. So if you said, hey, pick up all the things that are on the floor, we would all like that. We have a Roomba, but the next step is to be able to actually pick things up and put them away.
[00:48:17] Ken Goldberg: And that I think we can get there. I actually think that’s going to come in the next decade, and it’s very valuable because, by the way, if you’re a senior citizen, you really want things off the floor.
[00:48:23] Ken Goldberg: And if you’re a young parent, or if you have a teenager—can you clean your room?—it’d be great to have a robot go in and just pick up all the clothes. We call that the teenager problem.
[00:48:42] Ken Goldberg: By the way, we have a paper on this, which is how to get a robot to efficiently pick up clothes. And it’s not the obvious thing because if you just program a robot to go in and pick up, it’ll pick up one sock, take it to the bin, pick up the next sock, take it to the bin.
[00:48:57] Ken Goldberg: You actually need to be able to pick up lots of socks together. And so how do you do that? That’s called multi-object grasping. It’s a very complex and nuanced topic, and so we’re studying that in the lab.
[00:49:18] Ken Goldberg: So just coming back to your bigger point, I think robots will do something useful in the house. I think that’s possible. That could be useful for security also, maybe. In some form of companionship somewhere down the road, or as you get older, and I can appreciate this more and more, that I might want to have a robot that might help me shower or get changed or help me get out of bed in the morning.
[00:49:28] Ken Goldberg: I think that would be nice. I’d rather have that than a stranger in my house.
[00:49:34] Preston Pysh: Right? Just not one that’s networked back to some other person on the controls.
[00:49:48] Ken Goldberg: Yeah. I mean, the privacy issues are huge. You’re right. And that’s something a lot of engineers don’t appreciate.
[00:49:48] Ken Goldberg: I faced this at Berkeley a few years ago where I was talking about privacy and I made an art project about privacy with surveillance cameras. And some of my friends were like, I don’t care about privacy. I have nothing to hide.
[00:50:10] Ken Goldberg: And I said, oh, really? Can I see all the letters of recommendation you wrote for the last ten years? Oh, no. Or how about all the research proposals you’re working on? No. So there’s all kinds of stuff you don’t want to share.
[00:50:31] Ken Goldberg: Same as in your home. It’s not that you’re doing something criminal or embarrassing, but there’s a lot you don’t care to share because it’s important and confidential.
[00:50:46] Ken Goldberg: So I’m not opposed to humanoid robots. I think it’ll be interesting to see what happens in the next few years. We’ll probably start to see these. It’ll be very interesting to see that rollout from Figure.
[00:50:55] Ken Goldberg: And Brett Adcock is a very compelling businessman. Like Elon Musk, he has a lot of optimism and confidence, and he is definitely building something that’s working to some degree. So it’ll be interesting to see.
[00:51:09] Ken Goldberg: And I’m not a naysayer. I’m not saying all this is going to fail. I’d just say, be patient. The real science-fiction stuff is going to take longer than we think.
[00:51:23] Preston Pysh: Last question I got for you, Ken. What’s the most exciting or surprising thing that you’ve seen in the lab or in the space in general that you almost gasped when you saw it in the past year?
[00:51:36] Ken Goldberg: Okay, so actually I have a good answer for that. You know, I’m so proud of Ambi Robotics for being able to sort packages around the clock at very high speeds. But I recently saw a company called Dyna Robotics.
[00:51:52] Ken Goldberg: And they demonstrated folding napkins with a robot, and they did it for 24 hours. They just had the camera set up, and now this is by the way just two parallel grippers. Two grippers, no head, but it has cameras.
[00:52:13] Ken Goldberg: And they were folding napkins over and over again for 24 hours, reliably. The napkins would get tangled, and it would figure out how to untangle them and keep going. And the folds were actually pretty nice.
[00:52:22] Ken Goldberg: And then I saw a live demo of a new version that can now fold shirts. I saw this at a conference in Korea in September, and it was folding shirts round the clock. You could bring your own T-shirt and it would fold it.
[00:52:38] Ken Goldberg: That, to me, is very exciting. Everyone wants something to fold their clothes.
[00:52:42] Preston Pysh: That is for sure. I’m so bad at folding.
[00:52:48] Ken Goldberg: You have a folding board?
[00:52:51] Preston Pysh: Yes, I do.
[00:52:52] Ken Goldberg: Okay. So you’re going to be a great customer for this. But you have high standards, right? That’s where it gets tricky.
[00:53:03] Ken Goldberg: The Dyna Robotics guys—Jason and Linden—are pulling something off, and I think it’s worth watching. They’re showing one task done very reliably, rather than trying to do general-purpose robotics in a home.
[00:53:24] Ken Goldberg: Folding laundry, making coffee—bottom-up progress from specific tasks. That’s a path forward. But again, it’s going to take longer than most people think.
[00:53:45] Preston Pysh: Ken, I can’t thank you enough for making time. Your expertise is off the charts. If you have anything you’d like us to include in the show notes, just let us know.
[00:53:59] Ken Goldberg: Okay. I’ll send you some links. I’m also glad you’re going to connect with my good friend Rich Wallace. He’s a pioneer in chatbots and an unsung hero.
[00:54:31] Preston Pysh: All right, Ken. Thank you so much for coming on the show.
[00:54:34] Ken Goldberg: My pleasure, Preston.
[00:54:36] Outro: Thanks for listening to TIP. Follow Infinite Tech on your favorite podcast app, and visit TheInvestorsPodcast.com for show notes and educational resources.
[00:54:45] Outro: This podcast is for informational and entertainment purposes only and does not provide financial, investment, tax, or legal advice. The content is impersonal and does not consider your objectives, financial situation, or needs. Investing involves risk, including possible loss of principal, and past performance is not a guarantee of future results.
[00:55:01] Outro: Listeners should do their own research and consult a qualified professional before making any financial decisions. Nothing on this show is a recommendation or solicitation to buy or sell any security or other financial product. Hosts, guests, and The Investor’s Podcast Network may hold positions in securities discussed and may change those positions at any time without notice. References to any third-party products, services, or advertisers do not constitute endorsements, and The Investor’s Podcast Network is not responsible for any claims made by them. Copyright by The Investor’s Podcast Network. All rights reserved.
HELP US OUT!
Help us reach new listeners by leaving us a rating and review on Spotify! It takes less than 30 seconds, and really helps our show grow, which allows us to bring on even better guests for you all! Thank you – we really appreciate it!
BOOKS AND RESOURCES
- Official website: Ken Goldberg.
- Website: Ambi Robotics.
- Research Article: Dex-Net in Science Robotics January 2019.
- Executive Education profile: Prof. Ken Goldberg.
- Ken Goldberg interview by Kara Manke. UC Berkeley News. 27 Aug 2025: Are We Truly on the Verge of the Humanoid Robot Revolution?
- Goldberg on Moravec’s Paradox.
- Goldberg on AI and Creativity.
- TEDx Talk (Recorded Sept 9, 2023, ~12 min): “Robots: What’s Taking So Long?”
- Op-Ed by Ken Goldberg, Boston Globe, 30 May 2023: Let’s Give AI a Chance.
- Research Papers are available for download.
- Related books mentioned in the podcast.
- Ad-free episodes on our Premium Feed.
Some of the links on this page are affiliate links or relate to partners who support our show. If you choose to sign up or make a purchase through them, we may receive compensation at no additional cost to you.
NEW TO THE SHOW?
- Join the exclusive TIP Mastermind Community to engage in meaningful stock investing discussions with Stig, Clay, Kyle, and the other community members.
- Follow our official social media accounts: X (Twitter) | LinkedIn | | Instagram | Facebook | TikTok.
- Check out our Bitcoin Fundamentals Starter Packs.
- Browse through all our episodes (complete with transcripts) here.
- Try our tool for picking stock winners and managing our portfolios: TIP Finance Tool.
- Enjoy exclusive perks from our favorite Apps and Services.
- Get smarter about valuing businesses in just a few minutes each week through our newsletter, The Intrinsic Value.
- Learn how to better start, manage, and grow your business with the best business podcasts.
SPONSORS
- Simple Mining
- Human Rights Foundation
- Unchained
- HardBlock
- Linkedin Talent Solutions
- Onramp
- Amazon Ads
- Alexa+
- Shopify
- Vanta
- Public.com*
- Abundant Mines
- Horizon
*Paid endorsement. Brokerage services provided by Open to the Public Investing Inc, member FINRA & SIPC. Investing involves risk. Not investment advice. Generated Assets is an interactive analysis tool by Public Advisors. Output is for informational purposes only and is not an investment recommendation or advice. See disclosures at public.com/disclosures/ga. Past performance does not guarantee future results, and investment values may rise or fall. See terms of match program at https://public.com/disclosures/matchprogram. Matched funds must remain in your account for at least 5 years. Match rate and other terms are subject to change at any time.
References to any third-party products, services, or advertisers do not constitute endorsements, and The Investor’s Podcast Network is not responsible for any claims made by them.



