Skip to main content
Why the Next AI Revolution Will Happen Off-Screen: Samsara CEO Sanjit Biswas
Episode 72 | Visit Training Data Series Page
Podcasts Training Data Sanjit Biswas, Samsara

Why the Next AI Revolution Will Happen Off-Screen: Samsara CEO Sanjit Biswas

Sanjit Biswas is one of the rare founders who has scaled AI in the physical world – first with Meraki, and now with Samsara, a $20B+ public company with sensors deployed across millions of vehicles and job sites. Sanjit discusses why physical AI is fundamentally unique, from running inference on two- to ten-watt edge devices to managing the messy diversity of real-world data—weather, road conditions, and the long tail of human behavior.

Listen Now

Summary

Lessons scaling physical AI:

Real-world data is both the biggest challenge and the greatest asset: Sanjit highlights that the diversity and messiness of real-world data make it difficult to train and deploy AI at scale, but this same breadth enables unique, high-impact solutions that pure software cannot match.

Edge computing is essential for physical AI: Unlike cloud-based AI, operating in the physical world often means running inference on low-power, distributed devices. Compression, model distillation, and task-specific architectures are required to make AI practical and cost-effective at the edge.

Go-to-market execution is as critical as engineering: Sanjit stresses that building great technology isn’t enough—successful impact depends on getting products installed, training frontline workforces, and driving adoption across massive, traditional industries, which requires thousands of people and meticulous sales execution.

Technical curiosity can uncover overlooked opportunities: Transitioning from a domain expert at Meraki to a newcomer at Samsara, Sanjit’s curiosity about how infrastructure really works led him to recognize and address neglected, high-value problems in physical operations.

AI’s value lies in actionable insights, not just data: Customers face data overload, and AI’s transformative power is its ability to distill vast sensor streams into practical, actionable recommendations—improving safety, efficiency, and even workforce morale through real-time coaching and recognition.

Transcript

Introduction

Sanjit Biswas: If you think about it, there’s, like, a third shift between midnight and 8:00 am roughly, right, that people tend not to work because they’re sleeping. Imagine if operations like logistics could still run during that shift, right? And then same thing. Imagine you’re a field service technician and you need a part. Like, how amazing would it be if the part could just get delivered to you? Like, that is something that’s going to be a nice augment to operations. So it’s interesting, because typically when you see automation kick in, again volume increases, right? Because costs come down. There’s way more demand out there than people realize, because sometimes you’ll say, “Yeah, I could use that part, but I don’t need it delivered if it’s going to cost 50 bucks for someone to drive it to me.” If it costs five bucks or no bucks, like, how awesome would that be? So we kind of view it as, like, it will increase the speed that the world operates at.

Sonya Huang: In this episode we talk with Sanjit Biswas, founder and CEO of Samsara. Sanjit formerly founded Meraki, and has a legendary reputation amongst Sequoia-backed founders, so I’m excited to welcome him today for a conversation about physical AI. Samsara is a $20-billion market cap public company with sensors deployed in streaming data from millions of vehicles, capturing 90 billion miles annually. Sanjit shares his insights about the constraints of physical AI, from running inference on two- to ten-watt edge devices, to why the messy diversity of real world data is both the biggest challenge and opportunity for embodied AI. If you’re building in robotics or physical AI, this conversation offers a rare perspective from somebody who’s actually scaled it. Enjoy the show.

Main conversation

Sonya Huang: Sanjit, thank you so much for joining us today. You are a legendary Sequoia founder, and it is a delight to have you back at Sequoia.

Sanjit Biswas: Thanks for having me. It’s great to be back.

Sonya Huang: I want to start with your background. So you went from MIT’s Roofnet project to co-founding Meraki through its $1.2-billion acquisition. And you are now the founder and CEO of Samsara, a $23-billion market cap company with the best ticker on the public markets. What’s the throughline? Tell me about your personal passions and experiences, and what the throughline is between all of that.

Sanjit Biswas: Yeah, so I’m an engineer by background, so studied EE and CS. I went to undergrad out here at Stanford, went to MIT for grad school. And that’s where we worked on this project called Roofnet. So the throughline for me has been about building cool products, cool technologies that have real-world impact.

And Roofnet—this is, like, over 20 years ago—the idea was: could you build really big wireless networks? Because kind of early 2000s, WiFi was not mainstream, it was a brand new technology, internet access was just becoming mainstream and was still pretty expensive. And so we saw this opportunity to take WiFi chips and all that technology and use it to build really big networks.

And so we kind of had this idea that internet access should be everywhere, right? It should be in the air.

Sonya Huang: Yeah.

Sanjit Biswas: And how would you do that? And we’d need to build a big network. So that was Roofnet. And then with Samsara, it’s a bit of a different sort of focus. We focus on the world of physical operations. So think all the infrastructure companies, whether it’s energy utilities, construction, logistics, like, all these real-world physical industries. And the idea has been real-world impact through things like risk reduction, improving efficiency, improving sustainability, just using all this data and now AI.

Sonya Huang: Yeah. Physical AI feels like it’s finally going through an inflection moment. You’ve been building Samsara for the better part of a decade now. What did you see at the time, and how has the field changed? Why now?

Sanjit Biswas: Yeah, so if I rewind 10 years to when we were founding the company, we had a couple of sort of intuitive bets or guesses, and the “why now” for us at that point was connectivity. So we had been through the Meraki journey, and we’d seen internet access go from being kind of rare and expensive to being ubiquitous—so this is 2015 for reference. We saw basically the ability to process large amounts of data really coming online. So the cloud had matured, we were seeing the beginnings of the GPU wave.

And if you remember back to 2015, Nvidia was a player, and they were doing a lot of interesting, like, embedded GPUs. So if you picked up a Nintendo Switch back then, it had amazing graphics, but it fit in your hand. And so we saw compute was getting really good, and then we saw sensors, really specifically cameras were getting really good, because this is probably seven, eight years after the iPhone launch. So cameras had gotten extraordinary. And you combine all these three things together. You’ve got connectivity, you’ve got compute and you’ve got sensors/cameras. And we said this is the sort of, like, makings for total sea change when it comes to ability to process data in real-world context.

Sonya Huang: Wonderful. Okay, I’m excited to nerd out more about technical questions on the frontier of physical AI. Before we get into it, maybe can you just say a word on Samsara for our listeners? I guess how much of the business is—you know, I think of it as very much a—having had roots in commercial trucking. How much of a business is that today, and what do you see the ultimate vision of Samsara being?

Sanjit Biswas: Yeah. So we really focus on the broad world of physical operations. So think about all these different kinds of industries. Trucking is definitely one of them, it’s about 20, 25 percent of our business. So logistics and big trucks on the road. A lot of our business now is related to field service and construction, so other big kind of frontline industries. But we also are starting to work in, like, public sector. So we work with local governments, we work with student transit. So we just signed, like, the largest yellow school bus operator in North America, which is pretty cool. And we work in industries like aviation. So think about, like, labor-intensive, asset-heavy industries that really power the infrastructure of the planet.

Sonya Huang: Wonderful.

Pat Grady: Can I go back to your comment on—so at the time there was a “why now” around bandwidth, compute and cameras.

Sanjit Biswas: Yeah.

Pat Grady: And it sounds like you may not have necessarily had a crystal ball on what was going to happen with AI, but you kind of felt like you’re on the right side of history, and that with those raw ingredients you’d be able to do increasingly sophisticated stuff over time.

Sanjit Biswas: Yeah.

Pat Grady: What I’m curious about, I feel like there are a lot of founders today who are kind of in a similar position where nobody has a crystal ball. We don’t really know what’s coming, but you kind of know that whatever capabilities you’re going to have tomorrow are very different and better than whatever capabilities you have today.

Sanjit Biswas: Yeah.

Pat Grady: So I guess the question is, like, since you kind of had a directional sense for where the world was going, how did that influence the way you built the business? Like, was there anything specifically you did just kind of in anticipation of this inevitable direction that the world was going?

Sanjit Biswas: Well, actually the historical context is important. So our first company, Meraki, which was funded by Sequoia, we were domain experts, so we knew a ton about networking because that’s what we’d been working on in terms of our PhD.

With Samsara, it was kind of the opposite. We knew nothing about this domain. Like, we’d never driven a commercial truck before, never worked in a warehouse. And so we were sort of eyes open about it. What we did have was that intuitive sense of the compounding rate of those underlying technologies. So we said, okay, there’s this really interesting, like, problem space, this world of physical operations. It’s kind of overlooked—especially 10 years ago, no one was really talking about infrastructure the way they are now. But there are—you know, things are changing very quickly behind the scenes in terms of tooling.

So that intuition is exactly sort of what we were powered by. And we said even if it’s not mainstream yet or it’s not ready yet, certainly in five to ten years—which is about now—it will be possible to do this stuff.

So I think for a lot of the current founders, it’s kind of like if you look at AI model capabilities, even when, like, you know, the ChatGPT moment happened, these models weren’t perfect. They’ve gotten a lot better in the last two, three years, and they’re going to get even better in the next two to three years. I think technical people understand that in a way that consumers and customers often may not see yet.

Sonya Huang: So I think given the embedded systems background, and you’re one of the unique people that’s operated at the intersection of the hardware and software worlds, I’m curious, what are the things that make building AI in the physical world different than running AI in big data centers?

Sanjit Biswas: A couple of things. Well, it actually is a lot of fun. So the physical world is very diverse. You know, you see a lot of companies now working on physical intelligence and world models. And it’s because the training data set is really broad and vast.

So if you think about our products, we have products like dash cams that end up on the roads on millions of vehicles. They see, like, 99 percent of the US roads. It’s just an incredible data set. You’ve got urban, you’ve got rural, you’ve got residential, you’ve got weather. And so we see all these interesting, exceptional cases. So the training data is really interesting. And then what we can apply all the inference and basically pattern matching to is also interesting. So I think that’s the most fun part.

The most challenging part, though, is how messy it is and how distributed it is. So for our products, it’s not practical for us to just stream all the data to the cloud. It would be like a crazy bandwidth bill. You need pretty massive data centers if you think about millions of video streams, like, constantly running inference. And so we have a much more distributed architecture where we actually run in the cameras themselves. And that changes your compute and power footprint. You know, we’re talking about two to ten watts not, like, kilowatts, right? But you can do a lot more because you’ve got millions of them.

Sonya Huang: Hmm. And so what is it—like, how do you run—I’m thinking some of these large LLMs, even the image models are very large right now that people are working with. Are you running just very bespoke small models on the two-to-ten watts? That doesn’t give you much of a footprint to work with.

Sanjit Biswas: It doesn’t give you a lot of room. And that’s a fun engineering problem. So if you think about it, these state-of-the-art models, they are very large. So you’re talking about, like, you know, hundreds of millions of parameters or billions of parameters. That is simply not possible.

So our footprint is much more similar to what you could run on, like, your mobile phone, right? So it’s not tiny, it’s not a microcontroller. It runs Linux, it’s got, like, hundreds of megs of memory, maybe gigs. But it’s not like a big data center, right?

So what we tend to do is we will train models in the cloud, we’ll basically distill them down or use teacher models. So we’ll use a big model to basically instruct a small model that’s really designed for our use case. Because we don’t need to be able to answer, you know, what the capital of France is, right? Like, that’s not something the dash cam has to encounter. But we do need to be able to understand what is the risk profile on the road. So we train it with the data that’s relevant for the task.

Pat Grady: How much of the data—you see 99 percent of US highways or US roads, which is amazing. How much of that data can you make use of? How much of that data do you make use of?

Sanjit Biswas: Yeah. We can make use of a lot of it. And we basically have the ability to train over, like, this entire data set. There is a very practical question of, like, okay, you run a tokenizer at the edge, you send all these to the cloud, what do you do with it?

Pat Grady: Yeah.

Sanjit Biswas: And what’s cool about that is what we do with it this year is so much more interesting than what we could do with it two, three years ago. So two to three years ago—and these products really started around this idea of reducing risk. So if you think about the problem we’re trying to solve, it’s that, you know, our operations customers, they operate on these roads every day. It’s actually the riskiest thing that they do. It’s more so than construction or working in oil and gas, driving on the highway, getting to and from the job site is where they incur, like, most of their, like, fatality or kind of high-severity risk.

So the question is: how do you go to take all these images and tokens and turn it into a risk signal, right? A couple years ago we said the biggest risk we are seeing right now is mobile phone usage. Like, people are on their mobile device while driving a big truck. And that’s super risky. So we built a detector for that.

Pat Grady: Mm-hmm.

Sanjit Biswas: You do that and you say, “Okay, we can solve this problem. We can detect mobile phones. What else drives risk?” Now we’re seeing things like weather, right? And weather’s always been a risk factor. It’s not a brand new one, but it’s now something we can detect using these pretty sophisticated models. Training a weather detector using, like, old school convolutional networks, like an AlexNet-style model, you would have gotten a lot of things wrong. You couldn’t tell the road conditions. Once you use more sophisticated models like the ones we have today, you can really figure it out. So that’s the cool thing is there are these unlocks that happen every couple of years as model capabilities increase and our data set increases. So these two things, like, really work in our favor.

Pat Grady: Is there an upcoming unlock that you are most looking forward to?

Sanjit Biswas: In terms of our product set, or just in general?

Pat Grady: Or a new capability that’s going to unlock some new use case or some new feature for your product.

Sanjit Biswas: You know, I feel like we are seeing just such incredible foundational model capabilities that are making it possible to just inference over huge amounts of data. So historically what we did is we understood, like, what was happening in the moment. So like I said, mobile phone detection, or not wearing a seatbelt or following distance. Now we can start to really look over the course of a trip, and we’re not only detecting, like, negative, like, risky downside events, but we can actually detect good behaviors, too. And I’m really excited about that, because frontline workers 80, 90 percent of the times they’re doing a great job. No one’s able to recognize it because no one sees it. So what’s awesome is we can now see that someone’s doing awesome and give them a high five or, like, some kind of recognition or kudos. That is, like, making people’s day. And it’s a cool, like, silver lining side effect of having all this stuff running. So anyway, it’s kind of an unexpected upside sort of thing.

Pat Grady: Yeah. Yeah.

Sonya Huang: And do you think it’ll be video reasoning models that sort of empower that? I know you can’t run giant models at the edge, but are you doing stuff server side that takes advantage of LLMs?

Sanjit Biswas: Yeah, I should have mentioned that. So the model’s connected. We have a ton of inference running at the edge. It’s running continuously, because when you’re driving there’s, like, continuous risk. And then we’re taking those tokens, we’re streaming them up. And in addition, you know, we have images, we have video, we have other kinds of telemetry. And then we can go and run all kinds of sophisticated things in the cloud. So if we need to understand when an accident happened what really happened, we can run a full video language model, like a reasoning model essentially in the cloud. And that can say, oh, this was actually defensive driving and this guy got cut off, or these were the conditions. So that is really cool. We couldn’t have done that five years ago.

Sonya Huang: Do you believe in world models? Loaded question.

Sanjit Biswas: I do. I’m cautiously optimistic about them, but I think you need a tremendous amount of data.

Sonya Huang: Yeah. Are you guys training your own world models?

Sanjit Biswas: We are not building our own world model. And I think that requires a very specific kind of focus. But in the same way, we don’t train our own, like, base foundation models, but we are looking forward to using them at some point.

Sonya Huang: Yeah. And I imagine you have an incredibly rich data set that might be useful.

Sanjit Biswas: We do, yeah. We see about 90 billion miles on our system every year. So it’s a lot of driving.

Sonya Huang: Yeah. It seems like the sensor footprint you’ve built out is like a tech nerd’s dream, right? Most people dream of a connected world and, you know, you should be able to have so much telemetry on all these different attributes of the physical world. But as far as I can tell, you’re one of the only companies that’s really gone out and, you know, put sensors on the physical world in a really meaningful way. Why do you think that is? And what’s the key to actually being able to make that dream happen, versus have it just be a, you know, tech nerd’s dream?

Sanjit Biswas: Yeah. First of all, it takes a village to actually get this stuff out there. And I think that’s maybe one other big difference between just pure software and physical world is we have to get the products installed. So they’re installed on millions of vehicles. We have to train frontline workforces on, like, what this stuff is and what it’s doing. And then we have to provide value to all these customers kind of from day one, right? Like, they have to get something out of it.

You combine all those together and you get this big footprint. But it’s been hard, because you need thousands of people at our scale now to do this and to do the change management, the installs and all that kind of stuff. You know, there are a few companies that have data sets of the scale, but it’s like Tesla and then probably us, right? And then Waymo. There’s thousands of Waymos, but not millions. And maybe it will be millions in the future, but we’re not there yet. So that gives you a sense of how much effort is just like sheer willpower is required to get this stuff out there.

Pat Grady: Speaking of which, I think there are a lot of founders right now who are technical founders like yourself who’ve built something cool, and are now encountering this crazy, supercharged race to scale that the AI wave seems to have brought. And so I guess the question is you are a technical founder. I think both Samsara and Meraki have been known for go-to-market execution. And so maybe the question is, like, how important has go-to-market execution been to your success? And as a technical founder, was it obvious to you at the beginning that it was going to be that important, or kind of what was your journey like in appreciating the importance of go-to-market execution? If that makes sense.

Sanjit Biswas: Yeah, I’m replaying, like, 20 years in my head really fast.

Pat Grady: [laughs]

Sanjit Biswas: So when we started Meraki, at that point in time, like, I had never sold anything in my life. In fact, like, as an engineering nerd, like, I avoided any situation where there was, like—you know those fundraisers where you have to sell candy bars at school? Like, I was like, “Does anyone need a website for this thing?” Like you’re just trying to find some way out of it. So I really was not like a salesperson in terms of background. And no one in my family had done sales, so it was very foreign.

The thing that turned me onto it was this idea of this is what it takes to get the product out there, and if the product’s not out there, it’s not having impact. So if you’re driven by impact, if that’s what motivates you, it’s fun to see people using it, right?

And then this is what makes it sustainable. So with Meraki, we were growing the company. Between 2006—it was acquired in 2012. In the middle of that was a great financial crisis, right? There wasn’t a lot of funding at the time. Like, risk capital was just, like, turned off. So we basically had to make the company operate at break even or thereabouts. And that’s what really convinced us, like we have to figure out how to have sustainable sales execution, and a model that’s highly predictable. And as engineers we’re like, “Hey, this is actually a big engineering problem, right?”

And then that stuck with us with Samsara. We were talking about impact at scale. We raised capital along the way, but actually, we reinvested way more just from the revenue of the company and the gross margin. So if you look at our numbers—we’re public now, so you can kind of go back through the balance sheet—you can see we’ve invested probably close to $3 billion just in getting the stuff out there, right? R&D, customer success, all that stuff. That is only possible with a lot of sales, right?

Pat Grady: Yeah.

Sanjit Biswas: So once you understand the why, you can kind of buy into it and say, “I’m going to figure this out.” It was not natural for us, but it was a pivot that ended up being something we had to do. And I’m really glad we figured it out and have been getting better at it each year.

Pat Grady: Yeah.

Sonya Huang: Meraki, you were the main expert. Samsara you were not when you started the company. Why go and pick that domain?

Sanjit Biswas: I think it was curiosity. And this is a little bit of, like, going back to sort of curious nerd roots. Like, you just find yourself, like, reading books and wondering how stuff works. So after Meraki, we actually didn’t have a plan to start another company. There was a while I thought I was going to go back to grad school, finish the PhD kind of thing. My co-founder, John Bicket, he’s way smarter. He’s like, “That’s never going to work, but you go do that.”

And in that period of time, I realized that academic research was very long feedback loop, kind of slow cycle. But there were a lot of other interesting problems that caught my attention. So I got interested, I think, in energy at that time. So I was, like, learning about how the electrical grid worked—or at the time didn’t work, because photovoltaics and renewables were coming online. I started getting curious about nuclear, about satellites and things like that. So it’s kind of fun to be able to just open your mind up to everything when you’ve been, like, laser focused on one thing. And then over and over I found myself and then John found himself, like, attracted to this world of infrastructure. And so it was just curiosity about this part of the world that felt pretty overlooked.

Sonya Huang: Really cool. What do you think of autonomy? And that might be a loaded question.

Sanjit Biswas: Yeah.

Sonya Huang: But, you know, two years ago I avoided getting in Waymos. Now I don’t think twice. I feel safer in a Waymo than not in one. What’s your point of view?

Sanjit Biswas: Super excited about it. Very bullish. I think it’s been a long time coming. When I was an undergrad at Stanford, they were doing the first, like, DARPA Grand Challenge cars. So this is, like, 20-plus years ago now. And like you said, Waymos have gone from kind of like prototype tests to, like, I prefer Waymo, right? It’s super consistent. You know, there’s lots of things to like about it.

So our view on it is autonomy happens, and it actually increases the operational intensity of the world, right? So if you think about it, there’s, like, a third shift between midnight and 8:00 am, roughly, right, that people tend not to work because they’re sleeping. Imagine if operations like logistics could still run during that shift, right? And then same thing. Imagine you’re a field service technician and you need a part. Like, how amazing would it be if the part could just get delivered to you? Like, that is something that’s going to be a nice augment to operation. So we’re a fan of it. Our view on it is we think it’s an “and,” not an “or” exclusive. And it’s interesting, because typically when you see automation kick in, again volume increases, right? Because costs come down. There’s way more demand out there than people realize. Because sometimes you’ll say, “Yeah, I could use that part, but I don’t need it delivered if it’s going to cost 50 bucks for someone to drive it to me.” If it costs five bucks or no bucks, like, how awesome would that be? So we kind of view it as, like, it will increase the speed that the world operates at.

Sonya Huang: You think it’s going to happen on roads only, or you have customers with warehouses and forklifts and all the above? You think autonomy will hit all those sectors?

Sanjit Biswas: So I think autonomy already hit the warehouse. We have a lot of customers with big, like, you know, logistics warehouses. And really, about 10 years ago they started getting automated in a meaningful way. And it’s pretty rare for me to go into, like, a heavily, you know, industrialized environment without seeing automation. And that’s everything from, like, lift systems to big arms moving things. And it actually is welcomed by the people in the warehouse, because it helps reduce injury. So if you think about it, frontline workers are putting themselves at risk when they do their job every day. It is not a great outcome to get hurt lifting a pallet or doing something like that. So that is, I think, a good sort of preview of what we’re going to see out on the road. And then I think after that, there’s a construction site and job site.

Sonya Huang: Yeah. Humanoids. Yes or no?

Sanjit Biswas: Cautiously optimistic. Little bit scary, I won’t lie. They feel like they’re in that kind of creepy, uncanny valley like when you see them walking around without heads or hands or something.

Sonya Huang: Have you seen NEO from 1X?

Sanjit Biswas: I have. That’s a friendly one, yeah. But I think it reminds me of where self driving was about 10 years ago. So probably not a tomorrow, but it does feel inevitable. So as the capabilities increase, it’s going to be really exciting.

Sonya Huang: Yeah.

Pat Grady: How does the role that Samsara plays in the world change as we have more and more autonomy over time?

Sanjit Biswas: Well, I kind of think of it as digital transformation. So if you zoom way out, that’s what customers are excited about is how do we digitize these operations that have been around 50, 100 years in some cases? And most of our customers, they welcome new technology. So they adopted computers for, you know, route planning, like, in the 1970s or something like that. So they’re not against technology. It’s, is it going to help? Is it going to be relevant?

So our take is you’re going to want, like, a platform to see all of your operations, for all of these different operations to interact. So you can see your frontline workers, you could see all your vehicles, you could see your assets, know what needs maintenance. All of these problems will be evergreen. You’re going to want to maintain your assets, like, 20, 30 years from now. Maybe they’re robots and maybe they move on their own, but they still need maintenance, for example.

And then same thing. When you’ve got customer-facing or end-customer-facing teams, you’re still going to need to orchestrate hopefully thousands of people, right? And they may have help from robots and humanoids and all kinds of stuff behind the scenes, but how do you kind of run the entire operation? So that’s what we focus on is the big picture, as opposed to any specific product or technology.

Sonya Huang: How do you see the future of humans and AI interacting in the physical world and in the industries that you serve?

Sanjit Biswas: Well, I think they’re getting closer and closer. So 10 years ago when we started Samsara, most of our customers did run on a lot of, like, pen and paper process. Like, 2015, it’s not the distant past, right? Like, it really has been a change that they’ve gone from pen and paper to apps. I think as AI kicks in, you see many of them, like, using voice bots for freight brokerage, right? Like, that’s a brand new phenomenon, really in the last year, and they’ve taken to it very quickly. It’s automating tasks.

So I kind of think of it as where is the high-task intensity, a lot of, like, repetitive task work. And can AI help? Absolutely. So that’s where we’re seeing, like, very high rates of adoption.

I think the stuff that’s not changing, at least not yet, is the physical work itself is still being done by people, because it requires a lot of exception handling. So construction is a great example. So much diversity in construction. We are not to the point where you can automate it the way you could automate, like, car manufacturing, for example.

Sonya Huang: Do you think AI is—you know, you mentioned it’s something that prevents risky behavior in humans. Are you also seeing it kind of coach humans in these operational environments to actually perform better?

Sanjit Biswas: Yeah. And first, just thinking about risk. Coaching makes a big difference. So there’s risk detection, like, you know, please put down your mobile phone. But then if it’s a habit of yours, we actually want to coach you to help break the habit, right? And if you kind of look at the impact we’re able to have with customers, we often reduce risk, like, by 75 percent. So, you know, three quarters of the risk comes out of the system. Maybe half of that can come from the automated, like, in-the-moment, in-cab alert, and then the other half comes from coaching. And then that same coaching can be applied to, like, fuel efficiency. You can actually train drivers to operate heavy equipment in really smart ways. And you can gamify it, right? So that’s the kind of, like, cool opportunity that AI has is to process just enormous amounts of data, more data than any human could do. You look at patterns across thousands or millions of vehicles and then turn it into actionable insight. That’s coaching. So you can apply it to safety, you can apply it to efficiency. It’s pretty cool.

Sonya Huang: What’s the organizing principle of your product portfolio? You started from dash cams, it’s expanded out from there.

Sanjit Biswas: Yeah.

Sonya Huang: Maybe just tell us the history of how the product portfolio has expanded and how you see the future.

Sanjit Biswas: Yeah, so we actually started with GPS tracking or telematics. So 2015, dash cams were not quite viable yet.

Sonya Huang: Because of the cost?

Sanjit Biswas: Yeah, cost. And both, like, the backhaul cost of bandwidth, but also the cost of the cameras and things like that. But what was surprising to us was in 2015, most of the operational environments we went into, no one had any idea where their field teams were, not in real time. And there was this, like, disconnect, because Uber and Doordash had started to happen, right? And so it was weird. The gig economy had real-time tracking, but then, like, the logistics, like long-haul logistics economy was still getting, like, breadcrumbs, like, every—I think it was like five to fifteen minutes. And this probably predates most of the people who listen to the show, but there was this platform, MapQuest, that predated Google Maps, right? So late-’90s MapQuest, like, vintage map, right?

Pat Grady: Sony wasn’t around for that. You’d have to print out the MapQuest directions, and then take your piece of paper to figure out where you were going.

Sanjit Biswas: And it was this kind of like grainy, it looked like, you know, Minecraft level graphics. The amazing part was our customers now back then were using MapQuest printouts, and their system for GPS tracking was built on top of MapQuest. So I would go on site and I would say, “Whoa, we can help with this.”

So that was product number one, was GPS tracking. That basically got us off the ground and got us into customers. And from there we started figuring out, well, really the bigger challenge for them was managing risk. Because at that point in time it was mid-2010s, people did have phones in their pockets, and they actually asked us, “We’re getting into a lot of accidents. Do you have a dash cam you recommend that works well with your system?” So we said, “If we built one for you, would you use it?” And they said, “Yeah, absolutely.”

So John, my co-founder, I remember he went to Amazon, ordered a webcam, plugged into the USB port and over the weekend wrote some code to get a basic webcam working. We brought it back to the customers the next week. They tried it, they loved it. And then we were watching the videos with them, and you could see as people were getting into accidents, they had their phone out, right? And so we said, can we build a detection for that?

So that’s where the AI part of the dash cam came from, is very iterative. And that has now become our largest product. But it’s sold with the first product. So you asked about the kind of portfolio strategy. It’s concentric circles, it’s keep doing what we started with: core use case, adjacent use case, what else can we do? What else can we do? What else can we do? And now we have about 10 products out there.

Sonya Huang: Really cool. You mentioned kind of the backhaul and network bandwidth [inaudible] binding constraints. I’m curious if you think the growing adoption of Starlink and just internet everywhere is going to change what it’s possible to do in the physical world.

Sanjit Biswas: Absolutely. So we started Samsara right around the 3G-4G transition, and the unlock was actually YouTube. So if you remember 2015, everyone was starting to watch YouTube and baseball games and stuff on their phone. That drove data consumption way up on the carrier. The marginal cost per gigabyte came way down, and we were able to piggyback on that, right? And so that was really cool.

I think something similar is happening now, not just with 5G, which is like the networks have invested even more. But now with satellite, like, the cost of building Starlink is enormous. Like, I don’t know how much is being spent on it. It’s like many tens of billions in launch capacity and so on. But the marginal cost to add another device to Starlink is pretty low, right? And that’s like the cost for any network effect. So we’re excited about that, because it’ll help us get that last, like, one percent of coverage. And a lot of our customers are in super remote rural areas. Like, we have a lot of customers in energy like oil and gas. There are no roads where they operate. And so there’s not that much cellular coverage either.

Sonya Huang: Do you think that does away with some of the constraints of running AI on the edge? Meaning, like, today you can only stream back some percentage of data, because you do a lot of onboard compute. In a world of just internet everywhere, where it’s just a lot faster and cheaper to send all data back and forth, could you be doing a lot of it server side? And could you be doing a lot more?

Sanjit Biswas: You could do more of it. But it’s funny how, like, when stuff gets cheaper, you find a way to do more, right? And so I think it’s like a compression problem. And if the workload was static, like if you were just trying to get GPS data into the cloud? Yes, just stream it all, right? Like, it’s not a big deal. If you’re trying to get one-frame-per-second video from an outward-facing camera in the cloud, no problem. But if you want HD video from a 360°-view of a truck, like, eight cameras, that’s a lot of video. And then same thing if you want it with all the other telemetry that we get, it becomes pretty big. So I think you could potentially do it, but if you can push some of that to the edge and kind of like compress it down, everyone benefits from it.

Sonya Huang: Do you think controls and autonomy could ultimately be running in the cloud, or do you think that’s something people always want to run on device?

Sanjit Biswas: That one I think you’re probably going to see edge compute for a long time. And actually, if we kind of go a little technical for a second, one of the challenges there has been around power and compute and cost, right? So if you think about, like, a Tesla full, self-driving computer, it’s a couple thousand bucks, it takes many hundreds of watts of energy and they’re, like, the first company to be making it really practical at scale. Waymo’s probably a bit more. And so I do think that we will continue to see those sorts of approaches, because safety is, like, such a big deal. Like, you’ve got humans in the cab, you’ve got humans on the road. You don’t want, like, you know, a network outage to affect, you know, people’s lives.

Sonya Huang: Yeah. If we’re sitting here in 2030, what do you think is the biggest way that AI has transformed the physical world and physical operations?

Sanjit Biswas: I think a couple of thoughts. One is we’re pretty early, right? We’re at the end of 2025. The sort of AI adoption curve in physical operations, we’re still at the base of it. And so by 2030 I think we’ll have run up the curve where it’ll be much more mainstream in the same way that using apps is much more mainstream now than it was five, ten years ago. So I think you will see the current technologies basically experience a lot more diffusion, like get out there.

I think we’re going to see net new technologies. Like, I’m super excited about augmented reality and wearables. That’s going to make a huge difference to frontline workforces where they have to have their hands free. And it brings AI into their ear. A lot of folks have AirPods in, right? But having, like, sort of visual feedback, being able to run, like, a VLM to understand what’s going on in the environment, that will be possible in 2030. It’s not quite possible yet, but you can just feel it. It’s right on the cusp.

Sonya Huang: Maybe it’ll be glasses, maybe it’ll be some of these new devices that are under the wraps that we probably communicate with. Yeah.

Pat Grady: What’s your favorite personal use of AI?

Sanjit Biswas: Personal use of AI? Well, I love the sort of voice models. Like, I talk to AI. Like, whenever I’m driving to or from work, like, I’m chatting with it. And it’s not always about anything specific. Like, it’s kind of whatever’s on my mind. So I love that. I’ve become a big fan of, like, ChatGPT Pulse, for example. Like, it’s just cool that it tells you about—for me, events that are happening in the Bay Area. I’ve got three kids and stuff. It kind of knows its interests, right? So the whole idea that AI could know you better than you to some extent is really profound. So I love that on the personal side. It kind of exposes us to new experiences that we wouldn’t have known about, like, you know, a music performance or something like that that my kids would like.

Sonya Huang: How much of the value that you give customers do you think is thanks to AI vs thanks to all the other technology that you’re building?

Sanjit Biswas: It’s an interesting question. We don’t really, like, split it out, because there’s value in the data, but if no one looks at the data, it doesn’t have impact. So one of the things we’ve heard from customers is this concern about data overload. If you have sensor streams from every vehicle and every frontline worker and every asset, what do you do with it all, right? And so AI is pretty awesome in terms of really helping just distill that down to something actionable. So that’s why I don’t think you can split the two apart anymore.

But it’s transformative, it’s game changing. And I spend a lot of time on the road. Like, last week I was in Texas, like, all over: big food distributor, big oil and gas company, spent time with, like, the Home Depot. And it’s just cool hearing how they’re using the data in such creative ways, and ones that they didn’t have on their sort of roadmap when they started with us. But they’re like, hey, if we can use this data to help do time card punches, like, get someone started on their shift and they don’t have to walk to the office, that’s awesome. Or if we can share this data with our end customer and let them know we’re almost there, we’re delayed. That’s pretty cool. So it’s really neat to see these emergent use cases.

Sonya Huang: Really cool. It’s great to dream on all the ways that AI can kind of seep into all these different workflows and everyday lives.

Sanjit Biswas: Yeah. And it’s never any one thing. That’s what I love about this is, like, every quarter we get exposed to some new use case. And a lot of it is just you spend time with the customer, you understand their operation, and then you come up with, like, “Hey, if we did that with—like, would a voice bot that made ETA delivery phone calls be useful to you?” And many of our customers don’t even know that’s possible. They’ve never engaged with a voice bot before. So we’ll do a demo for them, we’ll do a prototype and they’ll say “This is amazing.” Right? And so that’s kind of fun to be able to kind of go back and forth.

Sonya Huang: Yeah.

Pat Grady: That’s very cool.

Sonya Huang: I’m curious in your point of view on, you know, there’s so much talk of US versus China, geopolitics, and our industrial base really needs to catch up. Our robotics, our manufacturing, our physical AI really needs to catch up. I’m curious if you’ve seen that actually accelerate customer conversations or have an impact on your business in any way?

Sanjit Biswas: Not in my customer conversations. I do think there’s this palpable sense of we need to modernize, and how do we just rethink the way the infrastructure runs? So many of our customers are involved in data center buildouts right now. They’re the energy utility, they’re the construction companies. There’s a lot going on there, and I think that has everyone thinking, okay, what does this mean for us, and what should be different about our business? So there’s a lot of introspection going on. I haven’t gotten the sense it’s like US versus China.

Sonya Huang: Yeah.

Sanjit Biswas: But it’s more of, like, could we do this differently now? We’re firmly in the 21st century, right? Like, it’s what should be different now than the way that previous generations ran these operations?

Sonya Huang: Wonderful. You’ve been a multi-time legendary founder. Any advice for young technical founders who are out building an AI right now?

Sanjit Biswas: I think it’s an amazing time to build. Whether you’ve done this before or you’re starting it for the first time, like, just the tools that are available, it’s incredible. And, like, to some extent, you know, everything’s getting magnified or amplified, right? So I think about whether it’s codecs or Cursor or all these, like, automated coding tools, if you have an idea now you can, like, sort of manifest it into something real so much more easily than when we started Samsara. Even when we started Meraki, like, back then we were racking—we’d, like, buy servers from Dell and, like, take them to the data center and set them. Like, can you imagine just like how slow that feels now?

Sonya Huang: I can’t even imagine.

Sanjit Biswas: It’s actually hard to imagine. But that is happening, and we will look back 10 years from now and say, “Can you believe we did X?” Right? And I don’t even know what X is, but it will feel so different. So it’s fun to be on these, like, exponential curves. And the best place to be on that is to be building, right?

Pat Grady: Yeah.

Sonya Huang: Really cool. Thank you so much for taking the time to share your story and what you all are up to on the AI side at Samsara.

Sanjit Biswas: Thanks. Thanks for having me.

More Episodes