This episode reveals how Structure Sensors are reshaping healthcare technology, emphasizing their role in improving patient care through accurate 3D scanning. Ravi Shah shares insights into the importance of collaboration and data sharing in the medical field, discussing the intricacies of their hardware, software, and the journey toward providing better solutions for clinicians.
• Exploring Ravi's journey from tech to healthcare • The critical role of 3D scanning in patient outcomes • Importance of open platforms and data sharing • How Structure Sensors ensure accuracy and consistency • Practical advice for clinicians integrating 3D technology • The promise of technology in narrowing healthcare gaps • Insights into user-friendly developments for practitioners • Future aspirations for quality care
Special thanks to Advanced 3D for sponsoring this episode.
00:00:00.620 --> 00:00:03.810 Welcome to Season 10 of the Prosthetics and Orthotics Podcast.
00:00:03.810 --> 00:00:11.788 This is where we chat with experts in the field, patients who use these devices, physical therapists and the vendors who make it all happen.
00:00:11.788 --> 00:00:20.131 Our goal To share stories, tips and insights that ultimately help our patients get the best possible outcomes.
00:00:20.131 --> 00:00:23.207 Tune in and join the conversation.
00:00:23.207 --> 00:00:27.310 We are thrilled you are here and hope it is the highlight of your day.
00:00:27.310 --> 00:00:44.332 Hello everyone, this is Brent Wright and Yoris Peels with another episode of the Prosthetics and Orthotics podcast, and I am super excited to have Ravi Shah on the show today to share a little bit about his journey with Structure Sensors.
00:00:44.332 --> 00:00:54.027 There's a lot of history there, a lot of stuff to get into, so I'm really, really excited to hear some of that journey and where you are headed as well.
00:00:54.027 --> 00:00:55.110 So welcome to the show.
00:00:55.843 --> 00:00:56.670 Thank you so much, brent.
00:00:56.670 --> 00:00:58.962 I really appreciate the warm welcome, at least for me.
00:00:58.962 --> 00:01:01.390 I think it's an absolute pleasure to be invited to the show.
00:01:01.390 --> 00:01:10.923 I love the work that you do in the industry and I really love listening to the different perspectives that you bring to what's happening and how we can make everything a little bit better.
00:01:10.923 --> 00:01:13.030 So, thank you, it's really an honor to be invited.
00:01:13.760 --> 00:01:21.548 Yeah well, I'm really excited to dive into some of this stuff and you know, what's interesting about doing this podcast is, like my mind changes.
00:01:21.548 --> 00:01:25.073 So I guess you kind of have to be open-minded a little bit.
00:01:25.073 --> 00:01:38.805 But like like if you and I think this, just you know, says a little bit to life as well, as like if you have at least an open mind and try to listen and learn and be okay with changing your mind, I think that's a that's a good way to live.
00:01:39.346 --> 00:01:44.406 Yeah, no, a hundred percent, I think, especially working with a lot of clinicians and in the healthcare context.
00:01:44.406 --> 00:01:45.548 I don't have that background.
00:01:45.548 --> 00:01:53.234 I come from tech, and so I find that I don't know very much, and every day I'm learning a little bit, and so, you know, for me it's, it's.
00:01:53.234 --> 00:01:55.322 I find that opinions are really fluid.
00:01:55.322 --> 00:02:02.746 I think that, especially given the state of where healthcare is today, like the use of all these workflows, any kind of 3d workflow is really nascent.
00:02:02.746 --> 00:02:14.522 And so, in terms of technology, we're really at just the beginning of it and I'm really excited to see where we're going to go and if we can make it a little bit better across the board and we can help all parts of the industry get a little bit better.
00:02:14.522 --> 00:02:16.508 You know, I like to say, a rising tide lifts all boats.
00:02:16.508 --> 00:02:20.765 I've always held that to be true, and so I'm excited to see where we're going.
00:02:21.587 --> 00:02:28.949 Yeah, well, we, we always ask the question Okay, so how did your journey land you at structure?
00:02:28.949 --> 00:02:39.742 So I'd love to you you mentioned already, you foreshadowed, that you have, uh, you know, a background in tech and, uh, some pretty cool stuff there, and then you've kind of brought your talents to structure.
00:02:39.742 --> 00:02:41.545 So, yeah, share a little bit about that journey.
00:02:42.127 --> 00:02:43.147 Yeah, no, thank you, brett.
00:02:43.147 --> 00:02:46.252 I'm sort of a giant nerd I guess it's not.
00:02:46.252 --> 00:02:47.794 Sort of it's really who I am.
00:02:47.794 --> 00:02:55.251 I was actually at Google for about 12 years before I joined Occipital, the original company that created the Structure Center.
00:02:55.251 --> 00:02:58.444 You know, at Google I spent my time doing a little bit of everything.
00:02:58.444 --> 00:03:13.950 I've done everything from answering support phones to working with partnerships, to working with engineering and being a product manager, and so I've sort of run the gamut there, joined Occipital, actually in 2017.
00:03:13.971 --> 00:03:15.697 I was really looking for a set of technologies that would make people's lives better.
00:03:15.697 --> 00:03:36.427 You know, I had worked in mobile and Android for a really long time before that and we thought we were making the world better because we were making mobile devices ubiquitous, and when you can give somebody this beautiful device that allows them to ask a question, get an answer instantly at least for me, that was sort of like Andrew Carnegie's vision of the library, like yeah, it was the great equalizer.
00:03:36.427 --> 00:03:40.890 You could change the way that people thought and they worked, and we saw it all over the world.
00:03:40.890 --> 00:03:55.945 We worked everywhere, from the streets of London to the slums of Delhi, to understand how people use these devices and how we could really get them into the hands of everyone, and, at least for me, I was looking for the next set of technologies that I thought would help humans and sort of give them superpowers.
00:03:55.945 --> 00:04:02.530 And when I looked ahead, it wasn't just like what we had on mobile today, where you know you could ask a question, you can answer it.
00:04:02.530 --> 00:04:23.221 It was a set of technologies on the horizon like, namely, computer vision and machine learning and these little sensors that are in your devices, and all of a sudden we could get to the point where, instead of having to ask a question, our devices could just answer those questions for us, and it was a whole nother set of superpowers, and I was really looking for a way to move in that direction with my career.
00:04:23.221 --> 00:04:32.608 I knew I didn't want to do it at Google, I like to tell people, google grew up and I didn't, and so I was introduced to this little startup called Occipital in Boulder, where we had moved, and it was just beautiful.
00:04:32.608 --> 00:04:42.653 They had a lot of beautiful technology there's just a lot of products and a lot of tech there and so I actually joined, originally to lead the AR VR team in 2017.
00:04:42.653 --> 00:04:56.122 I joined Structure in late 2019, along with my partner, paolo now, who is my partner in crime.
00:04:56.141 --> 00:04:57.826 But we found this beautiful, beautiful team that was selling sensors.
00:04:57.826 --> 00:05:13.290 Nobody internally really knew who we were selling them to, and so we dug in and we realized that healthcare had sort of adopted these sensors and there were like 100 apps or more that were really just designed to treat humans, like you know, to analyze the body and build custom therapies.
00:05:13.290 --> 00:05:18.432 And you know, paolo, bless his heart, he really pushed for this to be our focus.
00:05:18.432 --> 00:05:36.733 You know, I was a little skeptical at first because, you know, the company really had been focused on things like AR, vr, robotics, but over time I came to understand that it really was the thing that we did, and so, yeah, we spent our time really digging into it and we found that there was this huge user base that we didn't serve at all.
00:05:36.733 --> 00:05:43.660 Honestly, I think people made these beautiful apps on our platform and we sort of left them to their own devices and sold sensors, and that was it.
00:05:43.660 --> 00:05:48.072 And so Paolo and I really spent the next couple of years trying to focus the team on those use cases.
00:05:48.072 --> 00:05:56.189 I know the Mark II that had launched wasn't a really good fit for the healthcare use case, and so we spent a long time trying to refocus the team back towards that.
00:05:56.189 --> 00:06:00.312 And so, you know, we launched the Structure Sensor Pro in 2021.
00:06:01.600 --> 00:06:19.404 And in 2022, we convinced the board hey, having a really healthcare-focused startup in the middle of an interior and room scanning and interior understanding spatial computing startup just wasn't a good Neither would thrive.
00:06:19.404 --> 00:06:21.427 And so we had two businesses attached to the hip.
00:06:21.427 --> 00:06:42.600 We convinced the board to let us spin out, and so we spun out in 2022, really with that mission in mind, that idea that we could build a beautiful healthcare platform that if we could help make it really easy to capture information about the human body and analyze it and give tools to clinicians, they would be able to diagnose people anywhere around the world.
00:06:42.600 --> 00:06:46.182 And really we looked at who's using our products and we were everywhere.
00:06:46.182 --> 00:06:55.202 We were ubiquitous whether it's MSF scanning children in austere environments, who've stepped on landmines to fit them, from prosthetics to the Mayo Clinic, you can find a structure sensor.
00:06:55.202 --> 00:07:15.946 And so this idea that we could take that platform and use it to improve the lives of humans and make sure that, no matter who you are, where you are, you get the same quality of diagnosis, really fueled our desire to go out on our own, and that's really what we've been focused on since we spun out is how do we make this a big, beautiful, ubiquitous thing where it doesn't matter who you are around the world, you know?
00:07:15.966 --> 00:07:22.668 I like to say that there's this huge, huge gap in terms of quality of care when you look at clinicians.
00:07:22.668 --> 00:07:46.012 I used to live in the Bay Area and my daughter actually her pediatrician was the chair of her department of pediatrics at Stanford, and so she used to get insane like she would know things about our children that I couldn't even fathom anybody deriving from just like a quick look, and it was because she wrote and published and consumed peer-reviewed research 20 to 30 hours a week.
00:07:46.012 --> 00:07:51.451 We moved to Boulder and our doctors are fantastic, but the clinicians don't have that luxury.
00:07:51.451 --> 00:08:04.233 They have to treat 30, 40 kids a day and when you're working you know 10, 12 hours a day, you don't have time to be reading everything that's in the latest medical journals.
00:08:04.800 --> 00:08:20.184 The reason why I think the technology is so powerful, this confluence of computer vision and machine learning and those types of things, is we can provide those insights to everyone, like if we as a tool set can sort of make that quality of care ubiquitous that everybody, anywhere.
00:08:20.184 --> 00:08:23.565 No matter what resources they have access to, they can get that same diagnosis.
00:08:23.565 --> 00:08:26.807 We've done something really good, and so that's sort of our mission.
00:08:26.807 --> 00:08:28.048 I know everybody thinks we make.
00:08:28.048 --> 00:08:30.134 No matter what resources they have access to, they can get that same diagnosis.
00:08:30.134 --> 00:08:31.596 We've done something really good, and so that's sort of our mission.
00:08:31.596 --> 00:08:33.783 I know everybody thinks we make sensors, but really we make sensors A lot of software.
00:08:33.783 --> 00:08:34.471 We try to collaborate with others.
00:08:34.471 --> 00:08:41.625 If folks have a sensor of their own, we try to see if we can support it in our platform as well, and we're really just trying to see how do we elevate the quality of care for humans.
00:08:42.648 --> 00:08:44.412 I think that's super cool, and you know.
00:08:44.412 --> 00:08:54.649 One of the things, though, that is interesting, that I would love for you to dive in and give us the nerdy answer to is why look at something that's separate, right?
00:08:54.649 --> 00:08:59.908 So, like, people are like, well, you've got your Androids and you've got your iPhones and you have your different version.
00:08:59.908 --> 00:09:12.985 I mean, these things are good, cameras are good, hardware is good, but we know, like there's so much variety, right, you've got good stuff, bad stuff, stuff that gets deprecated, stuff that doesn't work well, play together all that.
00:09:12.985 --> 00:09:15.207 So why even do something that's separate?
00:09:15.639 --> 00:09:21.144 I think that's a really good question, you know, and the real answer is that we'll work with anything.
00:09:21.144 --> 00:09:36.210 Just, very truthfully, I think that this idea that we're trying to build something really ubiquitous implies that we need to support whatever device you have, and so we'll give you the best 3D reconstruction, landmark detection measurements, things like that, that we can, regardless of what device you come to us with.
00:09:36.210 --> 00:09:37.182 You know, I think we have.
00:09:37.182 --> 00:09:42.121 You know, we work even as far as working on every device via photogrammetry internally.
00:09:42.121 --> 00:09:46.610 We don't have something that we feel is good enough for clinical use yet, but we're still working on that.
00:09:46.610 --> 00:09:52.331 But the real answer for why we make our own hardware is there's a few actually big reasons why.
00:09:52.331 --> 00:09:55.903 One is because we can ensure consistency and quality.
00:09:55.903 --> 00:10:02.124 You know, I think the level of accuracy that you can consistently get from a dedicated device is high.
00:10:02.124 --> 00:10:04.227 And we work very closely with Apple.
00:10:04.227 --> 00:10:07.433 You know, I think we have team members, former team members there.
00:10:07.433 --> 00:10:08.664 I have former colleagues there.
00:10:08.664 --> 00:10:11.590 I have former bosses and employees who work at Apple.
00:10:11.590 --> 00:10:17.849 My CTO, our partner, paolo he has a lot of colleagues there too, you know, in the calibration teams and things like that.
00:10:18.320 --> 00:10:22.826 The problem is is that when you look at even the active depth sensing on these devices.
00:10:22.826 --> 00:10:24.171 We're not the consumers.
00:10:24.171 --> 00:10:26.024 You look at even the active depth sensing on these devices.
00:10:26.024 --> 00:10:26.745 We're not the consumers.
00:10:26.745 --> 00:10:27.567 They're not made for clinical purposes.
00:10:27.567 --> 00:10:32.904 The consumer of even the front-facing true depth camera really is the face ID team at Apple.
00:10:32.904 --> 00:10:34.027 It's designed for them.
00:10:34.027 --> 00:10:51.043 It's a black box, and so, whether it's hardware changes that come from year to year, that change the baseline or the projector, or parts that are becoming more efficient, or minor iOS revisions, changing the calibration parameters and things like that, it really is like.
00:10:51.043 --> 00:10:54.772 It's like the Wild West if you're trying to consume from many devices.
00:10:54.772 --> 00:10:57.730 And so, like I said, we actually do set up robot arms, we calibrate.
00:10:57.730 --> 00:11:11.042 We have a proprietary calibration system that we built that actually we use to calibrate iOS devices and things like that in order to get the best possible results, but nothing beats having the consistency and the accuracy of having your own hardware.
00:11:11.042 --> 00:11:13.288 So that's one piece and the other piece really is.
00:11:13.830 --> 00:11:30.288 You know, I've worked at this intersection of hardware and software my whole career and what I found is that the discipline required to make your own hardware is important because it allows you to understand how it's used in the real world, Like what are the physical interfaces and how that works.
00:11:30.288 --> 00:11:40.460 And when you make your own hardware, I really fundamentally believe that your software is better Because you have a full understanding of that entire ecosystem, of that full usage from top to bottom.
00:11:40.460 --> 00:11:41.865 The opposite is true as well.
00:11:41.865 --> 00:11:44.841 Top to bottom.
00:11:44.841 --> 00:11:45.464 The opposite is true as well.
00:11:45.464 --> 00:11:51.565 I think making our own SDK and even making our own internal apps and helping people build their apps and going out in the wild and tweaking them makes us better at hardware, and so it's really.
00:11:51.565 --> 00:11:58.066 We can build the best thing in my mind if we touch all of those things, and it's more than just 3D sensing.
00:11:58.086 --> 00:12:01.240 You know, at least for us, we do a lot of work as we look beyond that.
00:12:01.240 --> 00:12:04.285 We have a lot of work done in volumetric reconstruction.
00:12:04.285 --> 00:12:11.015 We have a lot of technology that we've built in-house for how do you marry different types of sensing modalities?
00:12:11.015 --> 00:12:24.191 So it starts with 3D, but in a few years we're going to have other types of insights, like we're going to layer temperature, microbial load and other types of sensing over this and all of a sudden you start to see intersectionality between these things.
00:12:24.191 --> 00:12:32.585 This, and all of a sudden you start to see intersectionality between these things, like just by adding temperature to a structure sensor, being able to 3D reconstruct and then extracting landmarks from that using our own hardware.
00:12:32.764 --> 00:12:36.013 Now we can diagnose things like diabetic ulcers.
00:12:36.013 --> 00:12:39.888 We can find them before they're a problem, you know.
00:12:39.888 --> 00:12:41.412 We can find pressure injuries.
00:12:41.412 --> 00:12:46.708 We can find things that are non-obvious, we can help with wounds and things like that by adding microbial load.
00:12:46.708 --> 00:12:48.232 We can go even deeper into wounds.
00:12:48.232 --> 00:12:51.591 Looking at dielectric properties of the skin, we can start finding melanomas.
00:12:51.672 --> 00:12:57.510 And so not only can we do more by making our own hardware, you know, we can do ad sensing that just doesn't exist in the world.
00:12:57.510 --> 00:13:01.711 We can start finding intersectionality between these different modalities, and it's really.
00:13:01.711 --> 00:13:08.990 If we can do that, we'll hopefully give clinicians the tool to just find insights about the human body that we've not found before.
00:13:08.990 --> 00:13:21.484 You know, mapping the human phenotype doesn't sound as exciting as mapping the human genotype, but I think there hasn't been enough work done there to really find all those intersectionalities, and I think we can help just make the world a better place if we do that.
00:13:21.484 --> 00:13:32.967 So yeah, I'm sorry it was a very long answer, but you know we make our own hardware because it's important, it makes us better, it makes us makes our SDK better, it makes the apps better, it makes everything better.
00:13:32.967 --> 00:13:33.873 But also we can do a heck of a lot more Like.
00:13:33.873 --> 00:13:37.020 We can make significantly more accurate, more consistent experience for clinicians, and we can.
00:13:37.020 --> 00:13:39.562 We can just start doing things that nobody's really done before.
00:13:40.482 --> 00:13:55.812 It's also interesting that that interesting that one thing here is that I went to two different museums and there were three different art pieces that were all kind of interactive 3D scanning art pieces and all of them used the Microsoft Kinect right, which has been discontinued, I think 15 years ago, if I'm not mistaken.
00:13:56.712 --> 00:14:05.138 It was discontinued the same day that they stopped making projectors for the original structure sensor because PrimeSense was bought by Apple in 2013.
00:14:09.960 --> 00:14:10.523 Exactly, that's at one moment.
00:14:10.523 --> 00:14:13.256 But still still, 50 years later, they're still using these connect things to do this, because there isn't an alternative, right?
00:14:13.256 --> 00:14:31.705 There is an alternative where somebody at a museum or an artist or some organizer can turn to to have a kind of hackable kind of sensor thing, and every single time I've paid attention to this, I've seen this more than a few dozen times now in museums where you have some kind of thing where you wave your arms or something, it's always the Kinect sensor, and I just checked on Amazon.
00:14:31.705 --> 00:14:34.705 You can still buy them Xbox One Kinect sensor.
00:14:34.705 --> 00:14:36.570 You can just buy them, and they must be.
00:14:36.570 --> 00:14:37.946 That's what's keeping this alive.
00:14:37.946 --> 00:14:46.383 So I think, no, I think you're right.
00:14:46.403 --> 00:14:48.187 And, by the way, it's a pleasure to meet you.
00:14:48.187 --> 00:14:49.590 Sorry I was late.
00:14:49.590 --> 00:14:50.711 No, not at all, actually.
00:14:50.711 --> 00:14:51.634 Life is unpredictable.
00:14:51.860 --> 00:14:53.104 And I love what you guys are doing here.
00:14:53.104 --> 00:14:54.948 And did you, you know?
00:14:54.948 --> 00:14:58.629 Because you could have kind of presented yourself as like the universal scanning platform for everything.
00:15:01.426 --> 00:15:04.336 I mean because a lot of people have tried doing that and that hasn't really worked for anyone, right?
00:15:04.938 --> 00:15:07.144 Yeah, no, I mean I guess just very transparently.
00:15:07.144 --> 00:15:13.740 I think, like I was telling Brent earlier, really we're really nascent with regard to the use of 3D technology in healthcare.
00:15:13.740 --> 00:15:27.288 Now, like you said, the Kinect has been gone forever, still being used for just crazy things out in the wild, and people are doing new beautiful things with it, and so I think, from a healthcare perspective, both the industry moves really slow.
00:15:27.288 --> 00:15:29.436 Like I think there's this concept in venture capital that drives me nuts.
00:15:29.436 --> 00:15:33.629 I hate it every time I hear it, but they talk about Uroom's law, which is the inverse of Moore's law.
00:15:33.629 --> 00:15:40.841 It's instead of silicon you get this increased transistor density and speed every couple of years.
00:15:40.841 --> 00:15:41.682 It's the opposite.
00:15:41.682 --> 00:15:48.648 Like medical device and pharmaceutical is more expensive and takes longer to produce every few years.
00:15:49.100 --> 00:15:55.506 Our real mission is to sort of realign those innovation cycles, and so in healthcare, media is still relatively unheard of.
00:15:55.779 --> 00:16:14.085 You know, in an industry like orthotics and prosthetics or in those areas, you see a lot of it, but it's still just the start, and so we're not going to get anywhere meaningful if we try to build an ecosystem or a platform that tries to suck all the air out of the room, like that's never been something I've been interested in.
00:16:14.466 --> 00:16:26.448 Personally, I think if we're going to make the world a better place, it really is going to be by being as open as humanly possible and supporting as many devices as possible and supporting the workflows that people actually have and the devices they have in hand.
00:16:26.448 --> 00:16:38.503 Whether it's something that has one of our sensors attached or not, we're going to do the best that we can with it, and whether it's our cloud system or the SDK actually we're pretty ubiquitous Like we'll give you measurements on meshes that came from anything.
00:16:38.503 --> 00:16:41.027 Like somebody could take a I don't know.
00:16:41.027 --> 00:16:51.282 They could take like a block that they see and see and model it themselves by hand and send it to us, and we'll still give you landmark recognition if we can do it, and I think that's the right way to approach things.
00:16:51.282 --> 00:17:02.134 I think the more open we are and the more, the more we work with other people's technology, the faster things will get better for clinicians, which just means better outcomes for actual humans at the end of it.
00:17:02.625 --> 00:17:09.712 And if you're looking at who's buying and actually using your sensors who are they, Is orthopedics a big part of that.
00:17:09.712 --> 00:17:11.511 What kind of doctors are using it?
00:17:11.511 --> 00:17:12.630 What kind of patients are using it?
00:17:14.009 --> 00:17:30.869 So I think our biggest single market is really the combination of orthotics and prosthetics, and I hate lumping them together because I think the needs and uses are so nuanced that I could spend the rest of my career understanding just like the workflow of one clinician and not really understand it in its entirety.
00:17:31.070 --> 00:17:32.294 But that is really a big part of it.
00:17:32.294 --> 00:17:34.527 But we do everything, you know, I think we do.
00:17:34.527 --> 00:17:35.469 We have partners.
00:17:35.469 --> 00:17:39.858 You know we have several partners that we've helped through the FDA 510K pre-market process.
00:17:39.858 --> 00:17:42.875 Some of them do things like wound care in the VA.
00:17:42.875 --> 00:17:45.252 I think we're approved for that use case.
00:17:45.252 --> 00:17:46.864 We actually do neurosurgery.
00:17:46.864 --> 00:17:59.316 We have a partner called Skia in Korea that we work with for surgical navigation and actually they've outperformed Medtronic's cell station in 80 out of 80 neurosurgery trials and we're just about done with their FDA pre-market notification for that.
00:17:59.496 --> 00:18:02.859 You know we do oh gosh, a thousand little things across the world.
00:18:02.859 --> 00:18:17.508 You know everything from actual healthcare use cases down to non-healthcare use cases, things downstream, like if you go and you want to buy a custom pair of goggles from Smith Optics, we power that along with a lot of other consumer-facing applications.
00:18:17.508 --> 00:18:22.137 Bath Fitter uses us to measure bathrooms really quickly to figure out what will fit.
00:18:22.137 --> 00:18:25.067 I think the use cases are pretty broad.
00:18:25.067 --> 00:18:32.355 In a healthcare context, though, I think that's about 99% of our use, and really what interests the team and what gets us up in the morning is how can we make healthcare better?
00:18:32.355 --> 00:18:33.617 That's cool.
00:18:33.964 --> 00:18:34.205 And then.
00:18:34.205 --> 00:18:40.115 So one thing is like you're kind of in a divorce thing with this healthcare thing, because you could do three things Essentially.
00:18:40.115 --> 00:18:43.559 You could make like a super cheap one you know $50 for everything.
00:18:43.559 --> 00:18:50.384 Or you could do, yeah, like the wound and looking into the wound, like whatever look at your DNA, I don't know whatever like the super mega advanced one, right.
00:18:50.384 --> 00:18:54.287 Or you could make 17 different kind of you know different like versions for everyone.
00:18:54.287 --> 00:18:58.469 You know what are you choosing to do, or what are you kind of like, like like focusing on.
00:18:58.929 --> 00:19:00.390 Yeah, and I think that's a really good question.
00:19:00.390 --> 00:19:03.932 You know, one of the things I found uh, you know I'm a product guy at heart.
00:19:03.932 --> 00:19:22.520 I've um shipped a lot of really good products but also a lot of clunkers there, and one of the things I've learned is having a strong opinion makes a better product, and so you know, if you're not, if it doesn't feel painful to bring it out, if it doesn't feel like you had to make some compromises in your vision to get the thing out, you end up with something that doesn't have an opinion.
00:19:22.520 --> 00:19:24.281 I like to liken it to the Simpsons.
00:19:24.281 --> 00:19:32.595 There's an episode where Homer's stepbrother long last stepbrother finds him and owns a car company and he hires Homer to design a car and he creates the Homer car.
00:19:32.595 --> 00:19:33.136 It's terrible.
00:19:33.136 --> 00:19:34.196 It's like an every car.
00:19:34.196 --> 00:19:39.180 He just threw every feature he could ever imagine in it, but it was the worst possible car that ever existed.
00:19:39.220 --> 00:19:41.500 Because of that, and we take that same approach internally.
00:19:41.560 --> 00:19:57.059 So one of the reasons we actually do try to support as many sensors as possible is because we know that at least there are parts of the world and people who you know buying a standalone sensor is an unattainable thing, and so for us it's really about building dedicated hardware for the people who need it and the use cases that need it.
00:19:57.240 --> 00:20:16.987 You know there are many, many use cases that need consistent, accurate device to device consistency, right, Like a lot of these partners especially who go all the way through the 510k pre-market and things like that, and even use cases before that, right, Like we talked to, people making head orthoses and things like or braces and stuff like that, scanning children and trying to custom fit those.
00:20:16.987 --> 00:20:22.518 You, or braces and stuff like that, Scanning children and trying to custom fit those you have to have not only a high level of accuracy but you have to be robust to movement and things like that.
00:20:22.518 --> 00:20:26.981 And if you're off by a couple of millimeters in those use cases it's a horrible experience for the child.
00:20:26.981 --> 00:20:40.567 So I think we make our own hardware because sometimes you need it, Sometimes you need that level of accuracy and consistency, and then for the use cases where it doesn't, we're happy to support.
00:20:40.586 --> 00:20:41.150 What device you have on here?
00:20:41.150 --> 00:20:41.854 And why is 3d scanning so hard?
00:20:41.854 --> 00:21:03.019 I mean, I think I used to do these reports right about the the 3d printing market and then it was like the ultimaker now has the ultimaker 3 plus, you know, or they have now a bigger system, you know, like incremental improvement, and then and then I'd have the same report for scanning right 2015 in scanning, and then half the players would be bankrupt and like it wouldn't work anymore, like it's been a bloodbath, comparatively, even the 3d printing, which are very competitive in places.
00:21:03.019 --> 00:21:04.949 So why is it that difficult?
00:21:04.949 --> 00:21:06.834 Is that intersection thing you were talking about before?
00:21:07.256 --> 00:21:07.476 yeah.
00:21:07.476 --> 00:21:08.467 So I think part of it.
00:21:08.467 --> 00:21:33.001 I actually don't think that, like a lot of the core technology, at least in the geometric side of computer vision, just like everything else, there's like papers that are written like eight years ago that really guide you, and one of our computer vision leads our calibration lead she actually we jokingly, when we were talking to her, when we were interviewing her, we talked that she's Google's 3D telepresence revolutionary system, project Starline.
00:21:33.001 --> 00:21:36.875 We talked about how she actually developed that in 2001.
00:21:36.875 --> 00:21:43.713 It was the exact same, basically, concept, and so a lot of these core technologies haven't changed.
00:21:43.713 --> 00:21:50.573 The problem is is that I think a lot of companies approach this as oh, you capture something and now you go figure out what to do with it.
00:21:50.573 --> 00:22:02.132 And I think that if that's all you think about when it comes to 3D capture, like you're sort of doing a disservice to the customer because it takes an entire workflow for it to be a thing.
00:22:02.132 --> 00:22:04.373 Just capturing something doesn't matter.
00:22:04.373 --> 00:22:12.576 And so actually I think the reason why we invest so much time and effort into the software side of things is the power doesn't come from us and the things that we make.
00:22:12.576 --> 00:22:19.169 It comes from what people do with it and that SDK and that ability to build your own apps and find new ways of doing things is really important.
00:22:19.169 --> 00:22:23.788 And then I think it's that intersection of geometric computer vision and machine learning.
00:22:23.788 --> 00:22:35.906 That's where a lot of the innovation comes from, because a lot of companies now are actually eschewing these original geometric computer vision techniques in thinking they're just going to machine learning their way to every answer, and that's really a bad.
00:22:35.906 --> 00:22:37.128 That's a big mistake.
00:22:37.128 --> 00:22:44.715 I think geometric computer vision is really good at solving certainties and machine learning whether it's computer vision or otherwise is really good at solving for uncertainties.
00:22:44.715 --> 00:22:54.628 And if you can constrain the problem and understand the certainties and then use machine learning to solve the things that are uncertain, you have this beautiful, beautiful set of technologies that interplay.
00:22:54.628 --> 00:22:57.538 So I think we're going to start seeing a lot more innovation in this space.
00:22:57.778 --> 00:23:01.353 I think it just takes time for adoption until you have these end-to-end workflows.
00:23:01.353 --> 00:23:12.255 So I think that and it's expensive to do end-to-end workflows Like one of the things we find is that a lot of people make medical devices on our platform because they couldn't otherwise.
00:23:12.255 --> 00:23:14.307 You know there's these companies that they spend.
00:23:14.307 --> 00:23:24.146 They have to carry an R&D team for 10 years and then they have to go through the FDA process and they've burned $50 to $100 million by the end of that and they go bankrupt.
00:23:24.146 --> 00:23:32.998 And on our platform we find that it takes we shave almost a decade off of that time and they save like 98% of the cost in getting to market.
00:23:33.118 --> 00:23:47.817 And so I think that's where the power is, is these end-to-end use cases, and the better that we make the platform and the more capabilities we give people, the more that we make the structure sensor more like a tricorder, the more adoption we'll see, just because there's just more possibilities and more workflows.
00:23:47.817 --> 00:23:50.634 So I think you're seeing a lot of it on the 3D printing space.
00:23:50.634 --> 00:24:02.278 Because you generate this tactile object right, you have a problem and you're like, oh, my door broke and I need to fix something in it, or there's a unique part I need, or I'm prototyping something or whatever it is Like I'm creating a custom therapy.
00:24:02.278 --> 00:24:04.961 There's a direct tactility to the thing that you generate.
00:24:04.961 --> 00:24:13.460 I think on the capture side it's a lot harder to wrap your head around because without others, without that other, side of the workflow.
00:24:13.480 --> 00:24:15.224 It doesn't mean anything, it's just a mesh Right.
00:24:15.224 --> 00:24:25.737 One thing that I'd love for kind of just expand on too is I mean, I know you said that your technology goes a lot into the prosthetic and orthotic field or industry, just from the outside looking in.
00:24:25.737 --> 00:24:29.519 And I always love to ask this question because I've kind of grown up in the field.
00:24:29.519 --> 00:24:32.148 You know, I started when I was 15.
00:24:32.148 --> 00:24:36.987 I knew that this is what I wanted to do since I was in second grade, so it's been a long time right.
00:24:36.987 --> 00:24:38.973 So sometimes you get blinders on.
00:24:38.973 --> 00:24:43.676 So you looking into the field and don't worry about hurting my feelings, okay.
00:24:43.676 --> 00:24:46.769 I mean, what do you see?
00:24:46.769 --> 00:24:53.108 You know, I think we've talked before like we have a passionate group of people trying to change people's lives.
00:24:53.108 --> 00:24:59.309 We are also stubborn and don't like to learn, so but like, what do you see?
00:24:59.510 --> 00:25:00.855 and where do we right?
00:25:00.855 --> 00:25:02.528 I think it's challenging.
00:25:02.528 --> 00:25:04.835 I think the industry is really in this time of flux.
00:25:04.835 --> 00:25:13.476 There's a lot of providers you know one of the things that and I always look at things like how I would at Google, like I like to look at like what are the trends and who are we selling to?
00:25:13.476 --> 00:25:19.578 And those kinds of things, because I'm a robot and that's the best way for me to actually try to understand how humans actually use products.
00:25:19.578 --> 00:25:43.327 But the demographic is aging and so there's almost like a generational shift, like in the industry, which is really interesting because I think in the conversations that I've had with clinicians and the time I've spent, I see a lot of people who really are like truly artisans, like they've mastered their craft and their way of treating not just like understanding their patients but treating them and they all have their tricks and their things that go into making like a beautiful orthotic or a beautiful prosthetic.
00:25:43.327 --> 00:25:57.771 And I think the biggest thing that I see, at least for the industry, is that there's almost like a tacit admission that you know the more that we can share and the more that we can understand the best way to treat people and make it consistent and make higher quality orthotics, whether it's like understanding.
00:25:57.771 --> 00:26:11.992 You know, like I talked to some clinicians who are like I actually don't even just take one scan anymore, I need to actually take multiple so I can understand how the foot changes when loaded, so that way I can make the right thing, like those types of things I think are starting to enter the common vernacular.
00:26:11.992 --> 00:26:15.626 I actually, outside, looking in, I'm really excited, honestly, like it's not even.
00:26:15.626 --> 00:26:16.568 I'm really excited, honestly, like it's not even.
00:26:16.568 --> 00:26:17.912 I don't have any shade to throw, I think.
00:26:17.912 --> 00:26:19.034 You know people don't.
00:26:19.034 --> 00:26:27.085 People get set in their ways, myself included, but that doesn't mean that there's not a lot of innovation happening in the industry and there's not a lot of people pushing to make it better.
00:26:27.385 --> 00:26:47.580 One of the things that has been a challenge for us is because we don't make the apps themselves, we don't necessarily always have a direct tie to the clinicians, and so we've tried to get that and I found that because the industry was probably a little bit more distributed and more more like a lot of little satellite private practices for a very long time.