0:00
0:00

Show Notes

Send us a text

This episode reveals how Structure Sensors are reshaping healthcare technology, emphasizing their role in improving patient care through accurate 3D scanning. Ravi Shah shares insights into the importance of collaboration and data sharing in the medical field, discussing the intricacies of their hardware, software, and the journey toward providing better solutions for clinicians.

• Exploring Ravi's journey from tech to healthcare 
• The critical role of 3D scanning in patient outcomes 
• Importance of open platforms and data sharing 
• How Structure Sensors ensure accuracy and consistency 
• Practical advice for clinicians integrating 3D technology 
• The promise of technology in narrowing healthcare gaps 
• Insights into user-friendly developments for practitioners 
• Future aspirations for quality care


Special thanks to Advanced 3D for sponsoring this episode.

Support the show

Show Transcript

WEBVTT

00:00:00.620 --> 00:00:03.810
Welcome to Season 10 of the Prosthetics and Orthotics Podcast.

00:00:03.810 --> 00:00:11.788
This is where we chat with experts in the field, patients who use these devices, physical therapists and the vendors who make it all happen.

00:00:11.788 --> 00:00:20.131
Our goal To share stories, tips and insights that ultimately help our patients get the best possible outcomes.

00:00:20.131 --> 00:00:23.207
Tune in and join the conversation.

00:00:23.207 --> 00:00:27.310
We are thrilled you are here and hope it is the highlight of your day.

00:00:27.310 --> 00:00:44.332
Hello everyone, this is Brent Wright and Yoris Peels with another episode of the Prosthetics and Orthotics podcast, and I am super excited to have Ravi Shah on the show today to share a little bit about his journey with Structure Sensors.

00:00:44.332 --> 00:00:54.027
There's a lot of history there, a lot of stuff to get into, so I'm really, really excited to hear some of that journey and where you are headed as well.

00:00:54.027 --> 00:00:55.110
So welcome to the show.

00:00:55.843 --> 00:00:56.670
Thank you so much, brent.

00:00:56.670 --> 00:00:58.962
I really appreciate the warm welcome, at least for me.

00:00:58.962 --> 00:01:01.390
I think it's an absolute pleasure to be invited to the show.

00:01:01.390 --> 00:01:10.923
I love the work that you do in the industry and I really love listening to the different perspectives that you bring to what's happening and how we can make everything a little bit better.

00:01:10.923 --> 00:01:13.030
So, thank you, it's really an honor to be invited.

00:01:13.760 --> 00:01:21.548
Yeah well, I'm really excited to dive into some of this stuff and you know, what's interesting about doing this podcast is, like my mind changes.

00:01:21.548 --> 00:01:25.073
So I guess you kind of have to be open-minded a little bit.

00:01:25.073 --> 00:01:38.805
But like like if you and I think this, just you know, says a little bit to life as well, as like if you have at least an open mind and try to listen and learn and be okay with changing your mind, I think that's a that's a good way to live.

00:01:39.346 --> 00:01:44.406
Yeah, no, a hundred percent, I think, especially working with a lot of clinicians and in the healthcare context.

00:01:44.406 --> 00:01:45.548
I don't have that background.

00:01:45.548 --> 00:01:53.234
I come from tech, and so I find that I don't know very much, and every day I'm learning a little bit, and so, you know, for me it's, it's.

00:01:53.234 --> 00:01:55.322
I find that opinions are really fluid.

00:01:55.322 --> 00:02:02.746
I think that, especially given the state of where healthcare is today, like the use of all these workflows, any kind of 3d workflow is really nascent.

00:02:02.746 --> 00:02:14.522
And so, in terms of technology, we're really at just the beginning of it and I'm really excited to see where we're going to go and if we can make it a little bit better across the board and we can help all parts of the industry get a little bit better.

00:02:14.522 --> 00:02:16.508
You know, I like to say, a rising tide lifts all boats.

00:02:16.508 --> 00:02:20.765
I've always held that to be true, and so I'm excited to see where we're going.

00:02:21.587 --> 00:02:28.949
Yeah, well, we, we always ask the question Okay, so how did your journey land you at structure?

00:02:28.949 --> 00:02:39.742
So I'd love to you you mentioned already, you foreshadowed, that you have, uh, you know, a background in tech and, uh, some pretty cool stuff there, and then you've kind of brought your talents to structure.

00:02:39.742 --> 00:02:41.545
So, yeah, share a little bit about that journey.

00:02:42.127 --> 00:02:43.147
Yeah, no, thank you, brett.

00:02:43.147 --> 00:02:46.252
I'm sort of a giant nerd I guess it's not.

00:02:46.252 --> 00:02:47.794
Sort of it's really who I am.

00:02:47.794 --> 00:02:55.251
I was actually at Google for about 12 years before I joined Occipital, the original company that created the Structure Center.

00:02:55.251 --> 00:02:58.444
You know, at Google I spent my time doing a little bit of everything.

00:02:58.444 --> 00:03:13.950
I've done everything from answering support phones to working with partnerships, to working with engineering and being a product manager, and so I've sort of run the gamut there, joined Occipital, actually in 2017.

00:03:13.971 --> 00:03:15.697
I was really looking for a set of technologies that would make people's lives better.

00:03:15.697 --> 00:03:36.427
You know, I had worked in mobile and Android for a really long time before that and we thought we were making the world better because we were making mobile devices ubiquitous, and when you can give somebody this beautiful device that allows them to ask a question, get an answer instantly at least for me, that was sort of like Andrew Carnegie's vision of the library, like yeah, it was the great equalizer.

00:03:36.427 --> 00:03:40.890
You could change the way that people thought and they worked, and we saw it all over the world.

00:03:40.890 --> 00:03:55.945
We worked everywhere, from the streets of London to the slums of Delhi, to understand how people use these devices and how we could really get them into the hands of everyone, and, at least for me, I was looking for the next set of technologies that I thought would help humans and sort of give them superpowers.

00:03:55.945 --> 00:04:02.530
And when I looked ahead, it wasn't just like what we had on mobile today, where you know you could ask a question, you can answer it.

00:04:02.530 --> 00:04:23.221
It was a set of technologies on the horizon like, namely, computer vision and machine learning and these little sensors that are in your devices, and all of a sudden we could get to the point where, instead of having to ask a question, our devices could just answer those questions for us, and it was a whole nother set of superpowers, and I was really looking for a way to move in that direction with my career.

00:04:23.221 --> 00:04:32.608
I knew I didn't want to do it at Google, I like to tell people, google grew up and I didn't, and so I was introduced to this little startup called Occipital in Boulder, where we had moved, and it was just beautiful.

00:04:32.608 --> 00:04:42.653
They had a lot of beautiful technology there's just a lot of products and a lot of tech there and so I actually joined, originally to lead the AR VR team in 2017.

00:04:42.653 --> 00:04:56.122
I joined Structure in late 2019, along with my partner, paolo now, who is my partner in crime.

00:04:56.141 --> 00:04:57.826
But we found this beautiful, beautiful team that was selling sensors.

00:04:57.826 --> 00:05:13.290
Nobody internally really knew who we were selling them to, and so we dug in and we realized that healthcare had sort of adopted these sensors and there were like 100 apps or more that were really just designed to treat humans, like you know, to analyze the body and build custom therapies.

00:05:13.290 --> 00:05:18.432
And you know, paolo, bless his heart, he really pushed for this to be our focus.

00:05:18.432 --> 00:05:36.733
You know, I was a little skeptical at first because, you know, the company really had been focused on things like AR, vr, robotics, but over time I came to understand that it really was the thing that we did, and so, yeah, we spent our time really digging into it and we found that there was this huge user base that we didn't serve at all.

00:05:36.733 --> 00:05:43.660
Honestly, I think people made these beautiful apps on our platform and we sort of left them to their own devices and sold sensors, and that was it.

00:05:43.660 --> 00:05:48.072
And so Paolo and I really spent the next couple of years trying to focus the team on those use cases.

00:05:48.072 --> 00:05:56.189
I know the Mark II that had launched wasn't a really good fit for the healthcare use case, and so we spent a long time trying to refocus the team back towards that.

00:05:56.189 --> 00:06:00.312
And so, you know, we launched the Structure Sensor Pro in 2021.

00:06:01.600 --> 00:06:19.404
And in 2022, we convinced the board hey, having a really healthcare-focused startup in the middle of an interior and room scanning and interior understanding spatial computing startup just wasn't a good Neither would thrive.

00:06:19.404 --> 00:06:21.427
And so we had two businesses attached to the hip.

00:06:21.427 --> 00:06:42.600
We convinced the board to let us spin out, and so we spun out in 2022, really with that mission in mind, that idea that we could build a beautiful healthcare platform that if we could help make it really easy to capture information about the human body and analyze it and give tools to clinicians, they would be able to diagnose people anywhere around the world.

00:06:42.600 --> 00:06:46.182
And really we looked at who's using our products and we were everywhere.

00:06:46.182 --> 00:06:55.202
We were ubiquitous whether it's MSF scanning children in austere environments, who've stepped on landmines to fit them, from prosthetics to the Mayo Clinic, you can find a structure sensor.

00:06:55.202 --> 00:07:15.946
And so this idea that we could take that platform and use it to improve the lives of humans and make sure that, no matter who you are, where you are, you get the same quality of diagnosis, really fueled our desire to go out on our own, and that's really what we've been focused on since we spun out is how do we make this a big, beautiful, ubiquitous thing where it doesn't matter who you are around the world, you know?

00:07:15.966 --> 00:07:22.668
I like to say that there's this huge, huge gap in terms of quality of care when you look at clinicians.

00:07:22.668 --> 00:07:46.012
I used to live in the Bay Area and my daughter actually her pediatrician was the chair of her department of pediatrics at Stanford, and so she used to get insane like she would know things about our children that I couldn't even fathom anybody deriving from just like a quick look, and it was because she wrote and published and consumed peer-reviewed research 20 to 30 hours a week.

00:07:46.012 --> 00:07:51.451
We moved to Boulder and our doctors are fantastic, but the clinicians don't have that luxury.

00:07:51.451 --> 00:08:04.233
They have to treat 30, 40 kids a day and when you're working you know 10, 12 hours a day, you don't have time to be reading everything that's in the latest medical journals.

00:08:04.800 --> 00:08:20.184
The reason why I think the technology is so powerful, this confluence of computer vision and machine learning and those types of things, is we can provide those insights to everyone, like if we as a tool set can sort of make that quality of care ubiquitous that everybody, anywhere.

00:08:20.184 --> 00:08:23.565
No matter what resources they have access to, they can get that same diagnosis.

00:08:23.565 --> 00:08:26.807
We've done something really good, and so that's sort of our mission.

00:08:26.807 --> 00:08:28.048
I know everybody thinks we make.

00:08:28.048 --> 00:08:30.134
No matter what resources they have access to, they can get that same diagnosis.

00:08:30.134 --> 00:08:31.596
We've done something really good, and so that's sort of our mission.

00:08:31.596 --> 00:08:33.783
I know everybody thinks we make sensors, but really we make sensors A lot of software.

00:08:33.783 --> 00:08:34.471
We try to collaborate with others.

00:08:34.471 --> 00:08:41.625
If folks have a sensor of their own, we try to see if we can support it in our platform as well, and we're really just trying to see how do we elevate the quality of care for humans.

00:08:42.648 --> 00:08:44.412
I think that's super cool, and you know.

00:08:44.412 --> 00:08:54.649
One of the things, though, that is interesting, that I would love for you to dive in and give us the nerdy answer to is why look at something that's separate, right?

00:08:54.649 --> 00:08:59.908
So, like, people are like, well, you've got your Androids and you've got your iPhones and you have your different version.

00:08:59.908 --> 00:09:12.985
I mean, these things are good, cameras are good, hardware is good, but we know, like there's so much variety, right, you've got good stuff, bad stuff, stuff that gets deprecated, stuff that doesn't work well, play together all that.

00:09:12.985 --> 00:09:15.207
So why even do something that's separate?

00:09:15.639 --> 00:09:21.144
I think that's a really good question, you know, and the real answer is that we'll work with anything.

00:09:21.144 --> 00:09:36.210
Just, very truthfully, I think that this idea that we're trying to build something really ubiquitous implies that we need to support whatever device you have, and so we'll give you the best 3D reconstruction, landmark detection measurements, things like that, that we can, regardless of what device you come to us with.

00:09:36.210 --> 00:09:37.182
You know, I think we have.

00:09:37.182 --> 00:09:42.121
You know, we work even as far as working on every device via photogrammetry internally.

00:09:42.121 --> 00:09:46.610
We don't have something that we feel is good enough for clinical use yet, but we're still working on that.

00:09:46.610 --> 00:09:52.331
But the real answer for why we make our own hardware is there's a few actually big reasons why.

00:09:52.331 --> 00:09:55.903
One is because we can ensure consistency and quality.

00:09:55.903 --> 00:10:02.124
You know, I think the level of accuracy that you can consistently get from a dedicated device is high.

00:10:02.124 --> 00:10:04.227
And we work very closely with Apple.

00:10:04.227 --> 00:10:07.433
You know, I think we have team members, former team members there.

00:10:07.433 --> 00:10:08.664
I have former colleagues there.

00:10:08.664 --> 00:10:11.590
I have former bosses and employees who work at Apple.

00:10:11.590 --> 00:10:17.849
My CTO, our partner, paolo he has a lot of colleagues there too, you know, in the calibration teams and things like that.

00:10:18.320 --> 00:10:22.826
The problem is is that when you look at even the active depth sensing on these devices.

00:10:22.826 --> 00:10:24.171
We're not the consumers.

00:10:24.171 --> 00:10:26.024
You look at even the active depth sensing on these devices.

00:10:26.024 --> 00:10:26.745
We're not the consumers.

00:10:26.745 --> 00:10:27.567
They're not made for clinical purposes.

00:10:27.567 --> 00:10:32.904
The consumer of even the front-facing true depth camera really is the face ID team at Apple.

00:10:32.904 --> 00:10:34.027
It's designed for them.

00:10:34.027 --> 00:10:51.043
It's a black box, and so, whether it's hardware changes that come from year to year, that change the baseline or the projector, or parts that are becoming more efficient, or minor iOS revisions, changing the calibration parameters and things like that, it really is like.

00:10:51.043 --> 00:10:54.772
It's like the Wild West if you're trying to consume from many devices.

00:10:54.772 --> 00:10:57.730
And so, like I said, we actually do set up robot arms, we calibrate.

00:10:57.730 --> 00:11:11.042
We have a proprietary calibration system that we built that actually we use to calibrate iOS devices and things like that in order to get the best possible results, but nothing beats having the consistency and the accuracy of having your own hardware.

00:11:11.042 --> 00:11:13.288
So that's one piece and the other piece really is.

00:11:13.830 --> 00:11:30.288
You know, I've worked at this intersection of hardware and software my whole career and what I found is that the discipline required to make your own hardware is important because it allows you to understand how it's used in the real world, Like what are the physical interfaces and how that works.

00:11:30.288 --> 00:11:40.460
And when you make your own hardware, I really fundamentally believe that your software is better Because you have a full understanding of that entire ecosystem, of that full usage from top to bottom.

00:11:40.460 --> 00:11:41.865
The opposite is true as well.

00:11:41.865 --> 00:11:44.841
Top to bottom.

00:11:44.841 --> 00:11:45.464
The opposite is true as well.

00:11:45.464 --> 00:11:51.565
I think making our own SDK and even making our own internal apps and helping people build their apps and going out in the wild and tweaking them makes us better at hardware, and so it's really.

00:11:51.565 --> 00:11:58.066
We can build the best thing in my mind if we touch all of those things, and it's more than just 3D sensing.

00:11:58.086 --> 00:12:01.240
You know, at least for us, we do a lot of work as we look beyond that.

00:12:01.240 --> 00:12:04.285
We have a lot of work done in volumetric reconstruction.

00:12:04.285 --> 00:12:11.015
We have a lot of technology that we've built in-house for how do you marry different types of sensing modalities?

00:12:11.015 --> 00:12:24.191
So it starts with 3D, but in a few years we're going to have other types of insights, like we're going to layer temperature, microbial load and other types of sensing over this and all of a sudden you start to see intersectionality between these things.

00:12:24.191 --> 00:12:32.585
This, and all of a sudden you start to see intersectionality between these things, like just by adding temperature to a structure sensor, being able to 3D reconstruct and then extracting landmarks from that using our own hardware.

00:12:32.764 --> 00:12:36.013
Now we can diagnose things like diabetic ulcers.

00:12:36.013 --> 00:12:39.888
We can find them before they're a problem, you know.

00:12:39.888 --> 00:12:41.412
We can find pressure injuries.

00:12:41.412 --> 00:12:46.708
We can find things that are non-obvious, we can help with wounds and things like that by adding microbial load.

00:12:46.708 --> 00:12:48.232
We can go even deeper into wounds.

00:12:48.232 --> 00:12:51.591
Looking at dielectric properties of the skin, we can start finding melanomas.

00:12:51.672 --> 00:12:57.510
And so not only can we do more by making our own hardware, you know, we can do ad sensing that just doesn't exist in the world.

00:12:57.510 --> 00:13:01.711
We can start finding intersectionality between these different modalities, and it's really.

00:13:01.711 --> 00:13:08.990
If we can do that, we'll hopefully give clinicians the tool to just find insights about the human body that we've not found before.

00:13:08.990 --> 00:13:21.484
You know, mapping the human phenotype doesn't sound as exciting as mapping the human genotype, but I think there hasn't been enough work done there to really find all those intersectionalities, and I think we can help just make the world a better place if we do that.

00:13:21.484 --> 00:13:32.967
So yeah, I'm sorry it was a very long answer, but you know we make our own hardware because it's important, it makes us better, it makes us makes our SDK better, it makes the apps better, it makes everything better.

00:13:32.967 --> 00:13:33.873
But also we can do a heck of a lot more Like.

00:13:33.873 --> 00:13:37.020
We can make significantly more accurate, more consistent experience for clinicians, and we can.

00:13:37.020 --> 00:13:39.562
We can just start doing things that nobody's really done before.

00:13:40.482 --> 00:13:55.812
It's also interesting that that interesting that one thing here is that I went to two different museums and there were three different art pieces that were all kind of interactive 3D scanning art pieces and all of them used the Microsoft Kinect right, which has been discontinued, I think 15 years ago, if I'm not mistaken.

00:13:56.712 --> 00:14:05.138
It was discontinued the same day that they stopped making projectors for the original structure sensor because PrimeSense was bought by Apple in 2013.

00:14:09.960 --> 00:14:10.523
Exactly, that's at one moment.

00:14:10.523 --> 00:14:13.256
But still still, 50 years later, they're still using these connect things to do this, because there isn't an alternative, right?

00:14:13.256 --> 00:14:31.705
There is an alternative where somebody at a museum or an artist or some organizer can turn to to have a kind of hackable kind of sensor thing, and every single time I've paid attention to this, I've seen this more than a few dozen times now in museums where you have some kind of thing where you wave your arms or something, it's always the Kinect sensor, and I just checked on Amazon.

00:14:31.705 --> 00:14:34.705
You can still buy them Xbox One Kinect sensor.

00:14:34.705 --> 00:14:36.570
You can just buy them, and they must be.

00:14:36.570 --> 00:14:37.946
That's what's keeping this alive.

00:14:37.946 --> 00:14:46.383
So I think, no, I think you're right.

00:14:46.403 --> 00:14:48.187
And, by the way, it's a pleasure to meet you.

00:14:48.187 --> 00:14:49.590
Sorry I was late.

00:14:49.590 --> 00:14:50.711
No, not at all, actually.

00:14:50.711 --> 00:14:51.634
Life is unpredictable.

00:14:51.860 --> 00:14:53.104
And I love what you guys are doing here.

00:14:53.104 --> 00:14:54.948
And did you, you know?

00:14:54.948 --> 00:14:58.629
Because you could have kind of presented yourself as like the universal scanning platform for everything.

00:14:58.629 --> 00:15:01.426
Right, why choose that?

00:15:01.426 --> 00:15:04.336
I mean because a lot of people have tried doing that and that hasn't really worked for anyone, right?

00:15:04.938 --> 00:15:07.144
Yeah, no, I mean I guess just very transparently.

00:15:07.144 --> 00:15:13.740
I think, like I was telling Brent earlier, really we're really nascent with regard to the use of 3D technology in healthcare.

00:15:13.740 --> 00:15:27.288
Now, like you said, the Kinect has been gone forever, still being used for just crazy things out in the wild, and people are doing new beautiful things with it, and so I think, from a healthcare perspective, both the industry moves really slow.

00:15:27.288 --> 00:15:29.436
Like I think there's this concept in venture capital that drives me nuts.

00:15:29.436 --> 00:15:33.629
I hate it every time I hear it, but they talk about Uroom's law, which is the inverse of Moore's law.

00:15:33.629 --> 00:15:40.841
It's instead of silicon you get this increased transistor density and speed every couple of years.

00:15:40.841 --> 00:15:41.682
It's the opposite.

00:15:41.682 --> 00:15:48.648
Like medical device and pharmaceutical is more expensive and takes longer to produce every few years.

00:15:49.100 --> 00:15:55.506
Our real mission is to sort of realign those innovation cycles, and so in healthcare, media is still relatively unheard of.

00:15:55.779 --> 00:16:14.085
You know, in an industry like orthotics and prosthetics or in those areas, you see a lot of it, but it's still just the start, and so we're not going to get anywhere meaningful if we try to build an ecosystem or a platform that tries to suck all the air out of the room, like that's never been something I've been interested in.

00:16:14.466 --> 00:16:26.448
Personally, I think if we're going to make the world a better place, it really is going to be by being as open as humanly possible and supporting as many devices as possible and supporting the workflows that people actually have and the devices they have in hand.

00:16:26.448 --> 00:16:38.503
Whether it's something that has one of our sensors attached or not, we're going to do the best that we can with it, and whether it's our cloud system or the SDK actually we're pretty ubiquitous Like we'll give you measurements on meshes that came from anything.

00:16:38.503 --> 00:16:41.027
Like somebody could take a I don't know.

00:16:41.027 --> 00:16:51.282
They could take like a block that they see and see and model it themselves by hand and send it to us, and we'll still give you landmark recognition if we can do it, and I think that's the right way to approach things.

00:16:51.282 --> 00:17:02.134
I think the more open we are and the more, the more we work with other people's technology, the faster things will get better for clinicians, which just means better outcomes for actual humans at the end of it.

00:17:02.625 --> 00:17:09.712
And if you're looking at who's buying and actually using your sensors who are they, Is orthopedics a big part of that.

00:17:09.712 --> 00:17:11.511
What kind of doctors are using it?

00:17:11.511 --> 00:17:12.630
What kind of patients are using it?

00:17:13.105 --> 00:17:14.009
It's actually pretty broad.

00:17:14.009 --> 00:17:30.869
So I think our biggest single market is really the combination of orthotics and prosthetics, and I hate lumping them together because I think the needs and uses are so nuanced that I could spend the rest of my career understanding just like the workflow of one clinician and not really understand it in its entirety.

00:17:31.070 --> 00:17:32.294
But that is really a big part of it.

00:17:32.294 --> 00:17:34.527
But we do everything, you know, I think we do.

00:17:34.527 --> 00:17:35.469
We have partners.

00:17:35.469 --> 00:17:39.858
You know we have several partners that we've helped through the FDA 510K pre-market process.

00:17:39.858 --> 00:17:42.875
Some of them do things like wound care in the VA.

00:17:42.875 --> 00:17:45.252
I think we're approved for that use case.

00:17:45.252 --> 00:17:46.864
We actually do neurosurgery.

00:17:46.864 --> 00:17:59.316
We have a partner called Skia in Korea that we work with for surgical navigation and actually they've outperformed Medtronic's cell station in 80 out of 80 neurosurgery trials and we're just about done with their FDA pre-market notification for that.

00:17:59.496 --> 00:18:02.859
You know we do oh gosh, a thousand little things across the world.

00:18:02.859 --> 00:18:17.508
You know everything from actual healthcare use cases down to non-healthcare use cases, things downstream, like if you go and you want to buy a custom pair of goggles from Smith Optics, we power that along with a lot of other consumer-facing applications.

00:18:17.508 --> 00:18:22.137
Bath Fitter uses us to measure bathrooms really quickly to figure out what will fit.

00:18:22.137 --> 00:18:25.067
I think the use cases are pretty broad.

00:18:25.067 --> 00:18:32.355
In a healthcare context, though, I think that's about 99% of our use, and really what interests the team and what gets us up in the morning is how can we make healthcare better?

00:18:32.355 --> 00:18:33.617
That's cool.

00:18:33.964 --> 00:18:34.205
And then.

00:18:34.205 --> 00:18:40.115
So one thing is like you're kind of in a divorce thing with this healthcare thing, because you could do three things Essentially.

00:18:40.115 --> 00:18:43.559
You could make like a super cheap one you know $50 for everything.

00:18:43.559 --> 00:18:50.384
Or you could do, yeah, like the wound and looking into the wound, like whatever look at your DNA, I don't know whatever like the super mega advanced one, right.

00:18:50.384 --> 00:18:54.287
Or you could make 17 different kind of you know different like versions for everyone.

00:18:54.287 --> 00:18:58.469
You know what are you choosing to do, or what are you kind of like, like like focusing on.

00:18:58.929 --> 00:19:00.390
Yeah, and I think that's a really good question.

00:19:00.390 --> 00:19:03.932
You know, one of the things I found uh, you know I'm a product guy at heart.

00:19:03.932 --> 00:19:22.520
I've um shipped a lot of really good products but also a lot of clunkers there, and one of the things I've learned is having a strong opinion makes a better product, and so you know, if you're not, if it doesn't feel painful to bring it out, if it doesn't feel like you had to make some compromises in your vision to get the thing out, you end up with something that doesn't have an opinion.

00:19:22.520 --> 00:19:24.281
I like to liken it to the Simpsons.

00:19:24.281 --> 00:19:32.595
There's an episode where Homer's stepbrother long last stepbrother finds him and owns a car company and he hires Homer to design a car and he creates the Homer car.

00:19:32.595 --> 00:19:33.136
It's terrible.

00:19:33.136 --> 00:19:34.196
It's like an every car.

00:19:34.196 --> 00:19:39.180
He just threw every feature he could ever imagine in it, but it was the worst possible car that ever existed.

00:19:39.220 --> 00:19:41.500
Because of that, and we take that same approach internally.

00:19:41.560 --> 00:19:57.059
So one of the reasons we actually do try to support as many sensors as possible is because we know that at least there are parts of the world and people who you know buying a standalone sensor is an unattainable thing, and so for us it's really about building dedicated hardware for the people who need it and the use cases that need it.

00:19:57.240 --> 00:20:16.987
You know there are many, many use cases that need consistent, accurate device to device consistency, right, Like a lot of these partners especially who go all the way through the 510k pre-market and things like that, and even use cases before that, right, Like we talked to, people making head orthoses and things like or braces and stuff like that, scanning children and trying to custom fit those.

00:20:16.987 --> 00:20:22.518
You, or braces and stuff like that, Scanning children and trying to custom fit those you have to have not only a high level of accuracy but you have to be robust to movement and things like that.

00:20:22.518 --> 00:20:26.981
And if you're off by a couple of millimeters in those use cases it's a horrible experience for the child.

00:20:26.981 --> 00:20:40.567
So I think we make our own hardware because sometimes you need it, Sometimes you need that level of accuracy and consistency, and then for the use cases where it doesn't, we're happy to support.

00:20:40.586 --> 00:20:41.150
What device you have on here?

00:20:41.150 --> 00:20:41.854
And why is 3d scanning so hard?

00:20:41.854 --> 00:21:03.019
I mean, I think I used to do these reports right about the the 3d printing market and then it was like the ultimaker now has the ultimaker 3 plus, you know, or they have now a bigger system, you know, like incremental improvement, and then and then I'd have the same report for scanning right 2015 in scanning, and then half the players would be bankrupt and like it wouldn't work anymore, like it's been a bloodbath, comparatively, even the 3d printing, which are very competitive in places.

00:21:03.019 --> 00:21:04.949
So why is it that difficult?

00:21:04.949 --> 00:21:06.834
Is that intersection thing you were talking about before?

00:21:07.256 --> 00:21:07.476
yeah.

00:21:07.476 --> 00:21:08.467
So I think part of it.

00:21:08.467 --> 00:21:33.001
I actually don't think that, like a lot of the core technology, at least in the geometric side of computer vision, just like everything else, there's like papers that are written like eight years ago that really guide you, and one of our computer vision leads our calibration lead she actually we jokingly, when we were talking to her, when we were interviewing her, we talked that she's Google's 3D telepresence revolutionary system, project Starline.

00:21:33.001 --> 00:21:36.875
We talked about how she actually developed that in 2001.

00:21:36.875 --> 00:21:43.713
It was the exact same, basically, concept, and so a lot of these core technologies haven't changed.

00:21:43.713 --> 00:21:50.573
The problem is is that I think a lot of companies approach this as oh, you capture something and now you go figure out what to do with it.

00:21:50.573 --> 00:22:02.132
And I think that if that's all you think about when it comes to 3D capture, like you're sort of doing a disservice to the customer because it takes an entire workflow for it to be a thing.

00:22:02.132 --> 00:22:04.373
Just capturing something doesn't matter.

00:22:04.373 --> 00:22:12.576
And so actually I think the reason why we invest so much time and effort into the software side of things is the power doesn't come from us and the things that we make.

00:22:12.576 --> 00:22:19.169
It comes from what people do with it and that SDK and that ability to build your own apps and find new ways of doing things is really important.

00:22:19.169 --> 00:22:23.788
And then I think it's that intersection of geometric computer vision and machine learning.

00:22:23.788 --> 00:22:35.906
That's where a lot of the innovation comes from, because a lot of companies now are actually eschewing these original geometric computer vision techniques in thinking they're just going to machine learning their way to every answer, and that's really a bad.

00:22:35.906 --> 00:22:37.128
That's a big mistake.

00:22:37.128 --> 00:22:44.715
I think geometric computer vision is really good at solving certainties and machine learning whether it's computer vision or otherwise is really good at solving for uncertainties.

00:22:44.715 --> 00:22:54.628
And if you can constrain the problem and understand the certainties and then use machine learning to solve the things that are uncertain, you have this beautiful, beautiful set of technologies that interplay.

00:22:54.628 --> 00:22:57.538
So I think we're going to start seeing a lot more innovation in this space.

00:22:57.778 --> 00:23:01.353
I think it just takes time for adoption until you have these end-to-end workflows.

00:23:01.353 --> 00:23:12.255
So I think that and it's expensive to do end-to-end workflows Like one of the things we find is that a lot of people make medical devices on our platform because they couldn't otherwise.

00:23:12.255 --> 00:23:14.307
You know there's these companies that they spend.

00:23:14.307 --> 00:23:24.146
They have to carry an R&D team for 10 years and then they have to go through the FDA process and they've burned $50 to $100 million by the end of that and they go bankrupt.

00:23:24.146 --> 00:23:32.998
And on our platform we find that it takes we shave almost a decade off of that time and they save like 98% of the cost in getting to market.

00:23:33.118 --> 00:23:47.817
And so I think that's where the power is, is these end-to-end use cases, and the better that we make the platform and the more capabilities we give people, the more that we make the structure sensor more like a tricorder, the more adoption we'll see, just because there's just more possibilities and more workflows.

00:23:47.817 --> 00:23:50.634
So I think you're seeing a lot of it on the 3D printing space.

00:23:50.634 --> 00:24:02.278
Because you generate this tactile object right, you have a problem and you're like, oh, my door broke and I need to fix something in it, or there's a unique part I need, or I'm prototyping something or whatever it is Like I'm creating a custom therapy.

00:24:02.278 --> 00:24:04.961
There's a direct tactility to the thing that you generate.

00:24:04.961 --> 00:24:13.460
I think on the capture side it's a lot harder to wrap your head around because without others, without that other, side of the workflow.

00:24:13.480 --> 00:24:15.224
It doesn't mean anything, it's just a mesh Right.

00:24:15.224 --> 00:24:25.737
One thing that I'd love for kind of just expand on too is I mean, I know you said that your technology goes a lot into the prosthetic and orthotic field or industry, just from the outside looking in.

00:24:25.737 --> 00:24:29.519
And I always love to ask this question because I've kind of grown up in the field.

00:24:29.519 --> 00:24:32.148
You know, I started when I was 15.

00:24:32.148 --> 00:24:36.987
I knew that this is what I wanted to do since I was in second grade, so it's been a long time right.

00:24:36.987 --> 00:24:38.973
So sometimes you get blinders on.

00:24:38.973 --> 00:24:43.676
So you looking into the field and don't worry about hurting my feelings, okay.

00:24:43.676 --> 00:24:46.769
I mean, what do you see?

00:24:46.769 --> 00:24:53.108
You know, I think we've talked before like we have a passionate group of people trying to change people's lives.

00:24:53.108 --> 00:24:59.309
We are also stubborn and don't like to learn, so but like, what do you see?

00:24:59.510 --> 00:25:00.855
and where do we right?

00:25:00.855 --> 00:25:02.528
I think it's challenging.

00:25:02.528 --> 00:25:04.835
I think the industry is really in this time of flux.

00:25:04.835 --> 00:25:13.476
There's a lot of providers you know one of the things that and I always look at things like how I would at Google, like I like to look at like what are the trends and who are we selling to?

00:25:13.476 --> 00:25:19.578
And those kinds of things, because I'm a robot and that's the best way for me to actually try to understand how humans actually use products.

00:25:19.578 --> 00:25:43.327
But the demographic is aging and so there's almost like a generational shift, like in the industry, which is really interesting because I think in the conversations that I've had with clinicians and the time I've spent, I see a lot of people who really are like truly artisans, like they've mastered their craft and their way of treating not just like understanding their patients but treating them and they all have their tricks and their things that go into making like a beautiful orthotic or a beautiful prosthetic.

00:25:43.327 --> 00:25:57.771
And I think the biggest thing that I see, at least for the industry, is that there's almost like a tacit admission that you know the more that we can share and the more that we can understand the best way to treat people and make it consistent and make higher quality orthotics, whether it's like understanding.

00:25:57.771 --> 00:26:11.992
You know, like I talked to some clinicians who are like I actually don't even just take one scan anymore, I need to actually take multiple so I can understand how the foot changes when loaded, so that way I can make the right thing, like those types of things I think are starting to enter the common vernacular.

00:26:11.992 --> 00:26:15.626
I actually, outside, looking in, I'm really excited, honestly, like it's not even.

00:26:15.626 --> 00:26:16.568
I'm really excited, honestly, like it's not even.

00:26:16.568 --> 00:26:17.912
I don't have any shade to throw, I think.

00:26:17.912 --> 00:26:19.034
You know people don't.

00:26:19.034 --> 00:26:27.085
People get set in their ways, myself included, but that doesn't mean that there's not a lot of innovation happening in the industry and there's not a lot of people pushing to make it better.

00:26:27.385 --> 00:26:47.580
One of the things that has been a challenge for us is because we don't make the apps themselves, we don't necessarily always have a direct tie to the clinicians, and so we've tried to get that and I found that because the industry was probably a little bit more distributed and more more like a lot of little satellite private practices for a very long time.

00:26:47.580 --> 00:26:49.748
It takes a long time to get people to share.

00:26:49.748 --> 00:26:57.213
You know, I think that there's there's all sorts of like crazy, like I've heard all sorts of crazy rumors Like I've heard like people tell me, aren't, didn't you guys go bankrupt?

00:26:57.213 --> 00:26:58.215
And I'm like what I've heard.

00:26:58.215 --> 00:26:59.397
Like people tell me, aren't, didn't you guys go bankrupt?

00:26:59.397 --> 00:26:59.759
And I'm like what?

00:26:59.759 --> 00:27:00.098
I've heard?

00:27:00.098 --> 00:27:01.020
All sorts of weird stuff.

00:27:01.040 --> 00:27:07.270
The best thing that I have started to see over the last few years is just a lot more open communication both between clinicians and between people in the industry.

00:27:07.270 --> 00:27:23.588
Like having this idea that everything's closed off and everybody has their way of doing things is great, but also the more that we all can learn, the better the treatments are at the end of the day.

00:27:23.588 --> 00:27:24.432
So I don't know, I like the industry a lot.

00:27:24.432 --> 00:27:24.772
It's fascinating.

00:27:24.772 --> 00:27:27.221
It's like nothing I've ever seen in my life and I think that's how it is for every group of clinicians that you talk to.

00:27:27.221 --> 00:27:29.406
They have their own discipline and it's a very different thing.

00:27:29.406 --> 00:27:34.393
And as somebody who goes to the doctor, I'm like, oh, I go to the doctor that specializes in this.

00:27:34.393 --> 00:27:37.676
15 years ago I had no idea that there's this much nuance.

00:27:37.676 --> 00:27:40.839
I was just like I guess they're all doctors, they just sort of specialize.

00:27:40.839 --> 00:27:44.353
But no, I think you know the culture and the dialogue and everything is different and I love it.

00:27:44.353 --> 00:27:47.568
So yeah, sorry, I didn't have anything really critical to say.

00:27:47.950 --> 00:27:50.777
No, I mean that's good, I think.

00:27:50.777 --> 00:27:57.933
The other thing that I'd love and for our listeners I think that's important is how exactly does your sensor work?

00:27:57.933 --> 00:28:06.756
Yeah, so like you run everything, you know, you hear lasers, structured light, mapping, all this stuff.

00:28:06.756 --> 00:28:10.393
So for our listeners, let's get.

00:28:10.413 --> 00:28:22.201
It's really simple, yeah, and so actually we have a stereo pair of IR cameras and we actually have an IR projector that has a pattern that's sent out, and one of the reasons we make our own hardware is IR projector that has a pattern that's sent out, and one of the reasons we make our own hardware is we can do all sorts of weird stuff with it.

00:28:22.201 --> 00:28:24.232
But basically it's the stereo pair.

00:28:24.232 --> 00:28:26.570
We get what's called like a disparity.

00:28:26.570 --> 00:28:35.573
So we understand, like there's depth matching engine on the silicon that we have so that it can match the depth really fast and we get that depth map from that.

00:28:35.573 --> 00:28:47.415
And so one of the reasons we build our own hardware and our own firmware and SDK and the calibration process secretly that sits in between, is that it's actually not just the sensor that produces this.

00:28:47.415 --> 00:28:50.071
We know that our computer vision slam is really good because we've been doing it forever.

00:28:50.071 --> 00:28:58.532
We inherited a 10-year-old stack of probably the largest body of research in computer vision outside of meta and Facebook and Google in the world.

00:28:58.532 --> 00:29:01.876
Right Like, we have a massive amount of technology behind that.

00:29:01.876 --> 00:29:04.080
Our calibration team is unique.

00:29:04.080 --> 00:29:08.996
Right Like, not a lot of people actually have taken that art and pushed it forward.

00:29:08.996 --> 00:29:13.775
Like I said, you know, the only companies that really do it are meta and Google and Apple these days.

00:29:13.775 --> 00:29:26.294
And so it's really all of those components, it's not just the depth that we're getting, it's not the data that we get from inertial units, it's not just the way that we calibrate it by itself, it's not just the way that we reconstruct in real time.

00:29:26.294 --> 00:29:33.474
It's all of those things together, everything from our factory calibration, like I said, to the actual underlying technology of the SDK.

00:29:33.474 --> 00:29:35.993
That makes the sensor work.

00:29:35.993 --> 00:29:41.797
And so, like I said, that's one of the reasons we make our own hardware we can do weird tricks, like we can take a really high resolution image.

00:29:41.797 --> 00:30:07.248
We're about to launch a version of our firmware that will enable you to do what we call we're going to call it STL1 mode just because everyone loves the STL1, but it'll change instead of like a much higher resolution, super precise thing that you can tune to scan, like, within a certain range, it'll scan from 27 centimeters to five meters, just a little bit more like the original Structure Sensor, but with a little bit better accuracy than you'd get out of that device.

00:30:07.248 --> 00:30:12.685
And so, yeah, I hesitate to say one thing makes our sensor work.

00:30:12.705 --> 00:30:17.775
Yeah, there's a stereo pair of IR cameras, there's a really nice projector, there's a big old honking battery.

00:30:17.775 --> 00:30:20.009
There's a lot of logic that we put on chip.

00:30:20.009 --> 00:30:22.255
You know the silicon that we use.

00:30:22.255 --> 00:30:25.050
We can load in our own machine learning models and stuff in there too.

00:30:25.050 --> 00:30:26.588
So there's a lot that goes into it.

00:30:26.588 --> 00:30:28.275
But it's like I said, it's more than that.

00:30:28.275 --> 00:30:31.726
Like, the further you get up the stack, the more that you see that it really is.

00:30:31.726 --> 00:30:36.277
Sorry, it's so dorky.

00:30:37.334 --> 00:30:39.828
No problem, we do dorky quite well here, I know.

00:30:39.828 --> 00:30:44.938
So if I'm an individual practitioner, do you have any 3D scanning tips?

00:30:44.938 --> 00:30:46.641
Should I get started?

00:30:46.641 --> 00:30:47.643
Why do I get started?

00:30:47.643 --> 00:30:48.202
When do I get started?

00:30:48.202 --> 00:30:48.624
That kind of thing?

00:30:49.085 --> 00:31:05.095
Yeah, no, I think that 3D workflow really just like anything else, it doesn't matter what you have in front of you, it takes time to adjust to, and so one of the biggest challenges I think that we've had in the past is people jump headfirst into 3D and then come back and go whoa, whoa, whoa, whoa, like it didn't do exactly the thing.

00:31:05.095 --> 00:31:05.415
That.

00:31:05.455 --> 00:31:06.159
I wanted.

00:31:06.289 --> 00:31:08.137
So I think it really is about finding the right.

00:31:08.137 --> 00:31:21.484
You know, if you're a practitioner, do you have somebody fabricating these things for you already, making sure that you have the right applications and then taking time to make sure that you have it set up right and get going?

00:31:21.484 --> 00:31:37.342
Whether, like I said, whether you're using just your phone or like a dedicated iPad with a structure sensor, if you're going to use these things in a clinical setting, it's really about wrapping your head around what you're trying to accomplish and making sure you have the right tools for it, and so, at least for me, I like to have, if you're using a structured sensor, I like to have an iPad with USB-C.

00:31:37.342 --> 00:31:39.785
Apple doesn't even make any new ones that don't.

00:31:39.785 --> 00:31:43.516
I like it because they charge the sensor as well, so you only have one thing to charge.

00:31:43.516 --> 00:31:47.757
You know, I like to make sure that I get a good.

00:31:47.757 --> 00:31:52.134
I hate that we call it calibrator, but like a good registration out the gate the new sensor.

00:31:52.295 --> 00:31:57.894
One of the challenges I think that we have is it doesn't matter how many videos and tutorials we put out, how many videos and tutorials we put out, people try to calibrate it.

00:31:57.894 --> 00:32:00.856
Just like an old sensor, it actually works differently.

00:32:00.856 --> 00:32:12.883
We shifted to 940 nanometers so that way we could reject more sunlight and so you could work in more environments, like in dark environments or outside.

00:32:12.883 --> 00:32:21.016
But that means that if you want to calibrate it and use that calibrator process, you probably need a little bit of sunlight or a halogen lamp, which breaks people's brains a little bit.

00:32:21.016 --> 00:32:22.861
But we're getting people there.

00:32:22.861 --> 00:32:25.296
We're putting out more videos, we're doing more on-site trainings.

00:32:25.296 --> 00:32:32.695
We're trying to get people over that hump and when you're scanning the patient, just make sure you have a really good scan before it actually goes out.

00:32:32.875 --> 00:32:44.372
A lot of new practitioners sometimes they take a scan and they haven't yet gotten to the point, when they're transitioning to technology, of recognizing whether the 3D capture was good or not, because they're so used to phone box or something else.

00:32:44.372 --> 00:32:45.776
I'm making sure you have a really good scan.

00:32:45.776 --> 00:32:47.140
We make some tools for that.

00:32:47.140 --> 00:32:51.680
Actually, we built we started with the for the AFO process for like the full foot and ankle.

00:32:51.680 --> 00:32:54.113
We have what's called scan quality indicators.

00:32:54.113 --> 00:32:59.065
So in the SDK now it'll tell you if the mesh is good before you finish it and then send it off.

00:32:59.065 --> 00:33:13.017
We're expanding that so that way the developers on our platform can use it for other things, you know, planar surfaces, elbows, head, and so what we do is we actually use our landmark detection to make sure, along with understanding the mesh, to make sure you've gotten a really good scan.

00:33:13.017 --> 00:33:15.858
And so, yeah, I think we're trying to automate that process.

00:33:15.858 --> 00:33:23.796
We're not all the way there yet for every body part, but we'll get there.

00:33:23.796 --> 00:33:26.203
But in the interim, just making sure you have a good scan before it goes off, I think, is really important as well.

00:33:26.223 --> 00:33:27.288
You know, I hesitate to be really prescriptive.

00:33:27.288 --> 00:33:36.298
Everybody has their own workflow, everybody has their own process and, you know, I'm sure some of our app partners are sitting there like banging their head against the wall saying, oh God, you didn't tell them to do X, y or Z.

00:33:36.298 --> 00:33:39.723
I really do think it depends on your workflow, but we're always happy to help.

00:33:39.723 --> 00:33:40.531
That's another thing.

00:33:40.531 --> 00:33:42.038
I think that's changed since we spun out.

00:33:42.038 --> 00:34:01.997
We try to be really accessible to our customers and so if you ever need help or you need advice, we're available, whether it's our partner team or our support team, and if people are having challenges, we're always happy to hop on a video call and walk through things with Cool and let's say I'm an app maker or some kind of person wants to develop a tool with you guys I guess I call you guys or something like that.

00:34:02.116 --> 00:34:08.981
Is there a really cool way to get to learn to see some kind of documentation, sdk, stuff like that?

00:34:09.001 --> 00:34:10.442
You just email us at partners at structureio.

00:34:10.442 --> 00:34:11.983
I know there's a phone number on our website.

00:34:11.983 --> 00:34:21.730
You can call there and leave a message and we'll get back to you.

00:34:21.730 --> 00:34:28.235
One of the things that we really are hoping to do is do a little bit more handholding of people through the process.

00:34:28.849 --> 00:34:41.557
You know, I think traditionally we had a lot of people come to us and make apps, which meant that there was a lot of variety in those experiences because we didn't have a strong enough opinion on how you should scan or how to make those experiences.

00:34:42.099 --> 00:34:58.583
So one of the things we've been trying to do, I think, a little bit more is helping people through that, whether it's hey, I have a really old app and I want to support the new sensor you know, secretly, a lot of the times we just do it we say, hey, can you like give us your scanning code and we'll just update it for you and we sneak in some tweaks here and there to make it better.

00:34:58.583 --> 00:35:31.161
So, yeah, if you just reach out to us whether it's on our website via the developer portal that you see there or, like I said, email us at partners at structureio or call the number and leave a message, we can help you through that process Getting an app up, getting it up to speed, updating old apps, making it as good as we can, talking through your use case and figuring out how to get the best results, like all of those things we're pretty open to, and our partner engineering team is growing every year, so we're hopefully staffing to help everybody out as those inquiries come in.

00:35:31.592 --> 00:35:40.918
And then the other thing is okay, so I've used your app with an iPad, right, and then also, like I've been scanned by it and okay, are you going to come up with like a handle to put in the middle of it?

00:35:40.918 --> 00:35:44.675
Or cause I just, I don't know, I'm a newbie, I only do like two or three times.

00:35:44.675 --> 00:35:45.956
I just thought it was a bit weird.

00:35:45.956 --> 00:35:47.255
It was kind of like I don't know.

00:35:47.255 --> 00:35:48.634
I just thought, is it me?

00:35:48.634 --> 00:35:49.175
Is it me?

00:35:49.335 --> 00:36:03.467
Is it like yeah, or are you going iPad or yeah, so actually hilariously, we have partners who have always used phones because they make their own cases.

00:36:03.467 --> 00:36:06.492
So actually we have two answers to that One.

00:36:06.492 --> 00:36:14.115
We actually have started working from a mechanical perspective on the right cases for phones, because in a handheld experience it's actually a lot easier that way.

00:36:14.115 --> 00:36:15.414
We're working on that right now.

00:36:15.414 --> 00:36:16.318
We have some prototypes.

00:36:16.318 --> 00:36:18.297
We want them to be good before we go out there.

00:36:18.297 --> 00:36:31.081
Alternatively actually I think a lot of people don't know this, but you don't actually have to if you make a custom bracket or case or something with a handle, you can use the Structure Calibrator app to enter your own and we'll help you with that.

00:36:31.081 --> 00:36:41.112
If you send us the CAD, we have team members who'll say, oh, enter this for XYZ dimensions, so that way you can get started and use our Calibrator tools so you're not stuck with our own.

00:36:41.351 --> 00:36:49.239
Some people use their own workflow, but we do recognize that some people like that more tactile handle handheld experience.

00:36:49.239 --> 00:36:58.038
We've been stubborn about it in the past, but I think we're moving towards supporting that and I think it's a little bit easier with iPhones to get that sort of grippy handheld experience.

00:36:58.038 --> 00:37:01.978
We're also working on more ruggedized cases rather than brackets for iPads.

00:37:01.978 --> 00:37:12.219
Probably won't be able to support every iPad, because it's just a lot of SKUs, but I think we'll support a few and I think that'll give you a little bit more of a grippy experience because it'll be in a case.

00:37:12.550 --> 00:37:31.157
And the other thing I've always wondered about is like I've been sitting there trying to scan my foot with the phone and stuff using you guys also other things and I'm like couldn't we put the phone on a gimbal or some kind of like thing that just moves around my foot and you know, like the dude just says, hey, hold your foot still, press the button, and then it just like goes around or something.

00:37:31.599 --> 00:37:36.797
We should do that we should okay, but there's no one asking for that, because that would just save them a lot of time.

00:37:36.958 --> 00:37:39.909
I agree, because I actually I'm surprised.

00:37:39.909 --> 00:37:43.117
So in a lot of other use cases we've had that like we do.

00:37:43.117 --> 00:37:46.478
You know we've had partners that do fitness towers that like rotate around you.

00:37:46.478 --> 00:37:54.992
Or you know we have a partner in japan that does baggage scanning with with our cross platform, like the linux based sensor from back in the day that rotates around the bag.

00:37:54.992 --> 00:38:02.739
I've always been interested into why that has it, and honestly, part of it too is like mechanical devices like that are really expensive.

00:38:04.291 --> 00:38:06.177
Trying to generate something that would capture like that.

00:38:06.177 --> 00:38:07.534
I think it's a really good idea.

00:38:07.534 --> 00:38:18.632
It'd be really easy to stick someone's like whatever it is in the case of the foot in there, get it to subtalar joint neutral and then have this thing rotate Like.

00:38:18.632 --> 00:38:22.137
I think that would be absolutely beautiful, especially because then you can maintain consistency.

00:38:22.137 --> 00:38:22.311
Like.

00:38:22.311 --> 00:38:34.940
One of the things that sucks about these devices is once you get outside of 30 centimeters they suck Like the depth goes to hell, like we have like thousands of trials on robot arm against like a variety of objects.

00:38:34.940 --> 00:38:43.333
That shows that the further away you get, the worse it gets, and so maintaining that consistency, whether it's this or with a sensor, would be fantastic.

00:38:43.333 --> 00:38:44.715
But no, we should collaborate on that.

00:38:44.715 --> 00:38:45.195
Let's make one.

00:38:45.396 --> 00:38:45.755
I love it.

00:38:46.476 --> 00:38:47.498
In PK5000.

00:38:47.498 --> 00:38:47.938
There we go.

00:38:47.938 --> 00:38:50.559
So, robbie, I do have a question for you, though.

00:38:50.559 --> 00:38:58.527
A lot gets made and you kind of alluded to it that we have it's fragmented in orthotics and prosthetics.

00:38:58.527 --> 00:39:00.512
Yoris says this all the time.

00:39:00.512 --> 00:39:02.893
You guys are big, small, all over the place, don't talk to each other.

00:39:02.893 --> 00:39:15.018
All this stuff With the digital stuff coming right, and I think one of the things that our listeners are just now starting to understand is there is data input, right.

00:39:15.070 --> 00:39:17.635
The scan is data endpoint, it's zeros and ones.

00:39:17.635 --> 00:39:18.137
It's input.

00:39:18.137 --> 00:39:25.572
The scan is data endpoint, it's zeros and ones, it's input.

00:39:25.572 --> 00:39:32.251
I think one of the power of having data input is that you have the power of being able to collect a lot of data and then also be able to build off of that.

00:39:32.251 --> 00:39:47.793
I think one of the resistance that we see and I hear this quite a bit, especially with the smaller companies that are very good at say a specific thing that they're really good at they really don't want that data out right.

00:39:47.793 --> 00:39:53.614
They want that data to be their own, and I just feel like there's this tug of war on data.

00:39:53.614 --> 00:39:58.123
What is your feeling on some of that and how can we make heads or tails of it?

00:39:58.809 --> 00:40:05.096
Yeah, I mean, I'll be honest, You're never going to get everybody to share everything, and nor should they feel like.

00:40:05.096 --> 00:40:11.215
People should feel like they have control over their workflow and their patients, their own data and what their process is.

00:40:11.215 --> 00:40:20.822
That being said, I think, at least for us, we've gone out of our way to really take a HIPAA-first approach to collection of data in our systems, Conf, to really take a HIPAA-first approach to collection of data in our systems.

00:40:20.822 --> 00:40:22.405
Confidentiality and those things are important.

00:40:22.405 --> 00:40:34.820
We see a lot of people who are like, oh, we've made a Dropbox connector and I'm like how do I know that once it gets to this Dropbox that I paid for that's HIPAA-compliant, that you've gone through a SOC 2 audit or anything.

00:40:34.820 --> 00:40:37.635
So I get the hesitancy very transparently.

00:40:37.635 --> 00:40:48.376
I actually I'll be super honest terrified for the security practices of the industry, more than anything, Having lived in a really high security environment and actually trying to exist in one today.

00:40:48.376 --> 00:40:51.442
You know our head of infrastructure actually comes from.

00:40:51.442 --> 00:40:54.045
He used to build HIPAA infrastructure and audit it.

00:40:54.045 --> 00:40:55.235
He did the same thing with banks.

00:40:55.235 --> 00:41:01.097
He built a lot of that infrastructure and audited it, and so when he sees like what our benchmarking looks like, he's terrified.

00:41:01.318 --> 00:41:03.081
I'd say that you know, honestly, I get it.

00:41:03.081 --> 00:41:15.635
I get that there's a hesitancy to share things, but at least the way that I look at it is, the more data that we have and the more that we understand about the human body, the better information we can give you out of that, like, the more insights we can give you, the more better that we can make.

00:41:15.635 --> 00:41:28.041
That people can make the actual fabricated custom therapy and you know, whether it's industries like orthotics and prosthetics or wound care, the state of the art doesn't move forward if people don't share.

00:41:28.041 --> 00:41:40.197
Like that's really the crux of it is that if you want to have the best possible experience for your patients and you want the best outcomes, the best way to do it is to collaborate and to be a little bit more open and, like I said, I think it's a personal thing, Like that's my philosophy.

00:41:40.197 --> 00:41:46.583
I know not everybody shares it, but you know I've I've a lot of our team does have backgrounds in academia.

00:41:46.704 --> 00:41:48.306
You know we have PhDs and folks on staff.

00:41:48.306 --> 00:41:55.594
We we secretly contribute to open source and other things like that.

00:41:55.594 --> 00:41:59.603
That openness is really the thing that pushes the boundaries of what we can accomplish, and so I encourage people to share as much as they're comfortable.

00:41:59.603 --> 00:42:08.307
Being married to little bits of data feels like something that I'm.

00:42:08.307 --> 00:42:09.856
I don't take that mindset because I know that nobody.

00:42:09.856 --> 00:42:17.518
When your data goes somewhere else, it's very rare that somebody is like going through your data to deconstruct your Like that level of attention doesn't exist.

00:42:17.518 --> 00:42:20.800
But the more that you can share, I think just the better that the industry is going to get.

00:42:21.070 --> 00:42:26.842
You know it's interesting that you say that, but I've heard some stories from even back in the day, early digital stories.

00:42:26.842 --> 00:42:32.963
The story would be like, hey, if this person ever sends in a socket, we want to see it.

00:42:32.963 --> 00:42:35.670
You know what I'm saying.

00:42:37.295 --> 00:42:38.277
So there's, a hesitancy.

00:42:40.690 --> 00:42:42.750
They're like I want to know what they're doing, just to kind of get a little bit of idea.

00:42:42.750 --> 00:42:45.579
Even though we're not supposed to look, we're looking.

00:42:46.090 --> 00:42:50.097
Well, I guess my answer to that is be very cautious who you work with.

00:42:50.097 --> 00:42:53.076
Yeah, yeah, just make sure you're not working with bad actors.

00:42:53.076 --> 00:42:58.478
Yes, definitely, like there is a lot of that that happens across the world, right, like?

00:42:58.478 --> 00:42:59.500
It doesn't matter where you are.

00:42:59.500 --> 00:43:01.423
So just be cautious who your partners are.

00:43:01.423 --> 00:43:03.034
Pick people that you like.

00:43:03.034 --> 00:43:04.157
I do this all the time.

00:43:04.157 --> 00:43:06.034
My partner Paolo and I.

00:43:06.034 --> 00:43:07.900
We pick companies to work with that we like.

00:43:08.590 --> 00:43:13.623
You know I think that's one of the beauties of spinning out on our own is we got to be picky.

00:43:13.623 --> 00:43:15.838
We can be picky and we can choose those things.

00:43:15.838 --> 00:43:20.773
Work with people that you trust.

00:43:20.773 --> 00:43:21.617
I think is really the crux of it.

00:43:21.637 --> 00:43:28.141
There you go and then trust that they are going to keep your data safe and also ask them, like, what happens to my data when it's with you, right?

00:43:28.141 --> 00:43:31.516
Does it just because it's in some Dropbox somewhere?

00:43:31.516 --> 00:43:32.519
Do you have access to it?

00:43:32.519 --> 00:43:33.282
What do you do with it?

00:43:33.282 --> 00:43:35.172
Like, do you publish a privacy policy?

00:43:35.172 --> 00:43:38.958
Do we know if you're training your people on it or you're training your data sets?

00:43:38.958 --> 00:43:41.342
These are questions that I would ask.

00:43:41.561 --> 00:43:43.905
We have to take this stuff really seriously, right?

00:43:43.905 --> 00:44:02.197
I don't want to end up in a situation where we're audited at structure and we're not compliant, right With things like face data in Illinois, right, it gets really strict in areas like that, and so making sure that people's privacy is really protected, make sure your data is secure, making sure nobody has inappropriate access to it internally, is a big deal.

00:44:02.197 --> 00:44:02.637
You know.

00:44:02.637 --> 00:44:03.978
We audit our systems.

00:44:03.978 --> 00:44:20.159
We figure out who can access what it's all logged on purpose, because, at the end of the day, we want to make sure we have a system that people can trust, and I encourage people to ask the same of whoever they work with across the board, whether it's people who make tools or people who provide them with the fabricated custom therapy or whatever it is.

00:44:20.159 --> 00:44:23.664
I would just encourage people to try to make sure you're working with people that you trust.

00:44:24.445 --> 00:44:27.896
That's awesome and a great sentiment, Ravi and I love what you guys are doing.

00:44:27.896 --> 00:44:36.650
I think it's a really, really great tool and kind of like a Lego block for people to build software solutions and the big part of the practice on top of so.

00:44:36.650 --> 00:44:38.514
I think it's absolutely fantastic what you guys are up to.

00:44:38.514 --> 00:44:39.717
So thank you so much for being here today.

00:44:39.737 --> 00:44:41.000
I love that you figured that out, Yoris.

00:44:41.000 --> 00:45:00.088
Actually, I will tell you that's something I only sort of say with our people who are on our board is, you know, we sort of try to make a Lego platform Like you take the pieces you need, whether it's this type that you came to that.

00:45:00.369 --> 00:45:03.139
Yeah, but it's also, I think, the way, like so many people are doing.

00:45:03.139 --> 00:45:04.295
We have this conversation every time.

00:45:04.295 --> 00:45:15.577
Some people want the one-click workflow, but that's like you know, for some people, if you do exactly the right thing for exactly the right people, it may work, but it's going to be super limiting practitioners.

00:45:15.577 --> 00:45:23.010
But also you know the way you set up your workflow internally in your company or whatever, or the type of way how you design or where it goes, all that kind of stuff.

00:45:23.010 --> 00:45:34.018
So I think that kind of interchangeability is, I think, a key thing that could make it very, very different than if you were just saying, to say, let's capture all the value and make one solution for $19 a month.

00:45:34.018 --> 00:45:34.860
You know that kind of stuff.

00:45:35.190 --> 00:45:37.237
So I'm going to tell you I hate quoting Bill Gates.

00:45:37.237 --> 00:45:40.288
So I'm going to tell you I hate quoting.

00:45:40.307 --> 00:45:40.710
Bill Gates.

00:45:40.710 --> 00:45:43.463
This is like a please don't throw me, don't just mute me, and thank me for doing this.

00:45:43.463 --> 00:45:43.603
Where's that button?

00:45:43.603 --> 00:45:43.898
Where's that button, uris?

00:45:44.090 --> 00:45:54.114
But in the 90s he said something that's always stuck with me, which is if you're building an ecosystem or a platform and you're sucking all the air out of the room, it's not an ecosystem in the platform.

00:45:55.536 --> 00:46:08.882
If you build something and you take just a little bit, like a couple points, and everything else is value for everyone else, then you will build something healthy and beautiful, and I think that's really true.

00:46:08.882 --> 00:46:12.304
That's always been the case in my career.

00:46:12.304 --> 00:46:16.586
Is any team I've worked on that was like, oh, let's take over and make all the profits.

00:46:16.586 --> 00:46:21.731
It's failed miserably and I didn't want to be a part of it and we were invested.

00:46:21.751 --> 00:46:35.298
Conversely, you know, as we build what we're building, it really is about building value for the clinician and the patient and who's building the medical device itself or analyzing that, like there needs to be room for all of those players.

00:46:35.298 --> 00:46:39.835
You know we also have to pay our engineers, so we'll charge money for it.

00:46:39.835 --> 00:46:46.775
I think the more that, and our philosophy is if we can drive a lot of value for a lot of people, we'll be successful.

00:46:46.775 --> 00:46:51.512
If we drive a lot of value for a few people, we're not really that successful.

00:46:51.512 --> 00:46:59.476
And we're doubly unsuccessful if we hold their feet over the fire and try to charge huge amounts of money and really kill them.

00:46:59.476 --> 00:47:00.972
Yeah, I think, uh, that's really.

00:47:01.693 --> 00:47:15.572
I think that's the only way to build a useful platform that's going to do something good for the world now we saw an additive, we saw so many people fall into probably the same trap simultaneously, where they're trying to build like a closed garden really, or walled garden or actually one of those terrariums.

00:47:15.572 --> 00:47:17.175
You know, those terrariums are all the rage now.

00:47:17.175 --> 00:47:29.632
They have like a glass pot and it's like closed and they're like the frog in the middle of the terrarium and then everybody else is moss and the frog is going to poo all over you and eat you, and that's it right and that's an ecosystem right.

00:47:29.632 --> 00:47:31.498
But it's like it's all about the frog, right?

00:47:31.498 --> 00:47:32.989
Or it's all about the glass thing.

00:47:32.989 --> 00:47:41.545
Keep you in inside, and that's it that they're like there's more, it's more about being sticky, it's more about getting it for them, and it just never works, or they're just not big enough, right.

00:47:41.545 --> 00:47:43.335
That's the other thing, that just like there's not enough volume.

00:47:43.978 --> 00:47:45.072
Yeah, I mean I, I hate to.

00:47:45.072 --> 00:47:58.338
It takes a lot of engineers to build even what we have, and we know we're just a fraction of what's needed for a workflow.

00:47:58.338 --> 00:48:03.963
You know, having 30 engineers or moreaffed even for the little bit of it that we do, and so I can't even wrap my head around.

00:48:03.963 --> 00:48:17.170
You know, I know that a large player entered the ecosystem and I won't name names and they invested $20 million and it just exploded on them because it was so hard to do.

00:48:17.170 --> 00:48:22.061
And that just goes to show you that when you go in and you try to own the entire thing end-to-end, it just like one, nobody wants to work with you, right?

00:48:22.061 --> 00:48:29.699
And two, I think it requires a massive amount of resources to do and it also requires an expertise that I think is really hard.

00:48:29.699 --> 00:48:31.215
There's no organizational focus.

00:48:31.215 --> 00:48:36.454
So, yeah, I don't know why people try to own things end-to-end or suck the air out of the room.

00:48:36.454 --> 00:48:37.275
Good luck to them.

00:48:37.275 --> 00:48:41.802
That's never been successful for me, but maybe somebody else can do that.

00:48:43.684 --> 00:48:43.884
All right.

00:48:43.884 --> 00:48:45.465
Well, thank you so much for everything, Ravi.

00:48:45.465 --> 00:48:50.115
It was wonderful having you on the Prosthetics and Orthotics podcast and yeah, Brent, you enjoyed this as well.

00:48:50.115 --> 00:48:50.577
I know you did.

00:48:50.697 --> 00:48:56.014
Oh, this is good and thanks for getting into the weeds and thank you for being an advocate for our field and bringing awareness to it.

00:48:56.014 --> 00:48:57.155
I think it's super important.

00:48:57.916 --> 00:49:33.518
Look, I think that anybody who has to consistently deal with the way that the world is shifting and treat patients in this world where you have to navigate insurance codes and everybody makes everything harder for you and there's uncertainty around what will happen with people's Medicare reimbursement If you have to contend with all of that and you still get up every day and treat patients, I have a lot like infinite respect for you because you're doing something fundamentally good and if we can do anything that we can from a technology perspective to make your life better and the outcomes for your patients better, we're going to do it.

00:49:33.518 --> 00:49:35.436
So thank you so much for having me.

00:49:35.436 --> 00:49:38.563
I feel really blessed that you guys invited me.

00:49:38.563 --> 00:49:39.849
I'm really grateful, thank you.

00:49:40.871 --> 00:49:42.994
Thank you very much as well, and thank you very much for listening.

00:49:42.994 --> 00:49:43.574
Have a great day.

Comments & Upvotes