AXSChat Podcast

Eyes Through AI: Bridging the Visual World

Antonio Santos, Debra Ruh, Neil Milliken

What does independence truly mean for someone who can't see? This question lies at the heart of Envision's revolutionary approach to assistive technology for blind and low-vision individuals.

Karthik Kannan's journey began with a simple career counseling visit to a blind school in India. While his sighted nephew dreamed of building cities on Mars, the blind students shared much more fundamental aspirations: reading books independently, visiting the beach alone, or simply living by themselves someday. This stark contrast revealed how much energy people with visual impairments spend overcoming basic hurdles that sighted people take for granted.

That revelation sparked a mission to build a bridge between the visual world and those who cannot access it conventionally. Initially developing a crash-prone mobile app that converted images to speech, Karthik watched in amazement as the blind community persevered through its flaws because of its transformative potential. The app's organic growth led to Envision winning Google's Play Award for best accessibility experience, opening doors to integrate their technology with Google Glass.

The marriage of AI and wearable technology has proven particularly powerful for accessibility. While smartphones require juggling a cane in one hand and possibly a guide dog with another, smart glasses provide hands-free access to visual information. Recent advances in conversational AI have further simplified the experience, allowing users to interact naturally through speech rather than learning complicated interfaces.

Though currently priced around $1,800, these glasses are expected to become dramatically more affordable as smart wearables enter the mainstream. The impact has been so profound that many in the blind community place AI accessibility innovations in the same category as the invention of Braille.

Join us to explore how technology can truly serve humanity when designed with inclusion at its core. Have you considered how AI might bridge gaps in your own life or community?

Support the show

Follow axschat on social media.
Bluesky:
Antonio https://bsky.app/profile/akwyz.com

Debra https://bsky.app/profile/debraruh.bsky.social

Neil https://bsky.app/profile/neilmilliken.bsky.social

axschat https://bsky.app/profile/axschat.bsky.social


LinkedIn
https://www.linkedin.com/in/antoniovieirasantos/
https://www.linkedin.com/company/axschat/

Vimeo
https://vimeo.com/akwyz

https://twitter.com/axschat
https://twitter.com/AkwyZ
https://twitter.com/neilmilliken
https://twitter.com/debraruh

Speaker 1:

Hello and welcome to Access Chat. I'm delighted that we're joined today by Karthik Khanna, who is the founder and CTO of a company called Envision. You might have heard of Envision because they're doing stuff with glasses and smart wearables to help people with vision loss and people who are blind. So, karthik, welcome to Access Chat. Great to have you with us. Can you tell us a little bit about yourself company and the things that you're developing, and then we'll develop from there? But really interested to hear about your journey and how you started in this field.

Speaker 2:

So, hey, everybody, Thank you so much, first of all, neil Debra and Antonio, for having me here. This has been a long time in the making and I'm really excited to be talking to you all today about all things AI and accessibility. And, like you said, my name is Karthik Kannan. I am the founder and CTO of a company called Envision and I'm originally from India and that's where Envision's journey started off as well, eight years ago, which, wow, now that I recall it feels like you know, it feels like yesterday, but it also is forever. It's a long, long, long time ago.

Speaker 2:

About eight years ago, I was invited to a blind school for the blind and low vision in my hometown, chennai in India. The school's principal was my mother's friend and she really didn't have a budget for a career counselor and I was, you know, basically coming there for free, right. So she said okay, if you're, I'd like you to, I'd like to invite you over to the school along with my friend, who was, you know, a designer or an industrial design student. His name is also Karthik and he's also one of the founders of Envision. We both share the same first name. So we went to the school in Chennai in India, and we went there to talk to these kids about what exactly does a designer or a researcher or an engineer do? And I just zone out because somebody is coming and talking about their lives and it's not often that interesting. So I was like, okay, I'm just going to go there and talk to these really tired kids at the end of the day and we're going to have a 15 minute chat, I'm going to take something off my box and I'm going to come back and I go there and I talk to these kids and I ask them this question because I tell them, you know, during the conversation, that an engineer or a designer is essentially somebody who just solves problems for a living. Right, I wake up in the morning, I have a huge to-do list of problems and if I'm able to solve or get through that to-do list, maybe I make something that's useful for other people. Then that's basically what I think an engineer or a designer or a researcher, in a nutshell, does just solve problems for a living. And I asked them you know, what kind of problems would they like to solve when they grew up? You know, when they left school, what kind of problems would they like to solve in their lifetimes.

Speaker 2:

And the interesting bit was, I asked the same question to my nephew the night before. You know he's a 13 year old kid. I asked the same question to my nephew the night before. You know, he's a 13-year-old kid, and I asked him this question, just you know, and I was very, and the answers he gave me were the usual outlandish things that young kids do when they're not really encumbered by reality of life. You know, he said oh, I would like to build a city on Mars and I would like to build a spaceship that would travel, you know, around the sun and come back in, like you know, half a day, and I would visit all these different planets. And he had this huge, you know, canvas to paint on right. His imagination was completely unbridled and I was expecting very similar answers from these kids. I was expecting them to say I'd like to cure cancer or like to go to Mars and stuff like that. But the answers these kids gave were completely different. They said they would like to go out independently, they would like to read a book by themselves, they would like to go and visit the beach on a sunny day by themselves. They would like to someday, they hope, to live by themselves. That was something that was very eye-opening for me, because I did not expect them to go ahead and give me answers to solutions that, you know, sighted people have been kind of solving ever, like after the age of 15 or 20. These are things that you can do by yourself, completely independently, without the need for somebody else to assist you. But I realized that day that a lot of these kids would spend a huge chunk of their lives, you know, basically trying to overcome these hurdles, and that's what their energy is going to be spent on. And so that was the starting point for Envision.

Speaker 2:

And when we started to think about what independence means to people who are blind or low vision, we realized that independence is access to information, and information is all around us, but at the same time, it's not, you know, feasible, or you know, a reality check for us sighted folks is we can't really put a braille display on every single bit of visual information that's out there, right? So we need to build a bridge between the visual world and the world of a visually impaired person or blind or low vision person, right? And that is AI. And so, you know, when we started looking into AI, we started to see that for a lot of tasks, ai was starting to get better than humans. Even about eight years ago, when AI was not a mainstream thing Like, 2017 was the first year AI got better than humans at recognizing faces. You know, 2018 was the year AI started getting better than humans at reading text or recognizing, you know, pieces of text, and so we just extrapolated you know and hope that you know. This technology would continue to evolve and this could be a bridge between the sighted, you know world and the world of a blind or low vision person. And so that's when we started Envision.

Speaker 2:

It was almost an accident. It was never meant to be a company we just wanted to do. I was in between jobs, my co-founder was doing his master's thesis and the two of us thought we'll build an app for fun, we'll put it out there, and after six months, we're going to shut it down and move on. It's going to be an experiment, but within about three or four months, this started to snowball into one of the more more popular apps for blind and low vision people in Europe, and we didn't even do any bit of marketing. It was not even available on the App Store or Play Store. We just had a beta link and people had to jump through multiple hoops in order to install the app, but it just started to take off organically. The blind and low vision community that was using the app at that time, I would say, in a very nice way, goaded us into starting envision as a company, and so that's what we've been doing for the last eight years the application.

Speaker 1:

What were the sort of things that it was doing? Because we know that the the disability community, are really very much prepared to jump through those hoops that you just described if they know that something's going to give them utility or, as you say, access to independence. So you know, we're not worried that it's part of the app store, we'll persevere with stuff. But what were the things that you were offering in in the early days in terms of the app and and you know you, you now also produce hardware, so maybe we'll get onto that in a bit but what were the things that really helped sort of grow the app organically?

Speaker 2:

Well, I think it was the fact that you know, despite the app being a not so robust prototype, it, you know, I remember seeing we used to have about 50% crash rates on the app.

Speaker 2:

So you know, every time you open the app it's almost like a Hail Mary. You know, it's like it works or it doesn't work. It's as good as a coin toss. That's how it was, but people still persevered because what it did was it basically gave people an option to take pictures of things around them and it would go ahead and convert those pictures into speech. So you could take a picture of a piece of text and the app would read it out to you. You could go ahead and take a picture of a scene and the app will tell you what it is in a single line, right? So that's what the app did in the early days. Eventually, the app did get better and we started to go ahead and include being able to read text, recognize faces, recognize objects, recognize personal belongings, and the app did about half a dozen things in the early days. And go on, debra.

Speaker 3:

Well, no, I just really appreciate the story and how the story unfolded and I also really appreciate the questions that you were asking the children I remember I did. That was very powerful what you said, because whenever you're excluded from society, like you know, people with disabilities are your dreams maybe are going to be a little bit different from people that think you know. Well, the world was made for me. I get to be included in everything and that's why I think it's a little ridiculous that we do it this way in society. But I also know that we're afraid of AI, because AI is going to kill us all for sure. But it sounds to me like you're actually embracing AI, to see how AI and technology could be used to enhance an individual's life, which we think is a really good idea. So I'm just and I also I know that and I'm just going to. This is my opinion.

Speaker 3:

I don't know if Google would agree with me, but I had some people at Google tell me one time that Google was really embarrassed about how the smart glasses failed. It was just something that they considered a failure, and I said I really hope that's not true because actually we have to go out, we have to try to do new technology. We got to figure it out and I am grateful that Google did this and that gave us the ability to learn from and say well, what else could we do? Oh, that's interesting. And Apple yay, good for you, apple jumping on, but I am always going to be grateful for Google for starting it.

Speaker 3:

So I was just wondering, based on those things and really at a time when our community feels so attacked and that maybe we're going to be excluded even more, what do you hope for AI and for your smart glasses? Because all we're really asking to do is to be included, and it seems like AI and some of the directions you're going on could just really change what it means, maybe, to be human and interact with AI. So sorry for the long question, yeah.

Speaker 2:

So I think you know, when we first started looking into so, when, as the Envision app kept getting, you know, mature over time, right, the biggest thing that we kept hearing was it's great to have all of this on a phone. But then you know we would love to actually have these on some kind of a wearable Because, like you said, right, I think phones have been around for quite some time. But the idea that you know these phones are again built for people who are sighted and you know they're able to use their phones more easily, they're able to point at things, let's say, you know you want to take a picture of a bird sitting in a tree, right, you know exactly where to point your phone, you take a picture with it and it feels such a natural experience for you to do that. Whereas for people who are blind or low vision, right, they have a cane in one hand and a guide dog in the other hand and they're pointing their phones around, and that was not a great experience. And so, you know, envision won the Google Play Award in 2019 for the best accessibility experience and I milked that award.

Speaker 2:

I went to California to meet with the folks at Google and I just carried that award around as, like a baby, just trying to enter into the hallways of Google and trying to, like you know, use that for credibility. You know, like really milk it and say, hey, I know you guys have built smart glasses in the past and I know that it wasn't really a success at that time. It was a little too ahead of its time. It didn't have people building things for it. I know, probably in this giant building somewhere someone's building glasses and I would love to talk to that person and show them the possibility of what those glasses can mean for people who are blind or low vision. You know what those glasses can mean for someone who has accessibility needs.

Speaker 2:

After a bunch of jumping through a few hoops, I was in the room with the people who are working on the next iteration of the Google Glass and they gave me about eight NDAs to sign. I still don't know what exactly were on those NDAs till this date. I probably signed my life away to Google, I don't know. But in return they let me walk away with three pairs of glasses, and at that time Google's main idea was to sell these glasses to companies like DHL and FedEx, because then workers scan packages with phones and handheld devices all day and they thought, okay, if we give them these glasses, they can just scan packages easily with that.

Speaker 2:

And then here I am telling them no, that's a great idea. But then there's also tens of thousands of people out there who would use this for changing their lives, and so that's how we got access to them and I think, in a way, smart glasses and AI they're starting to really open up the world for people who are blind or low vision and just people with other disabilities in general. Apart from Envision, I know another company that's using, for example, these glasses to help people who are having motor impairments to easily navigate their electric wheelchairs. So, just using their head movement, they're able to, like you know, help them navigate these wheelchairs right. So I know companies that are doing a lot of work in this space as well.

Speaker 4:

No, this is no.

Speaker 4:

It's really interesting, and I was looking to some of the comments made by Deborah and yourself on Google, because I was on a group on Google+ when Google Glass was launched and at the time there was a good number of people asking please make you know.

Speaker 4:

This is incredible for people who are visually impaired, please, no, consider that, unfortunately, the Google Plus no longer exists, but that was something when the product was launched. There was a good number of people asking Google to do that, and so it's interesting to see how this all evolved. But I've been working with some organizations who are working with Sony and are also working with Apple in order to bring the glasses to industrial environments, to factories, to places where employees can interact with virtual devices within the factory environment, or even to use the glasses for instructional activities, for example. There's a switchboard and they can use the glasses to virtualization to find to show them how to fix that switchboard, for example. So it's quite a fantastic opportunity to bring this, but I'm also interested to know how can we scale this and democratize the access of the glasses to everyone. The mobile phone is something that we can say it's accessible in some ways to a large number of the population, but the glass is not that much, so how can we change that?

Speaker 2:

I think it is going to change. I think one of the really interesting trends I have seen very heartening trends, you know, positive trends that I have seen over the last few years is that smart closets are becoming more and more mainstream. Of course, you know there is a whole debate around privacy and you know I also, you know, feel like, you know, sometimes wearables become coming into our lives means that we have to get used to a different kind of reality. You know it's. You know, because wearables are a lot more present and can blend into our environments more easily than smartphones. You know it's not a very distinct piece of rectangle that we, you know, carry around, that people know that we're using, right, I mean, smart glasses can be used as well, quite discreetly. So there is a lot of those societal things that we have to contend with. But the flip side of that is the fact that you know, because these classes are going to be more mainstream, I definitely think that the prices are going to come down. You know dramatically, right, dramatically going to be coming down in the next few years, and I'm talking about the next, you know, three to five years, right, it's going to be, you know, so quick that, you know, blind or low vision people, instead of having to pay thousands of dollars for having access to a pair of Envision glasses, today, you know, about $1,800 for a pair of glasses is going to probably come down to maybe a few hundred dollars in a few years time, and that's basically where we're taking this.

Speaker 2:

And as a company, you know, we're also rethinking how we do our business. How do we, you know, find us, how do we build a more sustainable business model in that new ecosystem? That's a challenge, but that's a challenge that I'm gladly willing to take on, because that means more people are going to have access to like what Debra was saying earlier as well, because we see smart glasses as a two-way thing. It can be used for bad, but it can also be used for good. And it's also the same thing with AI, right? I mean, in the early days, I used to joke around saying that we're pretty much using the same technology that autonomous drones use to target, you know, a military basis. We're using the same technology to basically help a blind or low vision person to navigate an environment more independently. It's all you know. It's all two sides of the same coin and I hope we continue to use AI and smart glasses more for the good so how can smart glasses, prescription, prescription glasses, how do you see them coming, coming?

Speaker 2:

um, so even now, with the current version of the envision glasses that we have on the google glass, right, we sell these specific frames that have these inserts in them so you can actually take them over to, you know, an optometrist and actually have them changed.

Speaker 2:

And that's also the same model that I'm hearing other companies who are working on mainstream glasses adopt, right? So then, the first step in smart glasses is going to be just a standalone glasses with cameras on them, maybe, and speakers and microphones and being able to connect to the internet, and eventually, maybe I think within this decade, we probably will have glasses that are also able to project things into your environment what they call augmented reality, right, so you could have, like, maybe, a puppet dancing on your table and stuff like that with those glasses and stuff. So that is a little bit down the line, and when that comes, I'm sure there's going to be other options to put in prescription lenses. Other options to put in prescription lenses. But the way it works right now, if you're using one of our glasses, you can just take them to an optometrist, swap out the lenses and then you can have your own prescription lenses or dark lenses on them, and that's also the model that other mainstream smart glasses manufacturers like Meta or Google.

Speaker 3:

They're also going to be following a similar Well, and so you were saying it's $1,000, but that's amazing that I could get a pair of glasses that could really change my life. So I just want to say I know $1,000, we'll want to bring the price down, but it also seems to me like this. So you're going to have to do the same thing Google and everybody else was trying to do. We now need, we get these and we need people to program stuff. And I know we've got AI. We're only feeding AI.

Speaker 3:

So am I correct in that around these glasses come all this other stuff? And I'm thinking about this just in case I'm doing a poor job of explaining myself, which sounds like to my head. I'm thinking of my daughter with Down syndrome. So my daughter's 37 years old with Down syndrome and her being able to wear the glasses and say, now, remember, is 37 years old with Down syndrome, and her being able to wear the glasses and say, now, remember. I mean, I'm just thinking in my head how to work, reminding her how to brush her teeth and because she forgets steps along the way, and so I just think of there's just so many ways that we'll be able to use these glasses to help assist humanity, which is what technology is for, and I know we use it for some weird horrible things that I don't like. But at the same time, applying it back to what why does? Is technology not created for humans at every stage of our life? I still find it very illogical why society does that when we know not how not to do that.

Speaker 3:

So I'm quite really, really hopeful about what you're doing. We've been watching it with interest. I think it's going to solve a lot of problems. I think it makes people nervous just right now of thinking of embedding something in our bodies. I know you can embed things and use it to pay, but that really freaks some people out because we think of all those stupid dystopian movies we watch. But I like what you're doing because we finally made it okay to wear glasses. At first, when we started wearing glasses, we made fun of each other oh, four eyes. Well, now we're like oh, you protect these beautiful eyes. So I'm just wondering where do you think we're going to go with all this and how do we help you?

Speaker 2:

you know, building more representation in these companies and supporting companies that are building in this space, right? I think there are two, you know, and the first one is a more advocacy part. Right, where you know you are trying to constantly advocate for accessibility at every level, right? So that's a very, very important step. So I believe in advocacy as much as I believe in technology, because it's humans who are going to drive this change more, and as much as I believe in technology Because it's humans who are going to drive this change more and as much as technology, in fact, I would say humans actually drive more, can do more here than technology ever can. Right, because building a product is one thing, but being able to say, can we have people at these companies that are building products also care about accessibility? Like, accessibility should not be an afterthought, it should be big into the process from the beginning. Building products also care about accessibility. Like, accessibility should not be an afterthought, it should be baked into the process from the beginning. So that's that's one thing, right.

Speaker 2:

And second thing is, I think we need more companies like Envision that are focused on accessibility, right, and one of the things that we try to do, simply by trying to make sure we exist in this market and try our best to thrive in this market is to try and set an example to other startups that are shying away from accessibility and saying, no, you can do good and you can survive as a business.

Speaker 2:

You can make money as a business, and that's an important signal to send to people, to young, imaginative entrepreneurs who you know. Instead of just going and building another 10 minute grocery delivery app right, use that brains to just you know. Instead of just going and building another 10 minute grocery delivery app right, use that brains to just you know build something that might actually benefit humanity and know that in the process, you can succeed. And I think we're trying to work or build or do that as much as we can, you know, from our end. And just being able to find support amongst people in the accessibility community for companies like Envision, right, that would mean a lot, because then we would get access to important partners, we would be able to sustain ourselves and make a difference, right, yeah, so I mean one of the things and I'm fully behind you in your encouragement of others to move into this industry One of the things I constantly talk about is making good money from doing good things, you know.

Speaker 1:

So we need to make that happen, right, uh, as an industry and as like as an accessibility industry, but also industry as a whole, needs to be able to be from doing the good things rather than the bad things. Let's sort of flip that model on its head. You know, it's like societal value generates shareholder value, right. So I think that that's important always. A couple of questions that have come up and Deborah touched on it already was you know you've got the functionality in the glasses, but are you also working on hosting an app ecosystem within the glasses? And then the other thing that I'm also interested in is sort of things like AI and proximity and what we can do with sort of some of the other wearables in terms of creating sort of greater environmental awareness. You know, I mean the personal environment, not like sustainability, forests, forests, etc. In this particular instance.

Speaker 2:

Could you maybe elaborate a bit more on your first question?

Speaker 1:

Right. So you've got the glasses and they're able to give you information around your environment, but do you have specific apps that are within your Envision glasses, ecosystem? Are other developers able to make stuff for your platform?

Speaker 2:

Yeah. So yes, we do actually partner actively with other companies, right? Because I think at Envision, we always believe in being complementary or looking at other companies in the space as not competition but as complementary companies. No-transcript, we always find a business model that kind of works for the two of us. So we do have a couple of companies. For example, we have got a company called Aira, which what Aira does is that it has a 24 seven call center for people who are blind or low vision, and the call center has trained agents and blind or low vision people can actually go ahead and call those agents and get help from someone who's trained in a way to help a blind or low vision person. And today we are able to recognize over 90 different currencies. And that's simply because we have partnered with a company called Cash Reader, another popular company in the blind and low vision ecosystem.

Speaker 2:

And eventually, what we want to do is we want to try and partner with more and more companies in this space. We're talking to a few more companies, some of them working on navigation. We have partnered with a company that builds accessible QR codes, and a lot of Unilever products today in Britain, for example, have come with these accessible QR codes that a blind or low vision person can scan, even from a distance, in order to identify what that product is. So we've been doing that kind of work and we'll continue to keep doing more of that work, and I think one of the really interesting things just to touch upon your second question is that we're seeing today a very different kind of AI coming up. So the AI that we used to work with in the past was just like discrete, you know, like AIs, right. So there would be an AI that would recognize text, there would be an AI that would recognize obstacles and there'd be an AI that will understand language. But today, especially over the last couple of years, we're seeing chat, gpt and those kind of AIs evolve where you could go ahead and talk to it and it would just be able to understand what you're looking for and tell you, give you a response, right.

Speaker 2:

And I think that kind of AI is extremely powerful and that's going to help people get a better sense of their environment more easily, because all of a sudden, the traditional way of interacting with a computer is gone. So if you you know traditionally with software, you know, we know, okay, we have a cursor and there is is a window, and then every app has a different layout and we have to kind of memorize that layout in our heads and how to use it. This button does this. If I need to go ahead and add somebody to this call, I click this button and do all these steps right. So I think that's basically the way software has evolved. But today I can just simply say, hey, can you do this for me, chargegpt? And then ChargeGPT would actually give you a response. It will search the web, it will do all of those things, and that's what we're leveraging at Envision as the next step.

Speaker 2:

Right, just basically saying, okay, we've got all these powerful AI models that can interact with the environment, and then we're just encapsulating it around a nice conversational interface so that even someone who has never used a smartphone in their lives we have an 85-year-old at Envision who is a very active user of the glasses. She's never used smartphones and when we showed her this new conversational interface, which we call Ally, when we showed her Ally and she just put on the glasses and started having a conversation with it like she's talking to a neighbor, and she was able to get her prescription bottle read she was able to ask if who was in front of me. She was able to just do this back and forth, and I think that's basically where we think the future of recognizing what's around us is heading right, where we're really understanding the environment and just putting it all into a nice little interface that people can talk to, versus having to memorize these different things. It's quite incredible.

Speaker 4:

I would like to talk about and try to look at one aspect, considering that you have. I'm sure you talk every day with many users who use the app. You get a lot of feedback from the community and sometimes we see in the app. We get a lot of feedback from the community and sometimes we see in the market sometimes some resistance and some AI. From your observations, can you tell us what people with disabilities look and see AI as? Can you give what perspectives and feedback you're getting from users on the benefits of AI?

Speaker 2:

I think a lot of people see this as one of the most transformational you know aspects of their lives, right? I think I was talking to somebody the other day and he was talking to me about how, in the blind and low vision community, the way they see the timeline is you've got, you know, the invention of Braille right, and then you've got AI going into accessibility. They see these two events on the same breath. You know, that's how transformational AI has been, which is why you know, I think, apart from being advocates for AI, for accessibility, they're also massive advocates for open source AI, right, AI that is not controlled by one company but that's, you know, available openly for companies to remix and reuse, because that's the kind of that's so powerful. Just, companies to remix and reuse, because that's the kind of that's so powerful. Just giving democratizing access to this tool makes a massive difference to their lives.

Speaker 2:

Apart from envision, people are using other ai models, like the more tech savvy, blind and low vision people.

Speaker 2:

Right, they're using those ai models on their laptops, on their phones.

Speaker 2:

They they go through all these different steps to put a very state-of-the-art, open-source AI model on their phones and they're interacting with it for free, and that's transformative for people who are blind or low vision or people who just want access to AI and improve their lives in some ways, definitely as a transformational thing.

Speaker 2:

You know, along the same lines as you know some of the more transformational technologies that have come before, and I hope it stays that way and we want to continue to, you know, contribute to this open source AI. I want to see open source AI thrive, simply because it means that we can take it and, you know, give it to people who probably are both disabled but also, like you know, are economically not at an advantage to be able to use. You know the latest chat GPT pay 20 bucks for it or pay $200 for it. It shouldn't be the case. It should be open, it should always be available for free and if people want to go through the hoops of like you know doing getting access to the open source there, they should be able to.

Speaker 1:

Yeah, thank you. This has been a fascinating conversation. I'd love to continue it at a later date. We've unfortunately reached the end of our time today, so I need to thank MyClearTex for helping keep us captioned and accessible, and our friends at Amazon for helping keep us on air, and, of course, yourself, karthik, for coming on and being a wonderful guest. So look forward to continuing the conversation when we have our Q&A on Blue Sky. So thank you once again for being with me.

Speaker 2:

Yeah, thank you so much. I really appreciate you all having me here and, yeah, this has been really fascinating. I mean, I wish that the conversation was longer, but I'm glad to be back anytime and looking forward to the chat.

Speaker 3:

We really enjoyed you. Thank you.

People on this episode