AXSChat Podcast
Podcast by Antonio Santos, Debra Ruh, Neil Milliken: Connecting Accessibility, Disability, and Technology
Welcome to a vibrant community where we explore accessibility, disability, assistive technology, diversity, and the future of work. Hosted by Antonio Santos, Debra Ruh, and Neil Milliken, our open online community is committed to crafting an inclusive world for everyone.
Accessibility for All: Our Mission
Believing firmly that accessibility is not just a feature but a right, we leverage the transformative power of social media to foster connections, promote in-depth discussions, and spread vital knowledge about groundbreaking work in access and inclusion.
Weekly Engagements: Interviews, Twitter Chats, and More
Join us for compelling weekly interviews with innovative minds who are making strides in assistive technology. Participate in Twitter chats with contributors dedicated to forging a more inclusive world, enabling greater societal participation for individuals with disabilities.
Diverse Topics: Encouraging Participation and Voice
Our conversations span an array of subjects linked to accessibility, from technology innovations to diverse work environments. Your voice matters! Engage with us by tweeting using the hashtag #axschat and be part of the movement that champions accessibility and inclusivity for all.
Be Part of the Future: Subscribe Today
We invite you to join us in this vital dialogue on accessibility, disability, assistive technology, and the future of diverse work environments. Subscribe today to stay updated on the latest insights and be part of a community that's shaping the future inclusively.
AXSChat Podcast
Turning Any Webcam Into An Accessibility Tool For Work And Games
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
What if a simple webcam could unlock your computer and games without touching a mouse? We sit down with SensePilot co-founder Mike Hazlewood to unpack how head tracking and facial gestures become fast, precise inputs for everyday work and high-stakes play. Built for Windows and running entirely on-device, SensePilot keeps latency low, privacy intact, and enterprise approvals realistic—no cloud uploads, no special hardware.
Mike traces the journey from a 2024 hackathon to a 2025 launch, where a bold idea met real-world testing. A friend with a spinal cord injury wanted to play Call of Duty again; designing for that level of precision made everything else—from Excel to email—more usable. Collaborations with SpecialEffect in the UK and a Ukrainian NGO supporting veterans revealed just how varied needs are, from ALS and muscular dystrophy to RSI and carpal tunnel. That diversity drove SensePilot’s granular approach: tune trigger strengths, build unique profiles for desktop vs. gaming, and even switch profiles inside a single title for driving, flying, or on-foot movement.
We also dig into the bigger picture of accessible technology and AI. On-device processing lowers security barriers and keeps assistive tools resilient when networks fail. Thoughtful AI support can speed text input and streamline workflows without replacing human judgment. The key is specificity—narrow, task-focused agents outperform generic models for accessibility testing and coding, while keeping the person’s intent front and center.
Looking ahead, Mike shares a vision for mainstream inclusion: optional head-tracking onboarding inside games like Microsoft Flight Simulator, letting anyone try hands-free immersion with one click. No wearables, no extra gear—just a webcam and curiosity. If accessible input becomes a standard feature, everyone wins: gamers gain immersion, and people with disabilities gain flexible, independent control.
If this resonates, subscribe, share with a friend, and leave a review. Curious to try hands-free control? Grab the free trial at sensepilot.tech and tell us which game or task you’ll tackle first.
Follow axschat on social media.
Bluesky:
Antonio https://bsky.app/profile/akwyz.com
Debra https://bsky.app/profile/debraruh.bsky.social
Neil https://bsky.app/profile/neilmilliken.bsky.social
axschat https://bsky.app/profile/axschat.bsky.social
LinkedIn
https://www.linkedin.com/in/antoniovieirasantos/
https://www.linkedin.com/company/axschat/
https://www.linkedin.com/in/neilmilliken/
Vimeo
https://vimeo.com/akwyz
https://twitter.com/axschat
https://twitter.com/AkwyZ
https://twitter.com/neilmilliken
https://twitter.com/debraruh
Meet Mike And SensePilot
Neil MillikenHello and welcome to AXSChat. I'm really delighted that we're joined today by Mike Hazlewood. Mike is one of the founders of a small company called SensePilot. I had the pleasure of meeting Mike and interviewing him for a fireside chat at the Xero project in Vienna the other week. And I'm really impressed by the work that Mike's doing. So as a consequence, I thought I'd invite him on to tell the audience about SensePilot. So welcome, Mike. So can you tell us a little bit about your your journey, what SensePilot is, and and we'll take it from there?
From Hackathon To Prototype
Gaming As The Benchmark For Access
Real-World Testing And Partners
Beta, Users, And Conditions
Launch, Team, And Funding
Mike HazlewoodYeah, certainly. Thank you for having me. So what what SensePilot is and does, it's software that turns a webcam into an accessibility tool. So we're picking up a live webcam feed of the face and mapping out the head movements and facial gestures, and then you can map all of the different facial gestures to different mouse clicks and keyboard outputs. And what that allows you to do is to control your computer, but also do things like video games. So we've we've kind of worked a lot on stabilizing it to, you know, how can you aim accurately in like a first-person shooter and things like that. So we started on this at a hackathon in May 2024. Of note, this is a family company. Co-founder Linus is my brother-in-law. So we started at this hackathon just trying to see what was possible. Linus's previous experience was in the AAC space, the augmentative and assistive communication. So we'd worked a lot with different eye gay systems, different head tracking systems, and just kind of, you know, had an idea and wanted to see what was possible. So in that hackathon, we came together, brought together other people that we could get. So my wife, Linus' sister, is also a software developer. And then their cousin, she has a background in social media and marketing. So we kind of just built a very small little team just to build a mini company, see what was possible, you know, equally as well, see if we like talking to each other in a more professional context versus a familial context. So we built a rough prototype, built a website, and just started to investigate the problem space. Is there a need for this sort of thing? How would people use it? How would we get it to people? And those sorts of ideas. So after we finished that, we started working with a friend who had suffered a spinal cord injury. So his fine motor skills are impacted. He uses a trackball mouse for work, so you know, was able to kind of use that for navigating the computer. But his main complaint was after work hours, I played with every single, you know, sensitivity setting. I can't play Call of Duty. And that was the exact exact use case. So we we kind of looked at that and said, okay, if we design around gaming to get in playing Call of Duty competitively, that's one of the most difficult things you can do. But that benefits everything else on the computer. If you're using, you know, the head tracking controls for browsing the web, for checking your emails for Excel or anything else, all of that benefits from what you need from gaming, where you need accuracy, stability, and responsiveness. So after that, we began working with a Ukrainian NGO with soldiers who'd lost limbs as a result of the war. So, you know, trying to understand how do they use this, what do they use it for. It was also quite a good test as well for a software system because logistically getting them hardware assistive technology is a challenge at the moment. So essentially, if they've got access to the internet and a tablet, we can send them a downloadable, can get into it, access it, try it, and we can learn about, you know, how how are they setting it up? What other things do we need to add? We then began working with Special Effects here in the UK. So a charity that focuses on on gaming for people with physical disabilities, just because of their knowledge and experience of all the different switches, joysticks, and all the kind of different permutations of assistive tech that they use that, you know, was challenging for us as a young startup to be able to accrue. And they were fantastic. We'd send them drawings, we'd send them, you know, kind of very rough designs. And their focus was design around independence. You know, that there may be some outside interaction needed. Try to minimize that and try to make it so that somebody can still navigate the user interface, you know, try to design around certain elements that, you know, if you imagine your sensitivity settings are wrong, how do you get yourself out of that? How do you set everything up for yourself? So kind of focusing on that as a core, core premise. So that takes us to September 25 when we started the company. We'd been kind of speaking with occupational therapists and end users throughout all of this. 2025, we launched our beta test, which was 80 users, uh, a mixture of end users and occupational therapists located world over. So obviously, the things that an end user needs is different to what an occupational therapist who's doing an assessment needs. You know, that that's some of the route, they form part of that decision making of how to choose an access method or an assistive technology. Yeah, people located world over with a wide variety of conditions. Those, those who kind of were happy to share with us, we had, you know, spinal cord injuries, ALS, muscular dystrophy, but also people with carpal tunnel syndrome, arthritis, and and repetitive strain injuries. So they typically wouldn't be looking for a lot of those sort of control systems. They'd be kind of geared more towards a hardware, like a you know, a more ergonomic mouse and things. So it's trying to understand can you do whatever you want to do on the computer, work, play, whatever. And, you know, what control inputs does everyone need? You know, so if somebody has kind of uncontrolled movements or quite dystonic movements, how what control mechanisms do they need to be able to do whatever they want to do? What games do they want to play? What do they want to do for controlling the wider computer? So it was taking that feedback, iterating on and kind of understanding, okay, we need to add a few other different control mechanisms, a few other different things. And then we were able to launch it in May 2025. So we're we're very new. We're still kind of finding our feet and seeing where we can get to. It is, well, it's the two of us, it's myself and Linus now. The other two, unfortunately, had to go back to day jobs, which obviously is is you know very much appreciated for the help and input they had. But understandable that, you know, running a startup is all-consuming. So yeah, that kind of takes us to where we are, where we've been primarily grant-funded. We've been trying to just get the tech out to as many people that need it as possible.
SpeakerYeah.
Mike HazlewoodBut also trying to figure out how else we can use it. So looking at different use cases and things.
SpeakerMike, I know that Antonio has a question. Go ahead, Antonio.
Platform Choices And On-Device Privacy
Antonio Vieira SantosHow have you navigated the complexity of systems that we have to deal with? You know, they are old computers, new computers, different operating systems, external cameras, built-in cameras, some of them receiving more recent updates, others not being updated for years. How do you navigate that complexity to make it happen?
Mike HazlewoodYeah, certainly. So when we started, we were building for Windows and Mac OS, but we kind of decided actually we've got pretty limited resources, so we need to kind of focus somewhere. So if we're looking at if we're looking at gaming, most of that happens on PCs within the Windows environment. And that represents the the bulk of the personal computing market. So that was kind of, okay, let's just focus on Windows for now. We are getting requests to put it onto Mac OS. Mac OS does have some similar head tracking and facial gesture controls built in. People who have tested it out are still asking us, can can you build something basically? We may come to that, but we were looking at it as a look, if we build on two operating systems with the resources we have, we're essentially going to be slowing down our own output. You know, we we want to push a new feature update, speech recognition or something like that. You're then going to have to try and figure out how to get it across both simultaneously. So it was it was more just let's focus and navigate one specific operating system first. We have been testing on Windows 10 and 11 just to make sure, because there's still a lot of Windows 10 devices out there, even with support kind of declining. I think it was October last year. Uh there's still a lot of devices out there. So just making sure everything we do still works on those systems. I should add as well, everything we do is local on the computer. So all of the facial recognition happens locally on the person's computer. So there's no cloud connectivity needed. You know, that that really helps the device itself because it's quite lightweight on the CPU usage, but also means it's more responsive and great for data privacy. And if you're using it for an AAC access method, you you shouldn't be relying on the internet for your voice.
Immersion, Games, And Built-In Use
Debra RuhSorry. I just want to I was taken up. You know, I'm fascinated with this conversation because we've been talking, I mentioned off-air that I just posted on LinkedIn. Do we need to focus on people with disabilities now that we have the ability to reduce any kind of gaps or any kind of accessibility needs with AI? I mean, the technology is just expanding so quickly that it is really, really supporting our community as well. So I think we can now, you know, for example, just grounding it a little bit. I have ADHD, I'm neurodiverse, and I know in school, the school setting, when I didn't know I had these, I still was considered sort of a problem student that didn't pay attention and wasn't applying myself and all that because shut up, Neil, I was a little disruptive in class. And so, but that is actually advantageous as an entrepreneur, as you've learned, like, right? Being this intense. But I am so fascinated with this new idea of now that we have the technology to solve it. So do you think with the gaming, the gaming environments becoming more immersive, could sense pilot technology eventually bridge physical and virtual navigation, really expanding inclusion in both of the worlds, both both the gaming worlds, which is a gigantic world, gigantic world. And I don't consider myself a gamer except I've always been having fun gaming on social media in a way, right? So maybe a lot of this is gaming, but I'm just curious. It feels to me like that's where you're going. And I do hope that brands that are listening to this understand that investing in these groups that are trying to do this for our community are critical. It's great for grants, but I'm also hoping some of the brands listening are listening to the funding opportunities here. Over to you, Mike.
Mike HazlewoodYeah, certainly. I mean, what we've been testing a lot with this with lots of different games, which, you know, we've kind of found a way to play games at work. We've given ourselves an excuse. But we also kind of found just really enjoy just even though we can use, you know, physical joysticks, just the standards, standard joysticks, is actually when you combine it with the head tracking, it just makes it that little bit more immersive. And especially with no wearables, you don't have to get a separate VR headset. It is just a very much go. So we are trying to see how else we can use this. Obviously, it's you know, kind of designed so everyone can just get in there and have a play with it and and see see how it works. So the idea is could it be inbuilt within a game so that you know if you're doing onboarding for if we take Microsoft Flight Simulator as an example, you know, it that game is designed to be kind of a really immersive experience. You're sitting in the cockpit, there's all sorts of buttons and things around. But if you imagine during onboarding, it says, Hey, do you want to try this? We just need access to your camera. Don't worry, everything stays on your computer. And you give it a go and kind of actually, this is quite cool. I can look out the window, I can see these buttons down here. You know, the camera kind of moves accordingly. So we're looking at some of those really to help with with mainstream uptake of these sort of things for for immersivity purposes. And equally with that, the hands-free control, obviously, not everyone's going to be using that. Not everyone wants to, you know, fly a plane with raising their eyebrows or things like that. So, how do you kind of work with the studios to say, okay, this is something we want to do for for both both aspects. So having it inbuilt within games is something we're we're trying to chase up and looking to get into. And equally, you know, similarly to to have it embedded within AAC systems as well.
Enterprise Approval And Cost
Neil MillikenYeah. I I I you mentioned before, Mike, that it it's on device and and it doesn't send stuff to the cloud. And one of the things that I've had significant trouble with in enterprises getting assistive tech approved through security and everything else. The fact that it doesn't cool up the cloud and doesn't uh an information security risk makes it much easier to be able to deploy this stuff. So hopefully that means it's going to be something that organizations uh are able to deploy quicker. I mean, because corporate processes for approval are pretty slow. And I think that also just the the fact, as you said, that it doesn't rely on an interconnect uh internet connection in order to work is is really super important as well. Because I think that over the last seven or eight years, we've had an awful lot of assistive tech that is cloud-based. So if your connection goes down, your AT goes down. And so one of the fundamentals of of assistive tech is that it should be robust, but cloud-based assistive tech isn't. And and we're seeing a shift backwards now to sorry, s back towards, not backwards, but back towards on-device processing and so on, because it gives that continuity of service that you wouldn't you wouldn't otherwise get. So uh so I think that that that's something I think is really important. And I just like the fact that you're doing this with bog standard hardware, right? Because the hardware that normally is used for head tracking, eye tracking, and everything else is phenomenally expensive. And so what you're doing is democratizing assistive tech.
Mike HazlewoodYeah. Yeah, it's it you know, the whole ethos is you can use whatever device you have. You know, whatever you can go into Curry's Best Buy, MediaMarkt, wherever you you procure your standard tech equipment from. That's kind of what you can just use. So yeah, that's the beauty of software-based systems. Uh, just kind of allows you to get out to places a a little bit easier.
Antonio Vieira SantosSo uh can you walk us through the process of someone that wants to install the software on their machine, what they need to do and how it works?
Mike HazlewoodYeah, so first step is to go to our website, sensepilot.tech, and then click download. So there's a form that pops up. It just asks for an email address. That's just purely just to send you a, you know, a few different or to send you the download link. You'll see it on the screen anyway, but it's also to, you know, just kind of send you a few different tutorials and kind of, you know, best case tips. Once you download it, there's a 30-day free trial. So it is just get to use it, see if it's for you, you know, understand what it can do. If it's not, you know, that that's that's okay. So it's just kind of going through and starting to see, and really it's it's how do you move the cursor on the screen. So getting your sensitivity settings right, and then how to left click. Those are the only two things that a person may need some assistance with. After that, once you've got those, you can navigate the computer, you can navigate the interface. So you kind of map out how do I right-click, how do I scroll, how do I, you know, pause the cursor temporarily if I don't want to control the device. And then you can start digging deeper into how to set up different profiles. So, you know, your desktop access profile is going to be very different from a video gaming profile. And equally, within video games, there's lots of different control mechanisms within the same game. You know, there's different control inputs for driving versus running around versus flying, so all within the same game. So how setting those up to kind of see how do I play with those? How do I interface with those? You know, how how do you use those different things? So we we have like speech inputs and things as well. So how do you it it really is just completely personalizable. You know, pe people have asked us, oh, could you send us a copy of the profiles that you've got set and stuff? It it really doesn't make sense to do that because that profile is completely unique to you and what you want to do. Obviously, everyone's facial biomechanics are are different as well. You know, it's setting how much you you want to move. So if you've kind of only got very small movement, we pick up really small twitches of of you know, smiles, blinking, and things like that. So you can use those as inputs as well. So setting the trigger strengths to whatever you want, basically, just because everyone is so vastly different, there's there's no kind of blanket, this is how you're gonna do this. Yeah, so it's really it's just kind of getting in there and seeing seeing what it what it does and can do.
Antonio Vieira SantosAnd and then how do you make sure that then the system is updated? How do you roll the updates?
Mike HazlewoodSo we push them out automatically. We kind of immediately, as soon as there's something updated, obviously after testing it, we'll push out the update. Just it's just a very quick process. We do have a version that we're testing at the moment where, certainly in some work environment or school environments, having administrator rights to the device are kind of difficult. So we're testing out a mo a version at the moment where you'll just kind of see somewhere on the screen a little, you know, badge number saying update available. Click here to update and then just yeah, kind of go through it that way basically.
Debra RuhMike, I know that on our Super Bowl commercials here in the United States, there were multiple organizations, including Amazon, that did cute little commercials on us not being afraid of AI. The Amazon one really was cute, but anyway. But how do you see intelligent AT working in partnership with your user rather than replacing human skills? And I know in some ways that's a hard question, but I think so many people are obviously paranoid of AI when actually I don't believe you're taking away human skills, you're actually trying to enhance it. But I was just wondering if you would explore that for a moment.
Mike HazlewoodYeah, I mean, there's certainly kind of AI-based AT, it's it's it's always a challenge of you know, how much are you assisting in things? Certainly, I I'm quite excited by a lot of the AI agents that I see coming out here of actually what it can do in speeding up, speeding up things, you know. So if you're imagining, you know, someone's typing out letter by letter on an on-screen keyboard, it's it's quite a slow process. You know, obviously the person's going to get faster and faster over time, but if if you can kind of help out with more better predictive text, so understanding the intent of that, the context of that. So if you're typing out an email, you know, it can automatically go, okay, this should be a professional context, this should be a casual context. And these would be the kind of words and typical sayings you'd use. So those sorts of interesting things learning from you. Certainly I'm kind of looking at it of like, okay, so I've got, you know, anniversary or birth date in the calendar, and it pops up and says, hey, you should probably order some flowers. Here's three options. Let the human take over from here. Uh, you know, just just making sure that just making sure the human can kind of you know keep going going through and still getting these things done. So I I think a lot of it is very empowering. Obviously, data is is a big question, you know. Certainly, as Neil was talking about, some of the systems that are being used for AT. So I'm kind of thinking of the, you know, meta-ray-band glasses, which have amazing impact for blind, low vision individuals and for people with mobility. You know, they can remain connected, they can understand what's going around, but large companies are kind of stopping them from being brought in, obviously for security reasons. So, you know, I've I've heard of one site, quite quite a big company, they've completely banned them because they don't know who's using them, they don't know where this data's going. And if you're taking pictures of items that they might deem sensitive or documents they might deem sensitive, then that's problematic. Yeah, there's there's there's kind of a bit of a a difficulty there with you know certain environments, I guess.
Debra RuhAnd I think it's important that we still talk about it because I I actually helped a friend of mine yesterday, he's 78 years old, and he was like, I want a flip phone, I don't want that smartphone, I know it's gonna and he just and I was like, that is not a problem. You want a flip phone? Not a problem, not a problem. We can get you a flip phone, but there's still a lot of fear associated with it. I really like what the the creators of Claude, the hashtag they're using, which is keep thinking. Why don't we keep thinking? Because if we keep thinking, working with AI, we can do amazing things. But back over to you, Neil.
Neil MillikenI think the key is to keep thinking because it's quite easy to stop thinking when you hand over your reasoning processes to AI. So there is a seductiveness in the ease at which you can produce stuff using AI, which can lead to dependency. Like an emotional and psychological dependency on some of these tools. And and so, yes, I think as we see further deployments of AI into more and more bits of work and life in general, one of the things that we need as society to be doing is teaching people reasoning and questioning skills, right? And to be able To think things through and to be able to take a step back and consider. And so one of my concerns is that when you make something that is able to move so fast, even if you have those reasoning skills, you don't get time to deploy them. So with the acceleration of processes, putting the human in the loop becomes really, really difficult because you've got the loop is moving so fast. So I think that there's there's some work to be done there to m to ensure that we don't sort of accelerate ourselves out of the process just because it's happening so fast.
Practical AI Workflows For Startups
Mike HazlewoodYeah. We've got some quite interesting ones. I've actually said we're we're grant funded. So the grant applications themselves can be absolute monsters. So it's kind of how do we streamline these? Because it's also quite repetitive. Obviously, you've got to kind of come back through and and tweak every answer and just really make sure it makes sense. Is this answering the question that it's supposed to? But we we've done some other ones of, you know, we'd we'd spend a lot of time on Reddit trying to speak to different community members and and and as well, like trying to add value to conversations. But you're at the mercy of that Reddit algorithm. So you're scrolling and scrolling and scrolling, and you know, a lot of it doesn't quite make sense. We we've set up a few things where it basically kind of pulls in new chats and then puts them through an engine, sends it to us in Slack. Basically, it asks the question: is this a conversation that SensePilot could add value to? Sends us a message, and then it's up to us as the humans to kind of go through and go, yeah, that makes sense. Or okay, no, I I have nothing to say here. This doesn't really involve me at all. But that's been quite a good streamlining flow for us as a startup to kind of shorten shorten how much time we're doing on those.
Specific Agents And Accessibility
Neil MillikenSo absolutely. The the tools can really augment your capabilities. And so I'm I'm not totally anti-AI. What I'm really mindful of is is doing it in ways where we are thinking about how best to deploy it and not just sort of sort of splashing it everywhere. The agents thing, apart from the issue of trust, it is really interesting as an assistive tech. And there was a post on LinkedIn just the other day where there was a young blind screen reader user developer who said, I'm using all of these AI tools, and I'm trying to use them to code and to audit, and frankly, they're not doing a very good job, no matter how well I prompt. And what she'd done was then go from sort of the general purpose LLM into creating her own agents for very, very specific tasks. And so then she was getting better results. So I think that as that specificity in how you configure and deploy AI is really crucial to being able to get the best results out of it. Otherwise, what she was experiencing was of inaccessible slot. So I'm I'm cautiously optimistic, and and I think that you know we all should be cautiously and curiously optimistic about the topic. We're reaching the end of our half-hour time flies really quickly. I need to thank Amazon for supporting us and helping keep us on air. Do you want to repeat where people can find you on the internet?
Mike HazlewoodYeah, absolutely. So uh you can visit our website at sensepilot.tech, and then you can find us across any social media channels with SensePilot or SensePilot Tech. We're on LinkedIn, Instagram, TikTok, YouTube, I think that's it, Facebook.
Neil MillikenWell, we wouldn't be anywhere with it without Facebook, especially if you've got the glasses on. So thank you very much, Mike. It's been a a pleasure to have you on today. Well, thank you very much for having me.