
AXSChat Podcast
Podcast by Antonio Santos, Debra Ruh, Neil Milliken: Connecting Accessibility, Disability, and Technology
Welcome to a vibrant community where we explore accessibility, disability, assistive technology, diversity, and the future of work. Hosted by Antonio Santos, Debra Ruh, and Neil Milliken, our open online community is committed to crafting an inclusive world for everyone.
Accessibility for All: Our Mission
Believing firmly that accessibility is not just a feature but a right, we leverage the transformative power of social media to foster connections, promote in-depth discussions, and spread vital knowledge about groundbreaking work in access and inclusion.
Weekly Engagements: Interviews, Twitter Chats, and More
Join us for compelling weekly interviews with innovative minds who are making strides in assistive technology. Participate in Twitter chats with contributors dedicated to forging a more inclusive world, enabling greater societal participation for individuals with disabilities.
Diverse Topics: Encouraging Participation and Voice
Our conversations span an array of subjects linked to accessibility, from technology innovations to diverse work environments. Your voice matters! Engage with us by tweeting using the hashtag #axschat and be part of the movement that champions accessibility and inclusivity for all.
Be Part of the Future: Subscribe Today
We invite you to join us in this vital dialogue on accessibility, disability, assistive technology, and the future of diverse work environments. Subscribe today to stay updated on the latest insights and be part of a community that's shaping the future inclusively.
AXSChat Podcast
Adrian's Quest to Simplify Sign Language Translation through AI
This week we talk with Adrian Pickering from Robotica. Robotica's main goal is to make all of the world's information and entertainment available in sign languages, starting with British Sign Language. Adrian explains the challenges of translating sign languages due to the chronic shortage of interpreters and translators. Robotica uses AI to create avatars that perform queued speech, a simpler form of sign language, to bridge the communication gap. They are also working on creating avatars for British Sign Language, using motion capture and AI to break down individual component parts for better understanding. The ultimate goal is to provide sign language translations for various aspects of life, such as public transport, healthcare, and utility bills, to empower sign language users and ensure inclusivity. Robotica aims to create deaf jobs, not take them, and make translations available everywhere through responsible and respectful AI. Check out Robotica to learn more about their groundbreaking work in sign language accessibility.
Follow axschat on social media.
Bluesky:
Antonio https://bsky.app/profile/akwyz.com
Debra https://bsky.app/profile/debraruh.bsky.social
Neil https://bsky.app/profile/neilmilliken.bsky.social
axschat https://bsky.app/profile/axschat.bsky.social
LinkedIn
https://www.linkedin.com/in/antoniovieirasantos/
https://www.linkedin.com/company/axschat/
Vimeo
https://vimeo.com/akwyz
https://twitter.com/axschat
https://twitter.com/AkwyZ
https://twitter.com/neilmilliken
https://twitter.com/debraruh
AXSCHAT Adrian Pickering
NEIL:Hello and welcome to Axschat. I'm delighted that we are joined today from Adrian from Robitica. Adrian, we had a great conversation off air, and yesterday as well, which is why I was so keen to get you on as soon as possible. Please, you know tell me and our audience, a little about yourself and what Robitica is doing around sign language because I got really excited when we were talking yesterday. So, welcome to the show. And, please tell all.
ADRIAN:Thank you Neil. Thank you very much for inviting me. It's an absolute privilege to be here with you today. Basically, Robitica has one very, very simple goal, one ambition and that's to make all of the world's information and all of the world's entertainment available in sign languages, simple as that.
NEIL:It sounds simple. But simple things can be used. So, that's a massive ambition and you know, and it's not trivial either. So, we know there is lots of different sign language. But you're starting with British Sign Language; am I correct?
ADRIAN:That right, yes, British Sign Language is our main focus. There is about 300, hundred official recognised sign languages around the world, probably many, many more that are used throughout the world that haven't been recognised. British Sign Language is in the United Kingdom, where I live, it was recognised as an official language of the UK in the BSL App last year and that means that there is an official recognition but it's going to be at last, taken seriously as an indigenous language of people of Great Britain and Northern Ireland. Although Northern Ireland has a different language of all things.
NEIL:Yes. So, yes, as you mentioned there is this huge variety of different sign languages but we have to start somewhere. And whilst the UK and the US in spoken language are two nations separated by a single language, when it comes to sign language, they are actually completely separate languages. So, you can't just think that you can move British Sign Language to an American audience. So, you have got double the work when it comes to translating all of that information that's out there in English on the Internet. So, tell us a little bit about how you're making that available because I know that we have this huge scarcity of people that are qualified to be sign language interpreters?
ADRIAN:Neil, you're absolutely right. There is an incredible chronic shortage of interpreters of translators. There are about 900 to a thousand RSLI registered interpreters in the UK and they are trying to serve all of the British Sign Language community and there are only well, there are fewer than 100 deaf translators, incredibly small number and translating between British Sign Language and English is extraordinarily mental taxing. It's cognitively probably about the hardest job that anyone can do. Unlike translating between say English and Finish or Korean and Portuguese. We don't have, we haven't evolved to have language centres that work directly efficiently with sign languages. So, people who are working and thinking and talking in sign languages are using their brains so much more than those of us who are thinking and talking in spoken languages or when we are writing. It's a lot more cognitively difficult and this is why typically an interpreter might work for 20/30 minutes at a time and then require the same amount of time again as a kind of rest recovery period because it's just utterly, utterly exhausting, really challenging work. So, if the interpreters in the UK were working flat out, only may be one in 80 or one in 100 of the people requiring sign language translations would actually receive an interpretation. It's a tiny, tiny number of interpreters and a really, really important problem and the reality is, if you think about when you're learning to read or write, how difficult it is when you're four or five or six years old to start to get that but for those of us who have hearing, we have got the advantage that we can match the sounds up with the letters. If you're trying to learn to read and write without that advantage, it's much, much larder. So, as a consequence, children who are deaf, learning to read and write, have, they start with a disadvantage and they have to work so much harder and what that quite often means is deaf children, leaving school, ten years reading age, lower than their peers. Deaf people entering the workplace are typically then under employed for their capability because they are prejudiced against by these starts in life of their reading is not, it might not be considered as good. And yet, actually they are typically bilingual. They will be very, very strong in their sign language British Sign Language here and they are no less capable of communication and what they are trying to do is make the information that's available to those of us who are here and to those are us who read, available equally on complete even footing to those who prefer British Sign Language.
DEBRA:Well, Adrian, that's quite an undertaking. Something else that we see is that, I know that I, like all of us on this call, we want to be fully accessible and I hear so often corporations that want to employ people that are deaf, still being really afraid to actually employ people that are deaf because they don't understand the sign language component of it and what they think, what they translated often to only because they have told me this, is because it is just so expensive. It is so expensive, even though you just mention one listen why we should pay these professionals appropriately because it is so taxing cognitively, doing it. But also, I knew we were trying and I know this is something your group is trying to do as well but as a society, this is really hard, the sign language stuff. So, we are going to make avatars and we'll solve it that way because all we have to do as society, I'm being sarcastic, is we have to just replace me speaking to you when of course, we know sign language is so much more than that. So, there seems to be a lot of confusion. I personally didn't realise that there was 300 different sign language, I know recognised. And so, this is so complicated. And we are saying to employers you need to make sure you're hiring all of us, especially including people that are deaf, people who are hard of hearing, people that are using sign language. But there just seems to be a lot of confusion with this specific thing in the marketplace. You're talking about the disadvantages to a child, you know, ten years behind their peers. You know, how in the world do you begin to wrap your arms around this complicated topic. I understand why we must, Adrian and I am so glad Neil invited you because it's a very complicated topic and so, how do you even begin to start to unravel the complexity of this, to begin to do this. I mean, just because it really is a complex topic. And I, you made an important point that I just want to say that you said, we are the community. So, first of all you do it with community but I was just curious because of all of the moving parts. How do you begin to wrap your arms around it?
A. ADRIAN:We started with a much simpler aim. British Sign Language, like American Sign Language, French Sign Language, Portuguese Sign Language, they are all languages, in their own right. They have their own grammar, their own syntax, their own vocabulary, their own rules, their own way of operating and they have their own cultures as well which is absolutely critical within the language. We started out working on a much, much simpler problem, a problem called queued speech, which is not a language at all but to some people it looks a bit like signing because it uses hand gestures, in order to make the sounds that we make with our mouth to make a visual version of those. So, what I mean by that is, I'm sure, to some degree rely a little bit on lip, whether we realise it or not and as we became older, it becomes typically more and more reliant. But a lot of sounds look the same on the lips. If I were to say to you B, P, Me and then I was to say it silently, which one of those am I saying here? Anyone? You can't really tell.
DEBRA:So, we see?
ADRIAN:So, what it is we recognise a lot of these sounds look the same when they are mouthed and what we do is we add eight different hand shapes to represent all the different, to disambiguate all the different lip shapes. So, with B, P and Me, we have got B, sorry, P, I'm getting it wrong, the pressure of being on a webcam. So, we have got P with one finger. And then we have got B with four fingers. Me, with five fingers. And so, now, if I went, anyone who has got the training in queued speech knows that that is P and not B or Me. And because queued speech relies on a spoken language, it means that if you already have that language skills then you can run queued speech depending on your own capability and the amount of time you put in, between a few days and a few weeks. Compare that to a sign language which takes years. To become a professional interpreter for instance, might take you seven, eight, nine years and be the equivalent amount of training and the amount of expense of getting a degree and a postgraduate qualification. It's really, really, really intensive work. Queued speech you can learn very quickly, but it does depend on you having a language. And the reason why queued speech has a really important value is it does help when children are born into hearing families. It helps in the early, it can help in the early stages, it's not for all deaf children and it's certainly not for all deaf families. But if you imagine, you're infant has been diagnosed as being profoundly deaf and you're hearing and you've had no exposure to any sign languages, you want to communicate with your baby as soon you can and what this does is, it kind of, it buys you time to start to learn to sign, if sign language is the right way for you because you can get this queuing in a matter of days. Whereas the learning to sign, you will learn to sign with your child over many, many, many years. And so, this gives you kind of that and almost instant ability to bond and communicate with your baby as well as, it's been said to be really supportive of learning phonics and helping deaf children to learn to both to speak and to read. So, it's a great, it's something that's quite niche in United Kingdom. It's used much more in the United States, in France and Belgium but, it's a really, really, really useful, it's a really useful way of helping people communicate but it's not a language and it's certainly not a sign language but we started out with that because there is really great value, it can do a lot of good and it's also a lot easier, just as it's easier for people to learn it, it's also much easier for computers to learn it. So, there is only 8 hand shapes, as opposed to many different gestures and that's only talking about hands with sign languages. So, just eight hand shapes and the different patterns that you can make and that's basically it. So, we started out making avatars that could perform queued speech. And we had our first ones about three years ago. We had them deployed in airports, so that people with hearing loss typically could have something that would lip speak for them and be supported by queued speech and on screen captions so that they could know what is going on and they could ask questions and the FAQ bot would then respond in whatever language they had chosen on screen, queuing it in that language with lip speaking as well. So, that's where we started, much, much simpler problem. So, that we could work on the problems of the avatar and expression and the basics, the basic mechanics, before getting into the really, really much more challenging area of the AI and the AI is really important because yes, there just aren't enough people to perform these translations. The only way we are going to get anything, if there are a thousand times as many interpreters and translators, it still wouldn't scratch the service of the amount of content being creating every day. We are sitting here today, creating content, creating valuable content that people want to consume. Every single minute of the day of every single day, more than 500 hours are uploaded to YouTube, 500 hours. How many translators do you need to turn that into sign language? And that's then you've got to multiply that by 300, for 300 different sign languages. The only way to deliver the availability of content to everybody on demand is through AI. Humans are always going to be the best at this. Humans will never, ever be replaced by computers, by avatars in this place. No one is ever going to want a computer telling them their medical diagnosis, explaining a court ruling or anything else like that. We'll always want people to do these. Computers will never have that empathy and they'll never have that human touch. Everything else though, everything that would not otherwise get translated, that is what AI is for.
ANTONIO:So, Adrian, we have been listening to you and your passion about your work but, we also, I think we also need to know your why. And what road has took you here. I don't think you have been working on this all your life. Can you tell us a little bit about that as well?
ADRIAN:I mean, I don't have a glamourous or a touching story on this, I'm afraid. I graduated as a neuroscientist. But very quickly fell into writing computer software. I was one of these geeky kids writing home computer games on Atari at home when I was 8 years old, 7 years old and I knew I'd end up in computing. And sure enough, after I graduated, I was very quickly a programmer and did bits of AI, mostly working in Biotech stuff and genetics and healthcare. But a bit of everything overall. And I grew a career as a software consultant. I went as far as I could in that career, as far as I wanted to go in that career of basically turning up, helping some of the biggest companies in the world get a little bit richer and it wasn't rewarding, I mean it was financially great, don't get me wrong and I loved the technical challenge. But I found myself increasingly not caring about going to work in the morning. And I thought well, actually I've got a bit of a safety net now. I want to have the job now where I want to go to work every day. And by coincidence, I was talking at a conference and a former customer and of mine came up and had a chat to me, and said, I'm working for, well, I'll name them because we're allowed to, I'm working for BT. And the way BT works when they have a problem, BT is a leading Teleco in the UK and it's not just a leading Telco, they are also a leading technical innovator. You could do an online shopping through a system they invented called Prestel, in the 1980's, long before the Worldwide Web. They are such a big technical innovator. They are one of the leading innovators and one of the leading patent makers in the world. And he was working for BT and he said, the way BT normally solves a problem is we throw a load of money at it and throw a load of engineers at it and the problem is solved. We need lots of fibre optic cable laid throughout the UK, so we'll go out and we'll hire 70,000 engineers and we'll train them up and lay the cable, job done. All it takes is three years, complete. And they said well, we have got lots of communications, that is our business. So, telephone, Internet, text messaging, fax, television, everything and we want to start making this available in sign language and we looked into the problem and found out it takes seven to ten years to train somebody up and it costs a hundred thousand dollars or more and there aren't that many people that want to do it and there aren't that many people capable of doing it. And so, we realised that technology is the only way to do this. And we don't think anyone in the world is doing it yet. A few people have tried and everyone has got so far and not quite been able to get over the threshold of something that people want to use and can use. And he said would you like to be part of the solution on this? And I thought wow, this sounds like exactly what I'm looking for. And I looked into the problem with him and with BT and I persuaded BT that actually my former customer Michael, should actually come and join me and co found Robotica and that without the constraints of a big, long established organisation, we can go quicker, we can do things without approval of senior management and we can just get on and get things done and they gave absolutely their blessing for him they said, we'll let him go as long as he can do it. So, that's what we did and they were also our first customer. We delivered some work for them, signing a sports document through from proof to concept and then started doing work around internal newsletters, which previously been video with captions and we added sign language to that for them, using our AI. So, that's kind of how we got into it. The more we got into it and the more we, the greater our appreciation was of the culture. And so, it became increasingly important that we weren't just there to solve a technical problem. We were there to solve a human problem. So, actually one of our, probably our post important hire is a lady called Catherine, who is our deaf champion and everything we do, absolutely everything we do, is purpose and its implementation has to come from her. So, she says we need this, we go ahead and make it. She guides that and then only if she gives it the green light does it ever see the light of day. And so, last year at one of the conferences we go to, there is a local conference, an incredible conference called the Norfolk deaf festival, which is our location. It gets a couple of hundred people around the country coming. Really, really very busy, very exciting, lots of fun. When we first went there, we got a lot of feedback saying because at the time we were working at doing TV shows because that is first of all, that's where money is. TV has a lot of money. And secondly, it's glamorous and we had good connections there doing work for BT. Doing work for Sky television. But overwhelmingly, what people said to us is well, that's great and everything but we want independence. We don't want to be relying on other people for the stuff that you don't rely on other people for. We want to be able to have public transport, without the need to go and ask somebody what is going on. We want to be able to go around the hospital. We want to have a way finding, a navigation, without having to reply on spoken, or written words. We want sign language for our day to day lives. So, we have refocused to make sure that's what we are doing. We are concentrating on National Health Service in the UK and we are concentrating on public transport and we are concentrating on the dare I say the boring things, so stuff like your utility bills. When you get a letter through saying, someone needs to come and read your meter and they are going to be coming around on this day at this time make sure you're in and make sure the meter is accessible. This is, it's boring, but it matters. And so what we are doing, we are putting a QR code onto those letters, so people scan the QR code with their phone, of if they get it as an email, it's just a hyperlink and they get the translation that sends their information, so including their specific dates, their address and their time and everything else or their NHS appointment or their council tax bill, they're getting it on their device in a way that they can consume it most comfortably.
NEIL:Yes, that's super important, you know. When you've got that life impacting information and it's not in your first language, it always adds a layer of anxiety and then an element of peril. So, being able to remove that is super. I think you know; we understand that the scale of the challenge and in fact it's even greater than the challenge the captions. You know, we know that the deaf community originally was really quite anti AI driven captions and that over time attitudes have started to change because the quality of the captions and the AI translations have got better and better. But there is huge, huge data sets for the AI to learn from, when it comes to captions, in a way that there isn't for sign language. So, aside from the challenges from the deaf community that maybe the quality is not there yet, which we understand around adoption. How do you deal with the fact that when you're creating the AI that drives the avatars that you haven't got this rich pool of data to be able to teach the AI and the avatars in the first place and you know, BSL niche but thinking about moving into other sign languages, which are from smaller cultures, the problem is even harder. So, how are you addressing that?
ADRIAN:And I mean you're completely right, particularly about the amount of data out there and it's even more than you might initially expect because when we are looking at other translations, translations as I say between as you say English and Korean, you have got a written form and the written form is, I won't say it's precise because there is always ambiguity in language. But when you write a word and then you write it again if it's typed, it's the same word exactly the same each time. If you speak it and then speak it again, it's slightly different. And it's the same with sign, if you sign it and then sign it again, if the same person signed the same sign, 20 times, they get 20 quite different versions of it. Even if, to the eye, it's imperceptibly different, it's still different in data terms. So, what we have done is we have made our own kind of written form, it's not truly a written form of British Sign Language but it's kind of the data driven approach that breaks it down. If you think of British Sign Language as and other sign languages as well but I'll talk just about British Sign Language. If you think of British Sign Language in terms of, the analogy I use is like an orchestral performance. You can get the, you can understand what the performance is from just the melody, you know, if you overhear, da, da, da, da, you instantly know what it is. But that's by far not the whole performance. And actually, with British Sign Language, there are many, many, many layers, lots of different instruments, lots of different tempos, lots of different rhythms and percussion, that all need to be in place, to make a complete performance. And so, we see the actual individual signs and that's not just hand movements. It's the hands, the face, the shoulders, the chest, the neck. Everything that's going on, essentially from pretty much the hips upwards is necessary to make a signed performance. What we are doing at the moment is we are taking baby steps. We are saying, our first target is to make something that is understandable. And so, we think there is about three layers in this orchestra for that and that's kind of, we are about three or four layers in at the moment. We think there at least 12 layers that we need to put in there make a can complete performance. But our threshold in the first instance is can this information be understood and we are not competing with translators and interpreters, we are competing with captions. This is an alternative option for people so they can use this instead of using captions, if it's more comfortable for them. If it's their preference. So, what we are doing at the moment is we are saying, well what we'll do is we'll motion capture deaf translators, performing different verses of signs, different translations and then we'll use AI to break that up into individual component parts. So, parts around fingers, parts around the arms, parts and the body, parts around the body language, parts around the facial expressions and then we layer them on top of each other. So, even little things right at the very most basic level. You can turn pretty much. You can turn a lot of signs from British Sign Language into questions by adding on a lean forward or a raised eyebrows or a quizzical scrunching your eyes up a little bit, twisting your head around. So, that might be another layer or two layers on top of the sign layer. And then you've got things like how expressive it is, the speed of the performance. The magnitude of the articulation. So, there are a lot of considerations. But, by saying well, we'll go and solve this iteratively. We don't think we can go straight in and say here's a performance that is comparable to a human being. That will take a long time. What we'll say is, here is a performance that we hope you can understand. We hope is better for you, as a user of British Sign Language, than just having captions alone. Again with data, there is such a shortage of data, traditional machine learning approaches are really, really hungry for data. Really greedy for data and there's just not enough British Sign Language data at all, even in the written form, there wouldn't be enough sign language data to make a convincing machine learning model. So, what we have done is we've gone back to our roots. And my roots as a neuroscientist and the routes of one of my colleagues, as a neuroscientist and with the engineering linguistic skills that we have across the team, we've said, well, let's go back and reinvent, how we think machine learning needs to work for this type of data and we have drawn on that and said that the methods that other people are using won't work for us, they are not very good at the diversity of data that we need. And they are not very good with the small data sets we need. So, we have engineered specifically for British Sign Language and I expect to be able to scale the problem out or smoothed out into other sign languages.
ANTONIO:Adrian, we know that technology, artificial intelligence, is changing how we communicate. You know, how do you see the most recent changes in technology in terms of AI also impacting on sign language itself?
ADRIAN:I mean the changes will, without a doubt be evolutionary, just as the Internet changed everything. The AI that we are seeing coming out right now, things like Chat GBT and MIG Journey, that we might have seen, they will change the world, in very much as big a way as the Internet did, 25 or 30 years ago. And most of those changes we can't even anticipate. Most of the jobs in 20 years' time, we won't even have names for them today and that was exactly the case, 20 years ago. And so, a lot of the changes we just don't know. The important thing is though that with these technological advances, communication barriers are starting to dissolve. People, we are all around the world right now, as we talk to each other, absolutely live and synchronistically, this is something that technology has brought us. Prior to this we might have had four people dialling in on a telephone, prior to that it would have been something where we would have to press a button to talk and prior to that it would have been posting a letter that would go across on a ship and take weeks to get there. So, technology is really bringing down these barriers but it's not without its problems and AI does need to be responsible. It needs to be ethical and it needs to be respectful.
DEBRA:Adrian, I'm so impressed with what you're doing because and this is often what we do on Axschat. We are looking for people that are solving the problems in a different way because the three of us really believe that a lot of the efforts that are being made, a lot of the money that's being spent, is not moving us forward. And we just don't like that. So, it seems like a lot of the efforts that you're making, to make sure that sign language users can be more meaningfully included, could actually be picked up at some point and applied in all the other communications issues that we have all over the world because we all speak different languages. So, I'm fascinated with the efforts that you're making, because it feels like it is the right direction and you know, like you said, the artificial intelligence is going to change everything. But I am just, I'm fascinated with how you're doing it and how you're starting to map it out and then of course, wondering, have you all thought about the possibilities, I mean I know once right now, you need to focus on solving this ridiculously gigantic problem, but it sure seems like there are a whole lot of other benefits to society that are going to solved along the way. Do you think I am thinking about that the right way.
ADRIAN:Yes, no, absolutely you are. And I mean we are one very small piece of a really, really big puzzle and sue hands were the biggest part of all of these puzzles. Technology is an enabler, it's something that facilitates. And actually, you know because we are in a such a niche but new industry, in what we are doing, we are in the unusual position that actually we really want there to be competitors out there because we are not competing with other people making AI sign language translations. We are competing with apathy, with people saying oh, we don't need sign language or sign language is too expensive or there's not just enough return on investment. That's what we are competing with. So, actually we want more people out there fighting a good fight. So that the problem we have, as a company, is getting past this barrier of oh, we don't need it. Actually yes, you do. You probably do, if you are serving a hundred thousand customers. If you're a B to C organisation, serving a hundred thousand customers, you've probably got a couple of hundred people in your customer base who rely on sign language who you are not serving. There are absolutely, I was at a banking conference, a couple of months ago and some of the stories there were heartbreaking. There are people who, when they have signed up for loan agreements, they had video relay or in person interpreters, to help them understand the agreement. When the terms and conditions changed they just got a letter through the post and look, people have lost their homes due to this.
NEIL:Wow.
ADRIAN:It's unacceptable. And the excuse was, I mean, there is kind of two problems that some of these lending organisations have taken. Some of them have said well actually the payout we have to make if we get sued is smaller than the cost of doing it, which you know, maybe financially smaller but it's not smaller in human cost.
DEBRA:Right.
ADRIAN:Others saying, well, there's no real alternative there are not enough translators and interpreters. So, we are taking away that second excuse and actually, bit by bit, we'll be taking away that first excuse as well because what we expect to see, I mean, part of our values is we'll never take a deaf job. We are aiming to create deaf jobs do not take them. We are making a platform that will allow deaf translators to scale their work facilitated by AI, rather than replaced by AI. They will be making a hundred times more translations than they currently do and they will be their translations, virtual versions of those deaf translators. And so, and this is about responsible and respectful AI. But, what we're keen to do is to make sure that translations are available everywhere and this is one of the things the lending standards board in the UK has been really, really positive for sign language users and they're saying this is not just about tokenism. This is about make sure all your communications are available in whatever form that your customers need. And if you sign somebody up using a particular medium or channel, you need to be table to support them for their entire customer journey, in that same channel, and if their needs change you need to support them in that change as well.
NEIL:That's a huge wake up for service providers and a huge ask for the people providing those services hence why what you're doing is so important. We are well over time. But it was well worth it because it's such an interesting topic. So, thank you Adrian, everybody you should checkout Adrian Pickering and you should check out Robitica. It's easy to find. Search for Robitca Sign Language and you'll find it via Google. Thank you also to My Cleartext for helping us with the captions. And to Amazon for sponsoring us and helping keeping us on air. So, look forward to continuing the discussion on social media. One again, thank you very much, Adrian.
ADRIAN:Thank you f. having me. Thank you.