AXSChat Podcast

AXSChat Podcast with Heather Dowdy - Senior Program Manager, AI & Accessibility at Microsoft

June 08, 2021 Antonio Santos, Debra Ruh, Neil Milliken talk with Heather Dowdy
AXSChat Podcast
AXSChat Podcast with Heather Dowdy - Senior Program Manager, AI & Accessibility at Microsoft
AXSChat Podcast +
Help us continue making great content for listeners everywhere.
Starting at $3/month
Support
Show Notes Transcript

 

With more than a decade of experience developing and demonstrating accessible technology in mobile, web and artificial intelligence, Heather Dowdy is passionate about connecting the dots across disability, race, and tech. She currently manages Accessibility Skilling at Microsoft, with a focus on making the future of work more inclusive. Heather is responsible for the global partnership strategy to scale skilling opportunities that lead to employment for people with disabilities.

Before relocating to Redmond, she was the Senior Product Manager of Accessibility Engineering at Motorola Mobility in Chicago. Heather has served as Chair of the Accessibility Working Group (AWG) of the Mobile Manufacturers’ Forum (MMF) and Chair of the Board of Directors of the World Institute on Disability (WID).

As the oldest daughter of Deaf parents, Heather is fluent in American Sign Language (ASL). She loves using the design thinking process to create solutions that empower marginalized communities and improve usability for everyone. A strong believer in empowering communities through education and benevolence, Heather is the co-founder of Microsoft Ninja Camp and a Board member of deaf Kids Code providing STEM leadership training to high school students with disabilities. Heather obtained her Bachelor of Science in Electrical Engineering from the University of Illinois at Urbana-Champaign. A native Chicagoan, she can now be found exploring the beauty of the Pacific Northwest with her husband and three young children. Heather also enjoys reading, journaling, and interacting with different cultures while traveling abroad 

Support the show

Follow axschat on social media
Twitter:

https://twitter.com/axschat
https://twitter.com/AkwyZ
https://twitter.com/neilmilliken
https://twitter.com/debraruh

LinkedIn
https://www.linkedin.com/in/antoniovieirasantos/
https://www.linkedin.com/company/axschat/

Vimeo
https://vimeo.com/akwyz




WEBVTT

1
00:00:04.440 --> 00:00:16.560
Neil Milliken: Welcome, welcome to axschat i'm really pleased that we're joined today by Heather Dowdy been a fan of heather's quite some time heather is is working at Microsoft in the field of Ai for good so.

2
00:00:16.830 --> 00:00:25.020
Neil Milliken: heather it's it's a real pleasure to have you here today we've had some conversations before around the importance of some of the topics.

3
00:00:25.230 --> 00:00:34.680
Neil Milliken: that we need to consider for Ai when we're talking about inclusion, but please tell us about you, your role and how you came to be working in the field of.

4
00:00:35.700 --> 00:00:37.410
Neil Milliken: accessibility and inclusion.

5
00:00:39.120 --> 00:00:46.800
Heather Dowdy: Thank you, thank you access chat for having me i'm really excited for this conversation because i'm excited to talk to our Community.

6
00:00:47.040 --> 00:00:56.880
Heather Dowdy: there's so much to really talk about when you think about the future of technology and quite honestly I come to this work, personally and professionally and ally.

7
00:00:57.330 --> 00:01:02.820
Heather Dowdy: So i'm a quota out of definitely don'ts I began signing sign language at six months old.

8
00:01:03.270 --> 00:01:13.740
Heather Dowdy: And I was raised by young Black Death parents on the South side of Chicago so lots of intersections there we had all types of technology in our home.

9
00:01:14.070 --> 00:01:25.830
Heather Dowdy: growing up that helps my parents here and communicate with others and I honestly fell in love with what technology could do, which is open up doors provide access.

10
00:01:26.340 --> 00:01:40.860
Heather Dowdy: And I went to school for engineering and was able to to fall in love with a career and access technology i've been in mobile accessibility web accessibility and now Ai accessibility.

11
00:01:42.420 --> 00:01:43.530
Neil Milliken: Thank you and.

12
00:01:44.970 --> 00:01:55.200
Neil Milliken: I think it's really important that that we accept the the contribution of allies and and and families and that lived experience of working with.

13
00:01:56.460 --> 00:02:06.150
Neil Milliken: disabilities from both sides, as part of our Community, because, because sometimes there is the tendency to focus on.

14
00:02:06.780 --> 00:02:21.210
Neil Milliken: The disability Community themselves rather than the wider ecosystems that we all live in and and then let's face it, some of us start off as allies and then, on the other side in the Community to.

15
00:02:21.630 --> 00:02:37.950
Neil Milliken: write an effective we live long enough pretty much all of this world so so you know here at access yeah we're definitely you know broad church, we want to welcome everyone so so it's it's it's great to hear a bit bit about your background so.

16
00:02:39.510 --> 00:02:43.230
Neil Milliken: I also spent time in mobile so i'm still fascinated right.

17
00:02:44.070 --> 00:02:49.710
Neil Milliken: Well, the changes more incremental and it's not quite so revolutionary, but there's still.

18
00:02:50.850 --> 00:03:02.820
Neil Milliken: The the conduit, if you like, for a lot of the stuff that's happening in Ai actually so a lot of the stuff that Ai is powering we're experiencing through these things.

19
00:03:03.360 --> 00:03:13.470
Neil Milliken: So so yeah tell tell us a bit about what you're doing in the Ai programs that you're working on and how that's impacting the Community.

20
00:03:14.970 --> 00:03:30.210
Heather Dowdy: Oh, I also want to comment on what you just said about being an ally it's so true we need allies in the accessibility Community I like to think that I have a certain privilege and privilege isn't necessarily inherently a negative word.

21
00:03:30.450 --> 00:03:38.400
Heather Dowdy: it's how you use your privilege and so having one foot in the deaf world and one foot in the hearing world allows me to observe a lot.

22
00:03:39.390 --> 00:03:46.770
Heather Dowdy: And it allows me to comment on a lot, and the reason why I bring that up is because you know it's taken some time to realize that.

23
00:03:47.010 --> 00:03:56.070
Heather Dowdy: That is stuck with me throughout my career, I don't get to leave that experience at the door just because i'm talking about a new technology.

24
00:03:56.340 --> 00:04:03.570
Heather Dowdy: It stuck with me in Milan, when we were looking at hearing aid compatibility to know this is a big game changer.

25
00:04:03.810 --> 00:04:14.730
Heather Dowdy: You know and web accessibility is stuck with me and realizing wow we get so much information on the Internet, how do we make sure that is accessible to everyone and now with.

26
00:04:15.240 --> 00:04:22.500
Heather Dowdy: Technology it's just the newest frontier, so I like to try to stay on the latest frontier when it comes to accessibility.

27
00:04:22.950 --> 00:04:33.900
Heather Dowdy: And for me and looking at Ai you're right it's a new technology, but the way we approach, it is very much the same as any other technology as mobile and web.

28
00:04:34.170 --> 00:04:41.490
Heather Dowdy: And it is to really think about who's at the table who's participating and designing that technology.

29
00:04:41.790 --> 00:04:57.540
Heather Dowdy: Who is participating when it comes to making sure that the Community understands the impacts, both positive and negative of the new technology, and so what I love about Ai is because it's seen as a disrupter.

30
00:04:58.260 --> 00:05:05.880
Heather Dowdy: And, quite honestly, I think we need a little bit of that there are some things, including the unemployment rate and the disability community.

31
00:05:06.180 --> 00:05:26.520
Heather Dowdy: That hasn't changed in 30 years in the US, since we started counting it, you know, a 50% unemployment rate for 30 years we got to do something different, and so really the as the Community, we need not be afraid of approaching a new technology, like Ai just like as we're done with others.

32
00:05:30.150 --> 00:05:33.600
Neil Milliken: excellent, and I think that's that's that's that's so true.

33
00:05:36.630 --> 00:05:40.830
Neil Milliken: And those figures you know, although I know them are shocking.

34
00:05:42.240 --> 00:05:47.820
Neil Milliken: You know that 50% unemployment is is not just shocking but it's unacceptable.

35
00:05:48.840 --> 00:05:57.450
Heather Dowdy: And that was before called it, and so during the Kobe pandemic, we see that people with disabilities.

36
00:05:58.140 --> 00:06:15.300
Heather Dowdy: are at the forefront of job losses, so when we think about a really inclusive recovery, what is it going to take, we need to leap frog i'm like let's go let's leap frog and not necessarily try to play catch up.

37
00:06:16.950 --> 00:06:21.300
Debra Ruh: And heather you I was listening to your interview, the other day.

38
00:06:22.170 --> 00:06:33.870
Debra Ruh: On verizon media with margo jaffe who am I, like you, i'm a big fan of her work and you had talked a little bit, and I also am very blessed to have had you on which showed human potential at work.

39
00:06:34.380 --> 00:06:48.870
Debra Ruh: And were you to talk, you talked about some of the same themes, but I know you were discouraged by Professor to be an engineer, because of the color of your skin, so I do think it's very interesting and maybe because you were a woman, I don't know.

40
00:06:48.930 --> 00:06:58.080
Debra Ruh: But I think that Li that accomplice conversation the intersection ality conversations are just so powerful and so I love that you.

41
00:06:58.470 --> 00:07:15.210
Debra Ruh: are representing our Community, representing women representing African Americans and people that are Brown and black, but why is it so important that all of these voices and intersections be considered in these topics and do you believe that that is really happening.

42
00:07:16.920 --> 00:07:32.580
Heather Dowdy: wow great question Deborah I I love the fact that I can explain parts of my identity D3 the word intersection but before there I knew that, where it was just my lived experience it was who I am and quite honestly.

43
00:07:34.140 --> 00:07:51.360
Heather Dowdy: I didn't have to show up in places and try to cover certain parts or minimize other parts, and that is not working, that is not working to serve the Community that's not working, for me, every day, and so a lot of my.

44
00:07:52.800 --> 00:08:00.060
Heather Dowdy: Doing are coming to terms with that really did happen with getting my engineering degree, when I was told by.

45
00:08:01.020 --> 00:08:09.000
Heather Dowdy: A white woman who was a dean of the College that I would never get my engineering degree from there, I had just finished taking her class.

46
00:08:09.570 --> 00:08:18.330
Heather Dowdy: In engineering and I was declaring my major and she took one look at me and just was like you're better off majoring in Caribbean studies just by one look at me.

47
00:08:19.710 --> 00:08:31.500
Heather Dowdy: And so, of course, and that really pissed me off, and so I went to the minority engineering department and told them what she said, and I was telling us to.

48
00:08:32.250 --> 00:08:40.890
Heather Dowdy: A black man, that was a Dean there, and he looked at me and said well who told you you're going to be an engineer anyway, so you know, to get it from both ends.

49
00:08:41.220 --> 00:08:53.190
Heather Dowdy: was like wow I will show you but for me it really grounded me and why I wanted to do this, I knew that I was serving people that looks like my parents I knew that I had a bigger why.

50
00:08:53.610 --> 00:09:07.050
Heather Dowdy: And that really did the same me, but the reason why that story and stories of enter sectional identities are important, is because it allows us to see where we could be harming people.

51
00:09:07.350 --> 00:09:14.280
Heather Dowdy: And where we can be leaving people intersection ality shows up in a lot of places it really does inform.

52
00:09:14.820 --> 00:09:28.740
Heather Dowdy: How I also proceed people with disabilities, it really does allow me to say wow that could be one part of their identity, but there is so much more there can we really you know get to know the whole person.

53
00:09:31.650 --> 00:09:35.100
Neil Milliken: And i'm obviously.

54
00:09:36.450 --> 00:09:39.300
Neil Milliken: very disappointed, but not shocked by.

55
00:09:40.410 --> 00:09:52.500
Neil Milliken: or experience I come from a different position, I come from a position of privilege, in many ways, I come from a middle class background i'm white Anglo Saxon at the same time.

56
00:09:53.550 --> 00:10:05.340
Neil Milliken: I was also told not to do things not to expect things and I was lazy or delete and today, and why not, why not, you know why was I applied towards it.

57
00:10:06.360 --> 00:10:12.150
Neil Milliken: So I think the AC the reason we're sat here today is because we were told that we couldn't do stuff.

58
00:10:12.210 --> 00:10:21.540
Neil Milliken: As much as we can that we can do stuff what I hope is that the message that we all give to people is that.

59
00:10:22.470 --> 00:10:34.980
Neil Milliken: don't take no for an answer if you really believe that you want to do something that will follow your passion, because you can do it because you're the one that determines whether or not you can't now there are systemic barriers.

60
00:10:36.300 --> 00:10:50.130
Neil Milliken: And those barriers are different for different social groups and and so you know it's like running the hundred meters and I get a 60 meter head start on you, because of where I was born and the color of my skin.

61
00:10:51.060 --> 00:10:57.000
Neil Milliken: And when you can't complete the hundred meters what i'd like to do is make sure that we all start from the same place.

62
00:10:57.810 --> 00:11:10.650
Antonio Santos: A bet on that topic, and if we go back to some of the live stories of some of our guests in some cases, the release the resilience was on the parents sides.

63
00:11:11.160 --> 00:11:26.400
Antonio Santos: If it wasn't for the parents to pursue say I don't accept a no as an answer and now their their children wouldn't have a chance of achieving what they were able to achieve and be Dr today.

64
00:11:27.690 --> 00:11:29.040
Heather Dowdy: is a really great.

65
00:11:29.040 --> 00:11:37.170
Heather Dowdy: Point Antonio and I would also add that, in my experience, I think about the parents of.

66
00:11:37.770 --> 00:11:43.560
Heather Dowdy: children and students with disabilities of color from marginalized communities where these parents.

67
00:11:43.980 --> 00:11:55.050
Heather Dowdy: aren't necessarily probably to understand what access options, there are, and so, then you even have a disparity there amongst Okay, which parents understand what resources.

68
00:11:55.320 --> 00:12:02.850
Heather Dowdy: are available to them, because they aren't available to everyone, at the same level as they should be, and to me that really does drive me.

69
00:12:03.450 --> 00:12:06.690
Heather Dowdy: Particularly and one of my passions in education.

70
00:12:07.350 --> 00:12:22.860
Heather Dowdy: That education to employment pipeline there's so many things there, but if we don't really think about intersection ality or sit with people's stories we really won't be doing a service to our entire Community, I think, for so long.

71
00:12:23.460 --> 00:12:33.210
Heather Dowdy: As a disability community we thought well if we talked about race that kind of detract from disability or if we focus on this one area or maybe we should just focus on certain.

72
00:12:34.380 --> 00:12:47.670
Heather Dowdy: visible disabilities and and that hasn't gotten as far as we would like, and we can no longer wait until we get to that particular mountain top to say everybody else should be included as well.

73
00:12:53.790 --> 00:12:57.060
Neil Milliken: i'm not sure why were you.

74
00:12:57.720 --> 00:12:58.320
Debra Ruh: On mute.

75
00:12:58.380 --> 00:13:05.580
Debra Ruh: Because this is, I totally agree with what she's saying and there's so many things we've got to figure out when it comes to society period.

76
00:13:05.970 --> 00:13:16.260
Debra Ruh: But for a specifically, this is a huge undertaking that we're doing a society, and we must have voices like heather and we must, we must have.

77
00:13:16.650 --> 00:13:27.900
Debra Ruh: People like heather in all of the rest of us fighting for others to come in, but there's so many I know that before we started the show Antonio had dropped a link in the our little chat box.

78
00:13:28.170 --> 00:13:34.980
Debra Ruh: Talking about some of these issues and even last night heather we were teasing you because you look so beautiful today but.

79
00:13:36.060 --> 00:13:44.970
Debra Ruh: And you were talking about Anna you were talking about what you were looking at last night that actually kept you awake so there's so many issues.

80
00:13:45.300 --> 00:13:51.180
Debra Ruh: That you know, in some ways they seem like the same issues it's all about.

81
00:13:51.750 --> 00:14:04.380
Debra Ruh: Leaving parents out leaving individuals out leaving people out, but I know that as a parent when I was raising my daughter with down syndrome, there was so much that I didn't have any information I didn't know where.

82
00:14:05.550 --> 00:14:14.640
Debra Ruh: And then we were trying to deal with a very complicated, ie P system intern individual education program, for example, here in the States and.

83
00:14:14.970 --> 00:14:21.780
Debra Ruh: I felt so sorry for some of my peers that didn't have the same communication skills, I had they didn't have the.

84
00:14:22.020 --> 00:14:29.400
Debra Ruh: The almost like you're saying what are you want to do, plus also i'm a white woman, so I have a little bit more privilege.

85
00:14:29.610 --> 00:14:48.360
Debra Ruh: When it comes to the teachers and influence sadly so it's like the societal barriers are so complicated and so entrenched I mean it shocks me it shocks and saddens me that a white woman told you, you could not be an engineer, it saddens me that a black man didn't stand with you.

86
00:14:48.570 --> 00:14:49.500
Heather Dowdy: So, to have your.

87
00:14:50.280 --> 00:14:50.790
Debra Ruh: I mean your.

88
00:14:50.850 --> 00:14:57.990
Debra Ruh: Gender to have your peers said you can't do it, thank goodness you're strong enough and say Oh well, let me tell you.

89
00:14:59.190 --> 00:15:09.600
Debra Ruh: Watch me do it, and so i'm stubborn like that, too, so is Neil Neil so stubborn Suzanne we go ahead and tell me I can't do it, and then get out of my way, because what you do it.

90
00:15:09.990 --> 00:15:22.950
Debra Ruh: Watch me do it, but but it's not fear it's it's really it's not good for society there's so many reasons why what we're doing to people, because what if you didn't heather What if you didn't have the confidence to say.

91
00:15:23.490 --> 00:15:32.130
Debra Ruh: Oh yeah watch me, but then we would have lost something very valuable, you would have lost your family would have lost society loses so.

92
00:15:32.400 --> 00:15:43.260
Debra Ruh: it's chilling what we're doing and and the problem seems so big and overwhelming sometimes i'm sure, a lot of us feel like I don't even know where to begin, these problems are just too big and.

93
00:15:43.560 --> 00:15:50.340
Heather Dowdy: Oh, my goodness, I mean talking about the education to employment pipeline, particularly.

94
00:15:50.700 --> 00:15:58.710
Heather Dowdy: um I Mike what can we do to really make sure that we fill that pipeline that we really get students with disabilities jobs.

95
00:15:59.070 --> 00:16:07.770
Heather Dowdy: But even in that and talking about some of the systemic barriers, you know, did you know that over 33% of.

96
00:16:08.250 --> 00:16:11.910
Heather Dowdy: The students that actually feel the school to prison pipeline.

97
00:16:12.270 --> 00:16:21.720
Heather Dowdy: are black students with disabilities, so you have that intersection of race and disability there, and you have to really mean we really do have to ask the question.

98
00:16:21.960 --> 00:16:30.720
Heather Dowdy: Why is that what's happening, you know how is disability being perceived do we understand enough about the intersection.

99
00:16:31.620 --> 00:16:39.090
Heather Dowdy: of race and disability in, and I would also challenge, all of us to think that these issues impact us personally.

100
00:16:39.960 --> 00:16:55.710
Heather Dowdy: When I don't hear your story, you know I miss out on the goodness of you and all that you bring and how we connect and so instead of just thinking about it as a problem over there, it is a problem that affects us as a community yeah.

101
00:16:56.160 --> 00:17:05.340
Neil Milliken: I think Deborah put in the chat 60% of your in us and makes us has the largest prison population per capita in the world.

102
00:17:06.810 --> 00:17:15.150
Neil Milliken: People with disabilities and in the UK, I think the stats or something like 40% of all people in prison are people with.

103
00:17:16.590 --> 00:17:17.760
Neil Milliken: learning disabilities.

104
00:17:19.110 --> 00:17:20.940
Neil Milliken: dyslexia, you know and.

105
00:17:22.560 --> 00:17:31.950
Neil Milliken: I think that the way that the education system is set up and it's so inflexible and so structured to going down a certain path.

106
00:17:32.880 --> 00:17:52.230
Neil Milliken: makes it very easy for people that aren't meeting a certain neuro type or a racial type or aces from a socio economic background to quickly fall through the gaps and we not only do we miss out their potential, but we we go further than that we block it.

107
00:17:52.950 --> 00:17:56.610
Neil Milliken: And then we, and then we take, we criminalize them.

108
00:17:57.480 --> 00:18:15.690
Neil Milliken: Through through all sorts of things and and and I think that there are some people doing really interesting work in the criminal justice system to try and address that we've we've obviously had judge ren on access chat a couple of times big fan of gentlemen but but i'm also.

109
00:18:17.820 --> 00:18:38.820
Neil Milliken: Aware of the work of project 507 in the UK which Whitney Isles, who I know from a nother thing I do runs and she looks at the intersection of trauma and ethnicity, in social justice and that's a whole nother fat fascinating area.

110
00:18:40.050 --> 00:18:42.120
Neil Milliken: So, because.

111
00:18:43.170 --> 00:19:00.300
Neil Milliken: trauma informed behavior informs how people interact with the authorities and and that then causes all sorts of chain reactions to happen, so if your conditions because of the social trauma and trauma can be inherited.

112
00:19:01.560 --> 00:19:13.740
Neil Milliken: Because you're from us, you know different socio economic groups, then you're much more likely to react or authority in a different way and they're going to assume guilt and all of these things can impact on people.

113
00:19:14.460 --> 00:19:28.020
Neil Milliken: And they impact on statistics so statistically, we have all of these things and those feed into the data that we're feeding into the Ai that is now making decisions about our lives.

114
00:19:28.830 --> 00:19:31.800
Heather Dowdy: Yes, wow can I can amplify what you.

115
00:19:31.800 --> 00:19:35.130
Heather Dowdy: Say it's more than we missed out we're harming people.

116
00:19:35.400 --> 00:19:35.820
Yes.

117
00:19:37.080 --> 00:19:48.510
Heather Dowdy: And it's like that also amplify what you said, in terms of trauma that's an area that we are team was looking at very recently we published a report on mental how.

118
00:19:49.260 --> 00:19:55.740
Heather Dowdy: societal bias in the black communities and Ai and just the potential there, and a lot of.

119
00:19:56.040 --> 00:20:06.300
Heather Dowdy: What we found out where they did have to tie to just racial trauma and so can we understand mental health conditions through the intersection of.

120
00:20:06.570 --> 00:20:16.860
Heather Dowdy: Identity lens of race, could there be things there that inform us and and again that goes to why language matters because it's like oh now you're talking about mental health.

121
00:20:17.100 --> 00:20:24.360
Heather Dowdy: The number one disability in the world that's where it fits into the disability conversation, so I just love everything that you said.

122
00:20:25.980 --> 00:20:26.340
Neil Milliken: Thank you.

123
00:20:28.170 --> 00:20:30.390
Neil Milliken: i'm immediately jumping to.

124
00:20:31.800 --> 00:20:33.210
Neil Milliken: email introduction to it.

125
00:20:35.220 --> 00:20:43.140
Heather Dowdy: Every time we talk right right, where do we want to go, and then we get off on this whole thread because there's so much here there's so much.

126
00:20:44.310 --> 00:20:47.670
Neil Milliken: yeah and I think that.

127
00:20:48.690 --> 00:20:48.930
Neil Milliken: That.

128
00:20:49.980 --> 00:21:02.940
Neil Milliken: That that leads on to the next thing that i'm really interested in, and I think we going into soft really soft focus I know i'm not got glasses on but i'm really soft focus today it's like someone smeared vaseline on that, which is a good thing for all of you folks watching.

129
00:21:04.920 --> 00:21:22.380
Neil Milliken: It thinking about if we've got these historical data sets that contain the the innate biases of society that then create harm, how do we address that when and at scale.

130
00:21:23.970 --> 00:21:36.810
Neil Milliken: When you can't necessarily go and suddenly just dragging 3000 people to start addressing those day said so, for example, you were talking I think before we started, we were talking about you know.

131
00:21:38.340 --> 00:21:53.010
Neil Milliken: camera recognition technology right and some of the challenges of that so if most of the data of people doing what are considered normal day to day things you know white SIS head males.

132
00:21:54.780 --> 00:22:03.030
Neil Milliken: And we need pitch you know we need pictures of people of different ethnicities of different backgrounds and everything else.

133
00:22:03.600 --> 00:22:20.610
Neil Milliken: Can we use things like generative adversarial networks to start, and I know there are experiments going on to try and do this to start building up images and data sets that are much more diverse and do you think that that's a valid approach.

134
00:22:22.740 --> 00:22:37.140
Heather Dowdy: inclusive data sets that are representative of people with disabilities and the lived experience is crucial because you mentioned that Ai runs on data but it's like the one point that I penciling with a God awful and.

135
00:22:38.130 --> 00:22:49.440
Heather Dowdy: Why, we want to use Ai we don't necessarily need to a hammer, where it's Ai everything but really think about why we're using the technology and at its core.

136
00:22:49.830 --> 00:23:05.070
Heather Dowdy: Ai is technology that processes lots of data, whether images or audio and allows us to make decisions it shouldn't replace humans, it should complement our decision making.

137
00:23:05.640 --> 00:23:17.250
Heather Dowdy: And so we really think about that framework, we really have to think about Okay, some of the design justice principles when it comes to developing Ai responsibly.

138
00:23:17.850 --> 00:23:29.760
Heather Dowdy: Who is harmed who benefits and who's participating, because that also lets you know who's been excluded, as we say in the Community if you're not included you're excluded.

139
00:23:30.690 --> 00:23:41.610
Heather Dowdy: And so those design justice principles are are lens that can be apply to everything and when it comes to lots of technology.

140
00:23:42.960 --> 00:23:57.120
Heather Dowdy: Particularly in in focused on Ai technology it's important that we do have a balance of privacy and transparency when collecting data because, quite honestly.

141
00:23:57.510 --> 00:24:06.450
Heather Dowdy: Our community can be a little bit fearful, what do you need that data for him, is it going to trace back to me specifically as a person living with a disability.

142
00:24:06.870 --> 00:24:21.330
Heather Dowdy: But what I want to counter with that is that we can be transparent, on how the data is being used and how is collected, but I have to land that we have to collect more inclusive data sets, we need the Community.

143
00:24:21.990 --> 00:24:33.570
Heather Dowdy: That already has inherent trust to be able to collect those data sets, because that is what's going to help voice assistance for instance recognize impacted speech.

144
00:24:34.200 --> 00:24:42.150
Heather Dowdy: People living with cerebral palsy, for example, we need more data we need those models trained on acoustic data sets for sure.

145
00:24:43.800 --> 00:24:55.050
Debra Ruh: heather I know I know that Antonio has a question, but I also I was tracking you know what was happening with tick tock in the United States and the politics and stuff but.

146
00:24:55.470 --> 00:25:05.310
Debra Ruh: One I remember when Microsoft said they were going to buy the American version of tick tock I was just sort of surprised as an individual, and I was thinking.

147
00:25:05.610 --> 00:25:15.330
Debra Ruh: wonder why Microsoft wants to talk, but then I read this very interesting article about it and they said, the reason why Mike and I don't know what happened if Microsoft did buy it but.

148
00:25:15.810 --> 00:25:21.870
Debra Ruh: The reason why Microsoft was intrigued with tick tock was because of all those data sets out there and that.

149
00:25:22.290 --> 00:25:34.320
Debra Ruh: Just that I just was fascinated by and because I thought we'll have because we are telling tick tock who we are, by all these cute little things we're doing right so.

150
00:25:34.650 --> 00:25:44.670
Debra Ruh: I was just so fascinated and I didn't know if you wanted to comment, and I know what's going Antonio also has a question, but I wanted to just throw that out real quick because it seemed timely to where you were in the conversation.

151
00:25:47.010 --> 00:26:06.960
Heather Dowdy: Was addictive I had had a positive, because I like learning and it's a way to just really get these really nice tidbits and one to two minutes and it's really engaging I have to say that for a lot of reasons, but I do think that we have to get creative about where we get data sets from.

152
00:26:08.100 --> 00:26:14.220
Heather Dowdy: Because we need scale, which is something Neil mentioned when you think about.

153
00:26:15.330 --> 00:26:22.830
Heather Dowdy: voice recognition that's been trained on billions of data points and you compare that to perhaps.

154
00:26:23.790 --> 00:26:27.840
Heather Dowdy: You know voice recognition for people with impact of speech, like I said.

155
00:26:28.200 --> 00:26:39.120
Heather Dowdy: or even sign language you know we're talking about much smaller data sets and the more data that we train Ai on the more accurate and confident, it is.

156
00:26:39.510 --> 00:26:53.040
Heather Dowdy: You know, and so it's just important for us to get creative and how we do it at scale, but I don't think that that should paralyze us as a community from starting to figure out how we can collect data sets that are inclusive today.

157
00:26:55.140 --> 00:27:03.390
Antonio Santos: Following on that conversation of datasets we we've seen startups trying to come into this space try to make.

158
00:27:04.110 --> 00:27:18.990
Antonio Santos: Try to make a more robust and more reliable, but even so, we are talking about assistance, I can tell you a story when I was studying sociology there was a program very well known at the time called spss.

159
00:27:20.160 --> 00:27:31.860
Antonio Santos: So we were using that for analyst for for analytics and to graphics and to do a lot of research English to the data that we're collecting, but like what sometimes when we were.

160
00:27:32.460 --> 00:27:42.510
Antonio Santos: As university, we have deadlines and we have to conduct surveys to different groups and institutions, and I have cases in my colleagues, where people were on a rush.

161
00:27:43.050 --> 00:27:53.880
Antonio Santos: To complete the task, or to deliver certain number of service, where they end up feeling them the service themselves in behalf of other people, because they needed to.

162
00:27:54.600 --> 00:28:05.100
Antonio Santos: provide some details and in some information to the teacher and then they were feeding the technology in this case the analytic tool spss with that data.

163
00:28:05.610 --> 00:28:16.650
Antonio Santos: So there was nothing wrong with the tool itself, the method and the approach that's where the problem was OK, so I feel that you know we are building all this.

164
00:28:17.340 --> 00:28:29.370
Antonio Santos: Solutions who are using you know engineers developing systems, but the root cause is a little bit different sometimes it's not even relate with the technology itself so.

165
00:28:30.360 --> 00:28:38.550
Antonio Santos: Knowing as well that sometimes some of these systems are delivered by people from a computer science background, who.

166
00:28:39.540 --> 00:28:51.480
Antonio Santos: are focused on targets on Oh, I really need to deliver something I need to be productive, how can we make sure that you know the other como components that are somehow in what I call.

167
00:28:53.010 --> 00:29:08.460
Antonio Santos: area of work computer engineers are not skilled scalable knowledgeable in terms of the social aspects of it, how can we feel this knowledge gap that exists here in order to avoid mistakes.

168
00:29:10.980 --> 00:29:21.900
Heather Dowdy: Great question Antonio and a lot of that is again who thought the table who's participating and that's why i'm so passionate about education, because we need more people with disabilities.

169
00:29:22.230 --> 00:29:30.180
Heather Dowdy: Who are entering into data science built and other fields that leverage Ai to be at the table to really make sure.

170
00:29:30.870 --> 00:29:43.440
Heather Dowdy: That we have and deliver an inclusive experience, and I will also say what i'm encouraged by when I look at Ai and through my experience in the field is that there is room for inclusive teams.

171
00:29:44.220 --> 00:29:57.300
Heather Dowdy: it's not just computer scientists and data scientists at the table, but sometimes we have one was at the table and psychologist at the table and I love that type of diversity of thought, because.

172
00:29:57.810 --> 00:30:05.040
Heather Dowdy: You know, sometimes, that means that hey you're going on slow down a little bit, naturally, because everyone is not thinking homogeneous asleep.

173
00:30:06.090 --> 00:30:17.610
Heather Dowdy: Which is not a negative thing but you're going to have a better experience and outcome as a result, but again, we really do more people with disabilities in this field.

174
00:30:19.140 --> 00:30:19.710
Debra Ruh: Yes.

175
00:30:20.730 --> 00:30:24.540
Neil Milliken: So I mean you just mentioned about.

176
00:30:26.190 --> 00:30:37.980
Neil Milliken: homogeneity, and that is something that Ai tends to focus on it's looking for commonalities and yet diversity is the opposite.

177
00:30:38.880 --> 00:30:40.170
Neil Milliken: So how do we.

178
00:30:41.400 --> 00:30:43.170
Neil Milliken: How do we approach.

179
00:30:44.310 --> 00:30:46.530
Neil Milliken: Using Ai.

180
00:30:47.760 --> 00:30:57.750
Neil Milliken: and creating Ai that can recognize the benefits of stuff at the edge, because what it tends to do is filter out and look for the middle ground.

181
00:30:58.830 --> 00:31:12.990
Neil Milliken: and actually where we're at the extremes quite often, and some of the most beneficial things come from understanding the extremes, because when you include the the the edges you're building for everyone and for the Center.

182
00:31:13.680 --> 00:31:19.500
Neil Milliken: And there's a fun, how do we fundamentally redesign our approach to Ai that's a big question i'm sorry.

183
00:31:20.370 --> 00:31:30.810
Heather Dowdy: yeah you do the way that my mouth right you gave me the question and the label the debate question i'm not mad at you nail, it is a big question one of my favorite questions because.

184
00:31:31.200 --> 00:31:42.630
Heather Dowdy: it's what we what we call balancing intelligence, you know, based on history with the future with discovering something new, so you're right and.

185
00:31:44.100 --> 00:31:58.920
Heather Dowdy: Ai is modeling based on historical data, but history hasn't always served marginalized communities, including people with disabilities, well, so do we really want to repeat history by modeling based on historical data.

186
00:31:59.400 --> 00:32:11.850
Heather Dowdy: And the answer, is it really depends and that's why we really have to remember that using Ai technologies is supposed to compliment our decision making, so if we can learn insights.

187
00:32:12.090 --> 00:32:26.490
Heather Dowdy: About root causes and perhaps where, for instance, people with disabilities get stuck in that employment pipeline, can we then use that to build something new, but not necessarily replicate what's been done in the past.

188
00:32:28.050 --> 00:32:31.290
Neil Milliken: Excellent point and I guess then.

189
00:32:32.550 --> 00:32:33.480
Neil Milliken: You know it.

190
00:32:33.660 --> 00:32:36.150
Neil Milliken: It comes some of it comes down to explain ability.

191
00:32:36.480 --> 00:32:49.410
Neil Milliken: of Ai right and Ai Ai explained ability is a big topic because, for most people this stuff happens in a black box, you know it's it's not transparent, as to what's happening or how decisions are being taken.

192
00:32:51.450 --> 00:33:07.680
Neil Milliken: Especially when it comes to systems that make decisions that can affect your life so you know your things that qualify you for insurance for medicare for when deciding whether or not you meet the cup for a job.

193
00:33:10.650 --> 00:33:22.290
Neil Milliken: Do you think that there is work still needed to regulate or to build frameworks for this that that that companies and countries and organizations need to adult.

194
00:33:24.330 --> 00:33:34.860
Heather Dowdy: I think frameworks are extremely important, we have to talk about what we're doing with the data, particularly when it comes to the disability community.

195
00:33:35.280 --> 00:33:49.680
Heather Dowdy: Or you know me walking up to you on the street and say hey give me your social and your height and all of these other things use, you should want to know what i'm gonna do with that data, and I should be able to explain, as you said.

196
00:33:50.520 --> 00:33:58.440
Heather Dowdy: In a way, that honors your privacy and in a transparent way, and so the responsible framework that Microsoft has.

197
00:33:59.190 --> 00:34:10.110
Heather Dowdy: Is is one that we we firmly believe is important when really considering how to deploy the technology which is is why you see Microsoft.

198
00:34:10.770 --> 00:34:20.280
Heather Dowdy: kind of have to reassess things once they're in the field, and they are used differently than how we intended for it to be designed.

199
00:34:20.520 --> 00:34:29.580
Heather Dowdy: And so that again goes back to a responsible Ai framework, because with any technology you design it once and then sometimes they have a life of its own.

200
00:34:30.450 --> 00:34:38.580
Heather Dowdy: And so, how do you can, how do you control that you know, but if you really do kind of put those things in place in the beginning.

201
00:34:38.850 --> 00:34:54.930
Heather Dowdy: Within that responsible framework or responsible a framework, you can begin to continue to have the conversation, you cannot just develop it and leave it, you have to continue to evaluate and have the conversation, of how it's being used after it's deployed.

202
00:34:56.550 --> 00:35:04.680
Debra Ruh: Well, said, and I have a quick question to ask you heather and Neil as well and Antonio I don't know if he wants to comment but, and I know we're almost out of time, but.

203
00:35:05.250 --> 00:35:12.390
Debra Ruh: I i'd had a conversation with you heather i've had a lot of really brilliant conversations with you, and one thing that you were.

204
00:35:13.200 --> 00:35:18.990
Debra Ruh: talking to me about was you know how can corporations, help the community of people with disabilities.

205
00:35:19.260 --> 00:35:26.040
Debra Ruh: Because often when we're business owners and things like that we don't really even know how to pitch what we're doing corporations.

206
00:35:26.280 --> 00:35:36.540
Debra Ruh: And so we know that we want the Community involved, but what role can Microsoft in a toast play to make sure that the disenfranchised communities.

207
00:35:37.380 --> 00:35:44.700
Debra Ruh: understand how to join the conversations and how to have the conversations in a way that they can be heard and.

208
00:35:45.420 --> 00:35:55.740
Debra Ruh: and often we don't think that's the case, but I know that's something that has been very important to you heather so I was just wondering if we could talk about that real briefly before we we end.

209
00:35:56.310 --> 00:36:06.450
Heather Dowdy: Sure, and i'm glad someone asked me the question because you know the answer might surprise everyone else, but I don't think it'll surprise you and it's amplifying storytelling.

210
00:36:07.080 --> 00:36:20.130
Heather Dowdy: And that, essentially, is what we're doing with the Microsoft API for accessibility program we are providing lots of examples where Ai can be used for good to solve societal challenges.

211
00:36:20.820 --> 00:36:34.230
Heather Dowdy: For the disability community and when corporations like Microsoft in a toss get behind the disability Community talking about the impact of the product service technology, what have you.

212
00:36:34.650 --> 00:36:39.240
Heather Dowdy: That is a totally different framework that Microsoft telling you that story.

213
00:36:39.840 --> 00:36:53.130
Heather Dowdy: You really get the human need and why that technology is needed, and you really get the impact, and so it really is simple, which is why I love amplifying other people's story last week we had.

214
00:36:53.730 --> 00:37:03.180
Heather Dowdy: One of the projects that we're working on with northwestern and mental health, America and university of Toronto, where they're looking at democratizing.

215
00:37:03.510 --> 00:37:15.600
Heather Dowdy: Access to mental health solutions through something that's pretty low tech test text messaging but it's backed up by a really fancy Ai algorithm that can deliver.

216
00:37:15.870 --> 00:37:26.670
Heather Dowdy: Mental health and behavioral interventions right when you need it, based on how you engage with that APP so just to really figure out Okay, what is going to work to continue to engage.

217
00:37:26.940 --> 00:37:31.230
Heather Dowdy: This person and develop this this new behavioral pattern and habit.

218
00:37:31.650 --> 00:37:46.320
Heather Dowdy: And this is for the benefit of lots of people that are wary of going to a therapist and so thinking about how we can reach people where they are, but then to hear the impact of the story.

219
00:37:46.800 --> 00:37:58.260
Heather Dowdy: of mental health America talking about how anxiety screenings went up 400% during the depression during the pandemic and depression screenings went up about 800%.

220
00:37:58.800 --> 00:38:13.980
Heather Dowdy: And the fact that you know this was the mental health was the number one and growing disability before the pandemic, but those numbers tell you hey we got a heat map going on here hey we need some solutions today.

221
00:38:14.790 --> 00:38:35.130
Heather Dowdy: And so I love the concept of really thinking about what accessibility is always done, which is when you solve for one you extend too many and I love that about this community and I love that about the stories that we get to tell with the Community using technology.

222
00:38:39.840 --> 00:38:47.010
Neil Milliken: And is not the only believer in storytelling directly you're in good company.

223
00:38:49.170 --> 00:38:53.910
Neil Milliken: is immensely powerful and the way to connect to people it's authentic.

224
00:38:55.170 --> 00:39:06.780
Neil Milliken: And I actually need to give my authentic thanks to you for being our guest for being part of our Community, for being a great ally for working in tech I need to thank my career tech for keeping us captioned.

225
00:39:08.010 --> 00:39:19.080
Neil Milliken: My curling and and Barclays access for also supporting us over the years to keep us on air we're now in our seventh year 13 plus billion impressions on Twitter and counting.

226
00:39:20.730 --> 00:39:23.640
Neil Milliken: So thank you heather i'm really looking forward to the chat.

227
00:39:24.900 --> 00:39:28.140
Neil Milliken: On Tuesday night on Twitter or Tuesday lunchtime, for you.

228
00:39:29.280 --> 00:39:30.690
Neil Milliken: Thank you have a great weekend.

229
00:39:31.440 --> 00:39:34.290
Heather Dowdy: Thank you so much, so much gratitude, thank you all.

230
00:39:34.830 --> 00:39:36.240
Debra Ruh: Thank you heather Thank you.