AXSChat Podcast

AXSChat Podcast with Dr. Rachele Hendricks-Sturrup, from the Health Policy Counsel at the Future of Privacy Forum.

April 23, 2021 Antonio Santos, Debra Ruh, Neil Milliken
AXSChat Podcast
AXSChat Podcast with Dr. Rachele Hendricks-Sturrup, from the Health Policy Counsel at the Future of Privacy Forum.
AXSChat Podcast +
Help us continue making great content for listeners everywhere.
Starting at $3/month
Support
Show Notes Transcript

 Dr. Rachele Hendricks-Sturrup is a health scientist and Health Policy Counsel at the Future of Privacy Forum. Dr. Hendricks-Sturrup’s work involves exploring and addressing ethical, legal, and social issues and implementation barriers at the forefront of health policy and innovation. Her research centers on generating best practices for the use and processing of health and genetics data across the patient-consumer spectrum.

Support the show

Follow axschat on social media
Twitter:

https://twitter.com/axschat
https://twitter.com/AkwyZ
https://twitter.com/neilmilliken
https://twitter.com/debraruh

LinkedIn
https://www.linkedin.com/in/antoniovieirasantos/
https://www.linkedin.com/company/axschat/

Vimeo
https://vimeo.com/akwyz




WEBVTT

1
00:00:03.270 --> 00:00:12.900
Neil Milliken: Oh, welcome to axschat i'm delighted that you're joining us for another week and i'm equally delighted that we're interviewing Rachel Hendricks-Sturrup today Rachel is.

2
00:00:13.290 --> 00:00:24.480
Neil Milliken: an academic and working for the future of privacy Forum, which has a really interesting rollers and a nonprofit organization that is looking into things like.

3
00:00:25.140 --> 00:00:37.020
Neil Milliken: The ethical and legal dilemmas of technology and society, so I mean These are big questions tell us a little bit more about yourself how you came to be in this super interesting role, please.

4
00:00:37.230 --> 00:00:40.710
Rachele Hendricks-Sturrup: Yes, absolutely first of all thank you for having me it's a pleasure to be here.

5
00:00:41.910 --> 00:00:52.560
Rachele Hendricks-Sturrup: As you mentioned, I work for the future of privacy forum, a think tank located in Washington DC there I work as the health policy Council and lead focusing on.

6
00:00:53.040 --> 00:01:05.670
Rachele Hendricks-Sturrup: issues at the forefront of technology and innovation and how it affects the world of consumer health, consumer genetics and also how that might even intersect with.

7
00:01:06.120 --> 00:01:25.080
Rachele Hendricks-Sturrup: Traditional clinical settings where health and genetics, are at the forefront as well, so really broadly here in the US, and I say that because the term i'm about to use is related to the 21st century cures act here in the US, I focus on matters related to things like precision medicine.

8
00:01:26.910 --> 00:01:41.430
Rachele Hendricks-Sturrup: Mobile health and the really the full gamut of what all of these technologies can do for health, consumers and patients and for other stakeholders like companies who create them.

9
00:01:42.630 --> 00:01:56.610
Neil Milliken: Excellent and technology is this amazing thing and we've seen the rapid acceleration of of the deployment of technology and the deployment of ever more.

10
00:01:58.380 --> 00:02:03.390
Neil Milliken: How do I put this invasive in terms of knowing more about us, so the technology.

11
00:02:04.530 --> 00:02:24.240
Neil Milliken: has so much more knowledge about has now in terms of you know even sort of you know it's able to predict stuff that we don't even understand about ourselves we're you know we're taking bio bio single readings, where we're embedding stuff into our bodies were collecting all kinds of.

12
00:02:25.680 --> 00:02:34.080
Neil Milliken: weird the quantified human being, you know, through the things we're wearing on our wrists and from everything else that we're doing you know.

13
00:02:35.010 --> 00:02:50.850
Neil Milliken: How do we, how do we keep on top of all of this, make sure that this data that we're gathering the picture and the predictability patents that we can create from all of this are used for the for the right reasons.

14
00:02:51.750 --> 00:02:58.200
Neil Milliken: These are the kinds of ethical questions that you're you're pondering upon right that as as an organization.

15
00:02:58.860 --> 00:03:14.280
Rachele Hendricks-Sturrup: Absolutely, so I spend a lot of time thinking about what I think you just described as what we might call an enter one study, where a single person can generate billions of different data points about themselves and then transform into what you just call the quantified person.

16
00:03:15.600 --> 00:03:19.830
Rachele Hendricks-Sturrup: Thinking about you know that and then also thinking broadly about how data.

17
00:03:20.010 --> 00:03:29.610
Rachele Hendricks-Sturrup: is now elected across millions in society, millions of data points about their movement data points about their health behavior.

18
00:03:30.540 --> 00:03:39.240
Rachele Hendricks-Sturrup: Also data about their genetics and and data about their their location and all of that is sort of coming together.

19
00:03:39.960 --> 00:03:53.160
Rachele Hendricks-Sturrup: To allow data users or processors to create segments of populations to assist them in understanding those populations and also reaching out to those populations so really.

20
00:03:53.550 --> 00:04:02.610
Rachele Hendricks-Sturrup: ourselves as individuals and then together as a society or as a population, we are becoming increasingly quantified and today we have the infrastructure.

21
00:04:02.880 --> 00:04:12.270
Rachele Hendricks-Sturrup: That allows that quantified person to exist, whereas we didn't have that say a couple of decades ago, so as we collect all of these data points and and.

22
00:04:12.930 --> 00:04:24.240
Rachele Hendricks-Sturrup: Try to figure out what they can tell us when we transform them from data into evidence, what does that what does it mean who's going to use the data and why.

23
00:04:24.900 --> 00:04:42.690
Rachele Hendricks-Sturrup: What are some of the limitations to using this data, given that the data is really just a snapshot really, really just a cross section analysis of a person at a certain time under a certain environmental condition or environmental setting.

24
00:04:43.710 --> 00:04:54.810
Rachele Hendricks-Sturrup: How How important is it to recognize the limitations of those data sets to make sure that any choices we make down the line don't perpetuate any particular bias any.

25
00:04:55.200 --> 00:05:07.620
Rachele Hendricks-Sturrup: Particular opinion that may be just that of what it is an opinion and not necessarily a facts and and how do we educate society on how to ask the right questions, rather than.

26
00:05:08.640 --> 00:05:14.370
Rachele Hendricks-Sturrup: educating them about what exactly they should know more about how should we think about this.

27
00:05:14.670 --> 00:05:22.440
Rachele Hendricks-Sturrup: Or, as we collect all of these data points, how should we partner that with more qualitative data so that we can understand the context around.

28
00:05:22.770 --> 00:05:39.780
Rachele Hendricks-Sturrup: The quantified data that we have collected in that we're trying to understand, to ask a sort of research question or any question that might help drive other aspects of society like the economy, healthcare, creating better standards of care empowering the consumer.

29
00:05:40.830 --> 00:05:52.860
Rachele Hendricks-Sturrup: or making certain processes around healthcare or around entertainment or whatever it might be more efficient for the user, how do we ask the right questions, to make sure we're helping ourselves and not hurting ourselves and others.

30
00:05:53.370 --> 00:06:01.890
Neil Milliken: So, so I think that those are really good questions to be asking, and I think that the one of the things that i've been observing is that there is an awful lot of concentration about.

31
00:06:02.130 --> 00:06:14.820
Neil Milliken: You know, being able to use data in certain ways, but, but what we're we're tending to do is look in a fairly siloed way at some of this stuff so you have these great big data lakes, but we're asking questions that are quite narrow.

32
00:06:15.570 --> 00:06:26.520
Neil Milliken: And we're not really understanding, all of the sort of sideways or systemic impacts, so I think that what we're really lacking at the moment in some of this stuff is systems thinking.

33
00:06:27.390 --> 00:06:37.470
Neil Milliken: Because of the we're looking to solve one problem, and you know if all you've got is a hammer everything starts to look like a nail so we're applying this sort of.

34
00:06:37.920 --> 00:06:47.310
Neil Milliken: This sort of data analytics approach without really thinking about all of those other things that we, we might need to consider so when you mentioned about precision medicine.

35
00:06:48.030 --> 00:06:52.350
Neil Milliken: For example, you know everyone's getting ready, this is brilliant, we can we can.

36
00:06:52.680 --> 00:06:59.700
Neil Milliken: You know cure stuff before it starts happening, but you know what What about the impact that it's going to have on people's ability to get health insurance and health care.

37
00:07:00.060 --> 00:07:06.780
Neil Milliken: And stuff like that are we are we actually then creating an underclass are we creating gatica, if you like.

38
00:07:07.650 --> 00:07:15.540
Neil Milliken: What about the fact that other people are going, you know what we can we can cure ADHD we can you know we can.

39
00:07:16.110 --> 00:07:29.760
Neil Milliken: You know screen it out well, maybe we don't want to be screened out maybe that's you know, maybe that's eugenics so so how How can an organization like the future privacy forum.

40
00:07:30.210 --> 00:07:45.780
Neil Milliken: influence these conversations and get the people that are the you know plunging headlong into developing this tech because they're not going to slow down to actually engage and consider these questions before the genies really out of the box.

41
00:07:46.590 --> 00:07:57.240
Rachele Hendricks-Sturrup: that's an excellent question, and thank you for your comments, I think it really drive or you really drove home the point of the the point that I think i'm trying to make here, which is that we have to be better data stewards.

42
00:07:57.540 --> 00:08:05.550
Rachele Hendricks-Sturrup: At the future privacy form we champion ourselves as data stewards basically helping individuals and and others.

43
00:08:06.030 --> 00:08:16.740
Rachele Hendricks-Sturrup: Other data stakeholders understand that it's not that we're trying to prevent the collection of the data in the name of privacy it's that we're trying to imagine the future of privacy as.

44
00:08:17.340 --> 00:08:27.300
Rachele Hendricks-Sturrup: How will the data be used and how can we educate people on using the data in ways that are good for for individuals and good for society.

45
00:08:27.600 --> 00:08:34.620
Rachele Hendricks-Sturrup: So, again chauffeuring the data towards good applications beneficial applications, so that way.

46
00:08:34.950 --> 00:08:43.410
Rachele Hendricks-Sturrup: As the data is generated processing used we're collaborating around using the data and not compromising ourselves in the process or compromising the livelihood of.

47
00:08:43.710 --> 00:08:51.030
Rachele Hendricks-Sturrup: ourselves and others for the benefit of innovation or in a in a name and sake of innovation or precision medicine, as you said it.

48
00:08:52.290 --> 00:09:03.300
Debra Ruh: wow, this is an amazing conversation and I knew that I had the pleasure to speak on a panel with you Rachel and I was just blown away I you give me a lot of hope for the future.

49
00:09:04.170 --> 00:09:19.620
Debra Ruh: And I I wonder things like I know that we're very concerned about the lack of data sets for a diverse groups all over the world we're very concerned about that, and we should be and there's a lot of efforts that's how we met, we met.

50
00:09:20.370 --> 00:09:27.390
Debra Ruh: On me a dance, which I love me on women in Ai and ethics panel and.

51
00:09:28.530 --> 00:09:42.810
Debra Ruh: It and I see a lot of groups really struggling to make sure their data sets are appropriately included, but how do you tie all that, I mean I love your teaching me data stewards and the data stakeholders.

52
00:09:43.170 --> 00:09:49.500
Debra Ruh: How do we make sure the right data and I don't even know if that is the right word is collected.

53
00:09:49.860 --> 00:09:59.460
Debra Ruh: And is there is there, you know how do we make sure all the groups are represented appropriately I just sometimes feels like such a gigantic problem.

54
00:09:59.910 --> 00:10:17.460
Debra Ruh: With so many moving parts which I think you and Neil certainly were already exploring this morning but i'm just curious the do you have any ideas on how we do that because it feels still like like we're doing it siloed very siloed and not working together.

55
00:10:17.940 --> 00:10:20.910
Rachele Hendricks-Sturrup: yeah definitely well that's a very nuanced question.

56
00:10:21.960 --> 00:10:24.360
Rachele Hendricks-Sturrup: As you stated representation matters.

57
00:10:25.380 --> 00:10:39.810
Rachele Hendricks-Sturrup: it's very important to not make material decisions based on data that lacks representation or that is biased or skewed to a particular population yet applied to the broad population.

58
00:10:40.530 --> 00:10:51.720
Rachele Hendricks-Sturrup: it's important to recognize the limitations to the data, as I mentioned earlier, but also equally understand that you know representation in the data is very important, especially if.

59
00:10:52.500 --> 00:11:07.530
Rachele Hendricks-Sturrup: If the use of that data set will affect the population broadly but then, at the same time, and probably on the flip side of things, you have to understand the sensitivities around who does not want to be included in the data set and why.

60
00:11:08.760 --> 00:11:16.650
Rachele Hendricks-Sturrup: There could be trust issues around how the data will be used controversy around how the data will be used.

61
00:11:16.980 --> 00:11:25.920
Rachele Hendricks-Sturrup: A certain precedent that maybe was set by the data user or handler that has you know stirred up some controversy among society so as.

62
00:11:26.460 --> 00:11:37.680
Rachele Hendricks-Sturrup: or or leading to the effect of you know, individuals not feeling comfortable essentially contributing data or information about themselves and then also data sharing practices are concerned, among.

63
00:11:38.160 --> 00:11:47.070
Rachele Hendricks-Sturrup: Many individuals probably it's a global concern really data sharing practices in the sense that if I give you my data, how can you assure me that you will share it with.

64
00:11:47.700 --> 00:11:53.130
Rachele Hendricks-Sturrup: Another third party that may or may not have a nefarious intent around using my data or.

65
00:11:53.640 --> 00:12:01.590
Rachele Hendricks-Sturrup: Or, as i've given you broad consent over uses of my data, how can I, how will, I know that any secondary uses will be in my best interest.

66
00:12:02.490 --> 00:12:12.720
Rachele Hendricks-Sturrup: Or, in the best interest of of society or my immediate loved ones or or whomever So there are a lot of unanswered questions and we are really just at the cusp.

67
00:12:13.200 --> 00:12:25.530
Rachele Hendricks-Sturrup: At the cusp of trying to figure this out, and I think we're at a very critical moment in society, where we are now understanding that it's not just the quantified individual that's important it's about the quality of the qualified.

68
00:12:26.190 --> 00:12:34.080
Rachele Hendricks-Sturrup: Individual that's important and and really understanding again what are some of the humanistic aspects that company.

69
00:12:35.100 --> 00:12:43.200
Rachele Hendricks-Sturrup: Not just the collection and generation of the data, but the uses of the data, how do we bring the human back to the Center at rather than.

70
00:12:44.100 --> 00:12:49.350
Rachele Hendricks-Sturrup: Rather than just kind of taking off with this reductionist quantified person approach and.

71
00:12:50.100 --> 00:12:58.470
Rachele Hendricks-Sturrup: that's I think the greatest the greater challenge that we've now faced or that we now face, because now it's allowing or it's pressuring.

72
00:12:59.340 --> 00:13:05.370
Rachele Hendricks-Sturrup: us to look at ourselves in the mirror and, unfortunately, for most that's one of the hardest things to do.

73
00:13:05.910 --> 00:13:11.070
Rachele Hendricks-Sturrup: Here we are coming back full circle looking ourselves dead in the face and asking ourselves okay.

74
00:13:11.820 --> 00:13:23.580
Rachele Hendricks-Sturrup: As qualified individuals, how do we guide responsibly guide all this quantitative information about ourselves to make sure that we're not harming ourselves today and generations down the line.

75
00:13:30.120 --> 00:13:43.020
Antonio Santos: When I look to the world of data and or with elements of using for artificial intelligence, or even for for industrial manufacturing I just see so many contradictions.

76
00:13:43.860 --> 00:13:53.490
Antonio Santos: From policymakers from different sectors from advocacy groups who sometimes collide, in terms of interests.

77
00:13:54.420 --> 00:14:07.380
Antonio Santos: And then from from industry or companies who are trying to we want to use data but somehow we are always afraid of doing a mistake because somebody is going to tell us we have we have that you have done something wrong there.

78
00:14:09.030 --> 00:14:10.860
Antonio Santos: And just as a side note.

79
00:14:12.000 --> 00:14:25.890
Antonio Santos: When we were discussing gdpr in Europe on on my mind I had decided, you know that the first organizations that are going to be hit by age by gdpr is not going to be private is going to be government so.

80
00:14:26.340 --> 00:14:33.840
Antonio Santos: Because I am reasons that because usually in Europe in the public sector that's where the problems with.

81
00:14:34.680 --> 00:14:45.060
Antonio Santos: systems exist because they don't use updated systems, sometimes government doesn't find enough organizations to have access to the latest applications, so when I saw.

82
00:14:45.810 --> 00:14:57.030
Antonio Santos: All the politicians, you know pushing gdpr I was really expecting that to happen, and that was what actually happened public hospitals that are funded by government.

83
00:14:57.840 --> 00:15:09.540
Antonio Santos: Use applications approval by government we're the first ones to get fines for GDP ours it's quite a contradiction because actually governments are the ones who apply those rules.

84
00:15:09.990 --> 00:15:25.020
Antonio Santos: So it's a kind of a contribution on on on itself and, more recently, about a week ago, though I was in a there was a discussion in a big event in Europe on this dialogue between industry leaders and government.

85
00:15:26.040 --> 00:15:32.430
Antonio Santos: And people went within this will say Okay, you know, the German Government is trying to put together an application.

86
00:15:33.630 --> 00:15:38.490
Antonio Santos: For a kind of these two passport passport for covert for 11 months.

87
00:15:39.330 --> 00:15:50.910
Antonio Santos: And after 11 months you're not able to build an APP a system because, even within government you're not able to make yourself clear in relation to the to the use of data.

88
00:15:51.330 --> 00:16:05.880
Antonio Santos: So if the regulator's is not able they're not able to come forward with the simple idea all to use data, how can we expect private companies to do it so so in the end, how can we sort out all this mess.

89
00:16:06.900 --> 00:16:14.760
Rachele Hendricks-Sturrup: yeah no I you definitely highlighted a lot of pressure contradictions that I think you know many of us have seen I have certainly seen.

90
00:16:16.500 --> 00:16:24.660
Rachele Hendricks-Sturrup: It and so you know one again Another thing we do at FP F is we try to engage engagement is another thing that I think is.

91
00:16:25.680 --> 00:16:34.350
Rachele Hendricks-Sturrup: incredibly important today, bringing everyone to the table to discuss exactly what you just said, people with your perspective, like okay.

92
00:16:35.070 --> 00:16:40.200
Rachele Hendricks-Sturrup: i'm bringing the perspective of someone from civil society, and this is what i've observed i've observed.

93
00:16:40.590 --> 00:16:46.050
Rachele Hendricks-Sturrup: engagement between government and industry in this contradictory way that doesn't work and ultimately.

94
00:16:46.500 --> 00:16:57.540
Rachele Hendricks-Sturrup: myself and others within civil society are the ones, or at least one of the entities that suffer in the process of it all so obviously guys, this is not working so.

95
00:16:58.260 --> 00:17:09.480
Rachele Hendricks-Sturrup: Bringing voices like yours to the table with industry and government present you know, bringing everyone to the table so that way they're present they have a safe space to engage.

96
00:17:10.470 --> 00:17:23.160
Rachele Hendricks-Sturrup: And, and they have someone moderating any pot, but moderating to control for any power dynamics that might exist amongst the groups collecting information to find out where we all look first of all, what can we all agree on.

97
00:17:24.510 --> 00:17:31.350
Rachele Hendricks-Sturrup: We might not agree on how to get there, but what can, what do we all agree on in terms of where we need to go and what we want and what we need to do.

98
00:17:31.860 --> 00:17:51.150
Rachele Hendricks-Sturrup: we're here to sort of level set in fact find around that so that way we can figure out what everyone is saying, but not saying or saying and not saying a lot of times when I engage with stakeholders with with you know multiple perspectives or lived experiences.

99
00:17:52.200 --> 00:18:02.490
Rachele Hendricks-Sturrup: I often find that we're all saying the same thing three different 567 different ways, or we all want the same thing, but we just simply don't agree on how to get there.

100
00:18:03.660 --> 00:18:17.070
Rachele Hendricks-Sturrup: So again, making sure that we have a representative group at the table, going back to what on Deborah mentioned representation is key, making sure that we're not leaving critical or key stakeholders, out of the conversation.

101
00:18:18.300 --> 00:18:34.410
Rachele Hendricks-Sturrup: and making sure that we are essentially flying the spaceship where you know together, you know not just not just send an old fashion, you know power power strained way that we have been i'd say for the last hundreds of years.

102
00:18:35.490 --> 00:18:48.960
Rachele Hendricks-Sturrup: So how do we not, you know repeat mistakes of the past in terms of leaving people behind and rather leveraging the data that we can collect and leveraging the infrastructure that we have to collect all of the data or leveraging.

103
00:18:49.770 --> 00:18:53.610
Rachele Hendricks-Sturrup: The organizations that have emerged, like the future of privacy forum.

104
00:18:54.810 --> 00:19:03.870
Rachele Hendricks-Sturrup: leveraging all of that to make sure that again, we are moving towards a state of good versus of state of more any quality.

105
00:19:04.380 --> 00:19:13.140
Rachele Hendricks-Sturrup: or a state of more violence against certain groups or state of misunderstanding between one group and another and there's a real art to that.

106
00:19:13.710 --> 00:19:26.280
Rachele Hendricks-Sturrup: And and that's I think one of the hardest things to do it takes it takes a lot of skill and conflict resolution and a lot of skill and engagement essentially and also it takes.

107
00:19:26.880 --> 00:19:35.130
Rachele Hendricks-Sturrup: A human to know how to connect with other humans, to make sure that everyone feels that they're in a again a safe space, a place where.

108
00:19:35.490 --> 00:19:52.020
Rachele Hendricks-Sturrup: Their opinion is valued or their perspectives are heard their lived experiences are are actually validated and take it into account and not dismiss because of power dynamics that might exist in the room, we need this sort of mediation down the wrong.

109
00:19:53.610 --> 00:19:57.090
Rachele Hendricks-Sturrup: thing that with with data in a Stuart Stuart it fashion.

110
00:19:58.050 --> 00:20:07.470
Antonio Santos: wouldn't have seen that testosterone meal is that the fact that this this is taking so long and and in the end I don't think it will be that difficult.

111
00:20:08.670 --> 00:20:20.040
Antonio Santos: But it's taking so long to bring clarity in the way we use data and and this takes away different opportunities and in in the situation that we are now.

112
00:20:21.030 --> 00:20:35.640
Antonio Santos: I believe it, this could have helped us to mitigate the spread of the pandemic if we're able to use the data in the right way, because in the end we haven't used the data in the right way, because of privacy concerns.

113
00:20:36.810 --> 00:20:44.730
Antonio Santos: I think this could have been a great way for us to operate in a smarter way in the way are we are we reacted to it.

114
00:20:46.770 --> 00:20:58.770
Rachele Hendricks-Sturrup: I 100% agree, I think we right now, we have an opportunity to again look inward look look at ourselves and ask, we know where do we want to go like.

115
00:20:59.790 --> 00:21:05.850
Rachele Hendricks-Sturrup: It will we really get where we want to go but being divided or or or being.

116
00:21:07.530 --> 00:21:15.660
Rachele Hendricks-Sturrup: or by misunderstanding one another, or being dedicated to misunderstanding one another, I mean, some people are dedicated to misunderstanding certain groups, we all know that.

117
00:21:16.530 --> 00:21:26.550
Rachele Hendricks-Sturrup: And you know that's a whole different discussion, but, but at the same time for those of us who have an interest in in the public good, or who have an interest in the economy.

118
00:21:26.880 --> 00:21:41.400
Rachele Hendricks-Sturrup: or an interest in in social welfare and all of these things we have an opportunity now to not just come to the table and engage with one another, but also to look at the data that we have collected and ask you know how can we use this data.

119
00:21:42.450 --> 00:21:53.340
Rachele Hendricks-Sturrup: And, and how can we make sure that the data is of sufficient quality first before using it in ways that benefit or in ways that could support.

120
00:21:54.750 --> 00:21:56.040
Rachele Hendricks-Sturrup: The stewardship efforts that.

121
00:21:56.040 --> 00:21:57.060
Rachele Hendricks-Sturrup: We do.

122
00:21:58.710 --> 00:22:09.030
Rachele Hendricks-Sturrup: engage in at the moment, to make sure that ourselves today and generations after that can benefit from what we're doing today and we can also teach them how to.

123
00:22:09.510 --> 00:22:19.110
Rachele Hendricks-Sturrup: Properly Stuart data qualitative and quantitative data teaching them how to properly do that because we've done it already so transferring those lessons learned.

124
00:22:19.770 --> 00:22:33.690
Rachele Hendricks-Sturrup: Today, and into future generations, I think, are more important or most important and so today we're we're we've we've approached that challenge, and I think now's the time to seize the moment and really execute and and teach generations, how to do that as well.

125
00:22:34.950 --> 00:22:43.530
Debra Ruh: I agree, I agree, and Rachel you talked about it a little bit when Antonio with Antonio but are there lessons that.

126
00:22:45.120 --> 00:22:48.630
Debra Ruh: That we have learned in society because of the pandemic and Kevin.

127
00:22:49.140 --> 00:23:01.200
Debra Ruh: That can now be applied to the way that we are you know collecting managing using data I I love the point that you made about a quantifiable person i've never heard that term.

128
00:23:01.530 --> 00:23:05.850
Debra Ruh: So I just thought that was fascinating but I often see.

129
00:23:06.480 --> 00:23:16.380
Debra Ruh: US corporations are just because I live in the US and i'm probably noticing that more, but you know in in front of our Congress explaining how they misuse our data.

130
00:23:16.770 --> 00:23:27.270
Debra Ruh: and promising, oh no we and I don't believe anything they're saying, because I just feel they are going to continue to misuse our data I think consumer confidence is at an all time low, but.

131
00:23:27.570 --> 00:23:38.100
Debra Ruh: Have we, as society and through the work that you're doing, are you seeing lessons learned that are going to be valuable as we move forward.

132
00:23:39.330 --> 00:23:46.020
Rachele Hendricks-Sturrup: I have seen lessons learned emerge in it, it really comes from, I would say.

133
00:23:46.500 --> 00:23:57.750
Rachele Hendricks-Sturrup: At least when it comes to learning how to engage individuals and bring diverse perspectives to the table, I have seen that happen or those types of lessons learned created on the academic side.

134
00:23:58.320 --> 00:24:09.960
Rachele Hendricks-Sturrup: A lot of academicians have been dedicated to doing that, as well as other nonprofit organizations that have a data driven focus and I have also.

135
00:24:11.070 --> 00:24:28.230
Rachele Hendricks-Sturrup: Seen advocacy organizations others from civil society like citizen scientists, for example, create these sorts of lessons learned so even drawing on some resources that they've created that are publicly available online are a good place to start.

136
00:24:29.280 --> 00:24:44.490
Rachele Hendricks-Sturrup: Obviously it SPF we've put out our own work, we continue to put out our own work on you know, in this regard and and we will continue to do this, especially me cove at 19 you know i've spent a lot of time recently looking at vaccine passports.

137
00:24:45.930 --> 00:25:07.920
Rachele Hendricks-Sturrup: Really understanding the nuances there and and really I think one important thing to realize, is that you know as we sort of anchor our new way or desire to digitize vaccine passports or vaccine records and use them in a way that's more broad than what we're used to.

138
00:25:09.270 --> 00:25:24.750
Rachele Hendricks-Sturrup: it's important to understand, first of all, there are still a lot of unanswered questions around coven in terms of you know what vaccine efficacy might look like you know, in the next five years, also.

139
00:25:26.100 --> 00:25:37.560
Rachele Hendricks-Sturrup: vaccine passports we're creating this digital solution for a one time pandemic, for you know for a particular cross sections or cross sectional event in time.

140
00:25:38.100 --> 00:25:51.720
Rachele Hendricks-Sturrup: And then, on top of that some are even considering the use of vaccine passports for day to day access to just day to day events and activities like going to the park going to the bank going to the grocery.

141
00:25:52.590 --> 00:26:05.460
Rachele Hendricks-Sturrup: going to the doctor, whatever it might be going to work so do we want to really create this sort of surveillance society around or in the name of vaccine passports.

142
00:26:06.510 --> 00:26:20.550
Rachele Hendricks-Sturrup: Or, in the name of addressing Kovac 19 um you know vaccine records are or have been around for a long time, I mean a lot of us are familiar with having to show them when we travel internationally, or when we sign our kids up for school.

143
00:26:21.330 --> 00:26:38.190
Rachele Hendricks-Sturrup: When we go to school on campus so you know, showing vaccine records is not new, but a vaccine passport just to gain access to day to day activities is a bit invasive too many and many have argued that and again.

144
00:26:39.690 --> 00:26:54.270
Rachele Hendricks-Sturrup: Thinking about you know, whatever we do today, or whatever we implement today could become the new societal norm in the future is that where we really want to go and and is this really in the best interest of privacy in the best interest of societal growth.

145
00:26:55.500 --> 00:26:59.550
Rachele Hendricks-Sturrup: and also in the best interests of generations to come that.

146
00:27:00.660 --> 00:27:08.640
Rachele Hendricks-Sturrup: That might you know fight back you know against you know the the privacy invasive nature of any type of passport.

147
00:27:09.840 --> 00:27:15.180
Rachele Hendricks-Sturrup: You know, we still just have so many questions, and I think you know understanding, you know again.

148
00:27:15.600 --> 00:27:25.110
Rachele Hendricks-Sturrup: How digitization and data play a role in all of this who's storing all of this vaccine passport data, who storing all the geo location data that might accompany it.

149
00:27:25.680 --> 00:27:39.000
Rachele Hendricks-Sturrup: it's questionable and and we cannot just implement something without asking these questions first and engaging the right stakeholders so that way we can sort of like vote on whether or not this is actually what where we want to go, or what we want to do.

150
00:27:41.220 --> 00:27:43.050
Neil Milliken: yeah and I think that.

151
00:27:44.520 --> 00:27:56.220
Neil Milliken: What we're seeing with Kobe we've seen very draconian laws put in place that in in normal times would have people protesting in the streets, at the very thought of.

152
00:27:56.670 --> 00:28:04.200
Neil Milliken: And yet we've allowed all of this stuff just to be applied to us and and and the amount of surveillance that states can now.

153
00:28:05.160 --> 00:28:13.740
Neil Milliken: impose on citizens is is unprecedented we have the technology to do it to a much deeper level than than ever before, as well, so I think that.

154
00:28:14.340 --> 00:28:24.630
Neil Milliken: That those kind of things, almost have been done unquestioningly because we've we've we've slipped sleep walked into this because we're in this kind of this crisis.

155
00:28:25.710 --> 00:28:37.560
Neil Milliken: What we're seeing in the UK right now is changes to the laws about protest too so even when we when coven finishes we're still going to be able to have difficulty being able to.

156
00:28:38.670 --> 00:28:48.810
Neil Milliken: protest these laws, because actually there a new law being put in place to restrict our ability to to to do what we've always done, which is gathering person to.

157
00:28:49.890 --> 00:28:51.780
Neil Milliken: To raise our concerns so.

158
00:28:52.980 --> 00:28:58.080
Neil Milliken: Some of them do have a time limit on them, but, but essentially the precedent has been set.

159
00:28:59.130 --> 00:29:02.100
Neil Milliken: The other thing is, I think that that also we've.

160
00:29:03.960 --> 00:29:05.670
Neil Milliken: Really fixated on digital.

161
00:29:06.750 --> 00:29:16.440
Neil Milliken: And where the UK failed really miserably in terms of track and trace was that actually we we trusted digital over.

162
00:29:17.550 --> 00:29:24.390
Neil Milliken: existing systems so actually we had a really very good paper based system of.

163
00:29:25.980 --> 00:29:37.680
Neil Milliken: epidemiology tracking and tracing in the UK and and it was it was well set up and, but it was done manually by the doctors surgeries and they collected all of it.

164
00:29:38.970 --> 00:29:55.020
Neil Milliken: But we have a you know, an administration that is fixated with digitizing everything, and so they didn't consult the doctors, they didn't go to the traditional way they they went out and contracted that organizations that had no past history and doing stuff and as a result.

165
00:29:56.220 --> 00:30:03.630
Neil Milliken: none of it worked so, so I think we also in our ethical considerations also need to be thinking about.

166
00:30:05.100 --> 00:30:15.810
Neil Milliken: Is technology always the answer to some of this stuff why why, why do we have to digitize because everyone's saying yes, we must digitize so so what.

167
00:30:16.890 --> 00:30:20.580
Neil Milliken: Why don't we take a step back and understand you know understand what we've got.

168
00:30:21.750 --> 00:30:32.940
Neil Milliken: and determine whether or not we should digitize and what we should be doing, and like Antonia saying in terms of privacy, etc, which bits of information and necessary, and I know that.

169
00:30:35.010 --> 00:30:36.600
Neil Milliken: Some former colleagues of mine.

170
00:30:38.310 --> 00:30:47.220
Neil Milliken: I working on things like blockchain in terms of allowing only the relevant bits of information to be given to people.

171
00:30:48.090 --> 00:30:52.560
Neil Milliken: To to enable that sort of certain transactions to take place, that you have.

172
00:30:53.100 --> 00:31:01.560
Neil Milliken: Your sort of digital identity on the blockchain we only share the bits as a required for that entity to deliver a service to cetera, so I think.

173
00:31:01.830 --> 00:31:16.530
Neil Milliken: that there are people thinking about this stuff and thinking about where it's necessary to play attack, but that, as a society, we need to take stock of this and think a bit more deeply so i'm really impressed by the work that's.

174
00:31:17.670 --> 00:31:19.710
Neil Milliken: being done by your organization on this.

175
00:31:20.340 --> 00:31:41.460
Rachele Hendricks-Sturrup: So thank you and and I agree, I think a lot of confidence was placed in you know efforts or digitization efforts like digital contact racing but we fail to first and foremost understand what society actually wanted or needed or what society felt comfortable.

176
00:31:42.510 --> 00:31:44.490
Rachele Hendricks-Sturrup: In you felt comfortable with using.

177
00:31:46.080 --> 00:32:01.680
Rachele Hendricks-Sturrup: You know a lot of the the user experience you know user interface sort of work, you know hadn't been there, initially so or if it was it just probably are clearly just wasn't enough so we again there's still a lot of questions that we need to ask there's still a lot of.

178
00:32:03.240 --> 00:32:13.080
Rachele Hendricks-Sturrup: introspect that we need to gain on the topic before we can move forward and just you know set the train running around digitization efforts in.

179
00:32:15.240 --> 00:32:22.290
Neil Milliken: yeah Thank you so much for your your insights today we reached the end of our half hour I just need to thank.

180
00:32:23.040 --> 00:32:36.240
Neil Milliken: bartos access micro link and my clear text for keeping the lights on and helping keep us going and sustaining all of the work that we've been doing for the last seven years, and we really look forward to you joining us on Twitter next week.

181
00:32:36.840 --> 00:32:39.630
Rachele Hendricks-Sturrup: Absolutely, it was a pleasure, thank you for having me.

182
00:32:41.850 --> 00:32:43.050
Debra Ruh: Great job great job.