AXSChat Podcast

AXSChat Podcast with Nataliya Kosmyna, Ph.D in Computer Science. Working on Brain Computer Interfaces

March 19, 2021 Antonio Santos, Debra Ruh, Neil Milliken talk with Nataliya Kosmyna
AXSChat Podcast
AXSChat Podcast with Nataliya Kosmyna, Ph.D in Computer Science. Working on Brain Computer Interfaces
AXSChat Podcast +
Help us continue making great content for listeners everywhere.
Starting at $3/month
Support
Show Notes Transcript

Nataliya is passionate about an idea of creating a partnership between AI and human intelligence, fusion of a machine and a human brain. She obtained her Ph.D in 2015 in the domain of non-invasive Brain-Computer Interfaces (BCIs) as a part of EHCI team of Université Grenoble-Alpes, France. Most of her projects are focused around EEG-based BCIs in the context of consumer grade applications. Before joining MIT Media Lab – Fluid Interfaces group in 2017, she was a post-doc at Hybrid team (VR/AR), Inria Rennes, France. Nataliya has published and served as a program committee member in conferences and journals such as Nature Scientific Reports, CHI, ACM IDC, Ubicomp, INTERACT, TOCHI, Frontiers in Human Neuroscience, Plos ONE, IEEE EMBC and ACM AutomotiveUI. She gave 2 TEDx talks. Nataliya worked for the past 12 years on designing solutions to control drones, rolling robots, home appliances using brain activity. These projects were presented to general public and were tested by more than 4000 people in 2015-2019. Nataliya won multiple awards for her work, among which is L’Oréal-UNESCO Women in Science award she received in 2016. Nataliya was also named as one of 10 Top French Talent 2017 from MIT Innovators Under 35. 

Support the show

Follow axschat on social media
Twitter:

https://twitter.com/axschat
https://twitter.com/AkwyZ
https://twitter.com/neilmilliken
https://twitter.com/debraruh

LinkedIn
https://www.linkedin.com/in/antoniovieirasantos/
https://www.linkedin.com/company/axschat/

Vimeo
https://vimeo.com/akwyz




WEBVTT

1
00:00:03.270 --> 00:00:11.670
Neil Milliken: hello, and welcome to access chat i'm delighted that we're joined today by Natalia because Mina Natalia is a researcher into.

2
00:00:12.360 --> 00:00:24.210
Neil Milliken: MIT and she's leaving work on brain computer interfaces you need a PhD in Grenoble on that topic and is doing some really amazing work in the fields around cognitive.

3
00:00:25.140 --> 00:00:39.420
Neil Milliken: conditions such as dementia and unless I seen her talk before I think they're going to be really interested in what she has to say so, welcome to Italia it's good to have you here, can you tell us a little bit about.

4
00:00:40.290 --> 00:00:46.290
Neil Milliken: Your background the work that you're doing and then we'll be sure to ask you to load more questions.

5
00:00:47.670 --> 00:01:01.080
Nataliya Kosmyna: Hello Hello everyone and it's so nice to be here, thank you for the invitation, it is really exciting is a new you already mentioned, I did my PhD in finance and that we call brain computer interfaces.

6
00:01:01.440 --> 00:01:11.430
Nataliya Kosmyna: it's a set of devices systems implantable or non implantable usually been born on the head of a person, where we can pick up.

7
00:01:11.850 --> 00:01:30.720
Nataliya Kosmyna: A brain activity of a person, and we can analyze this brain activity and we can use it in whole different set of applications imagined replace in your mouse with your brain signals directly and during my PhD I was lucky enough to try to do different different applications.

8
00:01:31.980 --> 00:01:38.790
Nataliya Kosmyna: drawn control smart home control video games and after.

9
00:01:39.360 --> 00:01:49.650
Nataliya Kosmyna: radiating I moved to the United States to MIT and continuing working on rented interfaces with a slightly different and go.

10
00:01:49.980 --> 00:02:02.550
Nataliya Kosmyna: on improving and supporting different cognitive states and conditions of people who have different health challenges and needs for example dementia and alzheimer's and parkinson's.

11
00:02:04.920 --> 00:02:06.450
Neil Milliken: that's fantastic and.

12
00:02:07.980 --> 00:02:15.150
Neil Milliken: You mentioned that there's two different ways of interfacing the brain and that you can have either invasive.

13
00:02:16.470 --> 00:02:23.700
Neil Milliken: interfaces or non invasive ones, my understanding is that your work, mainly focuses on the non invasive.

14
00:02:24.690 --> 00:02:31.350
Nataliya Kosmyna: yeah totally I I only do and non invasive interfaces exclusively for someone who.

15
00:02:31.860 --> 00:02:47.550
Nataliya Kosmyna: might be knowledgeable in the topic we I use a modality called electro into photography or EG you might have done this type of tests, for example in the hospitals for insomnia, in some different sleep disorders, that as a hospital.

16
00:02:48.570 --> 00:02:55.110
Nataliya Kosmyna: fights to treat, but there are actually a lot of other modalities, first of all non invasive wants a very.

17
00:02:55.380 --> 00:03:11.640
Nataliya Kosmyna: Common and actually famous one is fmri you might want again have been to the hospital have done those tests so FM right is neither of the modalities but there's you have guessed they are very expensive, they are used for clinical or heavily researched.

18
00:03:12.870 --> 00:03:24.750
Nataliya Kosmyna: implications because we can adjust easily bring a person and put them in this machine and, obviously, as they are especially with our current situations, they are being used heavily to treat medical conditions.

19
00:03:25.530 --> 00:03:43.980
Nataliya Kosmyna: So yeah I mostly work with headsets had bands glasses headphones so head one devices have different form factors, but he has new year either also a whole set of labs and researchers, they do amazing work in invasive systems yeah.

20
00:03:44.070 --> 00:03:46.620
Neil Milliken: So, so the invasive systems.

21
00:03:47.640 --> 00:03:58.290
Neil Milliken: Slightly scary to me i'm not so keen on having like headcount open but but but we've worked with.

22
00:03:59.790 --> 00:04:06.060
Neil Milliken: The Institute of bio engineering in Catalonia before where they they're working with.

23
00:04:07.170 --> 00:04:16.680
Neil Milliken: Patients who have parkinson's and other conditions where they are using tiny wires into the brain to.

24
00:04:17.760 --> 00:04:18.330
Neil Milliken: Look at.

25
00:04:19.560 --> 00:04:21.810
Neil Milliken: Controlling tremors and so on, so.

26
00:04:23.760 --> 00:04:30.960
Neil Milliken: i'm aware of the benefits it's pretty it's a lot more scary I think so i'm kind of glad you're not cutting people open.

27
00:04:32.460 --> 00:04:32.940
Nataliya Kosmyna: No.

28
00:04:34.080 --> 00:04:34.500
Nataliya Kosmyna: I mean.

29
00:04:35.850 --> 00:04:42.240
Nataliya Kosmyna: that's that's exactly the point I think there's a lot of advantages like and drawbacks in both invasive and.

30
00:04:42.240 --> 00:04:42.450
Nataliya Kosmyna: Another.

31
00:04:42.480 --> 00:04:49.710
Nataliya Kosmyna: Non invasive let's be super honest, there are advantages one invasive and I hear a lot of people saying oh it's invasive it's like implanted in my head.

32
00:04:49.920 --> 00:04:55.710
Nataliya Kosmyna: Instead of my scope, it means it's like the signal is perfect, we can record things in a perfect way.

33
00:04:56.040 --> 00:05:04.740
Nataliya Kosmyna: As it's not exactly true because we are kind of figuring out the same problem you don't put Alex Jones all inside of your brain to like wrap up your.

34
00:05:05.310 --> 00:05:11.400
Nataliya Kosmyna: electrodes you also a lot of people don't know that, as you mentioned, actually.

35
00:05:11.970 --> 00:05:20.310
Nataliya Kosmyna: Brain surgeries are really extremely dangerous brain is very aggressive are open to actually induce something inside.

36
00:05:20.670 --> 00:05:30.180
Nataliya Kosmyna: And, as of now, as of today, that is actually not so many implants by degradable implant that can survive in the brain more than three years so more than.

37
00:05:30.690 --> 00:05:38.610
Nataliya Kosmyna: 36 months that's What it means is that there is a around 40% chance that what you are creating minutes chokes.

38
00:05:38.880 --> 00:05:51.270
Nataliya Kosmyna: When you insert that implant but then you need to take it out it kind of just stays in your instincts there no, you need to take it out, once again, there is a huge chance that you're going to create a mini stroke by breaking the vessels.

39
00:05:52.050 --> 00:05:58.500
Nataliya Kosmyna: And what is even more important that what applications would you use it for, in addition to this, health and medical danger.

40
00:05:59.490 --> 00:06:11.340
Nataliya Kosmyna: There is also, maybe not a need yet so some conditions, as you mentioned, like epilepsy unknown to be treated very well with deep brain stimulation and the variable started it's a very.

41
00:06:12.240 --> 00:06:21.660
Nataliya Kosmyna: I would almost say common surgery almost, however, some of them are much more experimental and I consider them in some cases, especially the research goes on and goes by.

42
00:06:22.290 --> 00:06:29.670
Nataliya Kosmyna: We don't need to you know drill this COP to actually do a lot of interesting things and yeah we just don't need to do that.

43
00:06:30.090 --> 00:06:51.270
Neil Milliken: yeah so I guess some of the challenges, even when you're wiring people directly it's still about the signal to noise ratio and getting clear signals and understanding and interpreting what this the signals are so I know Deborah is going to have some questions for sure, but.

44
00:06:52.350 --> 00:07:00.090
Neil Milliken: How so with the stuff that you're putting on people's heads, what are you doing with that, so you said you're working with.

45
00:07:00.870 --> 00:07:10.200
Neil Milliken: dementia patients and you're also working with people with ALS and in the in the UK would call that motor neuron disease so for the UK audience.

46
00:07:10.860 --> 00:07:22.980
Neil Milliken: So so that's a degenerative condition and people in the later stages pretty much have no movement so So what are the things that you're doing with the tech for these days.

47
00:07:24.540 --> 00:07:35.670
Nataliya Kosmyna: yeah so as you have mentioned, and once again the language and names of some diseases are sometimes slightly different from country to country, so yeah in case you want to specify.

48
00:07:36.060 --> 00:07:44.970
Nataliya Kosmyna: That because I actually do have partnerships with UK with France back home and also United States and sometimes it means slightly different but you're right.

49
00:07:45.390 --> 00:07:51.510
Nataliya Kosmyna: Most of the conditions I do work with our neurological degenerative diseases or family of diseases or.

50
00:07:52.200 --> 00:08:04.860
Nataliya Kosmyna: challenges that are known to be neurodegenerative and I do work kind of let's say on a large spectrum of these diseases, meaning that I do very onset.

51
00:08:05.760 --> 00:08:14.610
Nataliya Kosmyna: Dementia condition dementia with Lewy bodies condition, where we are trying to identify as of now.

52
00:08:15.360 --> 00:08:22.320
Nataliya Kosmyna: So it isn't a search, it is a clinical trial, if there are actually markers biomarkers.

53
00:08:22.980 --> 00:08:30.450
Nataliya Kosmyna: Based on your brain signals, but not only brain actually also another physiological signals coming from your body particular from your head.

54
00:08:30.960 --> 00:08:41.400
Nataliya Kosmyna: To act to define and detect their cognitive decline prior before you would need to see the doctor because let's say you lose consciousness or your.

55
00:08:41.730 --> 00:08:51.810
Nataliya Kosmyna: You know your you stop and you fall so basically predicting it on the much earlier onset so you can be treated potential, with the main.

56
00:08:52.080 --> 00:09:04.380
Nataliya Kosmyna: medication that are being tested or being in place right now but, unfortunately, are only really effective on a very early onset and sometimes just a bit too late, unfortunately too late and, as you can imagine.

57
00:09:05.130 --> 00:09:14.040
Nataliya Kosmyna: Because i'm talking about headsets, I think, maybe even later show you one of here which i'm actually talking about they are very light as they avoid.

58
00:09:24.840 --> 00:09:25.860
Neil Milliken: talia for a second.

59
00:09:29.760 --> 00:09:37.740
Debra Ruh: We did I hope she comes back, because this is so interesting so interesting i'm sure she will she'll get.

60
00:09:39.030 --> 00:09:40.920
Debra Ruh: There her wi fi will get unstuck.

61
00:09:41.310 --> 00:09:41.700
yeah.

62
00:09:43.380 --> 00:09:46.920
Debra Ruh: And she probably doesn't need us to but we'll put it in the chat window.

63
00:09:49.260 --> 00:09:49.590
yeah.

64
00:09:51.750 --> 00:09:55.710
Neil Milliken: For sure we've all been used to having a.

65
00:09:56.430 --> 00:09:58.050
Debra Ruh: Wonderful where I live, yeah.

66
00:09:59.400 --> 00:10:10.530
Debra Ruh: As we wait for her to come back on I you know just so we can keep everything going but okay so she jumped off well you know it, but but sure i'm sure she'll be back yeah but.

67
00:10:11.130 --> 00:10:17.250
Debra Ruh: uh you know one thing, of course, my my husband's walking this right now, it is really.

68
00:10:17.700 --> 00:10:32.160
Debra Ruh: A horrifying walk with dementia and I thought it was interesting what she was saying that if we can find out before they start having symptoms and what I realized with my husband is that my husband was having symptoms years before.

69
00:10:33.060 --> 00:10:50.370
Debra Ruh: But I thought he was just taking his retirement too seriously I didn't realize what I was seeing, and so I think it's fascinating what she's saying that if they can find out beforehand, you know they can do more, but the reality is you don't know what it is you think it's.

70
00:10:51.930 --> 00:11:04.260
Debra Ruh: You know you look at yourself i'm getting forgetful or things like that, so I think that's fascinating I also want to say that, and I think a lot of people would know this, but we're going to learn so much about our brains.

71
00:11:05.280 --> 00:11:16.770
Debra Ruh: From the next couple of generations, I remember being 60 plus I remember when we were losing men about 4545 to 52 heart attacks.

72
00:11:17.100 --> 00:11:26.340
Debra Ruh: I remember my husband's grandfather he died from a heart attack, I mean I know so many you know family members and from different people that we're dying from heart attacks, but then.

73
00:11:26.640 --> 00:11:37.170
Debra Ruh: We got really good with the heart and so now we know how to solve a lot of those problems and luckily men aren't dying from those heart attacks, but with the brain, I think this is that.

74
00:11:38.040 --> 00:11:45.390
Debra Ruh: The decades of the brain and we're going to learn so much about it, and what does it really mean oh good here she's coming right back yeah.

75
00:11:46.440 --> 00:11:46.800
So.

76
00:11:49.050 --> 00:11:50.190
Debra Ruh: Welcome back.

77
00:11:51.090 --> 00:12:04.080
Nataliya Kosmyna: I am so sorry I don't know what happened, but I guess, this is, in our coffee time sleeping in a zoom box, I really I mean i'm fully charged really sorry about that no.

78
00:12:06.960 --> 00:12:17.280
Debra Ruh: it's technology we have it's just we are we get it so much here so Oh, we were just talking we understood what was happening, and so I was just filling it in but.

79
00:12:18.450 --> 00:12:24.450
Debra Ruh: I just thought it was so interesting what you were saying about if you could, if you can catch it earlier.

80
00:12:24.540 --> 00:12:29.730
Debra Ruh: And of course that's always the case, but, sadly, my husband has.

81
00:12:31.140 --> 00:12:35.970
Debra Ruh: Would they diagnosed with early onset dementia and he recently.

82
00:12:37.110 --> 00:12:45.180
Debra Ruh: he's been walking this path, a long time, but, as I was saying to the audience and Neil and Antonio well technology, you had kicked you out that.

83
00:12:45.990 --> 00:12:56.550
Debra Ruh: I realize now that, when I look back, I realized what was happening, but, at the time, as we were walking this for years, I didn't realize what was happening, I thought my husband was just.

84
00:12:57.030 --> 00:13:03.060
Debra Ruh: Taken retirement too serious I didn't I just didn't understand what was happening, but my husband got his.

85
00:13:04.470 --> 00:13:15.240
Debra Ruh: His dimension, because he had a traumatic brain injury is a young child 11 years old, he was hit by a car, and even though they didn't diagnose it with a brain injury at the time 50 plus years ago.

86
00:13:15.900 --> 00:13:27.660
Debra Ruh: that's 55 years ago that's what it was, and so it's I think the work that you're doing is so empower is so powerful, but I am that you know curious.

87
00:13:28.170 --> 00:13:35.220
Debra Ruh: You know walking where I am with my husband it's so chilling for him and for those that love him and.

88
00:13:35.910 --> 00:13:44.070
Debra Ruh: And now i'll just make a comment that I took him to a neurologist about a month ago, and when I went to the neurologist he said, what are you doing here.

89
00:13:44.880 --> 00:13:54.060
Debra Ruh: I said, what do you mean what am I doing here my husband has dementia and i'm trying to be a good wife and make sure he has a neurologist and he was like there's nothing we can do so.

90
00:13:54.660 --> 00:14:00.120
Debra Ruh: unless he gets aggressive to you just there's nothing we can do there's nothing we can do.

91
00:14:00.660 --> 00:14:08.820
Debra Ruh: yeah and I accept that and that's why the work that you're doing is so beautiful and powerful but one more comment, and then I definitely want you to talk about this.

92
00:14:09.180 --> 00:14:24.300
Debra Ruh: The comment you made about the brain i'm fascinated with it, that the brain didn't want to put stuff in there, and so it will I thought that was such a fascinating point so is we're manipulating things the brain saying yeah yeah no you're not so.

93
00:14:25.050 --> 00:14:30.720
Debra Ruh: I just it's you know I look at a situation like my husband and just assume there's.

94
00:14:31.380 --> 00:14:40.620
Debra Ruh: there's just absolutely nothing anybody can do and I accept that I accept that but I look forward to people that will have choices, because of the work you're doing.

95
00:14:40.830 --> 00:14:50.190
Debra Ruh: So that other families don't have to walk this and also, they say, with Kobe we're going to see more cases of dementia, because of the brain bleeds anyway, let me turn the microphone over to you.

96
00:14:51.060 --> 00:14:58.500
Nataliya Kosmyna: Yes, thank you, thank you so much number two for this comment, and thank you so much for sharing it's it's a personal thing and.

97
00:14:58.800 --> 00:15:14.310
Nataliya Kosmyna: I do work and i'm going to mention this, I think later about as a Ls his kids were and katie is such a crucial part of the whole equation are always says in this type of systems, you kind of have to.

98
00:15:14.880 --> 00:15:24.480
Nataliya Kosmyna: stakeholders, ultimately, you have actual you I mean I call that patient actual user but y'all to ultimately have the same level of the key events is actually.

99
00:15:24.720 --> 00:15:40.380
Nataliya Kosmyna: is important to get keg em and caregivers are like you, they do take as a do care it doesn't mean that others do not care sometimes work life, quality like on come on covered I just take such a toll on you mentally that she's just started, you know.

100
00:15:41.400 --> 00:15:52.680
Nataliya Kosmyna: feeling down and obviously this mood can take off, but what I wanted to actually say that brain is there really just i'm gonna get back to two dimension to work and why it might have some.

101
00:15:53.070 --> 00:16:01.620
Nataliya Kosmyna: Of the work i'm doing might not find a cure be helpful for the cure, but what it can help with is early diagnosis and also some support.

102
00:16:02.070 --> 00:16:11.910
Nataliya Kosmyna: For you, as a caregiver for the patient actually use it, but also for neurologists you actually mentioned, but before I do dive back into this and i'm holding the.

103
00:16:12.300 --> 00:16:21.510
Nataliya Kosmyna: sword about the brain you're right the brain is actually a major becomes a brand I have this question, a lot of times Oh, why.

104
00:16:22.320 --> 00:16:31.710
Nataliya Kosmyna: Do you think we can do consciousness transfer like you know altered carbon for those who have seen the the TV series I am fascinated I am a big fan or something like that.

105
00:16:32.580 --> 00:16:39.900
Nataliya Kosmyna: they're interesting thing is that there because we have so many artificial organs so ready we like so public 3D printed them.

106
00:16:40.440 --> 00:16:51.390
Nataliya Kosmyna: it's really exciting does it is that the brain is being developed within our body it's being developed, while we are an actual developing, while in the body, so it has this notion of.

107
00:16:52.290 --> 00:17:02.880
Nataliya Kosmyna: This is a body, and this is what I need to take care of and that brain has a lot of functionality, that is embedded i'm going multi car really multicore complicated process.

108
00:17:03.270 --> 00:17:10.860
Nataliya Kosmyna: However, some of the processes are not, as you know, I would say, there are some 10 years but, however, some of them are.

109
00:17:11.310 --> 00:17:21.480
Nataliya Kosmyna: we're not as good for example at multitasking we are sometimes not as good as managing some things you know it's like in a lot of books and i'm going to mention those maybe in the end.

110
00:17:22.320 --> 00:17:30.900
Nataliya Kosmyna: Excellent books about emotions how emotions are made and natural born cyborgs from Andy card actually from UK excellent book.

111
00:17:31.770 --> 00:17:37.770
Nataliya Kosmyna: I always tell my students and it's an interesting thing about the development, meaning that.

112
00:17:38.550 --> 00:17:50.850
Nataliya Kosmyna: rainbow of floored will try to afford a lot of scenes like Karen mentally about yourself to other people like that's how basically why you're getting a partner, ultimately, and they said they said yes so fascinating is that book.

113
00:17:51.240 --> 00:17:59.880
Nataliya Kosmyna: means it's sometimes sometimes its functionality you're right Deborah they seem to be nothing could be done, however, I do.

114
00:18:00.150 --> 00:18:11.580
Nataliya Kosmyna: firmly believe and obviously i'm very biased with the research i'm doing, but I do see it was a lot of patients i'm working with that, if there is a motivation, even of motivation of doing something.

115
00:18:12.390 --> 00:18:25.080
Nataliya Kosmyna: It actually pushes you to go and actually not maybe improving a condition, but actually expanding their quality of life and also the duration of life.

116
00:18:25.410 --> 00:18:37.020
Nataliya Kosmyna: The fact that you are keeping the brain bz that you are doing some mental tasks like this researcher asked you to do mental math each day you're like oh my God mental math Why do I need to do that.

117
00:18:37.410 --> 00:18:40.440
Nataliya Kosmyna: yeah it's a part of our system of our question is that we use.

118
00:18:41.010 --> 00:18:48.780
Nataliya Kosmyna: We have dementia patients you like, why would you do the House it's related yeah you're kind of reconnecting the daughter kind of firing back.

119
00:18:49.110 --> 00:18:57.030
Nataliya Kosmyna: Are you hoping that the connection will be broadband back and connected yeah who knows that's exactly what we're looking for maybe heavily wrong.

120
00:18:57.270 --> 00:19:03.330
Nataliya Kosmyna: By that exactly why a lot of this traditional clinical trials with a lot of conditions in place us sort of clinicians.

121
00:19:04.080 --> 00:19:09.510
Nataliya Kosmyna: We are collaborating and working together, for example in the case of dimensions it before I just like God at all.

122
00:19:10.350 --> 00:19:16.770
Nataliya Kosmyna: I was saying that what we do, we ship the device to your home and obviously was caught it it's even.

123
00:19:17.010 --> 00:19:25.350
Nataliya Kosmyna: more significant, so you don't owe you have no possibility or we don't feel comfortable enough to drive to the cleaning that might be miles and miles and miles away.

124
00:19:25.890 --> 00:19:37.560
Nataliya Kosmyna: And some you know some patients are older, and the more fragile to just not really, really to they're motivated but they're just not really, really into go take risk, which is totally understandable.

125
00:19:38.130 --> 00:19:47.430
Nataliya Kosmyna: And you would have the device at home it's once again usual pair of glasses very lie like you can see, like a chill in the back of my background and put it on purpose there.

126
00:19:48.210 --> 00:19:52.080
Nataliya Kosmyna: And you would do this little mental exercises, but while you're still doing them.

127
00:19:52.980 --> 00:20:00.810
Nataliya Kosmyna: you'll be actually measuring your brain activity by measuring your as if it's illogical signals, so we would be able to see from day to day from week to week.

128
00:20:01.170 --> 00:20:11.580
Nataliya Kosmyna: Is the difference is a decline, or does it look like all normal you know you just had a bad day it was just yeah just a bad day or is it actually not a bad day but a bad week or.

129
00:20:12.420 --> 00:20:18.840
Nataliya Kosmyna: A bad month and you need to go and take care of this, and you would be having this information as a primary user.

130
00:20:19.110 --> 00:20:24.180
Nataliya Kosmyna: caregiver will also have access to this information, but more importantly, you can show explicitly to the doctor.

131
00:20:24.570 --> 00:20:32.820
Nataliya Kosmyna: To the neurologist are who also might be not able just to see it right away because it's like yeah my our primary tests for.

132
00:20:33.420 --> 00:20:40.740
Nataliya Kosmyna: All good but actually when they might have an ability to see this very, very, very little collected information they might.

133
00:20:41.010 --> 00:20:48.210
Nataliya Kosmyna: get a second opinion from another specialist don't think yeah maybe they need to actually do another set of tests and to take it.

134
00:20:48.540 --> 00:20:57.060
Nataliya Kosmyna: to another level that's what we are aiming for, and once again this is research, this is really clinical trial trial, a lot of people involved and.

135
00:20:57.660 --> 00:21:04.740
Nataliya Kosmyna: it's a lot it's because so broke from from actual user from from caregiver so I actually want to use a sad little.

136
00:21:05.040 --> 00:21:19.290
Nataliya Kosmyna: One second opportunity I usually do this in my papers acknowledgement, but thank you to all the acadia us as well, not on it as actual users, because what we are doing might not help, unfortunately, you or your family member.

137
00:21:19.710 --> 00:21:22.740
Nataliya Kosmyna: And we are very clear about this, you know our be phones but.

138
00:21:23.550 --> 00:21:39.570
Nataliya Kosmyna: It might help the next family and it might bring something else on board, maybe to develop and, in addition to that magical formula is that feel that, in addition to this set of tasks will actually show us something so yeah that's the comment on this one.

139
00:21:40.590 --> 00:21:48.930
Debra Ruh: And that's such a powerful comment because I as i've walked these down this path with my husband and it's hard but.

140
00:21:49.800 --> 00:22:01.050
Debra Ruh: it's hard for him to but, like you said it's it's it's really hard for everybody, you know also and I, and I wonder, sometimes, as I watch my husband walking this like.

141
00:22:02.010 --> 00:22:09.840
Debra Ruh: i'm going to try to say all this stuff not get upset but he sometimes he forgets how to swallow, and it just freaks me.

142
00:22:10.290 --> 00:22:26.850
Debra Ruh: out and so i'm like using language swallow swallow, and I understand that Lee which doesn't work in the same way anymore, with my husband and he and I just continued to realize how inadequate languages.

143
00:22:27.930 --> 00:22:34.950
Debra Ruh: In situations, but I I it's i'm just so glad that you're doing this research, I know that.

144
00:22:35.370 --> 00:22:43.890
Debra Ruh: there's so much for us to learn and i'm fascinated fascinated with your work and the science and the way our brains work but it.

145
00:22:44.340 --> 00:22:51.210
Debra Ruh: But for the caregivers and the love in our loved ones that have dimension ALS parkinson's.

146
00:22:52.020 --> 00:22:58.470
Debra Ruh: We have no data we don't know how to help for when this first started happening with my husband, you know.

147
00:22:58.980 --> 00:23:06.630
Debra Ruh: i'm going to help you know i'm going to solve, you know that's what we want to do, and so all I could find the only thing I could find to help him.

148
00:23:06.870 --> 00:23:14.100
Debra Ruh: Was that I can give him more Omega three everybody should be doing Omega three it's very good for our brains and I would go in prepared.

149
00:23:14.370 --> 00:23:22.410
Debra Ruh: With the neurologist and I would say, well how about this with this how pop up and they're like oh sure you can try that, but it was never them saying anything to me.

150
00:23:23.040 --> 00:23:38.280
Debra Ruh: I was, and so I am maybe because, once again we don't have anything but I just wanted to remind everybody, this is very early stage we have so much to learn, which is why I think society should support the work that you're.

151
00:23:38.430 --> 00:23:38.640
Nataliya Kosmyna: You.

152
00:23:38.850 --> 00:23:39.540
Debra Ruh: guys are doing.

153
00:23:39.840 --> 00:23:51.120
Nataliya Kosmyna: Thank you, thank you don't fall for mentions is and it's totally it's 100% true so what I do want to say that that's exactly, I guess, one of the part of the work we're doing, I mean we have.

154
00:23:51.540 --> 00:24:00.270
Nataliya Kosmyna: It like in our so many devices, like in my apartment i'm just looking right now oh my God, how much, since I have even like regardless of my media lab here.

155
00:24:00.900 --> 00:24:11.430
Nataliya Kosmyna: Today, moved from from my mail from radio I got home how much stuff I have, but we are talking here about Internet of bodies like next level like yeah instead of using.

156
00:24:12.540 --> 00:24:25.920
Nataliya Kosmyna: Our alexa something else, and I like why we can have to use it for something good for actually what can also track monitors, a state of the patient without violating their privacy, but actually have this continuous continuous measurements.

157
00:24:26.850 --> 00:24:32.160
Nataliya Kosmyna: My mother and urologist and I guess This is like heavily I usually do mentioned this in my background.

158
00:24:32.460 --> 00:24:46.500
Nataliya Kosmyna: This is a heavily obviously I make she has inspired me to do this type of wherever she over to Hollywood i'm just gonna follow up on her and just become an md and not easy, however, what I want to say is that.

159
00:24:47.640 --> 00:24:59.130
Nataliya Kosmyna: When you show to your doctor and I tried this with a lot of times can team use continuous recording so but days and weeks it's going to be very hard to tell you Deborah I know.

160
00:24:59.700 --> 00:25:08.340
Nataliya Kosmyna: What you're saying you're hallucinating because you are by for mother or sister or whatnot you know you're concentrating that's like go no, you have.

161
00:25:09.000 --> 00:25:20.610
Nataliya Kosmyna: tones of data record, then, in addition to that you know just record this data blindly hopefully me and my lab and we actually doing heavily Ai so we do a machine learning so.

162
00:25:21.120 --> 00:25:32.040
Nataliya Kosmyna: For those who might not be so knowledgeable we do processing of the data, so we would learn some specific features and information inside of this data, and so we can actually do.

163
00:25:33.180 --> 00:25:42.510
Nataliya Kosmyna: choices and decisions even in real time, based on what's happening so it's not going to be just us saying Oh, I think he has a problem or she has a problem.

164
00:25:42.930 --> 00:25:53.370
Nataliya Kosmyna: You have bad cop it's like you know thinking about your I don't know apple watch I don't know if you do have our apple Watch has tracking of your history status, etc.

165
00:25:53.610 --> 00:26:02.520
Nataliya Kosmyna: Keep the same here you're going to show an absolute literally have an APP once again if we do have time to maybe show it to to our to our listeners maybe.

166
00:26:02.940 --> 00:26:12.060
Nataliya Kosmyna: Twitter some some screenshots do have an APP where you can actually see what's happening over months and months and months, and this is the whole purpose of it it's a.

167
00:26:12.750 --> 00:26:21.420
Nataliya Kosmyna: Active as much as it can get the perfect not is it research, yes, is it others teach yes and be a super super clear about this, this is no mind reading.

168
00:26:21.660 --> 00:26:28.830
Nataliya Kosmyna: Neither the system not anything else in zero just as a disclaimer kind of read minds, unfortunately, yet, however, yet.

169
00:26:29.160 --> 00:26:40.770
Nataliya Kosmyna: But can be tried to keep it objective Rico put it analyzed in place, yes, can be hopefully then show it to specialist and so specialist can do some actual.

170
00:26:41.370 --> 00:26:52.050
Nataliya Kosmyna: choices and then maybe pharmaceuticals can jump in and help with these molecules to potentially improve the state, or at least keep it at the same stage.

171
00:26:52.770 --> 00:27:03.390
Nataliya Kosmyna: I do believe that there is a possibility for that, but that's exactly the purpose and I think, also to say Corbett, I guess, was there one positive thing.

172
00:27:03.870 --> 00:27:10.920
Nataliya Kosmyna: I mean it's it's hard to say how it can be positive and covered, but I think that's what I saw even with grants and submissions that.

173
00:27:11.820 --> 00:27:26.760
Nataliya Kosmyna: NASA third nsf and all these huge huge you know governmental you know organizations do support and now actually doing courage submissions where you talk about telemedicine, where you talk about.

174
00:27:27.180 --> 00:27:37.020
Nataliya Kosmyna: variables, you can have them in clinics, and this is huge, for all these searches for all the labs that you do not shy away, and when you do that, you don't use.

175
00:27:39.210 --> 00:27:45.750
Nataliya Kosmyna: You know pounds of dollars device, and you can actually distribute them is this perfect no but imagine you have.

176
00:27:46.020 --> 00:27:59.370
Nataliya Kosmyna: Not hours you have it's not like one hour or two hours a year that you Deborah might experience with your husband, you have four hours minimum of dairy cordon image on a daily basis.

177
00:27:59.790 --> 00:28:06.060
Nataliya Kosmyna: that's that's huge on the data, even if it's not have as a time there was a 10 jumping on his head or something happened.

178
00:28:06.510 --> 00:28:12.360
Nataliya Kosmyna: And it's not being played super currently, and that is fine, we are here with our Ai machine.

179
00:28:13.170 --> 00:28:18.630
Nataliya Kosmyna: Learning English to jumping to try to correct it to work with other special, this is not one person.

180
00:28:19.260 --> 00:28:29.610
Nataliya Kosmyna: One person job it's whole team behind, especially as clinicians This is like heavily you know clinical trial i'm actually talking about but that's exactly why you know we are bringing this.

181
00:28:30.150 --> 00:28:31.560
Nataliya Kosmyna: And I was talking.

182
00:28:33.570 --> 00:28:33.870
Debra Ruh: yeah.

183
00:28:34.020 --> 00:28:35.040
Neil Milliken: you're getting inside here.

184
00:28:36.330 --> 00:28:36.690
Debra Ruh: too.

185
00:28:36.870 --> 00:28:49.770
Neil Milliken: But but but, but what you're talking about here is that that mass you know managing the data being able to evidence stuff being able to then spot patents, and I think that that's where you know that.

186
00:28:50.460 --> 00:29:03.690
Neil Milliken: One of the reasons why the NHS is a jewel is that that we have a data collection element to to what we do in the UK, which enables us to spot patents in epidemiology.

187
00:29:04.110 --> 00:29:09.930
Neil Milliken: That lots of other country can't do because they don't have the same kind of joined up network.

188
00:29:10.380 --> 00:29:20.730
Neil Milliken: So what you're doing with the data gathering and then the processing of the data is super interesting because it will enable you to spot patent and therefore you know that's going to have benefit over to you and Tony.

189
00:29:24.150 --> 00:29:37.470
Antonio Santos: When you when you started in medicine in general, and you started there you went to a journey that took it to to the United States when you look back to the developer, to the development of your work.

190
00:29:38.460 --> 00:29:47.790
Antonio Santos: What type of Sciences and areas of knowledge, have you seen converging in order to be able to succeed in your work.

191
00:29:48.900 --> 00:29:52.320
Nataliya Kosmyna: yeah that's an excellent question Thank you until you so much, and.

192
00:29:53.010 --> 00:30:01.290
Nataliya Kosmyna: I have so many students who are sending me, you know this hesitant email saying oh So what do I do to like do PG rated innovative and.

193
00:30:01.650 --> 00:30:09.540
Nataliya Kosmyna: Full disclaimer actually fall for my students come not on this call I hopefully we'll be able to get the link to send them later on.

194
00:30:09.990 --> 00:30:20.190
Nataliya Kosmyna: No one has nervous simon's background in in my students know what, not a single person so as a justice a full disclaimer I have artists.

195
00:30:20.910 --> 00:30:32.010
Nataliya Kosmyna: I have designers I have our program is of course developers but it's just a full disclaimer you do not need if you think that you would want to be a part of.

196
00:30:32.670 --> 00:30:46.470
Nataliya Kosmyna: My team get to a degree, like a master's degree PhD degree internship I have some people who don't have an never rode Hello world in a new programming language, and this is perfect.

197
00:30:47.010 --> 00:30:58.620
Nataliya Kosmyna: skill sets are so different, so I think first of all it's a big message doing a Chai out like do not shy out, so this is like huge huge huge huge you know message from.

198
00:30:59.280 --> 00:31:08.790
Nataliya Kosmyna: from you, first of all, second one, I do think that, like who i'm like literally yesterday, even like full disclaimer send a message on our slack channel.

199
00:31:09.450 --> 00:31:22.140
Nataliya Kosmyna: Who knows a you know, a person in history actually I need to talk about some historical the grounds of some stuff literally, this is an example and ethical consideration super strong.

200
00:31:23.010 --> 00:31:32.490
Nataliya Kosmyna: You know historical background and how some tech was approached and where it slipped away it's something you do want to talk and actively talk.

201
00:31:32.910 --> 00:31:40.230
Nataliya Kosmyna: enhances active discussion or just oh yeah, thank you for your time yeah Thank you and we never implemented any of those and never took a note of any of those.

202
00:31:41.250 --> 00:31:52.470
Nataliya Kosmyna: And I think that definitely it is what is really converging ease, I would say, specifically in brain computer interfaces it might be more obvious for other type of.

203
00:31:52.920 --> 00:32:05.520
Nataliya Kosmyna: tech it's definitely actively using cloud infrastructure so that's kind of obvious but not so much in brain computer interfaces yet definitely having a good use of.

204
00:32:06.060 --> 00:32:11.550
Nataliya Kosmyna: You know, storing the data, but one thing that you need to obviously be either hipaa regulated, you know storage is.

205
00:32:11.850 --> 00:32:19.650
Nataliya Kosmyna: And I know that quite a few initiatives in the US in Europe on those so basically ensuring that the data is stored properly and securely The big question.

206
00:32:20.520 --> 00:32:30.960
Nataliya Kosmyna: Definitely, this is emerging, and I think those are excellent, because we can learn from so many lessons before of the variable tech where it didn't go well.

207
00:32:31.380 --> 00:32:41.610
Nataliya Kosmyna: And you can imagine how it's also can you go back because someone might take it another way, what I think what I love about what's happening right now is as easy as changing the Silicon Valley.

208
00:32:42.390 --> 00:32:52.410
Nataliya Kosmyna: actively pushing for this as well in the vc world that you start you really need to start putting user first and using its first for profit is, of course.

209
00:32:52.770 --> 00:33:07.290
Nataliya Kosmyna: Important, but we have seen, we are still seeing now i'm not going to spend this going this, but what it has what happens with a lot of great teams great labs and grading shaders when user is not been put first.

210
00:33:07.590 --> 00:33:09.750
Nataliya Kosmyna: And when you don't really hear researcher.

211
00:33:10.080 --> 00:33:19.470
Nataliya Kosmyna: And was gone it as well once again okay to recall, it is a good example, but it's a good example, when you do, you might want to listen to researchers and.

212
00:33:19.920 --> 00:33:30.210
Nataliya Kosmyna: that's an excellent thing you know but and good point anyone can become one and you don't need to have a specific degree, you can totally not to have perfect nodes you know.

213
00:33:30.690 --> 00:33:38.130
Nataliya Kosmyna: um so yeah that's another big big message for everyone who is also listening and we just you know hesitant and thinking about those.

214
00:33:39.150 --> 00:33:45.300
Antonio Santos: Over the last couple of months we have seen some discussions, particularly on.

215
00:33:45.960 --> 00:33:56.430
Antonio Santos: Ai ethics computer science and sometimes we observe a few conflicts between people on the side of at at the CES and earned computer engineers.

216
00:33:56.880 --> 00:34:10.050
Antonio Santos: How can we can we manage these two forces and all these two areas of knowledge or can we make sure that each of them are more open to collaborate and are able to listen to each other more.

217
00:34:12.120 --> 00:34:23.850
Nataliya Kosmyna: On and I guess it's going to be their most boring but, honestly, I think the answer the work is regulations actual regulations and what I think i'm really serious about this like not like.

218
00:34:24.150 --> 00:34:33.840
Nataliya Kosmyna: 10 years after the tech is he, like you'll really wherever you are, you know, in the US in Europe in UK wherever you are.

219
00:34:34.560 --> 00:34:42.660
Nataliya Kosmyna: I understand Oh, I totally understand how much challenges we have is like some nations, all the stuff you you don't you can imagine how much news.

220
00:34:42.960 --> 00:34:52.620
Nataliya Kosmyna: As slipping off of what's being implemented patented and what funding is being put in well the main use is what is happening it's important to keep.

221
00:34:52.890 --> 00:35:00.750
Nataliya Kosmyna: keep us up to date on workstations but you really should have those people dedicated in the government who hire or consult.

222
00:35:01.140 --> 00:35:13.110
Nataliya Kosmyna: People like me researchers go to the labs go to the teams and talk about hey we saw our company X, you know I did this is as good what we can do, because we know what they're not.

223
00:35:14.400 --> 00:35:22.560
Nataliya Kosmyna: Scientific degrees and justice fine their job is putting the policy forward voting with rest of their.

224
00:35:22.860 --> 00:35:30.030
Nataliya Kosmyna: country, and this is their job that's why those as well as our government, but this is part of my job to tell you, whereas a dangerous.

225
00:35:30.360 --> 00:35:37.020
Nataliya Kosmyna: But I also need a platform and they need to look for those they need to acknowledge that they cannot sometimes.

226
00:35:37.410 --> 00:35:50.280
Nataliya Kosmyna: Ask the best questions, but if they have a good advisor Advisory Board to help them towards the questions, some of those very politicized videos that we saw recently have you know a lot of.

227
00:35:51.960 --> 00:36:01.710
Nataliya Kosmyna: People person through the TV questions could have been harsher they could have been much more reliable realistic, so the rest of the country.

228
00:36:02.010 --> 00:36:12.420
Nataliya Kosmyna: Who has the old pretty fences how to feed their families can also understand what has been happening, you know behind the curtains, so I would say, first and foremost, is really pro.

229
00:36:12.870 --> 00:36:20.610
Nataliya Kosmyna: Active regulation been in place and do not wait 10 years that's not the speed of silicon valley's it's not our speed.

230
00:36:21.000 --> 00:36:27.090
Nataliya Kosmyna: And you can imagine how much Corbett move this forward if you wait again five years 10 years toward for something.

231
00:36:27.600 --> 00:36:37.710
Nataliya Kosmyna: it's gonna be gone it's going to wait, you can do anything past four spots it's really hard to difference in cost, so I think that's like all important and, I think, just to, I think.

232
00:36:38.130 --> 00:36:52.350
Nataliya Kosmyna: I think you're trying to get over just a second comment, we should just not forget one important thing and then what I just mentioned, about the vcs and for profit well the company's for profit Z uh huh for profit.

233
00:36:52.980 --> 00:37:04.620
Nataliya Kosmyna: Point like you cannot do all this, like magic tree consensual oh yeah like to have for profit, they want to make money and either put it to you user and your data.

234
00:37:04.890 --> 00:37:12.690
Nataliya Kosmyna: they're going to be making you as a product, and this is, it should be just clear, you know it's not a product motor con just you know embrace.

235
00:37:13.290 --> 00:37:18.450
Nataliya Kosmyna: The fact and embrace reality and you need to work with this reality and you still need to work with them.

236
00:37:18.780 --> 00:37:28.470
Nataliya Kosmyna: And they're all based, as I said, as particularly regulations to work with them about transparency, but you should just not forget it's you know it's not me just straight forward thinking.

237
00:37:29.370 --> 00:37:34.650
Nataliya Kosmyna: Do not believe everything you read and just understand what whereas minor came from and follow the trail.

238
00:37:35.280 --> 00:37:41.310
Neil Milliken: So I think there's a really important point, and I fully agree with you about the the need for regulation now.

239
00:37:42.480 --> 00:37:52.800
Neil Milliken: Because we've seen the the challenges that we've we're now facing and the challenges of the last few years, where we haven't had regulation and we've allowed tech to regulate itself.

240
00:37:53.220 --> 00:38:12.510
Neil Milliken: And the impact of venture capital funding tech where where that's driven certain behaviors now, I think that you know antonio's a sociologist i'm a historian we're both working in tech, so we have our own innate biases in thinking that our our backgrounds, give us a different perspective.

241
00:38:13.740 --> 00:38:14.130
Neil Milliken: That.

242
00:38:14.190 --> 00:38:22.500
Neil Milliken: there's some value to that those backgrounds and I think that that's probably aligned with the stuff that you were saying, but I also think that.

243
00:38:23.700 --> 00:38:33.840
Neil Milliken: There is a room there is room for profit in tech, because one of the things that I want to show is that you, I would like it to be more profitable to do good things than bad.

244
00:38:34.620 --> 00:38:39.840
Neil Milliken: So, so one of the things that I really want to do is make stuff like you're doing really profitable.

245
00:38:40.560 --> 00:38:49.950
Neil Milliken: because it encourages organizations to to invest in doing the things that are good for society, rather than the things that are deleterious so.

246
00:38:50.790 --> 00:38:59.400
Neil Milliken: So, but but that's a strange that's not a normal model of capitalism, but I think that society is changing and we're warming to this idea.

247
00:38:59.940 --> 00:39:09.090
Neil Milliken: But what we need, you know companies as role models so I mean there's so many topics we could continue talking on this for four weeks.

248
00:39:09.870 --> 00:39:21.540
Neil Milliken: we're going to definitely continue the conversation on social media I know audience are going to massively engage on this, I need to thank Barclays access my clear text and micro link for keeping us on air.

249
00:39:22.530 --> 00:39:33.690
Neil Milliken: going for the last seven years and keeping us accessible, with all the captions Thank you so much Natalia it's been it's been a real pleasure and i'm really looking forward to chatting on social media on Tuesday.

250
00:39:34.230 --> 00:39:36.720
Nataliya Kosmyna: Thank you so much for having me thank you.