-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.html
executable file
·588 lines (515 loc) · 36.5 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
<!doctype html>
<html lang="en">
<head>
<!-- Global site tag (gtag.js) - Google Analytics -->
<!-- Google Tag Manager -->
<script>(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start':
new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0],
j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src=
'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f);
})(window,document,'script','dataLayer','GTM-WWPB63L');</script>
<!-- End Google Tag Manager -->
<!-- need an ID !!!! remeber to apply an ID !!!!!!!!!!!! -->
<!-- <script async src="https://www.googletagmanager.com/gtag/js?id=GA_MEASUREMENT_ID"></script> -->
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'GA_MEASUREMENT_ID');
</script>
<script>
function showHideNews() {
var x = document.getElementById("extraNews");
if (x.style.display === "none") {
x.style.display = "block";
} else {
x.style.display = "none";
}
}
</script>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1"/>
<meta name="description" content="" />
<meta name="author" content="Tao Chen|Design intelligent sensing and mobile for societal good">
<link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/css/bootstrap.min.css" integrity="sha384-ggOyR0iXCbMQv3Xipma34MD+dH/1fQ784/j6cY/iJTQUOhcWr7x9JvoRxT2MZw1T" crossorigin="anonymous">
<link href="style.css" rel="stylesheet">
<script type="text/javascript" src="http://latex.codecogs.com/latexit.js"></script>
<link rel="icon" type="image/png" href="data/images/tao_icon.png">
<title>Tao Chen|AI for sound,sensor,and health</title>
</head>
<body>
<!-- Google Tag Manager (noscript) -->
<noscript><iframe src="https://www.googletagmanager.com/ns.html?id=GTM-WWPB63L"
height="0" width="0" style="display:none;visibility:hidden"></iframe></noscript>
<!-- End Google Tag Manager (noscript) -->
<div class = "global_container">
<div class = "section" >
<h1></h1>
<h1></h1>
<div class="row no-gutters">
<div class ="col-5 col-md-3">
<div class = "white_boxed">
<!-- <img class="scale_img" src="data/images/tao_2023.jpg"> -->
<img class="scale_img" src="data/images/tao_chen_miami.jpg">
</div>
</div>
<div class ="col-7 col-md-7">
<div class = "white_boxed">
<h5>Tao Chen</h5>
<p> </p>
<p> Postdoctoral Associate <br>
Department of Computer Science <br>
University of Pittsburgh <br>
Pittsburgh, United States </p>
<p>Email: tachen.cs@gmail.com </p>
<p> <a href="https://scholar.google.com/citations?user=C6RUzpEAAAAJ&hl=en" target="_blank" rel="noopener">Google Scholar</a> / <a href="https://www.linkedin.com/in/tao-chen-8327301a3/" target="_blank" rel="noopener">LinkedIn</a> / <a href="https://github.com/tachen-cs" target="_blank" rel="noopener">Github</a> / <a href="https://orcid.org/0000-0003-4565-5548" target="_blank" rel="noopener">ORCID</a></p>
<!-- <p> <i>Design intelligent sensing and mobile for societal good.</i> </p> -->
<!-- <p> <i>AI for sound, sensing, and multi-modal signals.</i> </p> -->
<!-- <p> <i>AI for sound, wearable, and multi-modal agent.</i> </p> -->
<!-- <p> <i>Building sensory AI for perceiving and interpreting physical world.</i> </p> -->
<p> <i>Building sensory AI serving for human and physical world.</i> </p>
<p> <font color="#c03c3b"> I am on the job market this season and appreciate any fit opportunity. </font> </p>
</div>
</div>
</div>
<h1></h1>
<h4>News </h4>
<hr color=#F1ECEC size="1">
<div class ="boxed">
<p>[Jan, 2025] LeekyFeeder conditional accepted by SenSys'25. </p>
<p>[Jan, 2025] Our group had an exciting opportunity to visit Senegal, Africa, to deploy low-cost hearable health monitors for <a href="data/images/afria.jpg" target="_blank" rel="noopener">supporting local underserved communities</a>. </p>
<p>[Nov, 2024] Asclepius won <font color="#FF0000"><strong>Best Paper Award</strong></font> at MobiCom 2024 and I was invited to present it at both main conference and AgeTech workshop. </p>
<p>[Nov, 2024] "Earable Multimodal Sensing and Stimulation: A Prospective Towards Unobtrusive Closed-Loop Biofeedback" accepted by IEEE Reviews in Biomedical Engineering. </p>
<p>[Nov, 2024] Vision paper "Towards Next-Generation Human Computer Interface Based On Earables" now accepted by IEEE Pervasive Computing.</p>
<p>[Jul, 2024] Zero-shot IoT sensing with foundation models now accepted by ECAI'24.</p>
<p>[Mar, 2024] EarVoice is now accepted by MobiSys'24 :) </p>
<p>[Jan, 2024] Mobile Acoustic Field (MAF) is accepted by CHI'24. </p>
<!-- <p>[Dec, 2023] One acoustic paper is currently under one-shot revision at NSDI'24. </p> -->
<p>[Nov, 2023] <a href="https://asclepius-system.github.io/" target="_blank" rel="noopener">Asclepius</a> and MagWear are both accepted by MobiCom'24. See you in DC next year! </p>
<!-- <p>[Sep, 2023] Invited to serve the TPC of ICPADS'23, MobiQuitous'23. </p> -->
<!-- <p>[Jun, 2023] Invited to serve the AEC of MobiCom'23, NDSS'24, SOSP'23. </p> -->
<p>[Jun, 2023] Happy to receive the Best Poster Presentation Award (12/300+) at <a href="https://www.oacd.health.pitt.edu/content/postdoctoral-data-dine-symposium" target="_blank" rel="noopener">Pitt Postdoctoral Symposium 2023</a>!
<!-- <p>[May, 2023] Pitt SCI covers <a href="https://www.sci.pitt.edu/news/earable-computers" target="_blank" rel="noopener">our research on earable</a>! -->
<!-- <p>[Feb, 2023] SoundSticker is accepted by ToSN, checkout the <a href="https://soundsticker.github.io/" target="_blank" rel="noopener">website</a> for the demo and audio clips! </p> -->
<!-- <p>[Dec, 2022] Invited to serve as the TPC of IEEE Cloud Summit 2023! </p> -->
<p>[Dec, 2022] Humbled to receive the ACM SIGMOBILE Student Community Grant award! </p>
</div>
<h1></h1>
<h4> Research </h4>
<hr color=#F1ECEC size="1">
<div class ="boxed">
<!-- <p>
I am an audio expert focusing on <strong>human-centric acoustics</strong>. My research aims to unlock the untapped potential of sound through cutting-edge AI, sensing, and computational technologies.
Sound, a fundamental element in our world, is a potent force that offers rich insights into our physiology, environment, and interactions.
My work tries to uncover these new dimensions in the comprehension and utilization of sound within the mobile sphere.
This includes analyzing <strong>in-body sounds</strong> for healthcare advancements, enhancing interactions through <strong>sound-sensitive devices</strong> (like earphones) <strong> and speech </strong>,
and protecting <strong>environmental acoustics</strong>.
Ultimately, my research endeavors to transform the role of sound in technology and everyday life, making it more integral and interactive. </p>
<p>
I am currently building next-gen wearable AI on earphones.
</p>
-->
<!-- 2025 version -->
<!-- health, safety features-->
<p>
I am an experimental computer researcher. My research centers on building human-centric mobile system, aims to bring novel perception and ambient intelligence to wearable devices through cutting-edge AI, sensing, and computational technologies.
<!-- I take pride in building systems that function effectively in the physical world. My approach is usually end-to-end, encompassing sensors, embedded systems, signal processing, and AI/ML models.
and the perceive and interact with the physical world.-->
Passionate about sensor + ML technology, I currently work on the following areas to create intelligent systems and applications:
<br>
<br> <b>- Digital health:</b> Asclepius(MobiCom'24,<font color="#FF0000"><strong>Best Paper</strong></font>), Magwear(MobiCom'24), Zero-shot activities logging(ECAI'24), Earable(RBME'24)
<br> <b>- Audio, sensory, physical AI:</b> EarVoice(MobiSys'24), SpotSound(MobiCom'23), SoundSticker(ToSN), Metamorph(NDSS'20), TapLeak(ICDCS'20)
<br> <b>- Spatial computing:</b> MAF(CHI'24), LeekyFeeder(SenSys'25)
</p>
<p>
<i>I am always open to collaborations. If you see any interest, feel free to drop an email and I am happy to chat!</i>
</p>
</div>
<h1></h1>
<h4> Short Bio </h4>
<hr color=#F1ECEC size="1">
<div class ="boxed">
<!-- I care about IoT systems, especially acoustic things. The classical air-conducted sounds, acoustic signals spread among body, robotics, and even underwater are all my interests.
My toolkits include theoretical acoustic modeling, hardware design, signal processing and machine learning methodologies. -->
<!-- My research is about building systems on the intersection of Mobile Computing and IoT system. I feel motivated to enable new inteillgient automation using sensors data and communication parades, meanwhile protecting people from the "seemingly" innocuous automation. Currently, I am intersted in acoustic signals and devices. -->
<!-- My broad research interests lie in the areas of mobile computing, signal processing and embedded system design.
My vision is to develop human-centric communication, networking, sensing and computation technologies on commercial mobile, wearable and customized hardware devices for strengthening human convenience, intelligence, health and safety in our society.
Currently, I focuses on developing novel systems and applications interacted with <strong>acoustic signals and things</strong>, ranges from earable, voice-controlled interfaces, to commodity smartphones. -->
<!-- I am a sensory AI expert and building end-to-end systems for sensing and interpreting the digital world: 1.digital health (Neural Computing), 2.sensory/embodied AI, 3. interaction tech -->
<!-- I am a sensing system researcher who builds sensory AI systems to perceive and interact with the physical world. I build sensory AI systems end to end and am interested in any sensor technologies.
My sensory AI systems now focus on the following key topics:
digital/wearable health (perception people)
Sensory/Embodied AI (understand the world to help people, sensor for robot)
HCI (sensor for human) -->
<p>
I am currently a postdoc researcher at the University of Pittsburgh, working with
Prof. <a href="https://shanggdlk.github.io/" target="_blank" rel="noopener">Longfei Shangguan</a>. Prior to my postdoc,
I received my Ph.D. from City University of Hong Kong, under the supervision of Prof.
<a href="http://www.cs.cityu.edu.hk/~zhenjili/" target="_blank" rel="noopener">Zhenjiang Li</a>.
</p>
<p> I am a quick learner, easily adapt to new things, and get the job done. </p>
<!-- <p>
I am driven by a commitment to design intelligent sensing and mobile systems that make meaningful contributions to societal well-being.
Currently, my primary focus revolves around advancing healthcare for the greater good, with a keen exploration of health equity and accessibility.
Simultaneously, I am dedicated to enhancing mobile and wearable security through the implementation of cutting-edge technologies.
</p> -->
<!-- <p> I am a passionate experimental researcher and a full-stack engineer, currently working on projects to explore <b>health equity, accessibility, and convenience</b> as a postdoc researcher at the University of Pittsburgh, working with Prof. <a href="https://shanggdlk.github.io/" target="_blank" rel="noopener">Longfei Shangguan</a>. Prior to my postdoc,
I received my Ph.D. from City University of Hong Kong, under the supervision of Prof. <a href="http://www.cs.cityu.edu.hk/~zhenjili/" target="_blank" rel="noopener">Zhenjiang Li</a>.
</p> -->
<!-- <p> I find myself at the crossroads of wearable sensing and intelligent mobile systems, always eager to explore the interconnected worlds of software/hardware co-design, signal processing, and applied machine learning. My passion particularly lies in the realms of acoustic and speech signals and the devices associated with them, such as hearables, smart speakers, phones, etc.
</p>
<p> As a researcher, I enjoy pushing the boundaries of various research disciplines. I invent novel sensing technologies and mobile systems for healthcare and human-computer interactions and stand firmly committed to harnessing technology for the greater good, ensuring robust security and safeguarding privacy in our rapidly evolving cyber-physical world.
As an adept engineer, I excel in system integration, creating comprehensive end-to-end systems with a blend of multi-disciplinary skills, such as fast prototyping, customized circuit design, embedded system programming, signal processing, and machine learning. These skills led my research works to proof-of-concept demos and prototypes. I am a fast learner, easily adapt to new things, and get the job done.
</p>
-->
<!-- -->
</div>
<h1></h1>
<h4>Main Publications </h4>
<p class="right_pad"><font color="gray"> (^ represents for students I supervise)</font></p>
<hr color=#F1ECEC size="1">
<div class = "no_boxed">
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/images/2024-mobicom-asclepius.jpg' height="100px" width="50px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[MobiCom'24] </font>Exploring the Feasibility of Remote Cardiac Auscultation Using Earphones</p>
<p class="small"> <strong>Tao Chen</strong>, Yongjie Yang, Xiaoran Fan, Xiuzhen Guo, Jie Xiong, Longfei Shangguan</p>
<p class="small"><a href="https://sigmobile.org/mobicom/2024/" target="_blank" rel="noopener"><em>MobiCom 2024</em> </a>, D.C., USA, Oct 2024 </p>
<!-- <p class="small"><a href="https://sigmobile.org/mobicom/2024/" target="_blank" rel="noopener"><em>MobiCom 2024</em> </a>, D.C., USA, Oct 2024 <font color="#FF0000"><strong>Best Paper Award</strong></font> </p> -->
<p class="small">
<a href="https://asclepius-system.github.io/" target="_blank" rel="noopener">[Project]</a>
<a href="data/papers/2024-mobicom-asclepius/mobicom24-final263.pdf" target="_blank" rel="noopener">[Paper]</a>
<a href="data/papers/2024-mobicom-asclepius/TaoChen_asclepius_mobicom_9min_final 2.pdf">[Slides]</a>
<a href="https://tachen-cs.github.io/">[Video]</a>
</p>
<p class="small"> <font color="#FF0000"><strong>Best Paper Award (1 out of all submissions)</strong></font> </p>
<p class="small"> <font color="#FF0000"><strong>ACM SigMobile Research Highlight</strong></font> </p>
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/images/2024-earvoice-mobisys.jpg' height="100px" width="50px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[MobiSys'24] </font>Enabling Hands-Free Voice Assistant Activation on Earphoness</p>
<p class="small"> <strong>Tao Chen</strong>, Yongjie Yang, Chonghao Qiu, Xiaoran Fan, Xiuzhen Guo, Longfei Shangguan</p>
<p class="small"><a href="https://www.sigmobile.org/mobisys/2024/" target="_blank" rel="noopener"><em>MobiSys 2024</em> </a>, Tokyo, Japan, June 2024 </p>
<p class="small">
<a href="https://asclepius-system.github.io/" target="_blank" rel="noopener">[Project]</a>
<a href="data/papers/2024-mobisys-earvoice/earvoice.pdf" target="_blank" rel="noopener">[Paper]</a>
<a href="https://tachen-cs.github.io/">[Slides]</a>
<a href="https://tachen-cs.github.io/">[Video]</a>
</p>
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/papers/2024-rmbe-earable/earable_survey.png' height="100px" width="50px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[RBME] </font>Earable Multimodal Sensing and Stimulation: A Prospective Towards Unobtrusive Closed-Loop Biofeedback</p>
<p class="small"> Yuchen Xu, Abhinav Uppal, Min Suk Lee, Kuldeep Mahato, Brian L. Wuerstle, Muyang Lin, Omeed Djassemi, <strong>Tao Chen</strong>, Rui Lin, Akshay Paul, Soumil Jain, Florian Chapotot, Esra Tasali, Patrick Mercier, Sheng Xu, Joseph Wang, Gert Cauwenberghs</p>
<p class="small"><a href="https://www.embs.org/rbme/" target="_blank" rel="noopener"><em>IEEE Reviews in Biomedical Engineering</em> </a>, IF:17.20 </p>
<p class="small">
<a href="data/papers/2024-rmbe-earable/RBME3508713_final_proof.pdf" target="_blank" rel="noopener">[Paper]</a>
<a href="https://tachen-cs.github.io/">[Video]</a>
</p>
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/papers/2024-pervasive-compute/p.PNG' height="100px" width="50px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[Pervasive] </font>Towards Next-Generation Human Computer Interface Based On Earables</p>
<p class="small"> Yongjie Yang<sup>^</sup>, <strong>Tao Chen</strong>, Longfei Shangguan</p>
<p class="small"><a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7756" target="_blank" rel="noopener"><em>IEEE Pervasive Computing</em> </a> </p>
<p class="small">
<a href="https://tachen-cs.github.io/" target="_blank" rel="noopener">[Paper]</a>
</p>
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/papers/2024-ecai-zero/0shot.png' height="100px" width="50px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[ECAI] </font>Leveraging Foundation Models for Zero-Shot IoT Sensing</p>
<p class="small"> Dinghao Xue<sup>^</sup>, Xiaoran Fan, <strong>Tao Chen</strong>, Guohao Lan, Qun Song</p>
<p class="small"><a href="https://www.ecai2024.eu/" target="_blank" rel="noopener"><em>European Conference on Artificial Intelligence</em> </a> </p>
<p class="small">
<a href="https://arxiv.org/abs/2407.19893" target="_blank" rel="noopener">[Paper]</a>
<a href="https://github.com/schrodingho/FM_ZSL_IoT" target="_blank" rel="noopener">[Code]</a>
</p>
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/images/2024-chi-maf.jpg' height="100px" width="50px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[CHI'24] </font>MAF: Exploring Mobile Acoustic Field for Hand-to-Face Gesture Interactions</p>
<p class="small"> Yongjie Yang<sup>^</sup>, <strong>Tao Chen</strong>, Zipan Huang, Xiuzhen Guo, Longfei Shangguan</p>
<p class="small"><a href="https://chi2024.acm.org/" target="_blank" rel="noopener"><em>CHI 2024</em> </a>, Hawaiʻi, USA, May 2024 </p>
<p class="small">
<a href="https://asclepius-system.github.io/" target="_blank" rel="noopener">[Project]</a>
<a href="data/papers/2024-chi-maf/chi24-maf.pdf" target="_blank" rel="noopener">[Paper]</a>
<a href="https://tachen-cs.github.io/">[Slides]</a>
<a href="https://tachen-cs.github.io/">[Video]</a>
</p>
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/images/2024-mobicom-magwear.jpg' height="100px" width="50px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[MobiCom'24] </font>Exploring Biomagnetism for Inclusive Vital Sign Monitoring: Modeling and Implementation</p>
<p class="small">Xiuzhen Guo, Long Tan, <strong>Tao Chen</strong>, Chaojie Gu, Yuanchao Shu, Shibo He, Jiming Chen, Longfei Shangguan</p>
<p class="small"><a href="https://sigmobile.org/mobicom/2024/" target="_blank" rel="noopener"><em>MobiCom 2024</em> </a>, D.C., USA, Oct 2024 </p>
<p class="small">
<a href="https://tachen-cs.github.io/">[Paper]</a>
<a href="https://tachen-cs.github.io/">[Slides]</a>
<a href="https://tachen-cs.github.io/">[Video]</a>
</p>
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/images/2022-mobicom-spotsound.jpg' height="100px" width="50px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[MobiCom'23] </font>Towards Spatial Selection Transmission for Low-end IoT devices with SpotSound</p>
<p class="small">Tingchao Fan<sup>^</sup>, Huangwei Wu<sup>^</sup>, Meng Jin, <strong>Tao Chen</strong>, Longfei Shangguan, Xinbing Wang, Chenghu Zhou </p>
<p class="small"><a href="https://sigmobile.org/mobicom/2023/" target="_blank" rel="noopener"><em>MobiCom 2023</em> </a>, Madrid, Spain, Oct 2023 </p>
<p class="small">
<a href="https://dl.acm.org/doi/10.1145/3570361.3592496">[Paper]</a>
<a href="https://tachen-cs.github.io/">[Slides]</a>
<a href="https://tachen-cs.github.io/">[Video]</a>
</p>
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/images/2022-tosn-soundsticker.jpg' height="100px" width="50px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[ToSN] </font>The Design and Implementation of a Steganographic Communication System over In-Band Acoustical Channels</p>
<p class="small"><strong>Tao Chen</strong>, Longfei Shangguan, Zhenjiang Li, Kyle Jamieson </p>
<p class="small"><a href="https://dl.acm.org/journal/tosn" target="_blank" rel="noopener"><em>ACM Transactions on Sensor Networks</em> </a> </p>
<p class="small">
<a href="https://soundsticker.github.io/" target="_blank" rel="noopener">[Project]</a>
<!-- <a href="https://dl.acm.org/doi/10.1145/3587162" target="_blank" rel="noopener">[Paper]</a> -->
<a href="data/papers/soundsticker.pdf" target="_blank" rel="noopener">[Paper]</a>
<a href="https://tachen-cs.github.io/">[Slides]</a>
<a href="https://www.youtube.com/watch?v=RKyVIddkluA" target="_blank" rel="noopener">[Video]</a>
</p>
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/images/2020-ndss-adversarialExample.jpg' height="100px" width="50px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[NDSS'20] </font>Metamorph: Injecting Inaudible Commands into Over-the-air Voice Controlled Systems</p>
<p class="small"><strong>Tao Chen</strong>, Longfei Shangguan, Zhenjiang Li, Kyle Jamieson </p>
<p class="small"><a href="https://www.ndss-symposium.org/" target="_blank" rel="noopener"><em>NDSS Symposium 2020</em> </a>, San Diego, CA, February 2020 </p>
<p class="small">
<a href="https://acoustic-metamorph-system.github.io/" target="_blank" rel="noopener">[Project]</a>
<a href="data/papers/2020-metamorph-ndss/Metamorph_CameraReady.pdf" target="_blank" rel="noopener">[Paper]</a>
<a href="data/papers/2020-metamorph-ndss/Metamorph-Slides.pdf" target="_blank" rel="noopener">[Slides]</a>
<a href="https://www.youtube.com/embed/4NSpwiXMbtc">[Video]</a>
</p>
<!-- <p class="small"> <i><font color="#c03c3b">One of the top 4 security conferences</font> </i> </p> -->
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/images/2020-icdcs-tapleak.jpg' height="100px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[ICDCS'20] </font>Mobile Phones Know Your Keystrokes through the Sounds from Finger's Tapping on the Screen</p>
<p class="small">Zhen Xiao<sup>^</sup>, <strong>Tao Chen</strong>, Yang Liu, Zhenjiang Li</p>
<p class="small"><a href="https://icdcs2020.sg/" target="_blank" rel="noopener"><em> IEEE ICDCS </em> </a>, Singapore, December 2020 </p>
<p class="small">
<a href="data/papers/2020-icdcs-tapleak/icdcs.pdf" target="_blank" rel="noopener">[Paper]</a>
<a href="data/papers/2020-icdcs-tapleak/2020-ICDCS-TapLeak-slides.pdf" target="_blank" rel="noopener">[Slides]</a>
<a href="https://www.youtube.com/embed/4NSpwiXMbtc">[Video]</a>
</p>
<p class="small"> <i><font color="#c03c3b">[TMC]</font> Journal version in IEEE Transactions on Mobile Computing</i></p>
</div>
</div>
</div>
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/images/2020-survery.png' height="100px">
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p class="small"><font color="#c03c3b">[IOTJ] </font>Adversarial Attacks and Defenses on Cyber-Physical Systems: A Survey</p>
<p class="small">Jiao Li, Yang Liu, <strong>Tao Chen</strong>, Zhen Xiao, Zhenjiang Li, Jianping Wang</p>
<p class="small"><a href="https://ieee-iotj.org/" target="_blank" rel="noopener"><em> IEEE Internet of Things Journal </em> </a></p>
<p class="small">
<a href="https://ieeexplore.ieee.org/document/9006862" target="_blank" rel="noopener">[Paper]</a>
<a href="data/papers/2020-icdcs-tapleak/2020-ICDCS-TapLeak-slides.pdf" target="_blank" rel="noopener">[Slides]</a>
<a href="https://www.youtube.com/embed/4NSpwiXMbtc">[Video]</a>
</p>
</div>
</div>
</div>
</div>
<h1></h1>
<h4>Other Publications </h4>
<hr color=#F1ECEC size="1">
<div class ="boxed">
<p class="small">
<a href="https://dl.acm.org/doi/abs/10.1145/3560905.3568084" target="_blank" rel="noopener">Towards Remote Auscultation With Commodity Earphones</a></p>
<p class="small"><strong>Tao Chen</strong>, Xiaoran Fan, Yongjie Yang, Longfei Shangguan </p>
<p class="small"><a href="https://sensys.acm.org/2022/" target="_blank" rel="noopener"><em>SenSys 2022</em> </a>, Boston, USA, November 2022 </p>
<br>
<p class="small">
<a href="https://www.ndss-symposium.org/wp-content/uploads/2020/02/NDSS2020posters_paper_14.pdf" target="_blank" rel="noopener">Poster: Room-Scale Over-the-Air Audio Adversarial Examples </a></p>
<p class="small"><strong>Tao Chen</strong>, Longfei Shangguan, Zhenjiang Li, Kyle Jamieson </p>
<p class="small"><a href="https://www.ndss-symposium.org/" target="_blank" rel="noopener"><em>NDSS Symposium 2020</em> </a>, San Diego, USA, February 2020 </p>
</div>
<h1></h1>
<h4> Selected Awards </h4>
<hr color=#F1ECEC size="1">
<div class ="boxed">
<p> <b>ACM SigMobile Research Highlight</b>, 2025</p>
<p> <a href="data/images/bestpaper_mobicom24.jpg" target="_blank" rel="noopener"><font color="#FF0000"><strong>Best Paper Award (1 out of all submissions)</strong></font></a> , ACM MobiCom, 2024</p>
<p> <b>SIGMOBILE Student Community Grant Award</b>, ACM SIGMOBILE, 2022</p>
<p> <b>Best Poster Presentation Award (12/300+)</b>, Pitt Postdoctoral Research Symposium, 2023</p>
<p> <b>Student Travel Grant</b>, NDSS, 2020</p>
<p> <b>Research Tuition Scholarship</b>, CityU, 2020</p>
<p> <b>Robosub Best New Team</b>, AUVSI and ONR, 2016</p>
<p> <b>Singapore AUV Challenge First Place</b>, IEEE OES Singapore, 2016</p>
<p></p>
</div>
<h1></h1>
<h4> Talks, Lectures & Presentations </h4>
<hr color=#F1ECEC size="1">
<div class ="boxed">
<p> <b>Exploring the Feasibility of Remote Cardiac Auscultation Using Earphones </b>
<br> - ACM MobiCom 2024
<br> - MobiCom4AgeTech workshop 2024
<br> - Carnegie Mellon University, hosted by Prof. Mahadev Satyanarayanan </p>
<p> <b>Audio and Beyond: Building Multimodal Sensory AI Systems</b>
<br> - University of Pittsburgh
<br> - Samsung Research America
<br> - Starkey Hearing Technologies
<br> - Dolby Laboratories Inc
<br> - Apple
<br> - AIZIP </p>
<p> <b>Sensing and Interpretation: Push the Limit of Accessible Consumer Health</b>
<br> - University of Pittsburgh
<br> - Samsung Research America </p>
<p> <b>Enabling Hands-Free Voice Assistant Activation on Earphones</b>
<br> - Carnegie Mellon University, hosted by Prof. Mahadev Satyanarayanan </p>
<p> <b>Exploring Biomagnetism for Inclusive Vital Sign Monitoring: Modeling and Implementation</b>
<br> - Carnegie Mellon University, hosted by Prof. Mahadev Satyanarayanan </p>
</div>
<h1></h1>
<h4> Services </h4>
<hr color=#F1ECEC size="1">
<div class ="boxed">
<p> <b>Editor:</b>
<br> Electronics: Recent Advances in Signal Processing for Flexible and Wearable Electronics (Special Issue) </p>
<p> <b>TPC:</b>
<br> 2025: HumanSys; MobiSys (AEC),
<br> 2024: ISWC; MobiCom (AEC); NDSS (AEC); IPSN (Poster); IEEE ICPADS; EAI MobiQuitous; EarComp; IEEE MSN; ACM BigCom; IEEE ISPA; SENSORDEVICES;
<br> 2023: MobiCom (AEC); SOSP (AEC); IEEE ICPADS; EAI MobiQuitous; EarComp; IEEE Cloud Summit;
<br> 2022: IEEE ICPADS; EAI MobiQuitous; ACM SenSys (shadow); </p>
<p> <b>Reviewer:</b>
<br> ACM CHI 2025, 2023;
<br> ACM BigCom 2024;
<br> ACM ICDCS 2024;
<br> ACM ASSETS 2023;
<br> IEEE VIS 2023;
<br> ACM MobiCom 2025, 2024, 2023, 2022, 2021;
<br> The Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT/UbiComp), 2025, 2024, 2023, 2021;
<br> IEEE Transactions on Mobile Computing;
<br> ACM Transactions on Privacy and Security;
<br> ACM Transactions on Sensor Networks;
<br> IEEE Pervasive Computing;
<br> IEEE Internet of Things Journal;
<br> Frontiers Artificial Intelligence of Things; </p>
<p> <b>Organization:</b>
<br> 2022: Session Chair, MobiQuitous </p>
</div>
<h1></h1>
<h4> Miscellaneous </h4>
<hr color=#F1ECEC size="1">
<!-- <div class ="boxed">
</div> -->
<div class = "no_boxed">
<div class="row no-gutters align-items-start">
<div class ="col-3 col-md-3">
<div class = "white_boxed">
<img class="scale_img_100" src='data/images/auv-robosub copy.jpg'>
</div>
</div>
<div class ="col-9 col-md-9">
<div class = "white_boxed">
<p>I had wonderful experiences on designing <strong> underwater robots (AUV)</strong> with <a href="data/images/auv-team.jpg" target="_blank" rel="noopener">a group of friends</a> during my undergraduate years, where I am in charge of the hardware and sonar system.
Our AUV <a href="data/images/auv.jpg" target="_blank" rel="noopener">NEMO</a> won <strong>Best New Team</strong> in <a href="https://robonation.org/programs/robosub/" target="_blank" rel="noopener">Robosub</a> <a href="https://robosub.org/programs/2016-robosub/" target="_blank" rel="noopener">2016</a> (San Diego, CA) and <strong>First Place</strong> in <a href="https://sauvc.org" target="_blank" rel="noopener">SAUVC</a> <a href="https://sites.google.com/site/singaporeauvc/news/untitledpost" target="_blank" rel="noopener">2016</a> (Singapore). </p>
<p> Here are some memorable videos about our robot and team in SAUVC (<a href="https://www.youtube.com/watch?v=cbLnWk_7bbQ" target="_blank" rel="noopener">day3</a>, <a href="https://www.youtube.com/watch?v=rAqqL6XG_6Q" target="_blank" rel="noopener">day2</a>, <a href="https://www.youtube.com/watch?v=r6EPuCuPEJ8" target="_blank" rel="noopener">day1</a>) and Robosub (<a href = "https://www.youtube.com/watch?v=N81MyGFAt6I&feature=youtu.be" target="_blank" rel="noopener">team</a>).</p>
</div>
</div>
</div>
</div>
<!-- https://www.onr.navy.mil/en/Media-Center/Press-Releases/2016/RoboSub-2016 -->
<h1></h1>
<script src="https://code.jquery.com/jquery-3.3.1.slim.min.js" integrity="sha384-q8i/X+965DzO0rT7abK41JStQIAqVgRVzpbzo5smXKp4YfRvH+8abtTE1Pi6jizo" crossorigin="anonymous"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.14.7/umd/popper.min.js" integrity="sha384-UO2eT0CpHqdSJQ6hJty5KVphtPhzWj9WO1clHTMGa3JDZwrnQq4sF86dIHNDz0W1" crossorigin="anonymous"></script>
<script src="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/js/bootstrap.min.js" integrity="sha384-JjSmVgyd0p3pXB1rRibZUAYoIIy6OrQ6VrjIEaFf/nJGzIxFDsf4x0xIM+B07jRM" crossorigin="anonymous"></script>
</body>
</html>