Skip to content

asalekin-ubiquitouslab/Modality-wise-Multple-Instance-Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

19 Commits
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Modality-wise-Multple-Instance-Learning

This reposistory contains the code for the paper Psychophysiological Arousal in Young Children Who Stutter: An Interpretable AI Approach, IMWUT 2022

Modality Invariant-MIL (MI-MIL) Approach

The MI-MIL approach takes the modality-specific bag representations (π΅π‘š ={π‘₯1π‘š,π‘₯2π‘š,...π‘₯π‘˜π‘š}, π‘˜ = 19,π‘š = EDA, HR, RSP-amp, RSP-rate) of a 20s physiological sensing data as input. As shown in figure below, MI-MIL has four components: (1) modality specific embedding block, (2) modality specific self-attention pooling block, (3) modality fusion Block, and (4) classifier Block. While the first two blocks are applied to each modality π‘š independently, the latter two combine the cross-modality information to generate inference.

MI-MIL

ACM Reference Format:

Harshit Sharma, Yi Xiao, Victoria Tumanova, and Asif Salekin. 2022. Psychophysiological Arousal in Young Children Who Stutter: An Interpretable AI Approach. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 6, 3, Article 137 (September 2022), 32 pages. https://doi.org/10.1145/3550326

Contact

Harshit Sharma, SCAI, Arizona State University, hsharm62@asu.edu

About

The repository contains our implementation for the work to be presented at Ubicomp 2022

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •