|An Introduction to the Julia Programming Language||Learning||aidata-analysismachine-learning +1 more tags||Beginner|
|Applications of Machine Learning in Engineering and Parameter Tuning Tutorial||Learning||data-analysismachine-learningpython||Beginner, Intermediate|
|Automated Machine Learning Book||Learning||aidata-analysisdeep-learning +4 more tags||Intermediate, Advanced|
Investigation of robustness of state of the art methods for anxiety detection in real-world conditions
I am new to ACCESS. I have a little bit of past experience running code on NCSA's Blue Waters. As a self-taught programmer, it would be interesting to learn from an experienced mentor.
Here's an overview of my project:
Anxiety detection is topic that is actively studied but struggles to generalize and perform outside of controlled lab environments. I propose to critically analyze state of the art detection methods to quantitatively quantify failure modes of existing applied machine learning models and introduce methods to robustify real-world challenges. The aim is to start the study by performing sensitivity analysis of existing best-performing models, then testing existing hypothesis of real-world failure of these models. We predict that this will lead us to understand more deeply why models fail and use explainability to design better in-lab experimental protocols and machine learning models that can perform better in real-world scenarios. Findings will dictate future directions that may include improving personalized health detection, careful design of experimental protocols that empower transfer learning to expand on existing reach of anxiety detection models, use explainability techniques to inform better sensing methods and hardware, and other interesting future directions.