CV4Smalls Workshop was very well received!

Workshop Webpage: Computer Vision with Small Data: A Focus on Infants and Endangered Animals Enjoy the coverage on social media: X: CV News: CV..

Sarah receives SNOY FACULTY INNOVATION AWARD We are also featured on Northeastern Global News

We are hosting a WACV’24 Workshop in Hawaii

We are thrilled to extend an invitation to you for the inaugural workshop on Computer Vision with Small Data (CV4Smalls): A Focus on Infants and Endangered Animals. This workshop is hosted as part of the 2024 Winter Conference on Applications of Computer ..

Our ICPR’22 paper won the Best Paper Award

Congratulations to Michael Wan and the rest of co-authors: M. Wan⋆, S. Zhu⋆, L. Luan⋆, P. Gulati⋆, X. Huang⋆, R. Schwartz-Mette, M. Hayes, E. Zimmerman, and S. Ostadabbas, “InfAnFace: Bridging the infant–adult domain gap in facial landmark e..

Shuangjun’s paper accepted in TPAMI

S. Liu, X. Huang, N. Fu, C. Li, Z. Su, and S. Ostadabbas, “Simultaneously-Collected Multimodal Lying Pose Dataset: Towards In-Bed Human Pose Monitoring under Adverse Vision Conditions,” IEEE Transactions on Pattern Analysis and Machine Intelligence ..

Sarah Talks at NYU School of Engineering

Learning Strong Inference Models in Small Data Domains: Towards Robust Human Pose Estimation Speaker: Sarah Ostadabbas, Northeastern UniversityDate: Dec 9 Abstract: Recent efforts in machine learning (especially with the new waves of deep learning ..

Center for SPIRAL recognized by Northeastern

On behalf of the SPIRAL faculty, we are pleased to announce that we have just been recognized by Northeastern University as the Center for SPIRAL. Professors Ostadabbas and Ioannidis will serve as co-directors and are available to discuss any center-rela..

We received FY21 Northeastern TIER 1 Award

In collaboration with Danielle Levac (Physical Therapy), Karen Quigley (Psychology), and Lisa Barrett (Psychology), we received TIER 1 Mentored Award on “Novel methods to quantify the affective impact of virtual reality for motor skill learning.R..

Our A-Eye project Selected for GapFund360

Our collaborative project with NU Physics Department and PI Swastik Kar “A-Eye: A Nanotechnology and AI-assisted Artificial Cone Cell Capable of Color and Spectral Recognition,” is selected for Phase I GapFund360 funding: NU COE News Link

ACLab at CVPR2019 with 2 Posters

Shuangjun presented two posters on behalf of lab members at ACLab, “Infant Contact-less Non-Nutritive Sucking Pattern Quantification via Facial Gesture Analysis” (at CVPR2019 Workshop on Augmented Human) and “Introduction to Indoor GeoNe..

Shuangjun’s paper accepted at MICCAI 2019

Let’s congrats Shuangjun for having his paper entitled “Seeing Under the Cover: A Physics Guided Learning Approach for In-Bed Pose Estimation,” accepted at the MICCAI 2019, the 22nd International Conference on Medical Image Computin..

Sarah gives a talk at ECE, UMass Dartmouth

Friday October 26 Title: Human Pose Estimation: Deep Learning with Small Data Abstract: Although human pose estimation for various computer vision (CV) applications has been studied extensively in the last few decades, yet some pose problem such as in-bed..

ACLab is featured in ECCV2018 News

ACLab attended ECCV2018 with 3 presentations: * “A Semi-Supervised Data Augmentation Approach using 3D Graphical Engines,” Poster at the 9th International HBU ECCV Workshop. * “Inner Space Preserving Generative Pose Machine,” Poster at the ECCV Ma..

2018 MMDF Workshop Report is Out

Our very productive 2018 MMDF workshop resulted in insightful discussion and productive suggestions for moving the field of multimodal data fusion forward in the coming years. These insights have been distilled into a Workshop Report which we believe wi..

Sarah will be a mentor at TOM:Boston Makeathon

In conjunction with the Ruderman Family Foundation and TOM (Tikkum Olam Makers), the Nurse Innovation & Entrepreneurship Program of Northeastern University is happy to announce a second MakeAthon in the works! Sarah will be one of the mentors at th..