Meta’s Reality Labs Research (RL-R) brings together a team of researchers, developers, and engineers to create the future of Mixed Reality (MR), Augmented Reality (AR), and Wearable Artificial Intelligence (AI). Within RL-R, the ACE team solves complex challenges in behavioral inference from sparse information. We leverage multimodal, egocentric data and cutting-edge machine learning to deliver robust, efficient models that serve everyone. Our research provides core building blocks to unlock intuitive and helpful Wearable AI, empowering everyone to harness the superpowers of this emerging technology in their daily lives. In this role, you will work closely with Research Scientists and Engineers from across RL-R to develop novel, state-of-the-art algorithms for wearables that incorporate social behavior dynamics and multimodal sensing platforms. You will design and implement data collection strategies, benchmarks, and metrics to validate and improve model efficiency, scalability, and stability. Your expertise in psychology and human-human interaction will be crucial in developing AI algorithms that can infer human behavior patterns from wearable devices. You will also have the opportunity to work with multiple egocentric sensor modalities to advance our understanding of human behavior in various contexts.
Locations
Redmond, WA, US
Salary
Salary not disclosed
Skills Required
multimodal machine learningadvanced
data collection and analysisadvanced
coding (Python, C++, PyTorch)intermediate
Large Language Modelsintermediate
Multimodal Deep Learningadvanced
Required Qualifications
Bachelor's degree in Computer Science, Computer Engineering, or relevant technical field (degree)
PhD in Informatics, Social/Behavioral Sciences, Computer Science, Human-Computer Interaction, or related field (degree)
2+ years of research scientist experience in industry (post-PhD) (experience)
Documented understanding of social behavior dynamics, including expertise in psychology and human-human interaction (experience)
2+ years designing field experiments, data campaigns, and observation skills for human behavior (experience)
Proven track record of solving complex challenges with multimodal ML (grants, fellowships, patents, publications) (experience)
2+ years with multimodal sensing platforms, data collection, processing, and analysis (experience)
PhD in Behavioral Science, Computer Science or related field plus 3+ years with biosignals, behavioral signals, or egocentric data from wearable sensors (experience)
2+ years of coding experience documented in publications or open source repositories (experience)
Experience working in Wearables, Augmented Reality/Virtual Reality (experience)
Responsibilities
Characterize human behavior in-the-wild to derive behavioral signals for user states in the form of quantitative insights from ethnographic observations
Identify use cases and experiences that leverage behavioral signals to provide user value in wearable AI assistance
Design and implement data collection strategies, benchmarks, and metrics to validate and improve model interpretability, scalability, and stability
Provide research results that accelerate the development and application of state-of-the-art AI algorithms to infer human behavior patterns from wearable devices
Translate results of human data collection into datasets that can be effectively leveraged by ML tools and language readily interpretable by foundational models
Collaborate with researchers and engineers across broad disciplines through all stages of project development
Contribute to research that can eventually be applied to Meta products and services
Create tools, infrastructure, and documentation to accelerate research
Learn constantly, dive into new areas with unfamiliar technologies, and embrace the ambiguity of Augmented Reality/Virtual Reality problem solving
Benefits
bonus: Bonus included in compensation
equity: Equity included in compensation
health: Benefits package (details at Meta benefits page)