avatar
AstroAI
Developing Artificial Intelligence to Solve the Mysteries of the Universe
  • HOME
  • RESEARCH
  • EARTHAI
  • PEOPLE
  • EVENTS
  • LATEST NEWS
  • LUNCH TALKS
  • SUMMER PROGRAM
  • WORKSHOP
  • APPLY
  • CONTACT
Home AstroAI Workshop 2025
Workshop_abstract
Cancel

AstroAI Workshop 2025

Details Invited Speakers Abstracts Register Schedule Venue Accommodations Code of Conduct

Ilay Kamai

Key Stellar Parameter Predictions with Multi-Modal Neural Networks: Extending Deep Learning to Spectroscopy and Photometry

Presenter: Ilay Kamai

Title: Key Stellar Parameter Predictions with Multi-Modal Neural Networks: Extending Deep Learning to Spectroscopy and Photometry

Date/Time: Thursday, July 10th, 2:00 - 2:15 PM

Abstract: Understanding the fundamental properties of stars—such as rotation, mass, radius, age, and composition—is central to astrophysics, enabling insights into stellar evolution, galactic structure, and the conditions for planet formation. Traditionally, such parameters are inferred from either photometric light curves or spectroscopic data, with each modality offering a partial view of stellar physics. Yet, effectively combining information from both remains a significant challenge.

In previous work, we introduced LightPred, a deep-learning framework that extracts stellar rotation periods from Kepler light curves using a dual-branch LSTM–Transformer architecture. LightPred enabled the creation of the largest catalog of stellar rotation periods to date—over 80,000 main-sequence stars—outperforming classical methods like the autocorrelation function. It also revealed strong correlations between model uncertainty and stellar properties, an important step toward interpretability.

Building on this foundation, we now present a novel multi-modal neural network that integrates both stellar light curves and spectra into a unified architecture for comprehensive stellar characterization. Our model features modality-specific encoders trained via a hybrid self-supervised and supervised approach, a shared encoder module, and task-specific fine-tuning heads. At its core is DualFormer, a new module designed to maximize mutual information between modalities while keeping the learned representations disentangled and physically meaningful.

A key innovation is a linear projection layer that distills information from the joint embedding space. Projecting onto the eigenspace of this layer yields neural stellar diagrams—data-driven analogs of classical diagrams such as the Hertzsprung–Russell and Kiel diagrams—that emerge naturally and capture stellar properties with high fidelity. We showcase the model’s utility by fine-tuning it on binary star classification, achieving 95% accuracy, precision, and recall. Beyond binary identification, the model can be fine-tuned for stellar parameter inference (including challenging quantities like stellar ages) and enables clustering of distinct stellar populations in the learned representation space. Our work represents a step forward in data-driven stellar astrophysics: from rotation period inference using photometry alone, to a holistic, multi-modal framework capable of uncovering new physical representations from large and diverse stellar datasets.

-->

© 2025 AstroAI. Some rights reserved.

Powered by Jekyll with Chirpy theme.

A new version of content is available.