Palo Alto, CA | Hybrid or Onsite

EraDrive’s goal is to revolutionize how spaceflight is done and managed through autonomy. EraDrive builds autonomy software and hardware that let spacecraft see, decide, and act with minimal human control. Our software is already flying in space, and we are growing it into a product family that covers rendezvous, on-orbit inspection, and space based sensing. You will be one of EraDrive’s first engineering hires and a founding member of the team.

This role sits at the intersection of classical Guidance, Navigation, and Control (GNC) and learning-based autonomy; you will decide where physics ends, learning begins, and how the two are safely combined. For example, you will design and ship the stack that combines concepts like classical orbital mechanics and state estimation with modern AI/ML-based perception and decision logic. If you like turning physics, optimization and machine learning into reliable flight-ready code, this is your role.

What you will do

  • Design and implement autonomy, navigation and control algorithms for satellite rendezvous and proximity operations, on-orbit inspection, space domain awareness, and formation flying applications

  • Define architecture-level decisions for how classical GNC and learning-based systems are combined and validated.

  • Develop navigation pipelines using batch and sequential estimation methods such as Kalman filters

  • Develop control pipelines using analytical closed-form solutions and numerical methods such as convex optimization

  • Work with AI/ML-based perception, guidance and navigation modules that provide robust inputs to the guidance and control stack

  • Define and extend synthetic data and simulation pipelines used to train, validate, and stress test algorithms under realistic dynamical and visual conditions

  • Build and run simulation campaigns, including scenario libraries and Monte Carlo tests, to evaluate performance over full mission profiles and edge cases

  • Integrate algorithms into onboard software running on constrained compute, using clean C++ and Python interfaces and respecting timing and resource limits

  • Support hardware in the loop and software in the loop testing, closing the loop from models to real sensors, then back into algorithm and model updates

  • Own your modules end to end, from first prototype through internal review, regression tests, and delivery

What you bring

  • Strong background in orbital mechanics, spacecraft dynamics, or a closely related field

  • Experience designing and implementing guidance, navigation, or control algorithms in C++ and Python

  • Familiarity with state estimation and tracking algorithms, for example Kalman filters, nonlinear filters, or optimization-based estimators

  • Familiarity with optimal control and trajectory planning algorithms

  • Experience building and using simulation tools to validate algorithms and models across large sets of scenarios

  • Ability to work in a small team, take clear ownership, and communicate tradeoffs honestly

Bonus

  • Experience with multi-agent relative navigation or proximity operations in space or robotics

  • Hands on experience with machine learning for perception, estimation, or task planning, for example computer vision models, object detection or pose estimation networks, or foundation models

  • Experience with training pipelines, synthetic data generation, or domain adaptation for real world deployment

  • Experience integrating learned perception models with classical estimation or control loops

  • Exposure to embedded or real time systems for autonomous vehicles

  • Publications or open source work in autonomy, perception, or estimation