Woosik Lee

I am a PhD Canidate at University of Delaware College of Engineering, where I work on robotics and state estimation.

I am currently conducting research in the Robot Perception and Navigation Group (RPNG) starting from 2017 under the direction of Prof. Guoquan (Paul) Huang. I received a Bachelor's and Master's in Mechanical Engineering from the Korea University in 2015 and 2017 respectively.

Keywords: Robotics, State estimation, Multi-sensor, Calibration, Consistency

Email / CV / Scholar / Github / LinkedIn

profile photo
Research

My primary interest is enabling autonomy for robots through robust and accurate state estimation. Much of my research is mainly focused on visual (cameras) and inertial (IMUs) sensing modalities, but have dabbled in different sensing types to try to either improve robustness or increase overall trajectory accuracy. I am also a developer of the OpenVINS, which is a filter-based Visual-Inertial estimator and MINS Multisensor-aided Inertial Navigation System which is capable of flexibly fusing all five sensing modalities (IMU, wheel encoders, camera, GNSS, and LiDAR).

NeRF-VINS: A Real-time Neural Radiance Field Map-based Visual-Inertial Navigation System
Effectively leveraging NeRF’s potential to synthesize novel views, essential for addressing limited viewpoints, the proposed NeRF-VINS optimally fuses IMU and monocular image measurements along with synthetically rendered images within an efficient filter-based framework, 2023
pdf

MINS: Efficient and Robust Multisensor-aided Inertial Navigation System
A consistent tightly coupled Multisensor-aided Inertial Navigation System (MINS) that is capable of fusing the most common navigation sensors. Support:
- An efficient and robust filtering-based multi-sensor estimator
- Fuse IMU, wheel encoders, cameras, LiDARs, and GPS
- Flexible in adding homogenous and heterogeneous sensors
- Online calibration of all onboard sensors extrinsic and intrinsic
- Handle asynchronous and delayed measurements
- Lightweight with on-manifold state interpolation and dynamic cloning
- Resilient to sensor failures, measurement depletion, and outliers
- Robust proprioceptive sensor-based initialization
- Sensor simulation toolbox for algorithm evaluation
2023
bibtex / pdf / code

Optimization-based VINS: Consistency, Marginalization, and FEJ
A comprehensive analysis of the application of the First-estimates Jacobian (FEJ) design methodology in nonlinear optimization-based Visual-Inertial Navigation Systems (VINS), 2023
bibtex / pdf / tech report / talk / slides

Monocular Visual-Inertial Odometry with Planar Regularities
A novel real-time monocular VIO system that is fully regularized by planar features within a lightweight multi-state constraint Kalman filter (MSCKF), 2023
bibtex / pdf / video / talk / code / dataset / slides / poster

Visual-Inertial-Aided Online MAV System Identification
An online MAV system identification algorithm to tightly fuse visual, inertial and MAV aerodynamic information within a lightweight multi-state constraint Kalman filter (MSCKF) framework, 2022
bibtex / pdf / tech report / video / talk / slides

Tightly-coupled GNSS-aided Visual-Inertial Localization
A tightly-coupled GNSS-aided visual-inertial navigation system which is able to leverage the complementary sensing modality from a visual-inertial sensing pair, which provides high-frequency local information, and a GNSS receiver with low-frequency global observations, 2022
bibtex / pdf / talk / slides / poster

Decoupled Right Invariant Error States for Consistent Visual-Inertial Navigation
Two novel algorithms which preserve the system consistency by leveraging the invariant state representation and ensure efficiency by decoupling features from covariance propagation, 2022
bibtex / pdf / tech report

Efficient multi-sensor aided inertial navigation with online calibration
A versatile multi-sensor aided inertial navigation system that can efficiently fuse multi-modal measurements of IMU, camera, wheel encoder, GPS, and 3D LiDAR along with online spatiotemporal sensor calibration.
bibtex / pdf / video /

iCalib: Inertial Aided Multi-Sensor Calibration
A holistic non-linear least squares (NLS) based multi-sensor calibration system, which exploits high-rate inertial navigation and can handle a multitude of asynchronous sensors commonly found on robots (e.g., IMU, cameras, LiDARs, and wheel odometry) while requiring only initial calibration guesses, 2021
bibtex / pdf / talk / slides

OpenVINS: A Research Platform for Visual-Inertial Estimation
An open platform, termed OpenVINS, for visual-inertial estimation research. Supports:
- On-manifold sliding window Kalman filter
- Online IMU/camera intrinsic and extrinsic calibration
- Camera to inertial sensor time offset calibration
- SLAM landmarks with different representations and consistent FEJ
- Modular type system for state management
- Extendable visual-inertial system simulator
- Extensive toolbox for algorithm evaluation
2020
bibtex / pdf / video / talk / code / slides

Versatile 3D Multi-Sensor Fusion for Lightweight 2D Localization
A 2-stage localization system which incorporates both offline prior map building and online multi-modal localization. In particular, we develop an occupancy grid mapping system with probabilistic odometry fusion, accurate scan-to-submap covariance modeling, and accelerated loop-closure detection, which is further aided by 2D line features that exploit the environmental structural constraints, 2020
bibtex / pdf / video / slides

Visual-Inertial-Wheel Odometry with Online Calibration
A novel visual-inertialwheel odometry (VIWO) system for ground vehicles, which efficiently fuses multi-modal visual, inertial and 2D wheel odometry measurements in a sliding-window filtering fashion. Wheel encoders’ both intrinsic and extrinsic (spatiotemproal) parameters are calibrated online, 2020
bibtex / pdf / tech report / video / slides

Intermittent GPS-aided VIO: Online Initialization and Calibration
An efficient and robust GPS-aided visual inertial odometry (GPS-VIO) system that fuses IMU-camera data with intermittent GPS measurements. To perform sensor fusion, spatiotemporal sensor calibration and initialization of the transform between the sensor reference frames are addressed, 2020
bibtex / pdf / tech report / video / talk / dataset / slides

OpenVINS: A Research Platform for Visual-Inertial Estimation
The open sourced codebase provides a foundation for researchers and engineers to quickly start developing new capabilities for their visual-inertial systems, 2019
bibtex / pdf / code / poster

LIC-Fusion: LiDAR-Inertial-Camera Odometry
A tightly-coupled multi-sensor fusion algorithm termed LiDAR-inertial-camera fusion (LIC-Fusion), which efficiently fuses IMU measurements, sparse visual features, and extracted LiDAR points, 2019
bibtex / pdf / arXiv / video / slides

Practical Modeling of GNSS for Autonomous Vehicles in Urban Environments
A method of quantitatively modeling uncertainty in the accumulated GNSS and in wheel encoder data accumulated in anonymous urban environments, collected using vehicles, 2019
bibtex / pdf

Position estimation using multiple low-cost GPS receivers for outdoor mobile robots
A method of using multiple low-cost GPS receivers in position estimation to enhance accuracy and decrease system buildup cost, 2015
bibtex / pdf

A Big Data System Design to Predict the Vehicle Slip
A system storing sensor data generated from the vehicle using Big data system and analyzing real-time sensor output to predict the car slip, 2015
bibtex / pdf


Special thanks to Jon Barron and Patrick Geneva for the website's source code.