iLoco
Details
In this work, we present iLoco, a real-time, plug-and-play visual SLAM system that leverages the iPhone’s built-in RGB-D camera and IMU to enable accurate localization. By integrating a sensor suite from the iPhone, iLoco delivers pose estimation utilizing ORB feature matching for RGB-D visual feature extraction and tracking, while GT-SAM is employed to tightly integrate inertial measurements with visual odometry for enhanced robustness. The system is engineered to be a “slap-on” solution, requiring minimal setup and no external calibration, making it especially suitable for rapid prototyping, educational demonstrations, and accessible SLAM research. The design of iLoco prioritizes ease of use and adaptability, allowing a wide range of users, from students to developers, to harness the power of real-time SLAM using everyday mobile devices.
Our project report is available here: iLoco_project_report.pdf
Project information
- Project date May 2025
- Project URL github