I’ve started working on a new project. I’ve teamed up with Jack and Emery to continue the work they started last year with their capstone. The long term goal of this project is to create a drone equipped with a laser scanning system capable of scanning a room (or other space) and recreating it digitally.
This project has several phases, each laden with technical challenges and complexity.
Phase 1: Build the drone
This was mostly completely last year by Emery and Jack. Check out their blogs (Jack and Emery) to learn more about that process.
Phase 2: Proof of concept room scanning (SONAR)
Last year Emery prototyped a low resolution proof of concept scanner. This prototype originally utilized a ultrasonic sensor and then later used LiDAR. It’s two servos move a tiny increments while the LiDAR takes distance measurements. During this phase we got some experience with point clouds and applied surfaces. The configuration of the servos made the LiDAR not rotate about a consistent axis of rotation, causing some anomalies with the recorded points.
Phase 3: High fidelity room scanning (LiDAR)
This is the part that I’m working on. The idea is to recreate the prototype room scanner with high precision stepper motors and LiDAR. In this phase, we’ll also refine our coordinate system and surface mapping techniques. Early versions in this stage will focus on similar techniques used previously – rotating at fine increments and taking measurements. Later, we’ll need to rethink this design to allow for significantly faster measuring.
Phase 4: Precise positioning of the drone
We’ll probably be developing Phases 3 and 4 concurrently as we have time. In order to utilize the scanner on the drone, we need to know the drone’s location very precisely (at least within a few centimeters). The drone’s position must be appended to the each coordinate that the scanner measures, otherwise all of our points become trash as soon as the drone moves. Commercial GPS is only accurate within 2-3 meters, great for a car but not nearly good enough for us. At least for the early stages of this phase, we don’t have to be concerned with the drone’s global location; rather we only need it’s distance to some fixed reference point. We’ve been looking into different solutions involving Ultra-Wide band (UWB) technology, which promises potential accuracy within 10cm. Other options include optical flow sensors and inertial navigation systems.
We’ll also need to know very accurately the pitch, roll, and yaw (or compass direction) of the drone for the same reasons. These will likely come from the onboard sensors including a magnetometer, gyroscope, and several accelerometers.
Phase 5: Putting it all together
This phase is when we will integrate the lidar scanning system with the drone. Factors that will need to be considered include: battery life, balancing the scanner, scanning speeds, scanner reliability, drone reliability…etc.
It is likely that phase 3 (at least initially) will produce a slow moving scanning system which will be prohibitively impractical for drone mounting. That system will ultimately need to be redesigned to work at high speed so that the drone can map in real time as it moves.
Phase 6: Future Steps
I’m lumping together all of the far reaching super future goals of this project into one phase. If/when we get closer to implementing these, I’ll break them down further into new phases.
Autonomous Flight
A very long term goal of this project is to have the drone be able to fly autonomously. This should probably involve integrating the real time LiDAR scanning with the drone’s navigation systems so that it can be capable of object avoidance.
Cooperative Drones
Once we make the drone autonomous, we’d like to add a second drone to system. The drones would work collaboratively to map an area. Inter-drone communication would be crucial to avoiding in air collisions. The framework for the 2nd drone should be scalable to allow for any number of drones in this fleet.
Miniaturization
As our drone technology becomes further refined, we’d like to recreate this project on a smaller, more agile drone.
Other Sensors/Cameras
This project will become more versatile with the inclusion of other sensors and camera mapping technology. If we can do real time point cloud mapping, we’d like to be able to supplement the point cloud with photos to create a detailed 3D photoscape.
I’ll be updating this blog with my design process and progress as this project evolves.