Earlier today I met with Professor Golub and discussed this project. We talked at length about options for different stitching and warping techniques. His responses to my questions are paraphrased and elaborated upon below.
- After hearing my project plan described in detail, is there anything you think I’ve overlooked or should otherwise consider in more depth?
We discovered that he photos taken by the cameras have a 4:3 ratio, which means they have a 160 degree viewing angle in the horizontal direction and a 120 degree viewing angle. This may have ramifications for my camera alignment.
The cameras have wide-angle lenses, not fish eye. There are subtle differences which will be important to keep in mind.
He suggested embracing the DIY nature of this project and including an explanation of the math and science that founds this project. It could, he added, be used as a learning tool in a high school setting. As a demonstration, students could hold hands and form a circle around the camera system – circles under a certain size might have visible gaps.
- Do you have any advice or suggestions regarding the homography transformations and photo stitching process?
We spent most of our time discussing this issue. He suggested that I look into Autopano-SIFT and PanoTools, two of the programs behind Hugin. I could use Autopano-SIFT to find overlapping pixels (control points) and then hard code the transformations. I’d likely need to warp the images using PanoTools. He suggested I experiment with cubemaps, (skybox)- Cubemaps (known commonly as skyboxes in video game development) would involve 4 cameras in the horizontal direction, cubemap warping, trimming each image to a 3:3 ratio and then converting the cubemap into equirectangular format.
- If I can’t get internal automated stitching to work, could you suggest any external stitching programs that work in bulk?
Hugin should work well but usually requires 20-25% overlap which will require an additional camera. Look into the Ricoh Theta DIY and the VR/Unity communities for premade solutions to similar problems. You’ll want to strive for automated stitching – doing manually will be incredibly tedious.
- Do you have any suggestions for locations on campus to film?
- On McKeldin by the fountain
- There is a little ‘island’ on Campus Dr. that faces the M. From there you could capture the cars, students, and sky – nicely encapsulating the busyness of campus.
- Physical Sciences side of the Stadium and Regents Drive intersection.
- Dock at Lake Artemesia
- Do you have any thoughts or suggestions about weather/theft proofing?
“Stay and watch it.” It’ll be very difficult to build a theft deterrent without obstructing the view of the cameras.
Based on this conversation, I have some new information to look through and additional possibilities for stitching applications. Based on the results of future testing, I may decide to add an additional camera to the system, but other than that my plans have not really changed.