A German Roboter-Forum thread http://www.roboter-forum.com/showthread ... n&p=232830 references a paper claiming the inexpensive Neato and Xiaomi lidars are not true lidars measuring timing of light pulse travel to the target. Instead it is claimed the spacing between the emitter and sensor barrels allows parallax triangulation with the camera sensor, described in a paper at https://www.diva-portal.org/smash/get/d ... TEXT01.pdf
The original Neato lidar was reported in IEEE Spectrum http://ieeexplore.ieee.org/Xplore/login ... ision=-203
[edit] Another copy of same paper I think https://pdfs.semanticscholar.org/4d8f/5 ... 677254.pdf
Some DIY lidar info at http://www.seattlerobotics.org/encoder/ ... vision.htm
If the robot lidar uses parallax then disassembling the lidar to place the sensor and emitter barrels together would eliminate any parallax and ruin the measurements. Timing travel of light pulses would not be so affected. It would also be possible to make a simpler lidar with wider, several inches of spacing and maybe cheaper cameras for DIY lidar. Other DIY lidar http://www.seattlerobotics.org/encoder/ ... vision.htm
Commercial Neato lidar controller http://www.getsurreal.com/ and DIY Arduino controller http://www.roboter-forum.com/showthread ... test-bench
As the robot vacuum lidars have only about an inch of spacing between the barrels I have to wonder whether triangulation would be practical, but I have not analyzed it in detail. I expect the operation is embedded in a custom digital IC on the lidar circuit board not available for analysis directly.
Programs to operate installed lidar over robot USB
Neato Control http://www.robotreviews.com/chat/viewto ... 20&t=18173
Python scripts http://www.kelrobot.fr/forum/programme- ... tml#p14011
[edit] Additional Neato reference received http://www.hizook.com/blog/2009/12/20/u ... o-robotics
[edit] The parallax method explains how these lidars can measure down to very close to the robot, only one foot, where light speed timing would be difficult, and how range is limited by the small separation between the barrels. Human depth perception does not work by parallax beyond around 30 feet or so (limiting the role of 3D movies for wider range shots), but perspective geometry can play a role along with other pattern factors.
[edit]The lidar robots also have problems with reflective surfaces, mirrors, glass doors, and chrome furniture legs. The camera based guidance overcomes these limitations.