Show HN: Stray Scanner – collect raw Lidar scans on iOS devices
github.comHey everyone! Here is an app I've been working on over the past couple years for my own purposes. The original idea came from me wanting to collect RGB-D scans for research and development purposes. One option being to buy a depth camera and use that, but I figured I might as well just use the time-of-flight sensor on my phone.
The app has been on the App Store for a while and I originally wrote about it here https://keke.dev/blog/2021/03/10/Stray-Scanner.html. The app can be downloaded here for free https://apps.apple.com/us/app/stray-scanner/id1557051662.
Over the years, tons of users from all over the world have reached out to me asking for the source code, feature requests or adapting it in some way. The users are usually computer vision researchers at universities or corporate research labs or they are engineers working on commercial projects.
Sometimes the requested changes don't make sense for other users. Sometimes they do, but I don't have the time to develop them.
I've just now decided to just release the source code so that people can hack it to do whatever they want with it. Hopefully people will find it useful and that some people might contribute changes back to the app and it becomes more useful for others as well.
This is wonderful.
What I have been mulling about of late - is LIDAR drones and LIDAR accessories to bicycles - it would be great to have a lidar mapper on my MTB
Here is my MTB https://imgur.com/gallery/aca4tSU
I want to mount lidar to this to map trails it rides...
What recommendations may you have to accomplish this cleanly and cheaply given your release?
In terms of advice, I would for sure start by adding GPS logging into the app. The app logs the ARKit odometry information, which could be a good starting point, but don't know how well that would perform with all that vibration.
It would for sure be a fun experiment, but I think given the extreme conditions, it might be easier to build a custom SLAM sensor rig with a high-quality IMU and several synchronized cameras.
That sounds awesome. I do some mountain biking myself and have sometimes thought that it would be cool if you could replay your rides in 3D through a sparse SLAM point cloud. Of course it would very hard to run SLAM onboard an MTB as there is so much vibration and changes in lighting.
If you could have an accelerometer read, like from the phone running the app, mitigate vibration offsets in measurements that would be cool. This is all beyond my capabilities, but I'd pay some HNer is they can build a LIDAR scanner I can mount to bike...