Settings

Theme

Show HN: Real-world speedrun timer that auto-ticks via vision on smart glasses

github.com

4 points by tash_2s 2 months ago · 3 comments · 1 min read

Reader

I built a hands-free HUD for smart glasses that runs a real-world speedrun timer and auto-splits based on what the camera sees. Demo scenario: making sushi.

Demo: https://www.youtube.com/watch?v=NuOVlyr-e1w

Repo: https://github.com/RealComputer/GlassKit

I initially tried a multimodal LLM for scene understanding, but the latency and consistency were not good enough for this use case, so I switched to a small object detection model (fine-tuned RF-DETR). It just runs an inference loop on the camera feed. This also makes on-device/offline use feasible (today it still runs via a local server).

usefulposter 2 months ago

I can imagine corps using this HUD with a vision model for worker supervision. (Manna, anyone?)

    Cheeseburger Assembly
    
        x Toast bun
        x Place bun
        x Add ketchup
        x Add onions
        Add pickle
        Place cheese slice
        Add patty
        Place bottom bun
        Wrap and flip burger

    TIME REMAINING: 00:09.01
    UNITS THIS HOUR: 60
    ACCURACY: 99.8%
    KEEP UP THE GOOD WORK!
  • tash_2sOP 2 months ago

    Totally. I think this kind of thing sits right on that line: it can help someone (hands-free guidance, training, accessibility, staying in flow), and it can also slide into a pretty dystopian "score the worker" surveillance HUD. My intent here is the former: personal "real-world speedrun" / practice tooling, not a manager dashboard or productivity policing.

tash_2sOP 2 months ago

Cooking feels like a perfect fit for smart glasses (hands busy, lots of short steps), but I have not seen many apps that work reliably in a real kitchen. It feels like the hardware is finally getting to the point where this should become practical soon.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection