← Back to Home

Steve the Robot

if somebody stumbles upon this website while this text is there, treat the text as ipsum text--it's slop. Pi 5 brain, OAK-D eyes, and a lot of vibecadding. Also: a shameless shout-out to my silicon co-pilot (hi, ChatGPT).

Background

Circa 2017, baby Steve rolled for the first time—loose wires, scuffed treads, big dreams. Extremely cute, slightly alarming, and absolutely the moment the mission creep began.

Baby Steve (2017): the primordial wobble.

Vibecadding: draw → print → test → repeat

Vibecad = rapid, low-ceremony iteration. Sketch the contour, print the bracket, test the fit, tweak, reprint. Fewer ceremony points, faster feedback loops.

Brains & Eyes: Pi 5 + OAK-D, and a saner wiring loom

Compute: Raspberry Pi 5. Vision: Luxonis OAK-D (RGB + stereo depth). Drive: TB6612FNG motor driver. Clean 5 V rail via UBECs, fused harness, common ground.

Robot control UI with live feed and controls
Control UI — live camera, depth toggle, drive sliders, and a big red “stop being spicy.”
OAK-D depth stream — halls, corners, and the occasional chair-leg jump scare.

What worked (and what didn’t)

Future Work

This build absolutely had a silicon co-pilot: ChatGPT. From “why do my treads only go forward” to “decode the TB6612 pinout again,” Steve is a collaboration between a human with zip ties and a language model with opinions. I regret nothing.