Gadgets
fromThe Verge
5 days agoNanoleaf bets its future on robots, red light therapy, and AI
Nanoleaf is shifting from smart lighting toward embodied AI, wellness, and robotics as open standards commodify smart lighting.
Recently, an open-source project called OpenClaw surfaced on a maker community platform. Built on affordable edge-computing hardware, the project demonstrated a local AI agent controlling a physical robotic arm. It wasn't just predicting text; it was moving motors, reading sensors, and interacting with its physical environment in real-time. From a psychological and sociological perspective, this transition from abstract AI to embodied local AI forces us to re-evaluate trust, privacy, and the sanctity of our personal space.
We're at a rare inflection point. Robots are moving from research labs and factory floors into everyday life. Right now, they're being dropped into human spaces and, often, missing the mark. Yet embodied AI is becoming more intelligent, manipulation more capable, and perception more attuned. These shifts are giving robotics a new expressive range, the ability to move, interact, and take shape in ways that feel natural in human environments. It's a moment full of possibility.
The "Butter-Bench" test, as detailed in a yet-to-be-peer-reviewed paper, is a "benchmark that evaluates practical intelligence in embodied LLM." In the test, the robot had to navigate to an office kitchen, have butter be placed on a tray attached to its back, confirm the pickup, deliver it to a marked location, and finally return to its charging dock. The results of the Butter-Bench experiment, the researchers conceded, were dubious.