It appears Google did decide to skip the 13th floor. What could we be waiting for?
A scenario-friendly Bluetooth proximity API. It seems that with freedom of motion Glass’ APIs could offer direction to nearest beacon by combining signal strength with accelerometer/compass. But in lieu of that, Android 4.4’s support for BLE proximity would be great for a project I’m working on.
Less picky head-on. I’ve resorted to turning off head detection every time I have people try scenarios because Glass simply won’t turn on for them. And if I forget to turn it back on, I accumulate a fabulous collection of upside down pictures & videos of my desk. There should be a middle ground.
Hands-freeish, voice-free. Google’s recent “don’t be a glasshole” PR work reminds us they designed Glass to use with voice. But there are times when barking a command just doesn’t work (like when I wanted to live-tweet at Hackfort.)
I’d like Glass to ship with a small pocket keypad, like a screenless Blackberry, for touch-typing into the timeline.
Preparing for the alternative hardware story. I believe wearables’ future is in software and services, not this early hardware that has poor battery life, is uncomfortable, can’t be used with your own sunglasses, jacks facial symmetry, and is legitimately made fun of on SNL.
But the software is visionary, and, like Android proper, could work on many devices that target certain industries, situations, fashions, and needs. I think we’ll start to see this affect the software (i.e.. builds of the OS that run on spec/VM instead of XE hardware specifically) soon.