On Friday, Ausdroid got a chance to go hands-on – or perhaps, eyes-on, if you prefer – with Google Glass, courtesy of SafetyCulture Pty Ltd.
SafetyCulture is an Australian developer of OHS software, operating out of Townsville, Queensland. Their main mobile product, iAuditor 2, runs on iOS and Android devices and provides workplace auditing and reporting functionality. They’re looking to develop workplace safety applications to run on Glass, offering hands-free operation.
I met with Luke Anear, SafetyCulture’s CEO, during a break in the afternoon schedule at the One More Thing iOS developer conference in Melbourne. Luke told me that SafetyCulture managed to obtain their Glass via a US-based employee, who’d purchased the Explorer Edition and sent it to Australia. It’s early days for their Glassware development though, and Google hasn’t yet let any holders of Glass devices in on their plans for a retail release.
Yes, Luke let me wear their prized Glass.
Fit and Finish
As someone who wears glasses, I’m not a great candidate for trying out the Explorer Edition of Glass – it’s not designed to be worn on top of other glasses, and the design of the projection system means I can’t focus on the display without wearing my glasses. Additionally, the nose pads can’t support the device at the required height – because my own glasses are already there.
In general use, the Glass display should sit in the upper-right corner of your right eye’s field of view. This detail seems to be missed by a lot of companies publishing Glassware app concepts, as they’re usually attempting to convey some kind of data overlaid on whatever the user is looking at, AR-style.
For me, the presence of my glasses meant the display sat almost dead center in the field of view of my right eye. This wasn’t so much of a problem standing around in a conference centre, but if you were walking down the street it would be quite distracting. Luke mentioned he’s heard stories of people taking the Explorer Edition units apart and bolting them onto their own glasses, but for some reason we decided it wouldn’t be appropriate to try this…
If you’re worried your prescription eyewear will interfere with your plans to wear Glass everywhere in your life, fear not! Google has said they’ll make units to work with prescription eyewear in the future, but haven’t given much information about it. We did see some prescription Glass units pop up at Google I/O, but the calibration of the projector is said to be problematic at the moment for mass production.
Unlike some of the bright colours we’ve seen in the media, SafetyCulture’s Glass is dark grey, so you don’t feel quite so obvious when wearing them in public – several conference attendees actually had to do a double-take to verify that they were indeed looking at the fabled Google Glass.
All The Cards
The main Glass interface borrows a lot of design cues – or lack-of-design cues – from Google’s card-style user interface that began with Google Now and is now spreading to other Android apps and web services. You’ll generally only see one item on screen at any one time, and slide your finger across the side-mounted touchpad to swipe between “cards”.
The cards show only one item at a time – generally a large piece of text, a small piece and sometimes an image. Surprisingly, the background of the display was a dark red, almost maroon colour – this might be configurable.
A quick swipe through a number of on-screen cards revealed some search results, photos that had been taken with the device and a navigation card showing driving directions to one of Melbourne’s outer suburbs.
Navigation brings up an interesting hot-button topic – Luke says he’s found Glass an unobtrusive and helpful navigation aid while driving. Nearly everyone I’ve discussed my Glass experience with has almost immediately pronounced it a massive distraction, and thinks it should be banned from use while driving.
This seems to be based on a perception that the display sits in the middle of your field of view, like a Terminator. If properly worn however, the display should be no more obtrusive than a GPS navigation screen. A taxi I was in the other day had two phones, a GPS display, the meter and the taxi dispatch computer all within the driver’s field of view.
There’s sure to be a middle ground here – perhaps the new Location API in Google’s Play Services will be able to detect when you’re driving and reduce the amount of information displayed to the driver.
Navigation is further complicated by the sensors available on the device – standard GPS, an accelerometer and a gyroscope, just like you’ll find on a phone. If you’ve played a game like Ingress (itself a prime candidate for a Glass upgrade, if you ask fans), you’ll know how fallible GPS can be if you happen to wander near the shadow of a tall builting. It makes me wonder how accurate apps that purport to bring us AR-style overlays of building schematics and features in a room can really be.
A long-tap takes you into “OK Glass” mode, in which you can give voice commands to the unit, like “Google Safety Culture” or “take a picture”. Given the noise around the conference centre, this didn’t work so well – I had to ask it twice, but the second time was successful and showed recent news headlines about SafetyCulture.
Surprisingly, scrolling through search results can be fairly time-consuming – you’ll only see one result at a time, and you’ll need to move your finger on the touchpad to move between results on separate cards. It’s difficult to see how this would be presented any differently given the Glass UI’s presentation style, and it also means that those troublemakers who’ve promised to yell gross-out search queries near people wearing Glass will probably find their attempts and mischief going unnoticed.
I noticed a quirk of the user interface when scrolling through search result cards. I tapped, and long-tapped, on a result but nothing seemed to happen – at least, not on the Glass display. It’s possible that the page was opening on Luke’s phone (which was paired with the Glass unit), but it didn’t seem to be doing anything at a glance either. Maybe opening search results is Not Yet Implemented…
What started as a quick meet-and-greet quickly turned into a show-and-tell for a growing crowd as conference attendees noticed the device that was being shown off.
By the time I departed, there was a small circle of developers gathered around Luke and a lot of excited developers discussing the possibilities. I ran into Luke again later in the night and he told me he’d had to put the device away at the conference’s after-party due to the amount of attention he’d attracted.
My brief experience with Glass was a mixed bag. On the one hand it’s nice to play with new technology, but I also found interacting with the unit a little troublesome – while you see a 2-dimensional UI presented on the X/Y axis in your eye, your finger moves along the touchpad on the Z-axis. It’s easy to assume left = forward and right = back, but controlling full-screen Android apps (which we know Glass units have been hacked to run) seems like it’ll be a bit difficult.
I find myself agreeing with some recent sentiments that the Glass UI is perfectly suited to a smartwatch. At least a watch only requires you to raise your hand a little to interact with it, and it would have a more traditional touch interface.
With such fervour and excitement around the appearance of a developer device at a conference for a completely different platform, it seems high time for Google to share Glass with the rest of the world, instead of only its US-based developers.