Filming and cooking with Google Glass

10 Dec Filming and cooking with Google Glass

Have you ever cooked a new recipe from a phone or tablet? And did you wash your hands a million times to check the steps or ingredients again, because you didn’t want to get egg or garlic on the screen? Although online recipes and videos are super-helpful in the kitchen, sometimes our devices get in the way of a smooth experience.

 

That’s why we were very interested to experiment cooking with Google Glass and see how it eases the process, both for cooking and for shooting video. Oana Grecea’s website ReteteTV targets busy people who want new ideas in the kitchen and that’s why she is also interested to see if Glass is the future of cooking.

 

Compared to my first brief trial with Glass, I discovered more uses for it and was most impressed by the “take photo by winking” feature (how does it know???!). I also found a very useful tool for filming, the fact that the device to which Glass connects can act as a remote monitor and mirrors what it sees – just like on movie sets. We used it like that because Glass’s lens only sees somewhere above the eyeline and to the right of the head. While that’s a good placement for the lens and does not distract you, it also means it’s not exactly your point-of-view. So until we figured out a correct position on the face, the video material coming from Oana’s POV was very badly composed. So we used the monitor to track if she had the food in the centre of the image. If she didn’t, I would push her head slightly forward, as if I was controlling a human camera 🙂 Something we didn’t appreciate, however, was the video default length of 10 seconds, which meant Oana had to shout “extend video” every now and again and sometimes miss a shot because she was concentrated on the cooking. Luckily we didn’t just rely on Glass to capture everything!

 

Anyway, it’s still early stages with Glass, and although my first encounter with it was disappointing, I saw more potential the second time and was surprised by my own enthusiasm. Right now, I wouldn’t recommend it for shooting recipes, like Jamie Oliver tried, because DSLRs are far better quality and easier to manipulate. Same goes for battery life, storage and the unpleasant obligation to share to Google+ :). But this whole programme of early Glass testers will gather so much insight into how people will use it and how Google should cater for them. For instance, there’s an app being developed that guides you in making a recipe with no need to constantly wash your hands and touch the screen to scroll. For shooting, however, I would only use it if the lens could be angled somehow closer to where the eye is placed and create that feeling of POV, so you have more control over what you shoot/photograph. Not just a cooking problem, but you could easily miss the stage if you shoot at a concert, because the lens would go too high.

 

How would you use Google Glass and what kind of apps would you want to see in it?