One of our Android developers (let us keep his name in secret) was eating the third cheeseburger while reading something on the phone. He was eating his mega juicy burger and when he needed to swipe the page he stuck! Think, how many times a day you need to swipe and eat at the same time and keep your hands clean too?
Ta- da, and here comes the initial (brilliant) idea to make an IoT project that supports touchless navigation. And even the name for it - Handlessapp (sounds creepy, but you got the main idea).
We liked the idea, but we had no hardware background then... and here comes Google I/O 2018, release of Android Things announcement and giveaway of a Starter Kit.
We’ve made our first device version sense the user's gestures and navigate through the app, by using APIs to control hardware peripherals. In order to interact with the content, i.e. to move the pages in the newsfeed, a user needs to wave his/her hand up, down, left, or right.
We went further in brainstorming, thinking where on Earth this app can be 'at hand.'
Here are some checkpoints:
- in the bathroom, when we clean our teeth and check news or a weather forecast with no wet fingers on your screen;
- at the kitchen! no dirty hands while cooking and checking the recipe;
- repairing your car;
- doing manicure;
- playing the piano.
Ok, kidding aside, this IoT app can help people after operations or heart attacks to deal with basic newsfeed navigation. We can go on for ages, so let's stop here. For now.
Comments
Please log in or sign up to comment.