Beyond Logo Menu

2016 trends series: Non-traditional UIs are changing interaction design as we know it

Written by: Alex Hunting

Published on: March 4, 2016

We have collated 10 trends that we think are reshaping digital in 2016. This is the first installment of our series which looks at things that are happening now in Q1, whilst identifying trends to look out for in the rest of the year.

The digital industry is fond of coining new words and terms, often for things that aren’t so new when you scratch the surface. But it is also an industry where rethinking the status quo is a constant.

You may already be familiar with many of the trends we highlight in this series, but the cycle of innovation and iteration that our industry is known for means existing concepts can suddenly find radically new trajectories. In this series, we highlight ten ideas that are gaining rapid traction today, and which will reshape how we digitally interact with the world around us.

To help you organize your priorities for the trends in this series, we have sorted them as “trends to prepare for now” and “trends to start thinking about”.

We’ll kick off the first in the series with a trend to start thinking about:

Non-traditional UIs are changing interaction design as we know it

 

We are moving further away from traditional input and display modes for information.

Already, people are bypassing apps and completing tasks through interaction with notifications – especially with the rise in popularity 
of wearable devices. Increased innovation around voice and gesture recognition will continue to render traditional UIs even less important 
in app design as users complete tasks on their mobiles without even 
picking them up.

Wearable

At WWDC 2015, Apple claimed that Siri currently serves up

1 billion requests a week

(almost as much as a tenth of the mobile searches 
Google is estimated to be serving) and is

40% faster

and more accurate than last year.

In 2015 Google 
introduced Project Soli, a new interaction sensor which tracks hand gestures using radar technology.

The sensor can track sub-millimetre motions at high speed and accuracy. This means that once the chip reaches mass adoption, users could control their phone by simply rubbing their fingertips together in mid-air.

Brands will need 
to start thinking about how they can adapt their online and app-based 
services to ensure consumer interactions are simplified through a more invisible, yet intuitive, UI layer. Getting this user experience right will be particularly important as more and more notifications compete for our attention.

You can find the whole series here: