natural user interface (NUI)
What is a natural user interface (NUI)?
A natural user interface -- or NUI -- is an interface that is designed to feel as natural as possible. The goal of a natural user interface is to lower the cognitive load in human-computer interaction.
NUIs aim to remove as many artificial controls as possible, making interaction with digital or electronic devices resemble natural interactions in the real world. They let people interact with computers through intuitive actions such as touch, gesture and handwriting in the same way they would interact with an object in physical space. NUIs seek to provide an experience that lets users interact with a computer without learning an artificial control, such as typing on a keyboard or programming.
An NUI requires less understanding of how the underlying system works to interact with it. Users take advantage of existing skills to interact with a NUI.
What's natural depends on the user's context, so it's difficult to create a natural user interface that feels natural to every user.
Features of a natural user interface
Natural user interfaces have a few defining features, including the following:
- Targets user's natural abilities. Natural user interfaces should target abilities that come naturally to the user.
- Shallow learning curve. Users should not have to learn a lot about a system to use it. Interaction with the system should be intuitive.
- Minimalism. It's common for natural user interfaces to employ minimal design to reduce cognitive load and make the interface easier to use.
- Input recognition. NUIs process input through a variety of discrete sensors and other recognition technologies, such as cameras, microphones and touchscreens.
- Immediate feedback. NUIs provide immediate feedback on user actions to make the experience feel responsive and natural. Feedback may include visual, audio or haptics.
- Adaptation. Many NUIs can learn from user input and adapt the user experience.
- Multimodal. NUIs support multiple modes of interaction simultaneously, including voice, touch or gesture.
Command-line interface vs. graphical user interface vs. natural user interface
All three terms refer to the technology used to interact with a computer. The natural user interface aims to improve upon the graphical user interface (GUI) and the command-line interface (CLI).
With a CLI, the user interacts with the computer via text-based commands. Commands follow a specific syntax and need to be memorized. There are no graphics or icons on a CLI, just text and a command line. CLIs need an artificial means of input -- such as a keyboard. CLIs are used in scripting and automation tasks. CLIs do not require as many system resources as the other two types.
With a GUI, the user interacts with a series of graphical elements such as virtual windows, icons, menus and buttons to use the computer. Users do not need to memorize strict syntax or codified text inputs. Users can interact with a GUI via a keyboard and mouse. The GUI provides visual feedback to users beyond just textual output. GUIs require more system resources than CLIs but fewer than NUIs.
With a NUI, the user interacts with the computer through a collection of natural actions including touch, gesture, voice and eye movement. NUIs aim to require little to no instruction prior to use -- the user should intuitively know how to interact with it. The interface provides multiple forms of feedback to user actions, including visual, auditory and haptic feedback. NUIs often require more system resources than GUIs and CLIs because they rely on various sensors and sensor data to record user input and translate it into a computer command.
These three user interfaces can be used in conjunction with one another. For example, the Windows operating system has a graphical user interface with buttons, icons and windows. It also has a program called command prompt that opens a command-line interface. The user can run programs, create and delete files, and navigate through folders using the command prompt. Mac operating systems have a similar feature called Terminal.
Android and iOS have elements of both graphical and natural user interfaces. The user can toggle through menus and icons as one would in a GUI, but does so using a swiping motion, tapping the screen or speaking to the device -- features of NUIs.
Examples of natural user interfaces
Natural user interfaces are used in a variety of applications and devices. Examples of technology used in natural user interfaces include the following:
- Touch screens. Touch screens let users interact with smartphones and tablets in more intuitive ways than a computer with a GUI. Users can swipe, tap and pinch the screen to achieve various effects and interact with a device.
- Virtual assistants. A virtual assistant -- such as Amazon Alexa or Apple's Siri -- lets users communicate with applications by speaking to them. Voice assistants can use input from past conversations to improve future interactions. Natural language interface applications such as ChatGPT also have voice control options that let users interact with the tech by speaking instead of typing. The AI pin by Humane is another example of a user assistant that employs a natural user interface.
- AR/VR. Augmented and virtual reality technologies use natural user interfaces. For example, Microsoft Kinect lets users control video games using spatial gestures instead of using a controller. The AR game Pokemon GO lets users point their phone at a point in physical space to get a Pokemon to appear on the screen, and use a swiping motion to capture it. VR headsets such as the Apple Vision Pro use eye tracking to improve user experience.
- Facial recognition. Facial recognition is used in multiple technologies to improve user experience. For example, it can be used to authenticate users unlocking their smartphones.
- Brain spine interface. A brain spine interface can be used to improve the electrical signal between the brain and other parts of the body. Onward is one example of a company that develops brain spine interfaces to help paraplegic patients regain the ability to walk by reconnecting the brain and legs. The user activates the program via a touchscreen on a tablet and then performs physical therapy exercises to recover the ability to walk with the help of the device.