Haptic feedback is an incredibly powerful way to enhance physical interactions, but also very challenging to grasp. This tutorial will help you get started in the field by introducing the basics of human perception, common terminology, and technology. It is the first part of the hapticlabs.io platform providing accessible knowledge and tools for anyone interested in the topic. Please don't hesitate to → reach out, share your thoughts and questions and make sure to signup to the newsletter to keep up to date.
Let's start from the basics: In almost all interactions we have with physical objects not only one, but multiple senses are involved. This is referred to as a multi-modal experience, where a modality equals a human sense and allows us to perceive, overlay and interpret all the information we need within a split second. In the vast majority, three modalities are involved: Sight for visual feedback Hearing for auditory feedback Touch for haptic feedback
In our daily lives, we are often guided by visual feedback through form, colour or displays, followed by auditory feedback from smartphones and speakers. We are usually oblivious to haptic feedback and only recognise its absence when our sensory system is somehow impaired: When closing a zipper with cold hands or searching for a light switch in the dark.
Haptic feedback describes all the information we receive from things that are in skin contact with our body. A first important fact to remember is that the field of 'haptics' contains two very different areas: Tactile and kinaesthetic perception. Imagine holding a cup of hot coffee: You can feel the weight and judge if you need a refill, the size and shape of the cup, the position, orientation and movement in relation to your body. This feedback is based on what is called Kinaesthetic perceptionand allows you to take a sip even in complete darkness.
It relies on receptors sitting inside your muscles, joints and tendons (but also receptors in the skin help out). They monitor for example the angle of a joint, if your finger is currently bent or straight or how tense your muscles are.
Revisiting the cup example, you can also recognise the texture of a rough ceramic, a vibration if somebody is giving you a refill and you know how much force you need to put into the grip to prevent the cup from slipping through your fingers. In addition, you also feel the pressure and the location of the handle resting on your finger.
All of this information is part of the Tactile perception based on receptors located in your skin.
Also, you can judge the temperature, and if it passes 40°C you will definitely feel the pain arising. Although this is part of the tactile perception as well, both are neglected in the following, due to the few application areas.
The example of the cup falls into an area that is referred to as passive haptic feedback. It is based on physical attributes of objects such as the texture, weight and rigidity or analog mechanics such as a push button or a lever. To experience it, we need to actively explore the objects for example by running the finger over the surface, lifting it in the air or giving it a squeeze.
Nowadays, we experience more and more active haptic feedback produced by electric components such as the rumble of motors or an electric toothbrush's buzzing.. Here it is enough to be in physical skin contact with the object to perceive the feedback.
Lately, the worlds of active and passive feedback also begin to blend following the introduction of new technologies like adjustable surface textures or a changing rigidity.
Next, let's have a look at how active feedback is integrated into applications.
Even though we often talk about haptic feedback, what we are actually referring to is tactile feedback. It involves any feedback that is perceived through our skin such as the well-known vibration from our smartphones or a smartwatch.
Active tactile feedback can be induced not only through vibrations, but also through other physical manipulations of the skin tissue. Think about tapping someone on the shoulder (indentation), running with a finger along the cheek (movement) or slightly pinching your forearm (skin stretch).
Force feedback displays use kinaesthetic perception and can be divided into resistive (limiting the users' movement) or active feedback (supporting/ guiding the users' movement). Examples are motorised sliders or steering wheels for gaming. Kinaesthetic feedback only recently became more prominent in VR-Glove applications due to its high complexity in terms of mechanics and the control systems. Depending on the actuating force sometimes tactile displays can also induce kinaesthetic feedback and vice versa.
Having covered the basic terminology, we can now take a look at the underlying physiology. First of all, the haptic perception is not equally distributed across the body. While especially our face, hands and feet are able to recognise fine facettes, it is a completely different story looking at the back or our limps. Different body parts vary in sensitivity and in their ability to perceive certain tactile sensations. The → Two-point discrimination test is a good starting point, if you want to learn more about it.
Not only tactile, but also kinaesthetic resolution differs from each joint. Our shoulders have a much higher resolution, as already the slightest movement has a big impact on the hands' position.
The animal world sometimes has a completely different take on the topic. Welcome to your new favourite species: The → Star-nosed mole
The reason for the varying perception in our skin is due to the changing number of receptors. First of all we need to distinguish between hairy and non-hairy (Glabrous) skin areas. The later has a much higher density of receptors and is therefore more sensitive for tactile feedback in general. This does not mean that one should not consider the calf or the top of the head as a placement, but it will for sure result in a different sensation compared to glabrous areas.
Another big impact is the amount of tissue that is being penetrated by the feedback. For example, this influences if the feedback is transmitted to the bones, which generally result in uncomfortable characteristics or in the worst case is also audible when placed on the skull or neck area (Bone conduction).
The type of receptors responsible for tactile perception is called mechanoreceptor. There are four types, which differ in depth and ability to recognise input: Meissner corpuscles 1 , Ruffini ending 2, Merkel discs 3 and Pacinian corpuscles 4.
Let's put it to action: Take the backside of a pen and move it without applying pressure along your arm. During the movement you can easily tell the exact position.
Now let the pen rest. After a few seconds you will have a harder time locating the exact position or might not even be able to feel it anymore. Give it a try with your eyes closed to emphasize the tactile perception. What you just discovered is the adaption rate. Next, put some pressure on the pen and wait a few seconds. This time the feeling will not subside, but stay the same no matter what. This is due to the fact that different receptors are triggered according to the depth.
Repeat the last step of varying the pressure and focus on how far the feedback is spread. It is highly focused with little pressure, but the stronger you push, the more it will spread. This is called the field of reception and again differs depending on the receptors being triggered.
Let's talk about how we can trigger these receptors besides using pens. The components producing the active feedback are called haptic actuators. The most common types are electro-magnetic. Inside, a magnetic mass is accelerated in a linear or circular motion using elecricity. This movement results in a vibration or an impulse and is generally called Vibrotactile Feedback. New technologies arise almost every day. Ultrasonic waves, piezoelectric actuators, shape-shifting materials or pneumatics just to name a few offer a new world of opportunities, but most are still far from commercial application.
The most commonly used type of vibrotactile actuator is called Linear Resonance Actuator (LRA) and is found in almost all phones, smart textiles and vibrating objects.
Inside sits a coil inducing a magnetic field, pushing a magnet against a spring. By reversing the polarity on the coil, the magnet is accelerated into an oscillating linear motion, resulting in the characteristic, high frequency vibration.
One thing to remember is that LRA's are tuned to the resonance frequency of the internal spring (usually at 200Hz). At this frequency it consumes the least amount of energy while producing the highest force. Outside of this range, the impact rapidly declines.
To learn more about different actuator types stay tuned for the actuator library coming soon.
The last part missing in our setup is the driving signal. Let's stay with the classic LRAs for now, but the same idea applies also for other vibrotactile actuators.
First of all LRAs and in fact most of the actuators are AC-based, so you can't just attach it to a battery or directly to a microcontroller. The second aspect is that the input signal is linked to the characteristics an actuator can produce. Makes sense, right? Let's take a closer look: The haptic driver is a component that creates a signal that is optimal for the actuator and copes with its mechanical restrictions. To produce a sharp 'click' for example, it amplifies the input signal in the beginning (called overdrive) and reverses the polarity halfway at the end to stop the vibration abruptly (called braking). Take a close look at the animation showcasing a driving signal with and without overdrive and braking.
Without having a decent expertise in coding, you are pretty much limited to the integrated presets of available drivers, which give you a taste of what generally can be achieved.
Nonetheless it is worthwhile to know what happens underneath. If we zoom in on the input signal, we can spot a waveform. Three parameters shape the overall feedback characteristics: The frequency influences how fast it is vibrating.
The intensity relates to the amplitude, or how strong the feedback is. The duration is responsible for how long the vibration is taking place. To further shape the feedback, dynamic parameters are added, such as fading in the intensity or change it over time to create a pulsating effect. Especially the intensity is heavily influenced by the physical attributes of the actuator such as the weight and size, but also the implementation into the object.
We arecurrently working on a prototyping toolkit to offer full customisability down to the details. Take a look at the → Toolkit vision
One important fact left out until now is that the actuators are always integrated and mostly enclosed inside an object. The material and how it is fastened to it (glued/ screwed/ press fit) as well as the environment it is used in have a direct impact on the perceived characteristics. For now, the most important thing to remember when designing haptic feedback is to build prototypes and evaluate it as early and often as possible in multiple iterations. Every object requires a tailored interaction and just because something worked nicely when integrated in a rubber bracelet, it can be completely unusable when integrated into a hardshell helmet.
More on this topic will follow soon.
You made it!
Congrats! This was a brief glimpse at the world of haptic feedback and hopefully prepared you to start your own journey. Make sure to sign up to the newsletter and come back later to discover new additions to the platform. Feel free to reach out for any feedback, comments or questions. Let's stay in → touch!