You’ve seen them everywhere: the grocery store self-checkout. The signing pad, at the bank. The screen of a smartphone. But how do they work? And why is it so difficult to make a signature that looks like it does on paper?!
Most smart phones use capacitive screens. They’re effective, they’re accurate, and they’re relatively simple. The only real drawback is that they don’t work with things that won’t conduct – gloves need special fabric at the fingertips if you’re looking to text without freezing your thumbs, and if you want a stylus for a capacitive screen, you’d need to hunt down one advertised as such.
Resistive touchscreens are commonly used in hospitals and restaurants because they’re very resistant to contaminants, and can be cleaned easily. They’re also not relying on the fingers to conduct the charge, so they can be used with a stylus or gloved hands. It’s a win! The only downside is that they aren’t as accurate as capacitive touchscreens, but most UI designers understand this and factor it into their software design.
Infrared displays are the most sensitive, as anything opaque can trigger them. Essentially, a grid of lasers shoots beams to eyes on the other side of the box, and when the eye detects that the light has stopped coming in, it knows roughly where the interruption is for that direction. Repeat this process vertically and the machine can pinpoint where the interruption is! Kind of like those old spy movies where someone has to make it through a mess of lasers to get to a diamond.
Downsides to infrared displays include dust and snow, things that may interfere with the lasers reaching your fingers.
There are more than one kind of acoustic touchscreen. One kind is constantly sending an acoustic pulse, while the other only sends it once it’s triggered by contact with the screen. The kind that is constantly sending is more vulnerable to environmental factors, such as snow landing on the screen, while that touch-first kind loses detection if the thing it’s detecting stops moving – it fixes the snow problem, but can be its own issue.
Drawing pads have been around for much longer than you might think – they actually predated the touchscreen! Also called “Graphics Tablets”, they were commonly used for Computer-Aided Design (or CAD) programs because they were more accurate than mice, and some kinds that included pucks (a specialized type of mouse) could also register things like absolute location and turn. If you pick a regular mouse up and then set it back down without letting it’s infrared eye see the surface, it won’t know where it’s been moved to. A puck picked up and set back down on a graphics tablet will!
Signatures and More
Back before chip cards were considered reliable enough to not need a signature, signing on a grocery store touch-pad was a mess. Even if you tried to compare that mess to your actual signature, surely it would be worthless because of how little it matches the one on paper? Right? Realistically, what is the bank going to be able to do with a track record of inconsistent signatures on receipts?
The screen isn’t sensing enough of the contact. The screen itself is simply not designed to make a perfect copy of your signature. The sensors would require inhumanly precise control of the stylus pressure as you wrote to make the screen pick up every input exactly as you meant it. The software is generally designed with jabbing, or pushing buttons in mind. Resistive screens are just naturally worse at smooth, flowing movements than capacitive.
Signing with fingers is slightly better, since it’s usually a capacitive screen and not a resistive one – but few people get to practice writing with their fingertips outside of the screen itself, so it also comes out wrong.
One Square Senses, Another Doesn’t
Possibly one of the most frustrating experiences with a touch screen is touching one square only to have a spot a half-inch to the left register the contact. You end up pressing four when you meant to press six! Even worse is when you try to compensate, but the distortion is so great that you’d need to touch an area off-screen to get it to do what you want it to. Magical.
What’s the issue?
Calibration. Even when screens are physically separated, good software will allow the manufacturer to calibrate it after assembly. Tell the computer an input here means selecting “X”. Even if the touch input isn’t directly above “X”, calibration should tell the software to behave like it is. Bad calibration or general wear and tear can gradually muddy up the software’s input recognition, so you get drift.
Press Harder (Than Needed)
Another common issue, especially with those resistive screens mentioned above, is that some screens require you to press harder than you’d maybe like to. This is a common problem! The screen’s too worn out to register touches. Pressing too hard makes this problem worse, and gradually damages the display. Another facet of this problem is that other types of screens usually don’t respond better to pressing too hard – infrared screens can’t suddenly see your finger any more if you push harder. Acoustics might register a new touch surrounding your finger as you cause more points of contact, but pushing so hard that the plastic deforms is a sure way to make the issue permanent.
Why does it happen? The short answer is entropy. The long answer is that they gradually wear out, and manufacturers want to stave that off as long as possible. All of these elements are extremely thin and lightweight – it’s in their design. The thicker the outside plastic, the less responsive the screen is, which means users pressing harder. But thinner plastic may mean that frustrated users, or users used to using too much force elsewhere, overdo it and break the system immediately. Designers are forced to choose between ease of use and longevity, and it’s a difficult choice!