The Evolution of Touch Screens: How We Got from Styluses to Fingerprint Sensors
What’s the deal with touch screens? For anyone who used computers in the ‘90s, they’re incredibly different from what we used to have to deal with. And if you use modern phones, the difference between those and traditional computers might be even more stark. What changed? Why are touch screens so popular now? Let’s look at the history of touch screens and see how we got from styluses to fingerprint sensors!
Inkjet
Early touch screens relied on electronic ink or e-ink. E-ink is basically normal ink with a few modifications—it’s easier for an e-reader screen to produce and last for much longer on a single charge than regular old ink. However, e-ink uses heat and electricity, which can be limiting in some situations. For example, your smartphone likely isn’t going to come with an e-reader screen anytime soon because battery life would suffer greatly. The good news is that most modern smartphones use capacitive touch screens instead of e-ink—and that leaves you with one less thing to worry about when buying a new gadget!
Resistive
The earliest touch screen display was made up of a grid, with an electrical charge across each point. When a finger pressed down on a point in that grid, it would complete an electrical circuit and trigger information to be sent back to a computer. These screens were simple, cheap, and required very little power because they could only register one touch at once (hence, resistive). In time, resistive touch screens would be replaced by more advanced types.
Capacitive
The first touch-screen device that was widely used commercially was actually a capacitive touch screen. This type of screen is not in widespread use anymore, but it paved the way for future designs. It worked by using an electric current running between two layers on a display. One layer would have a positive charge and one would have a negative charge, and when you touched it with your finger or stylus, you interrupted that electric current which could then be measured. As time went on though, manufacturers began experimenting with different ways to create touch screens like incorporating resistive technology or creating new versions based on infrared beams or radio waves. Eventually, manufacturers were able to make these devices affordable enough for consumers and they started showing up everywhere.
Infrared
Touch screens made their debut with Sony’s Intrigue smartphone, but these were limited to infrared-based inputs (pushing buttons on the screen or sliding a stylus around) and only worked in darkness. Later products like Apple’s iPhone and Blackberry devices used capacitive touchscreens, which could accept input from fingers as well as specialized style. However, touch was still based on infrared rays—meaning it only worked when there was a clear line of sight between your finger and sensors beneath or at the edges of a screen. This led to problems such as the gorilla arm — holding your phone so that your arm is jutting outwards towards its display—or using alternate input methods like voice commands or keyboards.
Projected capacitive
For many years, touch screen displays were designed with projected capacitive technology, which means they worked by detecting changes in electric fields generated by a finger or stylus. While PCT is still found in many smartphones and tablet computers, it has been largely replaced by other technologies that use better sensors and more advanced circuitry. In terms of overall sophistication, PCT has come a long way since its debut.
SAW
In 1975, SAW was invented by a man named Sam Hurst at Bell Telephone Laboratories. This technology made it possible for two layers of glass, separated by a thin gap of air, to act as one seamless touchscreen surface. Unfortunately, cost constraints meant that SAW-based devices were too expensive for commercial use. It wasn’t until 2001 that Synaptics created a low-cost optical sensor that could be integrated into touch screens. The first public use of Synaptics’ optical sensor technology was in an HP notebook computer in 2004.
Surface acoustic wave (SAW)
As we mentioned above, touch screens are a form of touchscreen technology. In addition to capacitive touchscreens, there’s another common type known as surface acoustic wave (SAW) technology. SAW touchscreens work by sensing sound waves via a special screen cover layer. The user then taps or places their finger on a certain area and creates small vibrations that travel through both layers—which allows your screen to register where you touched it! The good news is that today’s SAW sensors are very fast—making them ideal for consumer electronics and other applications that require fast reaction times.
Inductive coupling
Though touch screens have been around for a while, it wasn’t until touch screens became popular with smartphones that they started becoming commonplace. One thing that makes touch screens different than the traditional mouse or keyboard-based interactions is that they can be integrated into devices in numerous ways. For example, with capacitive or resistive touch screens (most smartphones and tablets), you interact with a screen by touching it. A capacitive touchscreen is one where electrical fields are generated and captured when fingers come in contact with its surface; these electrical signals then trigger inputs and controls like pressing buttons or swiping through lists. Resistive touchscreens work similarly, but utilize pressure instead of electrical energy for interaction – for example, think about how your smartphone responds to a light tap vs.
Passive optical imaging (POI)
The idea that a touch screen could be used as an input device by tracking body movement dates back at least 50 years. However, it wasn’t until 2008 that researchers proposed using passive optical imaging (POI) as a means of interaction. The technology works by projecting infrared light onto a user’s finger or stylus; IR light reflects off fingertips and is absorbed in contact with skin. By studying how IR light reflects off the skin, POI touch screens can determine where your finger is, track movement, and even recognize tap patterns.
Acoustic pulse recognition (APR)
When Apple first released the iPhone in 2007, one of its most lauded features was a touchscreen that used an array of sensors and software to recognize finger taps. This gave iOS users a more intuitive way to interact with their phones than traditional styluses. However, capacitive touchscreens also had their drawbacks; they didn’t work as well when wearing gloves or using a stylus, and they became less accurate as people’s hands got oily. Since 2007, smartphone developers have been working on solving these problems by designing new methods for recognizing touchscreen inputs. Some newer smartphones include acoustic pulse recognition (APR) technology — which is essentially sound wave fingerprinting — which may be able to resolve many touchscreen issues without making touchscreen hardware any bigger or more expensive.
Proximity sensor
A proximity sensor detects nearby objects without physical contact. These sensors can be found in phones, laptops, gaming systems, and even pens. Some are infrared (able to sense heat) and others are sound-based. You may already be familiar with touchscreens as a way to interact with your devices, but how did we get there? Let’s look at how these technologies have evolved over time so you can see if one is right for your business—or learn where they might come in handy soon.
Optical imaging (OI) sensors
OI technology is a sensor type used in touchscreens, like on phones and tablets. There are two ways it works: active imaging and passive imaging. Passive OI sensors use a small LED light that acts as an eye on your screen. The LED sends light that reflects off your finger or stylus. As you move your finger or stylus over a screen, the LED follows you, and information about its path is captured by the software. Active OI sensors don’t require external lights because they emit their own light via LEDs built right into them.