Was the iPhone the first phone with a touchscreen?
The short answer is “no.” The iPhone wasn’t the first smartphone to boast a touchscreen. But Apple really was a pioneer in this field, just years earlier than Steve Jobs gave the world the first model of the device that made such big waves in the respective field.
The story of this technology, in fact, spans several decades, with ups and downs. If you’re interested, this piece gives the outline of how touchscreens, currently coming in all feasible sizes and shapes, came to be, evolved, and reached maturity. As a bonus, we also offer a peek at what will be the next big thing(s) in the interface realm.
Touchscreen technology: the history
Some of the first real-life applications of the touchscreen technology have to do with air traffic control. In 1965, E.A. Johnson of the UK’s Royal Radar Establishment developed the first capacitive screen that enabled a touch interface, and it was used to simplify the work of the air traffic control officers.
In 1971, Dr. Samuel Hurst invented the first resistive touchscreen. This type of screen accepts input from fingers and other objects.
Infrared technology, which can be considered a touchscreen surrogate, was used in PLATO IV, an educational terminal made at the University of Illinois in 1972.
The first transparent capacitive touchscreen was made at CERN, currently best known for its Large Hadron Collider. The center is quite old, in fact, and in 1973, for example, its engineers managed to further advance the technology that would later become ubiquitous.
The first touchscreen device available to regular consumers — sold in stores, that is, — was the Hewlett-Packard’s HP-150, a mass-produced personal computer first marketed in 1983. The interface technology in that one was infrared, the same as in the aforementioned PLATO IV. The device remained a niche item, probably simply because of being too advanced for its time.
Through the 1980s, touchscreens slowly found their way to setups controlling industrial production processes, and in the mass-market domain, they were more and more widely used in ATMs and all sorts of self-service kiosks.
In 1993, Apple released the Newton PDA, which had a resistive touchscreen and could recognize handwriting. This was a truly revolutionary product, the predecessor of what we call tablets today. While not the first touchscreen device nor the first PDA, Newton pioneered the new interface approach that would become the gold standard in the industry.
IBM, which was a major player back then, too, made its Simon Personal Communicator in 1994, and it was the first commercially available mobile phone with a touchscreen (resistive, i.e., you needed a stylus to use it). While not massively successful with only about 50 thousand units sold, this was the predecessor of modern smartphones.
Fast forward to 2006, when LG presented KE850 Prada, which actually was the first mobile phone with a capacitive touchscreen, controllable with your fingers. In 2007, Apple released the first iPhone, but it was only the second device of this type on the market. There was a pioneering feature in that one, though: multitouch interface, which, as it turned out, made all the difference.
A side note: the multitouch technology was invented in 1982 at the University of Toronto, but back then, it found no application in commercially available products.
What’s next in the interface realm?
Here are some developments that enhance the touchscreen interface technology and push it to the next level.
- Haptic feedback. Basically, haptic feedback grants screens the ability to push back to touch, giving users tactile sensations. This is applicable in games, virtual reality, and some professional use scenarios (medicine, materials handling).
- Flexible touchscreens. While folding phones are not new, and Apple has been rumored to work on one, too, truly flexible touchscreens are still a bit down the road from us. But there’s no doubt that we will soon have communication devices that wrap around the wrist.
- Hybrid control technology. Touchscreens can be just a component of a more complex interface that includes voice and gesture recognition. At first, such setups would probably become common in professional environments, and then find their way to the consumer market.
- Eye tracking control. This technology has already been developed and commercialized, but so far, it’s not widely recognized. The advent of AR glasses will probably facilitate its adoption.
- Tongue interface. This is an interesting twist: using your tongue to control devices. The technology suggests attaching a pad to the palate that receives touches from the tongue, interpreted into control signals. So far, it’s more of a university thing, with establishments like MIT housing the enthusiasts developing the tech, but given the overall speed of commercialization and adoption of new things, we can find it used in some popular devices quite soon.