A basic procedure in thinking about the future is to look back into our shared history and understand how people did something in the past. This hindsight allows us to learn from the way humans have approached change and apply this knowledge to future situations. Not only does it allow us to avoid the problem highlighted in the age-old adage, “Those who forget the past are condemned to repeat it,” but it also allows us to remember the first principle of the future: The future will be populated by human beings who do not change as quickly as our technology does.
Seen through a historical view of human change, technology has not made as much of an impact as we want to claim. We continually make massive changes in our society and culture—and this includes technology—but true human needs have not changed appreciably throughout our collective history.
To gain clarity about the future of IoT, we have to look back to how we connected ourselves to our technology in the past. Because IoT promises revolutions in control, synchronization, and technological extensions of human agency, lessons from the past will demonstrate how best to build our connected future.
The roots of IoT lie in two fundamental human qualities: First, we are social beings, and second, we tend to anthropomorphize the objects in our environment to help us make sense of them and bring them into our social world. While these are self-evident, taken together they suggest why human beings are so quick to try to organize everything in our lives along the framework provided by social structure. We use almost everything in our world to further individuality and social expression and to communicate with others. In fact, we are so social that we even try to turn inanimate objects into social beings. We imbue everything with social meaning. We encode our laws into things like speed bumps and police uniforms. We talk about objects like computers, cars, and teddy bears as if they have intention and agency of their own. We associate them with emotions and memories. Even in the adult world, a teddy bear or a house has a life of its own.
With this in mind, it is not difficult to see that what IoT offers is a fuller expression of this anthropomorphization. IoT allows us to push our desire to include objects in our social world as human analogs and make them into true, independent social actors. The basis for the development of IoT is really nothing more than a technologization of what we’ve been doing for millennia. This leads us to our next point: The future development of IoT will have to be grounded in an extension of what humans already do.
In the distant past, it was very difficult to organize human action beyond a small number of people—anything more than a village. Even though skills like making stone tools or metal smelting could be transmitted between individuals and groups fairly quickly, it still took many hundreds of years to transmit these skills across the globe. For instance, the people living in what is now the United Kingdom were technologically hundreds of years behind their counterparts on the continent well into the Bronze and Iron Ages. Later, after travel technologies closed the distances between populations, the pace of technology sharing increased.
However, synchronizing even daily efforts beyond a household was still difficult. Until the 12th-century development of early mechanical clocks based on monastic time, few people used the notion of time to synchronize their lives. These clocks were first used to coordinate the “hours” of the day that regulated the prayer times of monastic communities; the clock would signal when it was time to go to chapel to worship. With the addition of the bell tower, everyone in earshot could also coordinate themselves. Even though farmers and townspeople did not need to pray, they used this resource to organize the day and synchronize their schedules.
The information from a simple bell tower and mechanical clock provided a new way to coordinate action and understand time. While the technology was mechanical, the impact was social and conceptual. The introduction of time into daily life began a larger revolution where more people wanted clocks to better coordinate with those who had them. This desire was not to own the latest “must have” commodity, but to be able to coordinate trade, farming chores, and to meet up and chat. Even though all of these new clocks were unconnected, people were connected through their use. Time was the information that connected them all.
Now everything even mildly computerized has a clock, and these clocks connect us to everything else that has a clock. Moreover, clocks provide the basis for features. Devices can turn on and off using timers. They can perform scheduled actions. They can even coordinate with each other, just because they share the information provided by time. By being synchronized through the medium of time, humans and devices are connected practically and socially. Importantly, this connection allows these devices to act with some autonomy.
IoT provides a number of new possibilities for synchronization and social collaboration based on different means of connection. These connections are no longer conceptual, but are driven by a constant exchange of other kinds of information—mostly as data. Synchronization need no longer rely solely on time, but is achieved through machine-to-machine exchange. The content of this exchange is data.
In a more direct sense, the history of IoT is also the history of the development of computing and the manipulation of the data that it both requires and produces. Major developments, like the PARC developments of networked computing, the improvement of addressing and routing protocols that began with the early TCP/IP developments in the mid-1970s, ARPANET, and the final development of the World Wide Web have certainly contributed a great deal. However, smaller developments have contributed as well, such as improvements in antenna technology and sensor technology, the conceptual developments provided by media theorists like Marshall McLuhan, and even Kevin Ashton’s coining of the term “the Internet of Things.”
Understood this way, the foundations for IoT began when independent scientists and inventors developed the various pieces used in telegraph communication in the 19th century. This technology proved that even global connectivity was possible. Following that were the advances in basic computing made by various visionaries: from Alan Turing’s developments in the 1930s, all the way to the innovations in the 1970s (Xerox’s PARC labs, the mouse, Ethernet, Interpress [a forerunner of PostScript], and early touch screens, among others), computing developments have been working for decades to provide users with a technological “brain.” Sensor technologies provided “eyes” and “ears.” Networking protocols and technologies provided a means for communication. Programming languages provided the language. But in the end, the human desire for objects that can be more like us have guided us to this point.
Seen from this perspective, we should view the development of IoT as the joining of two important threads of history: a history of technological development throughout the 20th century, and also a history of the human desire to make the world more human.