There is a famous puzzle called "Missionaries and Cannibals." In this game participants have to transport cannibals from one side of a river to the other on a raft. The only catch is that cannibals can never outnumber missionaries on either river bank or on the raft. The puzzle needs planning and thought as missionaries and cannibals are brought back and forth across the river, making sure that the missionaries are never outnumbered, until all are safely landed on the other side of the river. Some people never figure the puzzle out, and, as the numbers of missionaries and cannibals increase, the puzzle becomes harder and harder.
Recently, Dutch researchers took a number of people into their laboratory to study how they played this game. The participants were split into two groups and the game was played on a computer. One group were given software prompts showing possible future moves. The other group were left alone to figure out how to transport the missionaries or cannibals. The group given software prompts soon got the hang of things and merrily started sending cannibals to the other side of the river. They outperformed the group without the software prompts; they were quicker and more effective. But as the puzzle became harder, and as time passed, things began to change. The group relying upon help from the computer began to aimlessly click around and became confused. In contrast, the group that had used their own brain were better able to think ahead, they were better at plotting strategy, they had better conceptual thinking, they were more efficient and had more focus, and more knowledge was imprinted in their brains. Eight months later, the same participants were recalled to the laboratory. The results still held: the group who relied on their brain, without software assistance, were smarter. The price of software assistance was the dumbing down of the human brain.
Often critics of emerging technology and the digital world focus upon the effects of information overload. What is less considered is information underload. This is when intellectual and physical tasks are taken from people and handed over to software or machines. Information underload doesn’t happen when we abstain from technology or the online world: it happens when we hand over skills and tasks that we are able to do to computers, to software, to automated machines, to our cell phones, or increasingly to algorithms and artificial intelligence. This information underload is deskilling our brains. This has fatal and devastating consequences, but to see the crippling consequences of information underload we must first take a detour into northern Canada.
The navigational talents of the Inuit were first discovered in 1822. In the barren Arctic circle, the Inuit showed extraordinary wayfaring skills. They had a profound understanding of winds, snowdrifts, different types and colors of snow, currents, stars, tides, and animal behavior. The elders of this ancient Inuit community would pass on this spatial knowledge to their young children. Then, around the year 2000, things began to change. Traditional dogsleds were given up for snowmobiles equipped with GPS satellite navigation. Routes that had taken years to learn were now navigated with ease, and the young Inuit no longer needed to learn the wayfaring skills of their elders.
It might be asked by promotors of the new technology, “Who cares if traditional navigation skills are lost?” In fact there is great deal to care about when we surrender navigational autonomy to a GPS system. For one thing, the adoption of GPS by the Inuit led to a significant number of fatalities in their small community. Satellite navigation got them from point A to point B, but it didn’t help them to read the landscape. It didn’t take into account thin ice and newly emerged snow cliffs over which some tumbled to their death. And it didn’t take into account sudden gusts of freezing weather that rendered the GPS inoperable, resulting in death because they could no longer find their way home. But the Inuit are a small community in a faraway land, so perhaps the question “who cares?” still applies. After all, we don’t need to navigate our way through snowdrifts in a barren landscape.
Since 1970 we have known that neurons fire in the hippocampus every time we pass a certain place. These neurons are called place cells and they are tremendously important. When these place cells activate in the hippocampus the brain uses them to mark out new territory. For this mapping to occur we need visual, auditory, and tactile clues. This is much like the difference between reading from an actual book and a screen: the tactile, physical feel of a book and even its smell helps us to remember what we have just read. Place cells are not the only neurons that fire when we navigate. In the year 2000, neuroscientists found new neurons called grid cells that create a precise geographic grid of space in our brain. While the place cells map specific locations, the grid cells provide an abstract map. These two types of neurons, firing in the hippocampus, work together with our bodily motion and our directional sense to act as a sophisticated navigation system. The tunnel vision created by following the screen of the satellite navigation unit or our cell phone destroys this finely tuned system.
But perhaps we are not bothered about this either. Then consider this. Alzheimer’s disease primarily targets the hippocampus, the very part of the brain that navigating without electronic aids protects. Among the first symptoms of Alzheimer’s disease and dementia are a decline in location and spatial memory. Sufferers forget where they are, how to get home, or how to get to a place. As we use more electronic aids to get from point A to point B our hippocampus is weakened, much like people who have the beginning symptoms of Alzheimer’s disease. The use of GPS systems is not the only thing that reduces the capacity of the hippocampus. A lack of exercise or sleep weakens it, as does a junk food diet, and shallow intellectual work without memorization. Eating too frequently also weakens the hippocampus. This has led some neuroscientists to conclude that modern society is geared to shrinking the hippocampus, which will result in dementia being diagnosed at an earlier and earlier age. When we learn to navigate our brains become stronger, and not just for the purposes of navigation. We are better able to memorize, we retain facts, we become smarter, and, crucially, we grow new neurons in the hippocampus that protect us against age-related cognitive decline.
In 2005 the RAND Corporation predicted that $81 billion dollars would be saved by switching to electronic medical records. In 2011, a team of British public health researchers reviewed over one hundred studies on I.T. in healthcare. They found any perceived benefits were largely theoretical. In 2013, the RAND Corporation issued a chastened report that concluded the adoption of I.T. in healthcare had resulted only in marginal better care. In the intervening period, between 2005–2013, an I.T. supply contractor called the Cerner Corporation pocketed $3 billion for healthcare services. This same corporation funded the original RAND report back in 2005. The intervening period is a cautionary tale about what happens when talented, intelligent doctors outsource their brain to electronic records, note-taking, and devices.
The first warning was sounded by a State University of New York study. This found that doctors who adopted electronic note-taking and record keeping became deskilled, had decreased clinical knowledge, and gave less personalized care. They began to copy and paste standardized notes from templates. They reused old text in new records. This contrasted unfavorably with handwritten notes that were unique to each patient. One significant, unintended consequence of electronic record keeping was the loss of specialist knowledge. With handwritten notes each specialist had a unique style of handwriting and the ink was often a different shade. As junior doctors flipped through handwritten notes they began to identify the penmanship of specialists, these notes would draw their attention, and add value to their medical education. This claim is not theoretical. A University of Washington study showed how doctors homed in on the distinctive penmanship of known specialists. These paper notes also deepened doctors’ understanding of individual patients, allowing for better diagnosis and treatment. The physicality of handwritten notes also guided doctors. They were able to flip through pages of notes, as they would a book, giving them a quick and meaningful sense of the patient’s medical history. In contrast, the switch to electronic records has meant that doctors often only look at the last two or three entries, rather than the longview.
Another unintended consequence of the computers in healthcare was the introduction of a third party into the examination room. Most of us have been in a situation with friends or family wherein someone has reached for their cell phone in order to take a call or a message. Aside from being rude and ill-mannered, this deliberate action brings an unwanted third party into the room. Researchers found that bringing a computer or tablet into medical consultations was not progress, rather it was a symbol of medical illiteracy. The computer acted like a third party that competed for the attention of the doctor, affecting his ability to be fully present in the examination. It changed communication between the doctor and patient, while shortening the time spent physically examining the patient. Moreover, software on these computers is used to diagnose patients by way of a tick list of symptoms or an algorithm. This further deskills doctors.
Doctors who rely upon computer-based systems begin to lose their intuition, which in turn hinders their response to emergencies, unexpected events, or unique cases. For medics, intuition is not a vague concept. Cognitive scientists have shown that doctors do not use conscious reasoning. Instead, their brain draws on prior knowledge and experience to make split-second decisions that enable them to “see” what is wrong. They need courage and improvisation alongside expert, confident audacity. This is a talent of a very high order, whereby intuition, thinking, and action converge into one. Computers relentlessly chip away at this talent.
This loss of skill brings with it a loss of independence, a loss of autonomy, and it means the worker, whether a doctor or otherwise, adjusts to the machine or software. This is very different to people who use tools, whether a stethoscope or a chisel, to help with their chosen vocation. Tools in the hands of people expand cognition. We know from very recent studies by neuroscientists that using tools for a chosen craft grows new brain cells, making parts of the brain bigger, and increases cognitive abilities. The carpenter who uses a hammer, the rug-maker who uses a needle, the researcher who uses a pencil, and the doctor who wields a stethoscope are all using and expanding their intellects as much as each other. No profession is more superior to any other in this respect. Meanwhile, those professions that cede their skill to software and a computer are merely dumbing themselves down in the name of efficiency.
This use of hardware and software is about to take a more sinister step beyond helping humans to do their job. It is about to become autonomous. It is well known that Google and other technology giants are working on sterile self-driving cars. Drones controlled by operators in faraway lands are routinely killing civilians. What is less well known is the development of the LAR, a benign acronym for the rather less benign Lethal Autonomous Robot. The LAR is a robot developed by the military. It is capable of killing, whether by air in the form of a drone, or on land in the form of a robot. It already exists. The crucial component of the LAR is its autonomy. It is able to select and determine who to kill without any human intervention or oversight. The robot is set free and left to kill humans at its pleasure. It doesn’t have emotions, it doesn’t have any form of morality, and it has no notion of right or wrong. For the human target it is death by algorithm: the software and artificial intelligence in the robot decides that you are acting suspiciously and therefore you must die. One has to wonder why Google, an internet search company with a large amount of our data, is actively developing military grade robots.
One restraint on overseas wars is the public distaste for casualties from their own nation. No country likes to see their own soldiers brought back in body bags. The Lethal Autonomous Robot removes this restraint. It can go on a killing spree. It can be destroyed by an enemy. In either case, no one cares. There are no dead troops to bring home. The first autonomous kill by a robot will be heard around the world. It will change war and society for a very long time.
The question of morality and machines is not limited to these military robots. Autonomous vacuum cleaners are already for sale. As they go about their business they might kill some spiders, but no one cares about spiders. Then we have autonomous lawnmowers. These kill slightly higher species. Their blades chop down smaller mammals or frogs that get in their way. Still, only some people care about small mammals. But what about autonomous self-driving cars? If a cat runs into the road, perhaps the algorithm determines it is better to run over the cat rather than risk being rear-ended and having a messy insurance settlement. So the cat dies. Now, a young child runs into the road. If your self-driving car swerves to avoid the child you fatally crash into a lamppost. So who gets to die? And, who programs the algorithm in the software to make these moral choices?
The question of software and ethics is found in more imbecilic activities than driving a car. Digital social media, such as Facebook, has altered the very nature of friendship and conversation. Friendship is messy. It requires honorable, timeless traits such as sacrifice, trust, time, courtesy, generosity, and loyalty. It requires people to suppress anger, not be given to easy emotional outbursts, and to overlook faults. But these time- consuming aspects of friendship are too messy and inefficient for Facebook and Google. Facebook wants the statistical discovery of friends by search and algorithm. It wants to remove the effort from friendship. It wants to reduce friendship to quick replies or “likes” to newly posted photographs. The technocratic ideals of speed, productivity, and standardization are applied to the ideal of friendship. Google’s soon-to- be-released messaging app called Allo uses artificial intelligence to reply to your friend’s messages in order to save you time. Your reply is then replied to by the artificial intelligence on your friend’s phone. The result is bizarre: an app on both phones are having a conversation with each other with little to no input from either of you. This is not progress or friendship—it’s stupidity. Compare this with the very real human example from the Sunna of the Prophet, peace and blessings be upon him. He would turn his whole body and his whole attention to the person he was conversing with. This is very far removed from Facebook and instant messaging friends, or the interruption of the cell phone mid-conversation. It’s very different to doctors who look at their tablet while the patient sits unexamined in front of them. And this level of attention is different to the attention of the person who navigates by a screen, rather than being fully aware of his environment.
The answer to these problems is to not let software or hardware replace any skill. Every time you lose a skill it has a detrimental cognitive effect. Every time you let software take the place of the intellect or human endeavor you dumb yourself down. There has to be striving and friction to help our brains. Our relationships with others need to be in the real world with all the trials and tribulations this involves. We need to use tools that we control. Whether that tool be a needle, a chisel, a pencil or a stethoscope, and not give an inch of our knowledge or skill set to a machine. A map is a tool, a GPS unit is not. And perhaps most of all not an ounce of our moral, ethics, and virtues should be given up to an algorithm programmed by a technocrat in an unknown land.