Artificial allies

June 1, 2018 -- Chiara Luce Catipon

Artificial allies
Despite potential risks, machines and humans can work together to improve the human condition

By Chiara Catipon

“Mom, Siri says to turn left,” said my then four-year old niece to my sister. We were heading to the beach, and for her it is second nature to help her mom use her smartphone assistant to navigate driving directions.

Artificial intelligence (AI) is everywhere and is shaping the way people of all ages act. In fact, according to the latest Gallup Poll, 85% of Americans use artificial intelligence in daily life. Whether following GPS, listening to medical advice, selecting a movie on a streaming subscription, checking miles walked and the probability of rain, chances are that a machine has configured data for these needs.

While some hail AI as the end-all solution, others view it in an apocalyptic lens. The truth lies somewhere in the middle.

The rise of AI
Although first officially coined by John McCarthy in 1956, it was Alan Turing, a British mathematician, considered the father of computer science, who laid the foundation for artificial intelligence in his paper in 1936.

Paul Sung, principal software engineer at Snaplogic in Northern California, gives some insight. He said that AI remained in the academic and then military field for years, until Google published two papers around 15 years ago.

“Google developed algorithms to crawl into all the webpages and connect them, looking for significant keywords. In the late 1990s, Google developed a software for text processing, algorithms and page links, able to link billions of webpages. Then in the early 2000s, two students simplified the software. Google developed the Google file system and added the storage for messages. There is a huge number of webpages all over the world. They need to be accessed immediately, so with this software, Google was able to accomplish that,” says Sung.

Big data world
Today “Big Data” has the biggest impact on AI and favors its development. “Before, it was only a theory. We knew how the neurons work, but now we are able to imitate it. Think of Google Translate 10 years ago; it was like a joke. But now it is becoming more and more accurate, because it uses situational connections. This is machine learning.”

Now that computers are better able to store, link, process and transmit immense data at super speed, AI has been applied to various fields such as transportation, education, agriculture and healthcare, to name a few.

Perhaps the most mainstream applications are online consumer research and social networking. Whenever a user clicks on a page of interest or “likes” something, the computer stores that information and filters through data to recommend a purchase or webpage approximating that preference. This can include anything from a flight ticket to a newsfeed.

Another common use of AI is in transportation. Mobile apps like GPS help not only individual drivers find their destination — even parking spots — but also provide large-scale information that helps city planners design traffic flow and optimize services, such as public transportation schedules and automated enforcement.

More and more cars are equipped with cruise control and sensors that alert the driver of the car’s vicinity to objects in their blind spot or to a parked vehicle. Of course, the latest development is self-driving cars and trucks, while drones are being deployed to deliver blood and supplies to war-torn cities.

Also in agriculture, the use of AI and drones is helping farmers. For example, Blue River Technology has created drones with specialized cameras that fly over large fields, collecting data that is then analyzed to decide where to strategically spray herbicides. According to their website, this has cut costs by 90%. The drones also measure the quality of soil, so farmers can address issues of moisture loss and nutrient deficiency. In the cattle industry, monitors attached to a cow’s neck help farmers track early signs of illness.

In education the use of Intelligent Tutoring Systems (ITS) is providing support for teachers in personalizing instruction. It keeps track of student performance in specific tasks, such as reading or math, gives students immediate feedback, and provides corrected practice opportunities. This has been shown to be helpful in first year college remedial math courses, according to the website Carnegie Learning.

Students can also use software that breaks down a textbook into more maneuverable study guides, and teachers can design curriculum, integrating video and audio components for visually or hearing impaired students.

Healthcare is another area benefiting from advancements in AI. A U.S. News article describes how El Camino Hospital has reduced the cases of patients falling after a surgery by 39% by identifying risk factors, using patient record information and tracking the frequency of their nurse calls.

Communication between doctors is also being streamlined to lower the risk of complications, probability of heart attack or stroke and drug interaction. AI has been helping to make faster diagnoses in the emergency rooms, and robots have even been shown to be more precise in hip or knee replacement surgeries, according to a website.

The University of Pittsburgh Medical Center compares patients’ genetic information from their tumors to molecular data available in the National Institutes of Health database, which contains 10,000 tumor samples of 33 cancer types. In an IBM Watson study, the program’s recommendations for cancer treatment agreed with 90% of those of MD Anderson physicians.

The job market
All this leads to the question whether machines will take over our jobs. A November 2017 McKinsey Global Institute Report predicts that by 2030, 50% of current work activities will be automatable and that in 60% of occupations today, a full third could be automated.

Berkeley Haas School of Business Professor Laura Tyson, however, does not foresee this as unemployment driven by future technology but a massive job dislocation. She stresses what many see as the need for more investments in human capital, as the divide grows between owners of huge conglomerates and the low and middle-skilled workers, who will need retraining due to automation.

Humans, machines together
No amount of machine sophistication or intelligence can ever replace humans. The Bureau of Labor Statistics reported that for the time period 2016 to 2026, 11 of the top 25 fastest-growing occupations are healthcare-related, where human skills are essential.

LinkedIn, one of the largest job-referral and networking sites, reports that soft skills remain an important part of being a fit for any role, and more than 60% of hiring managers said that they had a hard time screening for them. The top five skills noted were adaptability, culture fit, collaboration, leadership, growth potential and prioritization.

Therefore, as in the example of AI interpreting images from a CT scan more and more accurately, helping doctors diagnose a patient, intelligent human and machine interaction can be embraced as a valid contribution to the process of growth in what is becoming an always more digital age.

As for my niece, we used the time saved by the GPS directions to have a human-to-human conversation about what we were going to do at the beach.


If you are interested to read more articles like this: