Will AI take away your job? [Book Review]

Tech Magazine (2020)
  Copy   BIBTEX

Abstract

Will AI take away your job? The answer is probably not. AI systems can be good predictive systems and be very good at pattern recognition. AI systems have a very repetitive approach to sets of data, which can be useful in certain circumstances. However, AI does make obvious mistakes. This is because AI does not have a sense of context. As Humans we have years of experience in the real world. We have vast amounts of contextual data stored in our brains that make it possible to predict and to know boundaries of the real world so that even if we have never been in a particular situation, we are still able to deal with it. The unknown situation is where AI will fall down. In engineering, AI is being developed to monitor and control certain systems, this could extend to Nuclear Power Plants for example. If we examine control systems for Nuclear Power Plants on vessels, the system would likely be programmed using the safety of the reactor as the main priority. Automation is crucial in engineering systems like this, as the likelihood and cost of human error is high and human reaction times are slow in comparison to the system. However, there is always the possibility of unforeseen external factors that may override plant safety which cannot be programmed. If we look at a plant onboard a Submarine, for example, we see such additional factors as ship safety. In one example, the power plant might breach safety parameters, in which case an automatic system may shut down the reactor. However, there might be a greater urgency such as a flood in another compartment or an attack upon the vessel, that would override shutting down the reactor. The AI system can neither observe nor utilise information from the external environment to make this type of judgement, whereas the operator may be completely aware. To programme context into AI systems would be near impossible. We still need operators as we still need to understand the environment around us, which may contain an infinite number of possibilities. In a further example of this; if we examine the ICO (Information Commissioner's Office) guidance, it is clear that in clinical settings a human being will always be required to validate findings and to provide context. Sometimes having a lot of data just doesn't replace social interaction, intuition and experience.

Author's Profile

Dr Marie Oldfield
London School of Economics

Analytics

Added to PP
2021-09-19

Downloads
177 (#72,275)

6 months
46 (#80,006)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?