Legislators and opinion-formers need to start thinking about how autonomous machines like driverless trucks, surgical robots and smart homes that keep an eye on their occupants could affect society, according to the Royal Academy of Engineering.
In a new report, the Academy points out that the technology to develop such systems is either already available or closer to reality than many people think — and the legal system needs to catch up fast.
‘We’re very used to automatic systems, such as the braking assistance technology now standard in most cars’, said Prof Will Stewart of Southampton University, one of the contributors to the report. ‘But traditionally, engineers have designed these things so that they’re used with a human operator. As we move towards autonomous systems, we’re taking the human further and further away from the machine.’
The report’s authors looked at two particular types of system — autonomous transport, which they believe is maybe ten years off and most likely to first be used on heavy lorries; and smart homes, particularly in reference to the elderly, who could benefit from health monitoring systems and even devices to provide ‘companionship’, such as robotic pets.
‘We expect to see a new generation of systems that will become tools that are in some respects almost like people, but will also pose some of the same ethical and management issues as people do,’ Stewart said. ‘We expect great benefits — but also some new attitudes to our creations.’
Autonomous trucks are a good example; as Lambert Dopping-Hepenstal, a member of the Academy’s engineering ethics working group and BAE Systems’ Military Air Solutions’ science and technology director pointed out, autonomous vehicles already operate in mines and warehouses. Such trucks would use lasers and radar to monitor their surroundings and neighbouring cars, and would have the Highway Code programmed into them.
‘They’d be much more predictable than trucks driven by humans; they wouldn’t pull out suddenly, they would always pull in if there was a problem; they’d give way where they were supposed to,’ Dopping-Hepenstal said. ‘But also, there are bound to be problems. If there’s an accident involving one of these things, who’s responsible? The system's engineer? The manufacturer?’
One problem, explained Chris Elliott, a consultant engineer and barrister who also contributed to the report, is that the legal framework isn’t set up to deal with this sort of situation.
‘The law is built around cause and effect, but it’s bad at assessing systems, where each individual part is harmless but the whole might be harmful,’ he said. ‘It’s still in the age of automation, where the role of a human operator is well-defined.’
The legal and ethical systems have to catch up, he added, and the engineers developing these systems need guidance on how their machines might be licensed and approved.
Elliott is particularly concerned about a possible ‘yuck factor’ that might hinder public acceptance of autonomous systems. Heavy trucks and robot surgeons inherently carry some risk, and the prospect of one of these systems making a mistake and killing someone has to be discussed before they are developed.
Smart homes also present an ethical problem: systems which monitor an elderly person, watching for activity when they normally wake up; checking whether they take regular medication; and even monitoring vital signs, would doubtlessly reduce the risk of their condition deteriorating, said Dopping-Hepenstal.
‘But there are also questions of privacy, and whether that’s too much observation. We need to know what people are comfortable with. That’s a big issue now, and it’s going to get even bigger.’
Stuart Nathan
英國鐵路公司如何推動凈零排放
I am a little concerned when the OP mentions 'accelerator' and 'changing gear', as well as switching off the fuel supply???... it...