How email patterns can predict impending doom

How email patterns can predict impending doom

TECHNOLOGY How email traffic can predict impending doom happened around a month before. For example, the number of active email cliques, defined as g...

92KB Sizes 0 Downloads 31 Views

TECHNOLOGY

How email traffic can predict impending doom happened around a month before. For example, the number of active email cliques, defined as groups in which every member has had direct email contact with every other member, jumped from 100 to almost 800 around a month before the December 2001 collapse. Messages were also increasingly exchanged within these groups and not

Can we trust military drones to decide when to fire? TECHNOLOGY has long been distancing the soldiers who fire weapons from the people who get hit. Now robotics engineer Ron Arkin at the Georgia Institute of Technology, Atlanta, is trying to sever the link completely, with software that would allow drones to make their own decisions about wielding lethal force. He has developed an “ethical governor”, which aims to ensure that robot attack aircraft behave ethically in combat, and is demonstrating it in 20 | NewScientist | 20 June 2009

the Enron emails. He says that if further research backs up Menezes’s idea, this shift in communication patterns could be used as an early warning sign of growing discontent within an organisation. “Human resources folk would probably find this extremely useful,” he says. Confirming this link will be difficult, though, as privacy concerns mean that email logs are hardly ever made public. Menezes says his university will not allow him to analyse even an anonymised version of student email data. Such restrictions are frustrating for researchers who study social networks, as they had hoped email logs might boost the field by providing unprecedented quantitative data on how people communicate. A few other studies have been carried out, however. Duncan Watts of Yahoo Research in New York managed to obtain anonymised university email records – though he is not able to share them with others – and used them to show that instead of varying continuously, individuals’ emailing behaviour falls into distinct clusters. Meanwhile, Peterson used the Enron email log to develop software that scans for potential saboteurs working –Cliques can be electronic too within an organisation. Jim Giles ■ HELEN KING/CORBIS

EMAIL logs can provide advance warning of an organisation reaching crisis point. That’s the tantalising suggestion to emerge from the pattern of messages exchanged by Enron employees. After US energy giant Enron collapsed in December 2001, federal investigators obtained records of emails sent by around 150 senior staff during the company’s final 18 months. The logs, which record 517,000 emails sent to around 15,000 employees, provide a rare insight into how communication within an organisation changes during stressful times. Ben Collingsworth and Ronaldo Menezes at the Florida Institute of Technology in Melbourne identified key events in Enron’s demise, such as the August 2001 resignation of CEO Jeffrey Skilling. They then examined the number of emails sent, and the groups that exchanged the messages, in the period around these events. They did not look at the emails’ content. Menezes says he expected communication networks to change during moments of crisis. Yet the researchers found that the biggest changes actually

shared with other employees. Menezes thinks he and Collingsworth may have identified a characteristic change that occurs as stress builds within a company: employees start talking directly to people they feel comfortable with, and stop sharing information more widely. They presented their findings at the International Workshop on Complex Networks, held last month in Catania, Italy. Gilbert Peterson at the Air Force Institute of Technology in Dayton, Ohio, has also worked with

simulations based on recent conflicts in Afghanistan and elsewhere. In one scenario modelled on a situation encountered by US forces in 2006, the drone identifies a group of Taliban soldiers inside a defined “kill zone”. But the drone doesn’t fire. Its maps show the group is inside a cemetery; opening fire would breach international law. In another scenario, the drone identifies an enemy vehicle convoy close to a hospital. Here the ethical

governor only allows fire that will damage the vehicles without harming the hospital. Arkin has also built in a “guilt” system which, if a serious error is made, forces a drone to start behaving more cautiously. In developing the software, Arkin drew on studies of military ethics as well as discussions with soldiers, and says his aim is to reduce noncombatant casualties. One Vietnam veteran told him of soldiers shooting at anything that moved. “We should

“In some cases, soldiers shoot anything that moves. We need to make robots perform better”

be thinking about how to make robots perform better than that,” says Arkin. Simulations are a powerful way to imagine the future of combat, says Illah Nourbakhsh, a roboticist at Carnegie Mellon University in Pittsburgh, Pennsylvania. But they gloss over the difficulties of building robots that can understand the world enough to make such judgements, he says, something unlikely to be possible for decades. The Georgia Tech group has also made a system that advises a soldier of the ethics of a mission as they program it into a drone. Such tools could be used much sooner, as similar systems already exist to help doctors with medical ethics. Tom Simonite ■