A pair of astrophysicists have published a new paper which cautions that humans may want to consider the consequences before opening any messages sent to the planet by ETs.
The intriguing thought exercise from scientists Michael Hippke and John Learned looks at various ways in which first contact may turn out to be a Trojan Horse aimed at wiping out humanity.
On a technical level, they note that a lengthy, coded missive from aliens would require computers to decipher and, in turn, this process leaves open the possibility of unleashing some kind of ET malware.
The alien computer virus might then corrupt the systems here on Earth and create chaos on the planet by taking down our power grids and other technological infrastructure.
"It is cheaper for ETI to send a malicious message to eradicate humans compared to sending battleships," the authors muse in the paper.
To that end, the astrophysicists also propose a scenario where aliens greet humanity in a peaceful fashion and include with their greeting a technological emissary in the form of AI.
With promises of curing diseases and solving the planet's problems, the temptation to 'work with' the AI may prove to be too great, leading to Earth eventually being taken over by the technology which sports an intellect vastly superior to humans.
Hippke and Learned posit that even if tremendous precautions are taken to contain the AI upon its initialization, the prospect of it eventually being released 'into the wild' is almost certain to happen eventually.
And, on a more basic level, the researchers noted that ETs might induce mass panic on the planet by simply sending a message that threatens to destroy the sun, causing civilization to meltdown in response.
Although the authors ultimately conclude that the potential benefit of reading an ET message far outweighs the risk involved, they stress that all possible outcomes from such a decision should be considered first.
Source: IB Times