SETI@home publishes results from 21 years of distributed computing
For 21 years, private computers analyzed data from space for traces of extraterrestrials. The most promising signals are now being reviewed.
The Arecibo telescope before its collapse.
(Image: NAIC)
21 years, 12 billion signals, 100 candidates: The team from the University of California, Berkeley (UC Berkeley) has drawn a balance of the SETI@home project. As part of this project, millions of people around the world made their computers available to search for possible traces of intelligent life in space, so-called technosignatures.
The project ran for 21 years, from 1999 to 2020. Six years after its end, the team presented the results in the journal The Astronomical Journal. According to the results, the computers involved in SETI@home found 12 billion notable signals in the Arecibo data. After further analysis, 100 candidates remained, which are now being further investigated, says David Anderson, one of the project's founders.
The UC-Berkeley project SETI (Search for Extraterrestrial Intelligence) was the impetus for SETI@home, which searched the signals captured by the Arecibo radio telescope on the island of Puerto Rico for signals from possible extraterrestrial civilizations. Since the computational effort was immense, the idea arose to distribute it -- to private computers of internet users. This type of distributed computing was used at a time when complex and demanding tasks had to be solved without a supercomputer.
SETI@home used idle computers
To this end, UC-Berkeley computer scientist Anderson, along with astronomers Eric Korpela and Dan Werthimer, founded SETI@home. To participate, users installed a program on their computers. This program downloaded small data packets from SETI and processed them when the computer was not in use.
It was one of the most popular crowdsourcing projects of its time: After just a few days, SETI@home already had 200,000 users worldwide, and a year later, 2 million – far more than the initiators had expected: “When we were planning SETI@home, we were trying to decide if it would be worthwhile, if we would get enough computing power to actually make new scientific discoveries. Our calculations were based on getting 50,000 volunteers. Pretty quickly we had a million,” says Anderson. “That was pretty cool, and I want to let this community and the whole world know that we have indeed made scientific discoveries.”
From the immense amount of data, the SETI@home computers filtered twelve billion radio signals with characteristics of possible technosignatures. “Until about 2016, we didn't know what to do with these characteristics,” says Anderson. “We had no idea how to do the second part of the analysis.”
Videos by heise
Help from Hannover
The researchers received help from Hannover: they were able to use the supercomputer of the Max Planck Institute for Extraterrestrial Physics to filter out electromagnetic interference and other noise. This reduced the number to a few million. The researchers selected the thousand most promising ones from these and evaluated them manually. In the end, around 100 candidates remained.
These are now being reviewed. For this purpose, the researchers are using the Five hundred meter Aperture Spherical Telescope (FAST) in Guizhou, southwestern China. FAST, which went into operation in 2016, is currently the largest radio telescope. The facility in Arecibo has been out of operation since its collapse in 2022.
Anderson does not expect to find a signal from extraterrestrials. “Even if we don't find ET, we can at least say that we have reached a new level of sensitivity. If there had been a signal above a certain strength, we would have found it,” he summarizes. “One conclusion of ours is that the project didn't quite work as we thought. Also, we have a long list of things we would have done differently and that future sky survey projects should do differently.”
(wpl)