Most of the SETI programs in existence today, including those at UC Berkeley build large computers that analyze that data from the telescope in real time. None of these computers look very deeply at the data for weak signals nor do they look for a large class of signal types (which we'll discuss further on...) The reason for this is because they are limited by the amount of computer power available for data analysis. To tease out the weakest signals, a great amount of computer power is necessary. It would take a monstrous supercomputer to get the job done. SETI programs could never afford to build or buy that computing power. There is a trade-off that they can make. Rather than a huge computer to do the job, they could use a smaller computer but just take longer to do it. But then there would be lots of data piling up. What if they used LOTS of small computers, all working simultaneously on different parts of the analysis? Where can the SETI team possibly find thousands of computers they'd need to analyze the data continuously streaming from Arecibo?
|