Originally Posted by Asmodian
It is important to remember this is using radio telescopes, thinking of this data as normal image processing is very wrong. The way we combine data from multiple radio telescopes is quite different. Radio telescopes use interferometry, this enables amazing resolution with widely separated telescopes but only after doing completely crazy amounts of very sophisticated signal processing (math). This is super computer levels of math, not something you can do in a few days on a HEDT PC or something.
That's right. This is something they probably could have done years or decades ago with the satellite tech, but its only been recently that the compute abilities needed to render the finial images has been available. I've been trying to find out what supercomputer they've been using to render the image, but so far I haven't found an answer. I did find this though;
Enhancing the Sensitivity of the EHT
Data Collection at Wide Bandwidths
One way to increase the sensitivity of the EHT is to capture more energy from the black hole targets at each EHT site. Since black holes emit radiation at many frequencies, we can do this by increasing the range of frequencies that are recorded during EHT observations. This, in turn, requires electronic systems and recording systems that operate at higher speeds. Industry trends that allow faster personal computers and higher capacity hard disk drives have enabled the EHT to leap forward to recording rates that are more than a factor of 10 faster than for any other global array. This is embodied in “Moore’s Law”, a heuristic coined in 1965 by Intel co-founder Gordon Moore, has predicted the exponentially increasing power of integrated circuits for the subsequent decades.
The effect of Moore’s Law has enabled the EHT to gather, record, and process much larger bandwidths at a fraction of the cost of earlier pioneering VLBI systems. The resulting increase in observing sensitivity has helped extend the EHT’s reach to longer baselines, and resulted in higher quality data sets with much better “signal-to-noise” ratio, or SNR.
The EHT equips each single dish site with specialized electronics designed and supplied by the collaboration. Though historically, analog VLBI equipment was used, in the modern era digital electronics is prevalent and has been the mainstay of the EHT. For single dish telescopes, the primary unit is called the VLBI “Digital Back End”, or DBE, which samples analog data from a radio receiver and feeds the formatted digital data to a data recorder.
Several different types of digital backend have been used in EHT observations, including the first-generation DBE1 system, the Digital Base Band Converter (DBBC) system, developed in Europe, and the ROACH Digital Backend (RDBE). The most recent incarnation is called the “R2DBE” or “ROACH2 DBE”, and has been deployed at all EHT sites. The R2DBE samples and processes data at a rate of 16 gigasamples-per-second, perfectly matched to the recording data rate of the Mark6 digital recorder, the latest generation of EHT VLBI Data Recorder. ROACH stands for “Reconfigurable Open Architecture Computing Hardware” and is shared by an open source astronomical instrument collaboration called “CASPER” the Collaboration for Astronomy Signal Processing and Electronics Research”.
Each Mark6 recorder receives digital data at a rate of 16 Gigabits/sec from the R2DBE and distributes it among a total of 32 hard disk drives grouped into 4 modules of 8 disks each. The EHT is scheduled to record an aggregate rate at each site of 64 Gigabits/sec by using 4 Mark6 units in tandem. This rate is matched to the maximum bandwidth current available from the key ALMA site (Atacama Large Millimeter/Submillimeter Array) that has the largest collecting area of all the EHT sites.
Recorded disk packs from each site are shipped back to two central locations, the Max Planck Institute in Bonn, Germany, and the MIT-Haystack Observatory in Westford, Massachusetts, for correlation. The DiFX, or “distributed F-X” software correlator is now used for EHT correlation. Among other advantages, software correlation clusters are scalable and the programs are easily customized. CPU-based processors are commodity products so in the processing domain as well as the recording the EHT take’s advantage of Moore's Law advances in processing power.