Last night I watched an awesome documentary on the NatGeo Science channel about black holes. The program featured a Scientist from MIT's Haystack observatory who has made it his career goal to be the first ever human to observe a black hole. Granted I'm no astrophysicist but the gist of his theory seems to be that black holes warp the visible light that exists behind them in the background. This theoretically gives the black hole a large glowing halo of sorts. His plan is to attempt to view this halo to identify the black hole, his target is the suspected blackhole at the center of the Milky Way Galaxy named Sagittarius A*. The black hole that is thought to be here is rumored to be four million times as dense as the earths sun. It boggles my mind personally to imagine something that dense, yet so small.
To be able to see this far into space the MIT team would need to construct a telescope roughly the size of the continental United States. Obviously this is a logistical impossibility by today's standards. The solution? The team at MIT has created a program that uses data from radio telescopes throughout the United States (I mean TONS of data, the show showed RACKS upon RACKS of what appears to be hot swappable SANS devices) that are plugged into an MIT supercomputer to create a large "virtual telescope"
Using this new telescope the team at MIT can produce images of much greater detail and from much greater distances than ever thought possible. What do you folks at DST think about this type of technology and research? The discovery and observation of these singularities could send science as we know it back to the drawing board... Which isnt a bad thing of course, but I find myself wonder about the legitimacy of results from a "virtual telescope".
MIT News
The article is a little old, from 2008...
Here's a basic diagram of how the radio telescope clustering works. Not sure how relevant it is, the scientist on the program stated teh array contained observatories and radio telescopes accross the continental US and the amount of data arrays they showed on screen literally left my jaw hanging.
To be able to see this far into space the MIT team would need to construct a telescope roughly the size of the continental United States. Obviously this is a logistical impossibility by today's standards. The solution? The team at MIT has created a program that uses data from radio telescopes throughout the United States (I mean TONS of data, the show showed RACKS upon RACKS of what appears to be hot swappable SANS devices) that are plugged into an MIT supercomputer to create a large "virtual telescope"
Using this new telescope the team at MIT can produce images of much greater detail and from much greater distances than ever thought possible. What do you folks at DST think about this type of technology and research? The discovery and observation of these singularities could send science as we know it back to the drawing board... Which isnt a bad thing of course, but I find myself wonder about the legitimacy of results from a "virtual telescope".
MIT News
The article is a little old, from 2008...
Here's a basic diagram of how the radio telescope clustering works. Not sure how relevant it is, the scientist on the program stated teh array contained observatories and radio telescopes accross the continental US and the amount of data arrays they showed on screen literally left my jaw hanging.
