# June 24, 2014: Radio interferometry, Fourier transforms, and the guts of radio interferometry (Part 1)

• Today’s post comes from Dr Enno Middelberg and is the first part of a trilogy explaining in more detail about radio interferometry and the techniques used in producing the radio images in Radio Galaxy Zoo.

I have written in an earlier post about the basic idea of how to increase the resolution of a radio telescope: use many telescopes, separated by kilometers, and observe the same object with all. Here is a little more information about how this works. [Read More]

Posted

• Thanks for starting threads on GZ blog posts, 42jkb!

While RGZ Talk does have its limitations, they are nothing compared with those of the comments to blog posts! So this is a wonderful opportunity to discuss the blog post, in detail. 😃

From the blog post:

radio waves are electromagnetic waves, and radio telescopes are sensitive to the electric field. Now we can build a radio telescope in a way that it produces as its output a voltage which is proportional to the electric field which the antenna receives

How different from optical astronomy! 😮 CCDs, PMTs (photomultiplier tubes), even photographic plates are essentially particle (photon) detectors.

Are there any classes of detector which work by producing an output proportional (in some sense) to the magnetic field of incident radio signals (not just in astronomy)?

If the two data streams have nothing in common [...] then the correlation coefficient will be zero, which is to say that they are not similar at all.

I know this is a high-level, prose summary, but won't there always be some correlations, even if there's no underlying physical relationship?

However, if the two telescopes point at the same source, the data streams will have a few bits in common, and the correlator spits out a correlation coefficient which is not zero.

But how do you tell when a non-zero correlation coefficient is due to a common source, and when it's due to random noise? Also, won't all manner of systematic effects - which have nothing to do with 'a common source' - produce non-zero correlation coefficients?

Maybe it's in the next installment, but aren't there other series of functions into which a signal can be decomposed?

Posted

• Hello,

good to get some feedback! You're right, in the radio regime we cannot count photons because they're too feeble, but we do have plenty of them. The reason for that is the Planck statistic of photons: the number of photons per energy interval Delta E emitted by a black body is

n(E) Delta E = g(E) * 1 / (exp [h nu / kT] - 1) * Delta E

which implies that in the radio we have thousands of photons per energy state, whereas in the optical we have more like 1/1000. So in the optical and at higher frequencies all energy is concentrated in these few photons, whereas in the radio it's spread out across a large number of them.

Are there any classes of detector which work by producing an output
proportional (in some sense) to the magnetic field of incident radio
signals (not just in astronomy)?

My understanding is that if you use a small dipole in the focus of your radio antenna then you will sample essentially the electric field of the radio wave (which is an electromagnetic wave). If you use a small loop wire in the focus, then you will essentially sample the magnetic portion. Incident electric fields will wiggle the charges in a conductor along a straight line, but incident magnetic fields will move electric charges in circles. I don't know what the advantages are of either approach - you'd have to ask an engineer for that. But since both electric and magnetic fields are linked in EM radiation I'm certain that the information content is the same, and that preferring one over the other has technical reasons.

Mind you, this doesn't tell us anything about the static magnetic fields in the radio sources!

I know this is a high-level, prose summary, but won't there always be
some correlations, even if there's no underlying physical
relationship?

Well, maybe I was cutting that too short. In general, the correlation coefficient of the signals of two antennas will be different from zero (in fact, since we're using complex correlators, the output will be a complex number, or a vector, and this vector will have a non-zero length and random orientation). However, this correlation will be due to noise arising from the receivers, the atmosphere, the ground spilling over the rim of the antenna, and so on. Hence the subsequent correlation coefficient, or the next measurement, which is also a non-zero vector, will have a similar length and an entirely unrelated orientation. And so on and so on. The distribution of vector lengths depends on the system noise.

So if one averages a lot of these data (ie, if one stacks all these vectors head to foot) then one will approach zero, which is to say that there is no signal from the sky. If, however, after some period of time, one finds the average vector to be different from zero, then one has found a signal, or detected a source. In the case of strong sources such as calibrators with flux densities of several Janskys one can detect a source essentially in a few seconds. In the case of faint source we might need to integrate for days before the noise has come down to a level that the signal sticks out.

Also, won't all manner of systematic effects - which have nothing to
do with 'a common source' - produce non-zero correlation coefficients?

Well, it's the goal of the system designers (=electrical engineers) to reduce such effects to a minimum. But you're right that we see systematics. For example, at the Australia Telescope Compact Array, when one attempts to detect a very faint source, one is advised to observe a point close to the target, but not the target coordinates itself. The reason is that if the voltage delivered by the antennas has a non-zero mean when it's digitized, then this leads to a spurious "detection" exactly at the phase centre, where the antennas are pointed at and where the system thinks the target should be.