Beamforming or spatial filtering is a signal processing technique used in sensor arrays for directional signal transmission or reception. This is achieved by combining elements in an antenna array in such a way that signals at particular angles experience constructive interference while others experience destructive interference. Beamforming can be used at both the transmitting and receiving ends in order to achieve spatial selectivity. The improvement compared with omnidirectional reception/transmission is known as the directivity of the array.
Beamforming can be used for radio or sound waves. It has found numerous applications in radar, sonar, seismology, wireless communications, radio astronomy, acoustics and biomedicine. Adaptive beamforming is used to detect and estimate the signal of interest at the output of a sensor array by means of optimal (e.g. least-squares) spatial filtering and interference rejection.
To change the directionality of the array when transmitting, a beamformer controls the phase and relative amplitude of the signal at each transmitter, in order to create a pattern of constructive and destructive interference in the wavefront. When receiving, information from different sensors is combined in a way where the expected pattern of radiation is preferentially observed.
In passive sonar, and in reception in active sonar, the beamforming technique involves combining delayed signals from each hydrophone at slightly different times (the hydrophone closest to the target will be combined after the longest delay), so that every signal reaches the output at exactly the same time, making one loud signal, as if the signal came from a single, very sensitive hydrophone. Receive beamforming can also be used with microphones or radar antennas.
Conventional beamformers, such as the Butler matrix, use a fixed set of weightings and time-delays (or phasings) to combine the signals from the sensors in the array, primarily using only information about the location of the sensors in space and the wave directions of interest. In contrast, adaptive beamforming techniques (e.g., MUSIC, SAMV) generally combine this information with properties of the signals actually received by the array, typically to improve rejection of unwanted signals from other directions. This process may be carried out in either the time or the frequency domain.
Sonar beamforming utilizes a similar technique to electromagnetic beamforming, but varies considerably in implementation details. Sonar applications vary from 1 Hz to as high as 2 MHz, and array elements may be few and large, or number in the hundreds yet very small. This will shift sonar beamforming design efforts significantly between demands of such system components as the "front end" (transducers, pre-amplifiers and digitizers) and the actual beamformer computational hardware downstream. High frequency, focused beam, multi-element imaging-search sonars and acoustic cameras often implement fifth-order spatial processing that places strains equivalent to Aegis radar demands on the processors.
Sonar differs from radar in that in some applications such as wide-area-search all directions often need to be listened to, and in some applications broadcast to, simultaneously. Thus a multibeam system is needed. In a narrowband sonar receiver the phases for each beam can be manipulated entirely by signal processing software, as compared to present radar systems that use hardware to 'listen' in a single direction at a time.
The delay-and-sum beamforming technique uses multiple microphones to localize sound sources. One disadvantage of this technique is that adjustments of the position or of the number of microphones changes the performance of the beamformer nonlinearly. Additionally, due to the number of combinations possible, it is computationally hard to find the best configuration. One of the techniques to solve this problem is the use of genetic algorithms. Such algorithm searches for the microphone array configuration that provides the highest signal-to-noise ratio for each steered orientation. Experiments showed that such algorithm could find the best configuration of a constrained search space comprising ~33 million solutions in a matter of seconds instead of days.
To receive (but not transmit), there is a distinction between analog and digital beamforming. For example, if there are 100 sensor elements, the "digital beamforming" approach entails that each of the 100 signals passes through an analog-to-digital converter to create 100 digital data streams. Then these data streams are added up digitally, with appropriate scale-factors or phase-shifts, to get the composite signals. By contrast, the "analog beamforming" approach entails taking the 100 analog signals, scaling or phase-shifting them using analog methods, summing them, and then usually digitizing the single output data stream.
Digital beamforming has the advantage that the digital data streams (100 in this example) can be manipulated and combined in many possible ways in parallel, to get many different output signals in parallel. The signals from every direction can be measured simultaneously, and the signals can be integrated for a longer time when studying far-off objects and simultaneously integrated for a shorter time to study fast-moving close objects, and so on. This cannot be done as effectively for analog beamforming, not only because each parallel signal combination requires its own circuitry, but more fundamentally because digital data can be copied perfectly but analog data cannot. (There is only so much analog power available, and amplification adds noise.) Therefore, if the received analog signal is split up and sent into a large number of different signal combination circuits, it can reduce the signal-to-noise ratio of each.
Compared to carrier-wave telecommunications, natural audio contains a variety of frequencies. It is advantageous to separate frequency bands prior to beamforming because different frequencies have different optimal beamform filters (and hence can be treated as separate problems, in parallel, and then recombined afterward). Properly isolating these bands involves specialized non-standard filter banks. In contrast, for example, the standard fast Fourier transform (FFT) band-filters implicitly assume that the only frequencies present in the signal are exact harmonics; frequencies which lie between these harmonics will typically activate all of the FFT channels (which is not what is wanted in a beamform analysis). Instead, filters can be designed in which only local frequencies are detected by each channel (while retaining the recombination property to be able to reconstruct the original signal), and these are typically non-orthogonal unlike the FFT basis.
A sensor array is a group of sensors, usually deployed in a certain geometry pattern, used for collecting and processing electromagnetic or acoustic signals. The advantage of using a sensor array over using a single sensor lies in the fact that an array adds new dimensions to the observation, helping to estimate more parameters and improve the estimation performance.For example an array of radio antenna elements used for beamforming can increase antenna gain in the direction of the signal while decreasing the gain in other directions, i.e., increasing signal-to-noise ratio (SNR) by amplifying the signal coherently. Another example of sensor array application is to estimate the direction of arrival of impinging electromagnetic waves. The related processing method is called array signal processing. A third examples includes chemical sensor arrays, which utilize multiple chemical sensors for fingerprint detection in complex mixtures or sensing environments. Application examples of array signal processing include radar/sonar, wireless communications, seismology, machine condition monitoring, astronomical observations fault diagnosis, etc.
Using array signal processing, the temporal and spatial properties (or parameters) of the impinging signals interfered by noise and hidden in the data collected by the sensor array can be estimated and revealed. This is known as parameter estimation.
Each sensor is associated with a different delay. The delays are small but not trivial. In frequency domain, they are displayed as phase shift among the signals received by the sensors. The delays are closely related to the incident angle and the geometry of the sensor array. Given the geometry of the array, the delays or phase differences can be used to estimate the incident angle. Eq. (1) is the mathematical basis behind array signal processing. Simply summing the signals received by the sensors and calculating the mean value give the result
will result in an enhanced signal. The process of time-shifting signals using a well selected set of delays for each channel of the sensor array so that the signal is added constructively is called beamforming.In addition to the delay-and-sum approach described above, a number of spectral based (non-parametric) approaches and parametric approaches exist which improve various performance metrics. These beamforming algorithms are briefly described as follows.
If a time delay is added to the recorded signal from each microphone that is equal and opposite of the delay caused by the additional travel time, it will result in signals that are perfectly in-phase with each other. Summing these in-phase signals will result in constructive interference that will amplify the SNR by the number of antennas in the array. This is known as delay-and-sum beamforming. For direction of arrival (DOA) estimation, one can iteratively test time delays for all possible directions. If the guess is wrong, the signal will be interfered destructively, resulting in a diminished output signal, but the correct guess will result in the signal amplification described above. 781b155fdc