Doppler reflectometry, or the Doppler backscattering (DBS) diagnostic, enables the measurement of turbulent density fluctuations of intermediate length scales, typically $10 gtrsim k_perp rho_i gtrsim 1$; here $k_perp$ is the turbulence’s wavenumber and $rho_i$ is the ion gyroradius. The backscattered power is proportional to the power of the turbulent fluctuations. However, when the beam’s wavevector is not properly aligned perpendicular to the magnetic field, the backscattered power is attenuated. The extent to which one is misaligned is quantified by the mismatch angle, and the reduction in power is the associated mismatch attenuation. In previous work, we used a beam model to derive a quantitative dependence of the mismatch attenuation on the mismatch angle, and preliminary comparisons with the results from MAST were promising. In this article, we analyse MAST DBS data for various frequency channels and at various times, demonstrating that the beam model can indeed properly account for the mismatch attenuation. Interestingly, we show that mismatch attenuation can have a significant effect on signal localisation, shifting the point of highest backscattered power away from the cut-off. This is especially important for spherical tokamaks, since the pitch angle varies significantly both spatially and temporally.