In the latest paper published in the IEEE Signal Processing Letters, researchers Steven Kay and Yazan Rawashdeh introduce a novel random variable normalizing transformation. This transformation aims to convert an arbitrary random variable into an approximate standard normal distribution using the cumulant generating function (CGF), actually its . This work is motivated by the need to address deficiencies in the generalized likelihood ratio test (GLRT), particularly for model-order selection and multifamily detection.

Motivation

The traditional Generalized Likelihood Ratio Tses (GLRT) has a tendency to always favor the most complex model due to its nature of maximizing likelihood functions across nested hypotheses. This poses a significant challenge in applications requiring model-order selection or multifamily detection, where comparing test statistics of different distributions on an equal footing is crucial. The authors propose a transformation that mitigates this issue by normalizing these test statistics, making them comparable on a uniform scale.

Mathematical Foundation

At the heart of this transformation is the cumulant generating function (CGF), $K_X(\eta)$ , defined as:

$$ ⁍ $$

The key step involves computing the Legendre transform (In this case also the complex conjugate function) of the CGF, $K^*_X(x)$, given by:

$$ ⁍ $$

This transformation effectively normalizes the random variable, converting it into an approximate standard normal. For example, transforming a chi-squared random variable with $k$ degrees of freedom into one with a single degree of freedom, thereby facilitating its use in GLRT without the inherent bias towards complex models.

Theorem

The central theorem of the paper establishes that for a scalar random variable $X$ with mean $\mu$ and variance $\sigma^2$, the transformed variable $Y$ converges asymptotically to a standard normal distribution. Specifically, for $X_N = \sum_{i=1}^N U_i$ where $U_i$ are i.i.d. with mean $\mu$ and variance $\sigma^2$:

$$ ⁍ $$

As $N \to \infty$, the distribution of $Y_N$ approximates that of a chi-squared variable with one degree of freedom with probability 1/2, and zero otherwise.

Along the same lines we can use $K^*_X(x)$ to transform i.i.d. r.v.s to a $\mathcal{N} (0,1)$ r.v.s.:

$$

⁍ $$

Then asymptotically, the PDF of $Z_N$ will converge to the $N(0, 1)$ PDF. A proof of the theorem is given in Appendix B of the paper.

Practical Implications

The proposed transformation is shown to be particularly effective in scenarios involving model-order selection and anomaly detection, where background statistics vary significantly. By normalizing the test statistics, it ensures a fair comparison across different models, enhancing the robustness and reliability of the GLRT.

Conclusion

This new normalizing transformation marks a significant advancement in signal processing and statistical hypothesis testing. It not only addresses the limitations of the traditional GLRT but also opens new avenues for accurate and reliable model comparison in various applications.