Distributed rao-blackwellized point mass filter for blind equalization in receiver networks

We describe new Bayesian algorithms for cooperative blind equalization in a network in which the signal broadcast by a single transmitter is received by multiple remote nodes through distinct frequency-selective channels. The algorithms are based on Rao-Blackwellized point mass filters, and approximate the posterior densities of the unknown channel parameters by Gaussian mixtures of fixed order. To keep computations treatable, density mixtures are reduced by a modified version of West's algorithm. A reduced complexity approach which employs a single mode approximation of some remote quantities is also considered. Via numerical simulations, we verify that the proposed algorithms outperform certain particle-filtering-based algorithms with comparable communication loads.


INTRODUCTION
Distributed estimation has attracted much interest due to the potential robustness and performance gains provided by those techniques.Most original distributed estimation methods [1,2] were linear, but recently there has been a surge of interest in nonlinear methods [3][4][5].Nonlinear methods generally lead to more accurate estimation results for nonlinear or non-Gaussian signal models at the cost of heavier computational and communication burden.
We consider in this paper the use of Rao-Blackwellized point mass filters (PMF) [6][7][8].PMFs are deterministic, and recursively evaluate the posterior probability of all possible current states of the hidden (discrete) random variable given the set of observations.The complexity of PMFs is proportional to the number of possible state transitions and, in many setups, is much smaller than that of particle filters, whose complexity is proportional to the required number of particles.Contrary to particle filters, PMFs may need to approximate the posterior distributions of the unknown model parameters to keep computations feasible [7].In the considered blind equalization scenario, those posteriors are Gaussian mixtures with exponentially growing size, which are ap-proximated by mixtures with constant size by the mixture reduction algorithms [9].
In this paper, we develop novel distributed blind equalization approaches [3] based on PMFs.The proposed methods extend previous ones [6,10] to a signal model that considers multiple remote receivers, and exhibit reduced complexity compared to [11].The algorithms do not demand a fusion center, as each individual receiver processes its observations independently and exchange information to approximate the optimal joint estimate of the transmitted data given all observations.As we verify via numerical simulations, the proposed algorithms surpass certain approaches based on particle filters for comparable communication requirements.
The remainder of the text is organized as follows: in Section 2 we describe the considered problem setup and present in Section 3 the PMF.Section 4 introduces a distributed blind equalization algorithm based on PMF.Section 5, in turn, describes a second approach aimed at reducing communication complexity.In Section 6 we assess the performance of the proposed methods and finally draw our conclusions in Section 7.

PROBLEM SETUP
Denote by x t ∈ {±1} the differentially encoded binary symbol transmitted at time instant t corresponding to {b t }, an independent, identically distributed (i.i.d.) binary bit sequence.The observations y r,t , 1 ≤ r ≤ R, at the r−th node of a network of R receivers are assumed to be the output of the additive noise frequency-selective channel where x t [x t . . .x t−L+1 ] T , L denotes the channel order, v r,t is an i.i.d.zero-mean Gaussian random process of known variance σ 2 r , and the unknown, random channel impulse response parameter vector h r ∈ R L×1 is assumed to be distributed a priori as h r ∼ N(h r |0; I/ε 2 ), where N denotes a (multivariate) Gaussian p.d.f., and ε is the model's hyper-parameter.The random quantities x t , h 1 , . . ., h R , v 1,t , . . ., v R,t are presumed to be mutually independent.Under these assumptions, our objective is to develop a recursive method for obtaining maximum a posteriori estimates bt = arg max bt p(b t |Y t ), where Y t {y 1,1:t , . . ., y R,1:t } and y r,1:t {y r,1 , . . ., y r,t }.In the sequel, we also employ the notation y t y 1:R,t and h h 1:R .

RAO-BLACKWELLIZED POINT MASS FILTER
The Rao-Blackwellized Point Mass Filter (PMF) [7] allows one to recursively evaluate the posterior discrete probability p(x t |Y t ).Suppose that, at time instant t − 1, we have p(x t−1 |Y t−1 ) and p(h|x t−1 , Y t−1 ).For t = 1, these quantities reduce to the prior probabilities of symbol p(x 0 ) and the prior probability density function p(h), respectively.The PMF first computes The updated conditional probability of the parameters h can then be evaluated as and the updated posterior of the states x t as For the recursion (2)-( 4) to be of practical use, x t must be discretely distributed on a finite number of points and the integral in h must be treatable.This generally requires that some form of approximation be applied to p(h|x t , Y t ).

PMF-BASED DISTRIBUTED BLIND EQUALIZATION ALGORITHM
Using Equation 1 and exploiting conditional independence relations, it follows that, for a given pair of possible states x p(x For the integral in (3) to have closed analytical expression, we approximate p(h|x where w Substituting ( 5)-( 7) into (2), it follows that p(y t , x where, to obtain ( 9) from ( 8) we exploited standard Kalman filter theory results (see [12] for details), and [12] γ = y r,t − (x = ĥ(j,m) with ĥ(j,m) r,0 = 0 and Σ (j,m) r,0 = I/ε 2 .To integrate h out of (9), it suffices to discard the Gaussian densities in h r .Performing such operation and substituting the result into (4), we obtain that p(x where Similarly, it follows from (3) that p(h|x .We wish to show that (17) has the same functional form of (7).To verify that, we introduce the unitary mapping (i, j, m) → (i, m ) (i, (j − 1)M + m), 1 ≤ m ≤ MN j , where N j denotes the number of possible predecessor states x (j) t−1 given x (i) t (2 for binary systems).This allows (17) to be rewritten as p(h|x which is similar to (7) except for the increased number of sum components.As a consequence, if no further approximations are made to (18), the complexity of PMFs grows exponentially with time.

Mixture Reduction
To keep the computational complexity of the PMF treatable, we follow [7] [6] and reduce the mixture in (18) to M components.This can be carried out by several algorithms (see [9] for a review), which differ in performance and computational complexity.
We propose to use a new version of West's algorithm [13] adapted to product mixtures.The algorithm is iterative: take a mixture as ( 18 Then, for all m = m, evaluate the sum of symmetrized Kullback-Leibler divergences [14] where D KL (•||•) denotes the Kullback-Leibler divergence [13] and . Next, the index of the nearest component is determined as After that, components with indexes m and m are merged via marginal moment matching [13] w(i, m) − ĥ(i, m) r,t )( ĥ(i, m) After evaluating (20)-( 22), the distinctive bar is dropped from the quantities on the left-hand size and the component m is deleted.The resulting mixture has M − 1 components.

Distributed Implementation
A distributed implementation of PMF-based blind equalizer requires that computations performed in the filtering step (Section 4) and the mixture reduction step (Section 4.1) be spread across the nodes.
To implement the filtering step, each node r runs a bank of M • 2 L+1 Kalman filters that process the local observation y r,t , update the statistics of h r , and evaluate λ (i,j,m) r,t via (16).These quantities are then broadcast to the remaining nodes (M • 2 L+1 real numbers), which evaluate w (i,j,m) t (Equation 15) and the symbols' posterior probabilities via (14).
As for the mixture reduction step, the r−th node evaluates the r−th term of the sum (19) and broadcasts this quantity to the remaining nodes.All nodes then are able to evaluate (19) and determine the indexes of components to be merged.Moment matching operations (21)-( 22) referring to the r−th component are performed with local information.To reduce a mixture from 2M to M components, a total of (3M 2 − M )/2 distances must be evaluated.As this procedure must be repeated for each possible state x (i) t , a total of (3M 2 − M )2 L−1 real numbers must be broadcast.

THE DISTRIBUTED PMF (DPMF) ALGORITHM
To reduce the communication burden, we propose an alternative algorithm in which the general r−th node employs a single mode approximation to the remote marginal posteriors p(h s |x (j) t−1 , Y t−1 ), r = s.By this new approach, the r-th node approximates (7) as 1 p r (h|x based equalizers demand a number of particles P 2 L for proper operation.The computational cost of the PMF and DPMF are dominated by the cost of the mixture reduction algorithm, and is comparable to that of particle filters for the considered parameters.

CONCLUSIONS
This paper described two distributed Bayesian algorithms for cooperative blind equalization based on Rao-Blackwellized point mass filters.The PMF algorithm approximates the posterior densities of channels parameters by Gaussian mixtures of fixed order.The required mixture reductions are implemented via a distributed version of the West's algorithm [13].
To reduce communication among nodes, the DPMF algorithm performs mixture reductions only locally, presenting to the remaining nodes a single term Gaussian approximation.Via numerical simulations, we verify that the proposed algorithms outperform certain particle-filtering-based algorithms that incur in comparable communication and computational complexities.
Counterintuitively, for M > 1, the PMF performed worse than the DPMF for low noise levels.This can possibly be explained by the fact that the PMF deals with Gaussian mixtures in which the components are more distant (in the KL measure) than for the DPMF (since all terms in the product but one are equal), which exacerbates distortions caused by the mixture reduction process.It remains to be verified, as future work, if the use of more complex mixture reduction algorithms [9] can avoid this effect.
) with M components.If M = M , finish.Otherwise, if M > M, determine m arg min m w (i,m) t .

Table 1 .
Comparison of Communication and Computational Costs (per node per symbol).