Abstract
A general formulation for the adaptive equalization by distribution learning is proposed. The least relative entropy (LRE) algorithms for binary data communications is developed and analyzed with respect to its statistical and dynamical properties. It is shown that LRE learning is consistent and asymptotically normal, and that the algorithm can always recover from convergence at the wrong extreme as opposed to the MSE based MLP's. Finally, this fact is demonstrated using simulation examples.
Original language | English (US) |
---|---|
Pages (from-to) | 929-932 |
Number of pages | 4 |
Journal | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings |
Volume | 2 |
State | Published - 1995 |
Externally published | Yes |
Event | Proceedings of the 1995 20th International Conference on Acoustics, Speech, and Signal Processing. Part 2 (of 5) - Detroit, MI, USA Duration: May 9 1995 → May 12 1995 |
ASJC Scopus subject areas
- Software
- Signal Processing
- Electrical and Electronic Engineering