An estimation method for correcting misclassifications in signal and image processing is presented. The method is based on the use of context-based (temporal or spatial) information in a sliding-window fashion. The classes can be purely nominal, that is, an ordering of the classes is not required. The method employs nonlinear operations based on class proximities defined by a proximity matrix. Two case studies are presented. In the first, the proposed method is applied to one-dimensional signals for processing data that are obtained by a musical key-finding algorithm. In the second, the estimation method is applied to two-dimensional signals for correction of misclassifications in images. In the first case study, the proximity matrix employed by the estimation method follows directly from music perception studies, whereas in the second case study, the optimal proximity matrix is obtained with genetic algorithms as the learning rule in a training-based optimization framework. Simulation results are presented in both case studies and the degree of improvement in classification accuracy that is obtained by the proposed method is assessed statistically using Kappa analysis.
Correction of Misclassifications Using a Proximity-Based Estimation Method
1 Institute of Signal Processing, Tampere University of Technology, P.O. Box 553, Tampere 33101, Finland
2 Department of Pathology, The University of Texas M.D. Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, TX 77030, USA
3 Department 504, National Aerospace University, 17 Chkalova Street, Kharkov 61070, Ukraine
4 School of Electronics and Computer Science, University of Southampton, England, Southampton SO17 1BJ, UK
EURASIP Journal on Advances in Signal Processing 2004, 2004:508513 doi:10.1155/S1110865704402145
The electronic version of this article is the complete one and can be found online at: http://asp.eurasipjournals.com/content/2004/8/508513
|Received:||14 October 2003|
|Revisions received:||17 December 2003|
|Published:||8 July 2004|
© 2004 Niemistö et al.