Methods Inf Med 2004; 43(02): 192-201
DOI: 10.1055/s-0038-1633858
Original Article
Schattauer GmbH

Abductive Network Committees for Improved Classification of Medical Data

R. E. Abdel-Aal
1   Center for Applied Physical Sciences, Research Institute, King Fahd University of Petroleum and Minerals, Dhahran, Saudi Arabia
› Author Affiliations
Further Information

Publication History

Publication Date:
05 February 2018 (online)

Summary

Objectives: To introduce abductive network classifier committees as an ensemble method for improving classification accuracy in medical diagnosis. While neural networks allow many ways to introduce enough diversity among member models to improve performance when forming a committee, the self-organizing, automatic-stopping nature, and learning approach used by abductive networks are not very conducive for this purpose. We explore ways of overcoming this limitation and demonstrate improved classification on three standard medical datasets.

Methods: Two standard 2-class medical datasets (Pima Indians Diabetes and Heart Disease) and a 6-class dataset (Dermatology) were used to investigate ways of training abductive networks with adequate independence, as well as methods of combining their outputs to form a network that improves performance beyond that of single models.

Results: Two- or three-member committees of models trained on completely or partially different subsets of training data and using simple output combination methods achieve improvements between 2 and 5 percentage points in the classification accuracy over the best single model developed using the full training set.

Conclusions: Varying model complexity alone gives abductive network models that are too correlated to ensure enough diversity for forming a useful committee. Diversity achieved through training member networks on independent subsets of the training data outweighs limitations of the smaller training set for each, resulting in net gain in committee performance. As such models train faster and can be trained in parallel, this can also speed up classifier development.

 
  • References

  • 1 Kononenko I. Machine learning for medical diagnosis: History, state of the art and perspective. Artificial Intelligence in Medicine 2001; 23: 89-109.
  • 2 Brause RW. Medical analysis and diagnosis by neural networks. 2nd International Symposium on Medical Data Analysis. ISMDA; Madrid, Spain: 2001
  • 3 Andrews R, Diederich J, Tickle AB. A survey and critique of techniques for extracting rules from trained artificial neural networks. Knowledge-Based Systems 1995; 8: 373-89.
  • 4 Setiono R. Extracting rules from pruned networks for breast cancer diagnosis. Artificial Intelligence in Medicine 1996; 8: 37-51.
  • 5 Hippert HS, Pedreira CE, Souza RC. Neural networks for short-term load forecasting: A review and Evaluation. IEEE Transactions on Power Systems 2001; 16: 44-55.
  • 6 Montgomery GJ, Drake KC. Abductive networks, Proc. of the SPIE Conf. on the Applications of Artificial Neural Networks, Orlando, Florida. 1990: 56-64.
  • 7 Alves da Silva AP, Rodrigues UP, Rocha Reis AJ, Moulin LS. Neuro Dem – a neural network based short term demand forecaster. Presented at the IEEE Power Tech. Conf., Porto, Portugal. 2001
  • 8 Barron AR. Predicted squared error – a criterion for automatic model selection. In Farlow SJ. (ed) Self-Organizing Methods in Modeling: GMDH Type Algorithms. New York: Marcel-Dekker; 1984: 87-103.
  • 9 Abdel-Aal RE, Mangoud AM. Modeling obesity using abductive networks. Comput Biomed Res 1997; 30: 451-71.
  • 10 Abdel-Aal RE, Mangoud AM. Abductive machine learning for modeling and predicting the educational score in school health surveys. Methods Inf Med 1996; 35: 265-71.
  • 11 Echauz J, Vachtsevanos G. Neural network detection of antiepileptic drugs from a single EEG trace. Proceedings of the Electro/94 International Conference. 1994: 346-51.
  • 12 Kondo T, Pandya AS, Zurada JM. GMDH-type neural networks and their application to the medical image recognition of the lungs. Proceedings of the 38th IEEE SICE Annual Conference. 1999: 1181-86.
  • 13 Cheung J, Lin ZY, McCallum RW, Chen JDZ. Screening of delayed gastric emptying using electrogastrography and abductive networks. Gastroenterology Suppl. S 1997; 112: A711
  • 14 Peters BO, Pfurtscheller G, Flyvbjerg H. Automatic differentiation of multichannel EEG signals. IEEE Transactions on Biomedical Engineering 2001; 48: 111-16.
  • 15 Reddy NP, Rothschild BM. Hybrid fuzzy logic committee neural networks for classification in medical decision support systems. Proceedings of the 24th IEEE Conference on Engineering in Medicine and Biology, 2002, 30–31. 16 Lippmann RP, Shahian DM. Coronary artery bypass risk prediction using neural networks. Annals of Thoracic Surgery 1997; 63: 1635-43.
  • 17 Gopinath P, Reddy NP. Toward intelligent Web monitoring: performance of committee neural networks vs single neural network. Proceedings of the IEEE International Conference on Information Technology Applications in Biomedicine. 2000: 179-82.
  • 18 Reddy NP, Das A, Simcox D. Hybrid fuzzyneural committee networks for recognition of swallow acceleration signals. Proceedings of the 20th IEEE Conference on Engineering in Medicine and Biology. 1998: 1375-6.
  • 19 Sharkey AJC, Sharkey NE, Cross SS. Adapting an ensemble approach for the diagnosis of breast cancer. Proceedings of the International Conference on Artificial Neural Networks. 1998: 281-6.
  • 20 Zhou Z-H, Jiang Y, Yang Y-B, Chen S-F. Lung cancer cell identification based on artificial neural network ensembles. Artificial Intelligence in Medicine 2002; 24: 25-36.
  • 21 Jimenez D. Dynamically weighted ensemble neural networks for classification. IEEE World Congress on Computational Intelligence. 1998: 753-56.
  • 22 Su M, Basu M. Gating improves neural network performance. IEEE International Joint Conference on Neural Networks. 2001: 2159-64.
  • 23 Wolpert DH. Stacked generalization. Neural Networks 1992; 5: 241-60.
  • 24 Swann A, Allinson N. Fast committee learning: Preliminary results. Electronics Letters 1998; 34: 1408-10.
  • 25 Kim S-J, Zhang B-T. Combining locally trained neural networks by introducing a reject class. IEEE International Joint Conference on Neural Networks. 1999: 4043-7.
  • 26 Krogh J, Vedelsby A. Neural network ensembles, cross validation, and active learning. Proceedings of Neural Information Processing Systems, NIPS’94. 1995: 231-38.
  • 27 AbTech Corporation, Charlottesville, VA. AIM User’s Manual, 1990.
  • 28 Farlow SJ. The GMDH algorithm. In Farlow SJ. (ed) Self-Organizing Methods in Modeling: GMDH Type Algorithms. New York: Marcel-Dekker; 1984: 1-24.
  • 29 Breiman L. Bagging predictors. Machine Learning 1996; 24: 123-40.
  • 30 Freund Y, Schapire R. Experiments with a new boosting algorithm. Proceedings of the 13th International Conference on Machine Learning. 1996: 148-56.
  • 31 Guo J-J, Luh PB. Market clearing price prediction using a committee machine with adaptive weighting coefficients. IEEE Power Engineering Society Winter Meeting. 2002: 77-82.
  • 32 www.ics.uci.edu/~mlearn/MLRepository.html
  • 33 Smith JW, Everhart JE, Dickson WC, Knowler WC, Johannes RS. Using the ADAP learning algorithm to forecast the onset of diabetes mellitus. Proceedings of 12th Symposium on Computer Applications in Medical Care. Greenes RA. ed IEEE Computer Society Press; 1988: 261-5.
  • 34 Opitz DW, Maclin RF. An empirical evaluation of bagging and boosting for artificial neural networks. International Conference on Neural Networks. 1997: 1400-05.
  • 35 Detrano R, Janosi A, Steinbrunn W, Pfisterer M, Schmid J, Sandhu S, Guppy K, Lee S, Froelicher V. International application of a new probability algorithm for the diagnosis of coronary artery disease. American Journal of Cardiology 1989; 64: 304-10.
  • 36 Guvenir HA, Demiroz G, Ilter N. Learning differential diagnosis of erythemato-squamous diseases using voting feature intervals. Artificial Intelligence in Medicine 1998; 13: 147-65.