×

Evolving spiking neural networks for online learning over drifting data streams. (English) Zbl 1434.68510

Summary: Nowadays huge volumes of data are produced in the form of fast streams, which are further affected by non-stationary phenomena. The resulting lack of stationarity in the distribution of the produced data calls for efficient and scalable algorithms for online analysis capable of adapting to such changes (concept drift). The online learning field has lately turned its focus on this challenging scenario, by designing incremental learning algorithms that avoid becoming obsolete after a concept drift occurs. Despite the noted activity in the literature, a need for new efficient and scalable algorithms that adapt to the drift still prevails as a research topic deserving further effort. Surprisingly, Spiking Neural Networks, one of the major exponents of the third generation of artificial neural networks, have not been thoroughly studied as an online learning approach, even though they are naturally suited to easily and quickly adapting to changing environments. This work covers this research gap by adapting Spiking Neural Networks to meet the processing requirements that online learning scenarios impose. In particular the work focuses on limiting the size of the neuron repository and making the most of this limited size by resorting to data reduction techniques. Experiments with synthetic and real data sets are discussed, leading to the empirically validated assertion that, by virtue of a tailored exploitation of the neuron repository, Spiking Neural Networks adapt better to drifts, obtaining higher accuracy scores than naive versions of Spiking Neural Networks for online learning environments.

MSC:

68T07 Artificial neural networks and deep learning
68W27 Online algorithms; streaming algorithms

References:

[1] Aggarwal, C. C., On biased reservoir sampling in the presence of stream evolution, (Proceedings of the 32nd international conference on very large data bases (2006), VLDB Endowment), 607-618
[2] Alippi, C., Intelligence for embedded systems (2014), Springer
[3] Alippi, C.; Roveri, M., Just-in-time adaptive classifiers-part ii: designing the classifier, IEEE Transactions on Neural Networks, 19, 12, 2053-2064 (2008)
[4] Alnajjar, F.; Zin, I. B.M.; Murase, K., A spiking neural network with dynamic memory for a real autonomous mobile robot in dynamic environment, (Neural networks, 2008. IJCNN 2008. (IEEE world congress on computational intelligence). IEEE international joint conference on (2008), IEEE), 2207-2213
[5] Baena-García, M., del Campo-Ávila, J., Fidalgo, R., Bifet, A., Gavaldà, R., & Morales-Bueno, R. (2006). Early drift detection method. In Proc. of the 4th ECML PKDD international workshop on knowledge discovery from data streams; Baena-García, M., del Campo-Ávila, J., Fidalgo, R., Bifet, A., Gavaldà, R., & Morales-Bueno, R. (2006). Early drift detection method. In Proc. of the 4th ECML PKDD international workshop on knowledge discovery from data streams
[6] Barddal, J. P.; Gomes, H. M.; Enembreck, F.; Pfahringer, B., A survey on feature drift adaptation: definition, benchmark, challenges and future directions, Journal of Systems and Software, 127, 278-294 (2017)
[7] Belatreche, A.; Maguire, L. P.; McGinnity, M., Advances in design and application of spiking neural networks, Soft Computing, 11, 3, 239-248 (2007)
[8] Bifet, A.; Gavalda, R., Kalman filters and adaptive windows for learning in data streams, (International conference on discovery science (2006), Springer), 29-40
[9] Bifet, A.; Gavalda, R., Learning from time-changing data with adaptive windowing, (Proceedings of the 2007 SIAM international conference on data mining (2007), SIAM), 443-448
[10] Bifet, A.; Holmes, G.; Pfahringer, B.; Gavalda, R., Improving adaptive bagging methods for evolving data streams, (Asian conference on machine learning (2009), Springer), 23-37
[11] Bohte, S. M.; Kok, J. N.; La Poutre, H., Error-backpropagation in temporally encoded networks of spiking neurons, Neurocomputing, 48, 1-4, 17-37 (2002) · Zbl 1006.68760
[12] Bohte, S. M.; Kok, J. N.; La Poutré, J. A., Spikeprop: backpropagation for networks of spiking neurons, (ESANN (2000)), 419-424
[13] Breiman, L., Random forests, Machine learning, 45, 1, 5-32 (2001) · Zbl 1007.68152
[14] Chang, R.; Pei, Z.; Zhang, C., A modified editing k-nearest neighbor rule, Journal of Computational Physics, 6, 7, 1493-1500 (2011)
[15] Cohen, E.; Strauss, M., Maintaining time-decaying stream aggregates, (Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on principles of database systems (2003), ACM), 223-233
[16] Dawid, A. P.; Vovk, V. G., Prequential probability: principles and properties, Bernoulli, 5, 1, 125-162 (1999) · Zbl 0929.60001
[17] Derrac, J.; García, S.; Herrera, F., Stratified prototype selection based on a steady-state memetic algorithm: a study of scalability, Memetic Computing, 2, 3, 183-199 (2010)
[18] Ditzler, G.; Roveri, M.; Alippi, C.; Polikar, R., Learning in nonstationary environments: a survey, IEEE Computational Intelligence Magazine, 10, 4, 12-25 (2015)
[19] Domingos, P.; Hulten, G., Mining high-speed data streams, (Proceedings of the sixth ACM SIGKDD international conference on knowledge discovery and data mining (2000), ACM), 71-80
[20] Domingos, P.; Hulten, G., A general framework for mining massive data streams, Journal of Computational and Graphical Statistics, 12, 4, 945-949 (2003)
[21] Elwell, R.; Polikar, R., Incremental learning of concept drift in nonstationary environments, IEEE Transactions on Neural Networks, 22, 10, 1517-1531 (2011)
[22] Escalante, H. J.; Graff, M.; Morales-Reyes, A., PGGP: prototype generation via genetic programming, Applied Soft Computing, 40, 569-580 (2016)
[23] Fayed, H. A.; Hashem, S. R.; Atiya, A. F., Self-generating prototypes for pattern classification, Pattern Recognition, 40, 5, 1498-1509 (2007) · Zbl 1113.68084
[24] Gama, J.; Medas, P.; Castillo, G.; Rodrigues, P., Learning with drift detection, (Brazilian symposium on artificial intelligence (2004), Springer), 286-295 · Zbl 1105.68376
[25] Gama, J.; Žliobaitė, I.; Bifet, A.; Pechenizkiy, M.; Bouchachia, A., A survey on concept drift adaptation, ACM computing surveys (CSUR), 46, 4, 44 (2014) · Zbl 1305.68141
[26] Garcia, S.; Derrac, J.; Cano, J.; Herrera, F., Prototype selection for nearest neighbor classification: taxonomy and empirical study, IEEE Transactions on Pattern Analysis and Machine Intelligence, 34, 3, 417-435 (2012)
[27] Gerstner, W.; Kistler, W. M., Spiking neuron models: single neurons, populations, plasticity (2002), Cambridge university press · Zbl 1100.92501
[28] Gomes, H. M.; Barddal, J. P.; Enembreck, F.; Bifet, A., A survey on ensemble learning for data stream classification, ACM Computing Surveys, 50, 2, 23 (2017)
[29] Gomes, H. M.; Bifet, A.; Read, J.; Barddal, J. P.; Enembreck, F.; Pfharinger, B., Adaptive random forests for evolving data stream classification, Machine Learning, 106, 9-10, 1469-1495 (2017)
[30] Gonçalves, P. M.; De Barros, R. S.M., RCD: a recurring concept drift framework, Pattern Recognition Letters, 34, 9, 1018-1025 (2013)
[31] Grossberg, S., Nonlinear neural networks: principles, mechanisms, and architectures, Neural Networks, 1, 1, 17-61 (1988)
[32] Harries, M.; Wales, N. S., Splice-2 comparative evaluation: electricity pricing, Technical Report (1999), The University of South Wales
[33] Hart, P., The condensed nearest neighbor rule (corresp.), IEEE Transactions on Information Theory, 14, 3, 515-516 (1968)
[34] Hu, W.; Tan, Y., Prototype generation using multiobjective particle swarm optimization for nearest neighbor classification, IEEE Transactions on Cybernetics, 46, 12, 2719-2731 (2016)
[35] Kasabov, N.; Dhoble, K.; Nuntalid, N.; Indiveri, G., Dynamic evolving spiking neural networks for on-line spatio-and spectro-temporal pattern recognition, Neural Networks, 41, 188-201 (2013)
[36] Kasabov, N.; Scott, N.; Tu, E.; Marks, S.; Sengupta, N.; Capecci, E., Design methodology and selected applications of evolving spatio-temporal data machines in the neucube neuromorphic framework, Neural Networks, 78, 1-14 (2016) · Zbl 1414.68063
[37] Kasabov, N. K., Evolving connectionist systems: the knowledge engineering approach (2007), Springer Science & Business Media · Zbl 1140.68434
[38] Kasabov, N. K., Neucube: a spiking neural network architecture for mapping, learning and understanding of spatio-temporal brain data, Neural Networks, 52, 62-76 (2014)
[39] Khamassi, I., & Sayed-Mouchaweh, M. (2017). Self-adaptive ensemble classifier for handling complex concept drift. In CEUR workshop proceedings, vol. 1958; Khamassi, I., & Sayed-Mouchaweh, M. (2017). Self-adaptive ensemble classifier for handling complex concept drift. In CEUR workshop proceedings, vol. 1958
[40] Khamassi, I.; Sayed-Mouchaweh, M.; Hammami, M.; Ghédira, K., Self-adaptive windowing approach for handling complex concept drift, Cognitive Computation, 7, 6, 772-790 (2015)
[41] Khamassi, I.; Sayed-Mouchaweh, M.; Hammami, M.; Ghédira, K., Discussion and review on evolving data streams and concept drift adapting, Evolving Systems, 9, 1, 1-23 (2018)
[42] Klinkenberg, R., Learning drifting concepts: example selection vs. example weighting, Intelligent Data Analysis, 8, 3, 281-300 (2004)
[43] Kononenko, I.; Kukar, M., Machine learning and data mining: introduction to principles and algorithms (2007), Horwood Publishing
[44] Krawczyk, B.; Minku, L. L.; Gama, J.; Stefanowski, J.; Woźniak, M., Ensemble learning for data stream analysis: a survey, Information Fusion, 37, 132-156 (2017)
[45] Li, J.; Wang, Y., Prototype selection based on multi-objective optimisation and partition strategy, International Journal of Sensor Networks, 17, 3, 163-176 (2015)
[46] Lobo, J. L.; Del Ser, J.; Bilbao, M. N.; Perfecto, C.; Salcedo-Sanz, S., DRED: An evolutionary diversity generation method for concept drift adaptation in online learning environments, Applied Soft Computing (2017)
[47] Meena, L.; Devi, V. S., Prototype selection on large and streaming data, (International conference on neural information processing (2015), Springer), 671-679
[48] Minku, L. L.; White, A. P.; Yao, X., The impact of diversity on online ensemble learning in the presence of concept drift, IEEE Transactions on Knowledge and Data Engineering, 22, 5, 730-742 (2010)
[49] Minku, L. L.; Yao, X., Ddd: a new ensemble approach for dealing with concept drift, IEEE Transactions on Knowledge and Data Engineering, 24, 4, 619-633 (2012)
[50] Ng, W.; Dash, M., A test paradigm for detecting changes in transactional data streams, (International conference on database systems for advanced applications (2008), Springer), 204-219
[51] Oliveira, D. V.; Magalhaes, G. R.; Cavalcanti, G. D.; Ren, T. I., Improved self-generating prototypes algorithm for imbalanced datasets, (Tools with artificial intelligence, 2012 IEEE 24th international conference on vol. 1 (2012), IEEE), 904-909
[52] Ponulak, F., ReSuMe-new supervised learning method for spiking neural networks, vol. 42 (2005), Institute of Control and Information Engineering, Poznan University of Technology
[53] Ponulak, F., Analysis of the resume learning process for spiking neural networks, International Journal of Applied Mathematics and Computer Science, 18, 2, 117-127 (2008)
[54] Ponulak, F.; Kasiński, A., Supervised learning in spiking neural networks with resume: sequence learning, classification, and spike shifting, Neural Computation, 22, 2, 467-510 (2010) · Zbl 1183.92018
[55] Schliebs, S.; Kasabov, N., Evolving spiking neural network-a survey, Evolving Systems, 4, 2, 87-98 (2013)
[56] Soltic, S.; Kasabov, N., Knowledge extraction from evolving spiking neural networks with rank order population coding, International Journal of Neural Systems, 20, 06, 437-445 (2010)
[57] Soltic, S.; Wysoski, S. G.; Kasabov, N. K., Evolving spiking neural networks for taste recognition, (Neural networks, 2008. IJCNN 2008. (IEEE world congress on computational intelligence). IEEE international joint conference on (2008), IEEE), 2091-2097
[58] Thorpe, S.; Gautrais, J., Rank order coding, (Computational neuroscience (1998), Springer), 113-118
[59] Thorpe, S. J.; Gautrais, J., Rapid visual processing using spike asynchrony, (Advances in neural information processing systems (1997)), 901-907
[60] Tomek, I., An experiment with the edited nearest-neighbor rule, IEEE Transactions on Systems, Man, and Cybernetics, 6, 448-452 (1976) · Zbl 0332.68081
[61] Tomek, I., Two modifications of CNN, IEEE Trans. Systems, Man and Cybernetics, 6, 769-772 (1976) · Zbl 0341.68066
[62] Triguero, I.; Derrac, J.; Garcia, S.; Herrera, F., A taxonomy and experimental study on prototype generation for nearest neighbor classification, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 42, 1, 86-100 (2012)
[63] Triguero, I.; García, S.; Herrera, F., IPADE: Iterative prototype adjustment for nearest neighbor classification, IEEE Transactions on Neural Networks, 21, 12, 1984-1990 (2010)
[64] Vitter, J. S., Random sampling with a reservoir, ACM Transactions on Mathematical Software (TOMS), 11, 1, 37-57 (1985) · Zbl 0562.68028
[65] Wang, J.; Belatreche, A.; Maguire, L.; Mcginnity, T. M., An online supervised learning method for spiking neural networks with adaptive structure, Neurocomputing, 144, 526-536 (2014)
[66] Wang, J.; Belatreche, A.; Maguire, L. P.; McGinnity, T. M., Spiketemp: an enhanced rank-order-based learning approach for spiking neural networks with adaptive structure, IEEE Transactions on Neural Networks and Learning Systems, 28, 1, 30-43 (2017)
[67] Wang, S.; Minku, L. L.; Yao, X., A systematic study of online class imbalance learning with concept drift, IEEE Transactions on Neural Networks and Learning Systems (2018)
[68] Webb, G. I.; Hyde, R.; Cao, H.; Nguyen, H. L.; Petitjean, F., Characterizing concept drift, Data Mining and Knowledge Discovery, 30, 4, 964-994 (2016) · Zbl 1411.68127
[69] Wilson, D. L., Asymptotic properties of nearest neighbor rules using edited data, IEEE Transactions on Systems, Man and Cybernetics, 3, 408-421 (1972) · Zbl 0276.62060
[70] Wysoski, S. G.; Benuskova, L.; Kasabov, N., Adaptive learning procedure for a network of spiking neurons and visual pattern recognition, (International conference on advanced concepts for intelligent vision systems (2006), Springer), 1133-1142
[71] Wysoski, S. G.; Benuskova, L.; Kasabov, N., Evolving spiking neural networks for audiovisual information processing, Neural Networks, 23, 7, 819-835 (2010)
[72] Zhou, Z. H.; Chawla, N. V.; Jin, Y.; Williams, G. J., Big data opportunities and challenges: discussions from data analytics perspectives [discussion forum], IEEE Computational Intelligence Magazine, 9, 4, 62-74 (2014)
[73] Žliobaitė, I. (2010). Learning under concept drift: an overview. arXiv preprint, arXiv:1010.4784; Žliobaitė, I. (2010). Learning under concept drift: an overview. arXiv preprint, arXiv:1010.4784
[74] Žliobaitė, I.; Pechenizkiy, M.; Gama, J., An overview of concept drift applications, (Big data analysis: new algorithms for a new society (2016), Springer), 91-114
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.