CFP last date
20 May 2024
Reseach Article

Incremental Feature Transformation for Temporal Space

by Preeti Mahadev, P. Nagabhushan
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 145 - Number 8
Year of Publication: 2016
Authors: Preeti Mahadev, P. Nagabhushan

Preeti Mahadev, P. Nagabhushan . Incremental Feature Transformation for Temporal Space. International Journal of Computer Applications. 145, 8 ( Jul 2016), 28-38. DOI=10.5120/ijca2016910737

@article{ 10.5120/ijca2016910737,
author = { Preeti Mahadev, P. Nagabhushan },
title = { Incremental Feature Transformation for Temporal Space },
journal = { International Journal of Computer Applications },
issue_date = { Jul 2016 },
volume = { 145 },
number = { 8 },
month = { Jul },
year = { 2016 },
issn = { 0975-8887 },
pages = { 28-38 },
numpages = {9},
url = { },
doi = { 10.5120/ijca2016910737 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
%0 Journal Article
%1 2024-02-06T23:48:16.978700+05:30
%A Preeti Mahadev
%A P. Nagabhushan
%T Incremental Feature Transformation for Temporal Space
%J International Journal of Computer Applications
%@ 0975-8887
%V 145
%N 8
%P 28-38
%D 2016
%I Foundation of Computer Science (FCS), NY, USA

Temporal Feature Space generates features sequentially over consecutive time frames, thus producing a very large dimensional feature space cumulatively in contrast to the one which generates samples over time. Pattern Recognition applications for such temporal feature space therefore have to withstand the complexities involved with waiting for the arrival of new features over time and handling the knowledge hidden in large dimensions. Although, the problem of deriving the knowledge can be overcome by dimensionality reduction techniques like feature subsetting or feature transformation, the complexity due to the large dimensions still prevails. Even though the arrival of features is temporally incremental in nature, generally the pattern analysis is not carried out over time frames to enable the production of knowledge in incremental model for more effective management over time. However, temporal data in real time applications demand that the decisions be taken in the interim or at every temporal point even before all the features arrive temporally. This problem can be overcome by accumulating and building the knowledge for pattern analysis at the end of each temporal phase in an incremental mode. The temporal arrival of features would provide an environment to accumulate the knowledge in the transformed feature space at the end of every phase thereby minimizing a large dimensional space. Since the cumulative knowledge is built upon and passed on from one phase of the temporal space to the other without looking back at the previous data, the feature space required for computing at a given instant would remain fairly constant and comparatively smaller. As fewer transformed features remain in scope for processing at each phase for further reduction and knowledge extraction, computation is minimized and memory is efficiently utilized. In this proposed research, the pattern analysis and recognition of temporal space occur at every temporal point instead of at the end when all features are available. At each temporal point, the proposed model not only withstands the mandatory wait time but also works towards generating the most updated and best available transformed feature space thus far by means of continuous knowledge extraction.

  1. P. Nagabhushan, An efficient method for classifying remotely sensed data (incorporating dimensionality reduction), Ph.D thesis, Universityof Mysore, 1988
  2. Syed Zakir Ali., P Nagabhushan., Pradeep Kumar R, Incremental datamining using Clustering Intelligent Methods of Fusing the Knowledge During Incremental Learning via Clustering in A Distributed Environment , PhD Thesis, 2010
  3. Syed Zakir Ali., P Nagabhushan., Pradeep Kumar R, Regression based Incremental Learning through Cluster Analysis of Temporal data, International Conference on Data Mining (DMIN) 2009
  4. Martin H.C. Law, Anil K. Jain, Incremental Nonlinear Dimensionality Reduction by Manifold Learning, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 3,2006
  5. Chowdhury.F et,al, Single-pass incremental and interactive mining for weighted frequent patterns, 2012.
  6. S. Kotsiantis., K. Patriarcheas., and M. Xenos. A combinational incremental ensemble of classifiers as a technique for predicting students’ performance in distance education, 2010.
  7. T.Gharib et,al, An efficient algorithm for incremental mining of temporal association rules, 2010.
  8. D.Dudek, RMAIN:Association rules maintenance without reruns through data, 2009
  9. Law, Martin HC, Nan Zhang, and Anil K. Jain. "Nonlinear Manifold Learning for Data Stream." SDM. 2004.
  10. Ahonen, Timo, Abdenour Hadid, and Matti Pietikainen. "Face description with local binary patterns: Application to face recognition." Pattern Analysis and Machine Intelligence, IEEE Transactions on 28.12 (2006): 2037-2041.
  11. Jieping Ye., IDR/QR: an incremental dimension reduction algorithm via QR decomposition, Knowledge and Data Engineering, IEEE Transactions , Vol17, Issue09, 2005
  12. Balsubramani, S Dasgupta, The fast convergence of incremental pca Advances in Neural Information Processing Systems, 2013
  13. J. Han, M. Kamber, “Data Mining: Concepts and Techniques,” Third Edition, Elsevier Inc., Raj Kamal Electric Press, 2011.
  14. Anil K. Jain's talk: Clustering Big Data, University of Notre Dame, Nov. 29, 2012
  15. Feng Wang , Jiye Liang , Yuhua Qian, Attribute reduction: A dimension incremental strategy, Knowledge-Based Systems, 39, p.95-108, February, 2013
  16. GF Lu, J Zou, Y Wang - Pattern Recognition, , Incremental complete LDA for face recognition, Elsevier, 2012
  17. Balsubramani, S Dasgupta , Freund The fast convergence of incremental PCA, Advances in Neural Information Processing Systems 26, 2013.
  18. J Feng, H Xu, S Mannor, S Yan, Online-PCA for contaminated data , Advances in Neural Information Processing Systems 26, 2013
  19. I Mitliagkas, C Caramanis, P Jain - Memory Limited, Streaming PCA, Advances in Neural Information 26, 2013
  22. /dmclass/ cluster_ survey_ 10_02_00.pdf
  23. Principal Component. Analysis, Second Edition. I.T. Jolliffe. Springer, NewYork, 2002
  24. Raducanu, Bogdan, and Fadi Dornaika. "A supervised non-linear dimensionality reduction approach for manifold learning." Pattern Recognition45.6 (2012): 2432-2444
  25. Geraldine, J. Mercy, E. Kirubakaran, and S. Sathiya Devi. "WEIGHTED TEMPORAL PATTERN MINING WITH DIMENSIONALITY REDUCTION USING MODIFIED AFCM." Int J AdvEngg Tech/Vol. VII/Issue I/Jan.-March565 (2016): 571
  26. Rangarajan, Lalitha, and P. Nagabhushan. "Dimensionality reduction of multidimensional temporal data through regression." Pattern recognition letters 25.8 (2004): 899-910
  31. P.Nagabhushan, Preethi Mahadev.,"Incremental Feature Subsetting useful for Big Feature Space Problem", International Journal of Computer Applications,Volume 97, Issue12,2014
  33. Huan Liu et al.,Incremental Feature selection, Journal Applied Intelligence, Vol 9 Issue 3 ,pp 217-230, 1998
  34. /pdf/ Procedures/NCSS/Principal_Components_Analysis.pdf
  35. Introduction to statistical pattern recognition, second edition, Keinosuke Fukunaga, Academic Press Professional, Inc. San Diego, CA, 1990
Index Terms

Computer Science
Information Sciences


Big Feature Space Incremental Dimensionality Reduction Cumulative Variance Optimal Feature Subset Incremental Dimensionality Index (IDI)