Mostrar el registro sencillo del ítem

dc.contributor.authorAguilar, Jose
dc.contributor.authorPuerto, E.
dc.contributor.authorVargas, R.
dc.contributor.authorReyes, J.
dc.date.accessioned2021-12-02T13:45:30Z
dc.date.available2021-12-02T13:45:30Z
dc.date.issued2019-07-13
dc.identifier.urihttp://repositorio.ufps.edu.co/handle/ufps/1642
dc.description.abstractIn the context of pattern recognition processes with machine learning algorithms, either through supervised, semi-supervised or unsupervised methods, one of the most important elements to consider are the features that are used to represent the phenomenon to be studied. In this sense, this paper proposes a deep learning architecture for Ar2p, which is based on supervised and unsupervised mechanisms for the discovery and the selection of features for classification problems (called Ar2p-DL). Ar2p is an algorithm of pattern recognition based on the systematic functioning of the human brain. Ar2p-DL is composed of three phases: the first phase, called feature analysis, is supported by two feature-engineering approaches to discover or select atomic features/descriptors. The feature engineering approach used for the discovery, is based on a classical clustering technique, K-means; and the approach used for the selection, is based on a classification technique, Random Forest. The second phase, called aggregation, creates a feature hierarchy (merge of descriptors) from the atomic features/descriptors (it uses as aggregation strategy the DBSCAN algorithm). Finally, the classification phase carries out the classification of the inputs based on the feature hierarchy, using the classical Ar2p algorithm. The last phase of Ar2p-DL uses a supervised learning approach, while the first phases combine supervised and unsupervised learning approaches. To analyze the performance of Ar2p-DL, several tests have been made using different benchmarks (datasets) from the UCI Machine Learning Repository, in order to compare Ar2p-DL with other classification methods.eng
dc.format.extent21 páginasspa
dc.format.mimetypeapplication/pdfspa
dc.language.isoengspa
dc.publisherNeural Processing Lettersspa
dc.relation.ispartofNeural Process Lett
dc.rights© 2021 Springer Nature Switzerland AG. Part of Springer Nature.eng
dc.sourcehttps://link.springer.com/article/10.1007/s11063-019-10062-4spa
dc.titleAn Ar2p Deep Learning Architecture for the Discovery and the Selection of Featureseng
dc.typeArtículo de revistaspa
dcterms.referencesGuyon I, Gunn S, Nikravesh M, Zadeh LA (2006) Feature extraction foundations and applications. Springer, Berlinspa
dcterms.referencesLeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521:436–444spa
dcterms.referencesSchmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 6:85–117spa
dcterms.referencesLeCun Y, Bengio Y (1998) Convolutional networks for images, speech, and time series. In: Arbib MA (ed) The handbook of brain theory and neural networks. MIT Press, Cambridge, MA, pp 255–258spa
dcterms.referencesSalakhutdinov R, Hinton G (2010) Efficient learning of deep Boltzmann machines. In: Proc. intl conference on artificial intelligence and statistics, pp 693–700spa
dcterms.referencesHua Y, Guo J, Hua Z (2015) Deep belief networks and deep learning. In: Proc. int. conf. intel. comput. internet things, pp 1–4spa
dcterms.referencesEster M, Kriegel H, Sander J, Xu X (1996) A density-based algorithm for discovering clusters in large spatial databases with noise. In: Proc. 2nd intl conf on knowledge discovery and data mining, pp 226–231spa
dcterms.referencesAguilar J (2001) Learning algorithm and retrieval process for the multiple classes random neural network model. Neural Process Lett 13(1):81–91spa
dcterms.referencesAguilar J (1998) Definition of an energy function for the random neural to solve optimization problems. Neural Netw 11(4):731–738spa
dcterms.referencesGelenbe E, Yin Y (2016) Deep learning with random neural networks. In: Proc. international joint conference on neural networks (IJCNN), pp 1633–1638spa
dcterms.referencesAguilar J (2004) A color pattern recognition problem based on the multiple classes random neural network model. Neurocomputing 61:71–83spa
dcterms.referencesZhang J, Yu J, Tao D (2018) Local deep-feature alignment for unsupervised dimension reduction. IEEE Trans Image Process 27(5):2420–2432spa
dcterms.referencesYu H, Sun D, Xi X, Yang X, Zheng S, Wang Q (2018) Fuzzy one-class extreme auto-encoder. Neural Process Lett. https://doi.org/10.1007/s11063-018-9952-zspa
dcterms.referencesHong J, Yu J, Zhang X, Jin K, Lee K (2018) Multi-modal face pose estimation with multi-task manifold deep learning. https://arxiv.org/pdf/1712.06467.pdf. Accessed 1 July 2019spa
dcterms.referencesKumar G, Bhatia PK (2014) A detailed review of feature extraction in image processing systems. In: Proc. fourth international conference on advanced computing & communication technologies. IEEE, pp 5–12spa
dcterms.referencesPacheco F, Exposito E, Gineste M, Budoin C, Aguilar J (2019) Towards the deployment of machine learning solutions in traffic network classification: a systematic survey. IEEE Commun Surv Tutor 21(2):1988–2014spa
dcterms.referencesChang M, Buš P, Schmitt G (2017) Feature extraction and K-means clustering approach to explore important features of urban identity. In: 16th IEEE international conference on machine learning and applications (ICMLA), pp 1139–1144spa
dcterms.referencesPuerto E, Aguilar J (2017) Un algoritmo recursivo de reconocimiento de patrones. Revista Técnica de la Facultad de Ingeniería Universidad del Zulia 40(2):95–104spa
dcterms.referencesPuerto E, Aguilar J, Chavez D (2017) A new recursive patterns matching model inspired in systematic theory of human mind. Int J Adv Comput Technol (IJACT) 9(1):28–39spa
dcterms.referencesPuerto E, Aguilar J (2016) Formal description of a pattern for a recursive process of recognition. In: Proc IEEE Latin American conference on computational intelligence, pp 1–2spa
dcterms.referencesPuerto E, Aguilar J (2016) Learning algorithm for the recursive pattern recognition model. Appl Artif Intell 30(7):662–678spa
dcterms.referencesKurzweil R (2013) How to make mind. The Futurist 47(2):14–17spa
dcterms.referencesPuerto E, Aguilar J, Chávez D (2018) A recursive patterns matching model for the dynamic pattern recognition problem. Appl Artif Intell 32(4):419–432spa
dcterms.referencesLiu H, Motoda H (1998) Feature extraction, construction and selection: a data mining perspective, vol 453. Springer, Berlinspa
dcterms.referencesKhalid S, Khalil T, Nasreen S (2014) A survey of feature selection and feature extraction techniques in machine learning. In: Proc. IEEE science and information conference (SAI), pp 372–378spa
dcterms.referencesMotoda H, Liu H (2002) Feature selection, extraction and construction. Commun IICM 5:67–72spa
dcterms.referencesYu J, Kuang Z, Zhang B, Zhang W, Lin D, Fan J (2018) Leveraging content sensitiveness and user trustworthiness to recommend fine-grained privacy settings for social image sharing. IEEE Trans Inf Forensics Secur 13(5):1317–1332spa
dcterms.referencesYu J, Liu D, Tao D, Seah H (2012) On combining multiple features for cartoon character retrieval and clip synthesis. IEEE Trans Syst Man Cybern 42(5):1413–1427spa
dcterms.referencesLausser L, Szekely R, Schirra L, Kestler H (2018) The influence of multi-class feature selection on the prediction of diagnostic phenotypes. Neural Process Lett 48(2):863–880spa
dcterms.referencesSingaravel S, Suykens J, Geyer P (2018) Deep-learning neural-network architectures and methods: using component based models in building-design energy prediction. Adv Eng Inform 38:81–90spa
dcterms.referencesPham D, Dimov S, Nguyen C (2005) Selection of K in K-means clustering. In: Proc. inst. mech. eng., pp 103–119spa
dcterms.referencesPham D, Dimov S, Nguyen C (2004) An incremental K-means algorithm. J Mech Eng Sci 218(7):783–795spa
dcterms.referencesPelleg D, Moore A (2000) X-means: extending K-means with efficient estimation of the number of clusters. In: Proc. of the 17th international conf. on machine learning, pp 727–734spa
dcterms.referencesKass R, Wasserman L (1995) A reference Bayesian test for nested hypotheses and its relationship to the Schwarz criterion. J Am Stat Assoc 90(431):928–934spa
dcterms.referencesBreiman L (2001) Random forests. Mach Learn 45(1):5–32spa
dcterms.referencesBalakrishnama A, Ganapathiraju S (1998) Linear discriminant analysis-a brief tutorial. Inst Signal Inf Process 18:1–8spa
dcterms.referencesEster M, Kriegel H-P, Sander J, Xu X (1996) A density-based algorithm for discovering clusters in large spatial databases with noise. In: Proc. 2nd int conf on knowledge discovery and data mining, pp 226–231spa
dcterms.referencesWagner P, Peres S, Lima C, Freitas F, Barros R (2014) Gesture unit segmentation using spatial-temporal information and machine learning. In: Proc. twenty-seventh international Florida artificial intelligence research society conference, pp 101–106spa
dcterms.referencesLichman M (2013) UCI machine learning repository. University of California, Irvinespa
dcterms.referencesEl Kessab B, Daoui C, Boukhalene B, Salouan R (2014) A comparative study between the K-nearest neighbors and the multi-layer perceptron for cursive handwritten arabic numerals recognition. Int J Comput Appl 107(21):25–30spa
dcterms.referencesKeith MJ et al (2010) The high time resolution universe pulsar survey—I. System configuration and initial discoveries. Mon Not R Astron Soc 409(2):619–627spa
dcterms.referencesLyon RJ, Stappers BW, Cooper S, Brooke JM, Knowles JD (2016) Fifty years of pulsar candidate selection: from simple filters to a new principled real-time classification approach. Mon Not R Astron Soc 459(1):1104–1123spa
dcterms.referencesCharytanowicz M et al. (2010) A complete gradient clustering algorithm for features analysis of X-ray images. In: Proc. information technologies in biomedicine. Springer, pp 15–24spa
dcterms.referencesAltay H, Acar B, Demiroz G, Cekin A (1997) A supervised machine learning algorithm for arrhythmia analysis. In: Proceedings of the computers in cardiology conferencespa
dcterms.referencesBareiss E, Ray E, Porter B (1987) Protos: an exemplar-based learning apprentice. In: Proceedings 4th international workshop on machine learning, pp 12–23spa
dcterms.referencesPuerto E, Aguilar J, Reyes J, Sarkar D (2018) Deep learning architecture for the recursive patterns recognition model. J Phys Conf Ser 1126:012035spa
dcterms.referencesVan M, Van L (2011) Combining RGB and ToF cameras for real-time 3D hand gesture interaction. In: Proc. IEEE workshop on applications of computer vision, pp 66–72spa
dcterms.referencesNguyen A, Yosinski J, Clune J (2015) Deep neural networks are easily fooled: high confidence predictions for unrecognizable images. In: Proc. IEEE conference on computer vision and pattern recognition, pp 427–436spa
dcterms.referencesNiu X, Suen C (2012) A novel hybrid CNN–SVM classifier for recognizing handwritten digits. Pattern Recogn 45(4):1318–1325spa
dcterms.referencesLiu L, Wu Y, Wei W, Cao W, Sahin S, Zhang Q (2018) Benchmarking deep learning frameworks: design considerations, metrics and beyond. In: IEEE 38th international conference on distributed computing systems (ICDCS), pp 1258–1269spa
dcterms.referencesWu D, Pigou L, Kindermans P, Do-Hoang N, Shao L, Dambre J, Odobez J (2016) Deep dynamic neural networks for multimodal gesture segmentation and recognition. IEEE Trans Pattern Anal Mach Intell 38(8):1583–1597spa
dcterms.referencesWang P, Li W, Liu S, Zhang Y, Gao Z, Ogunbona P (2016) Large-scale continuous gesture recognition using convolutional neural networks. In: 23rd International conference on pattern recognition, pp 13–18spa
dcterms.referencesChen F, Chen N, Mao H, Hu H (2018) Assessing four neural networks on handwritten digit recognition dataset (MNIST). Chuangxinban J Comput. arXiv:1811.08278v1 [cs.CV]spa
dcterms.referencesKöpüklü O, Gunduz A, Kose N, Rigoll G (2019) Real-time hand gesture detection and classification using convolutional neural networks. arXiv preprint arXiv:1901.10323spa
dcterms.referencesStrezoski G, Stojanovski D, Dimitrovsk I, Madjarov G (2016) Hand gesture recognition using deep convolutional neural networks. In: International conference on ICT innovations, pp 49–58spa
dcterms.referencesHossein A (2018) Implementing VGG13 for MNIST dataset in TensorFlow. https://medium.com/@amir_hf8/implementing-vgg13-for-mnist-dataset-in-tensorflow-abc1460e2b93. Accessed 1 July 2019spa
dc.identifier.doihttps://doi.org/10.1007/s11063-019-10062-4
dc.publisher.placePaíses Bajosspa
dc.relation.citationeditionVol.50 No.1.(2019)spa
dc.relation.citationendpage643spa
dc.relation.citationissue1(2019)spa
dc.relation.citationstartpage623spa
dc.relation.citationvolume50spa
dc.relation.citesPuerto, E., Aguilar, J., Vargas, R. et al. An Ar2p Deep Learning Architecture for the Discovery and the Selection of Features. Neural Process Lett 50, 623–643 (2019). https://doi.org/10.1007/s11063-019-10062-4
dc.relation.ispartofjournalNeural Processing Lettersspa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.rights.creativecommonsAtribución 4.0 Internacional (CC BY 4.0)spa
dc.subject.proposalDeep learningeng
dc.subject.proposalPattern recognition processeseng
dc.subject.proposalFeature engineeringeng
dc.subject.proposalPattern Recognition Theory of Mindeng
dc.type.coarhttp://purl.org/coar/resource_type/c_6501spa
dc.type.contentTextspa
dc.type.driverinfo:eu-repo/semantics/articlespa
dc.type.redcolhttp://purl.org/redcol/resource_type/ARTspa
oaire.accessrightshttp://purl.org/coar/access_right/c_abf2spa
oaire.versionhttp://purl.org/coar/version/c_970fb48d4fbd8a85spa
dc.type.versioninfo:eu-repo/semantics/publishedVersionspa


Ficheros en el ítem

Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem