International Journal of Computer Applications |
Foundation of Computer Science (FCS), NY, USA |
Volume 179 - Number 47 |
Year of Publication: 2018 |
Authors: Ansam Nizar Younis, Fawzia Mahmoud Remo |
10.5120/ijca2018917236 |
Ansam Nizar Younis, Fawzia Mahmoud Remo . Distinguish Musical Symbol Printed using the Linear Discriminant Analysis LDA and Similarity Scale. International Journal of Computer Applications. 179, 47 ( Jun 2018), 20-24. DOI=10.5120/ijca2018917236
Music exists in all areas of our daily lives , A moment does not pass without hearing a musical tone that expresses a specific event or any other sound like the sounds of animals, the sounds expressed by sadness or joy emanating from the sound systems in the human vocal cords and throat different rhythms, and others. The language of music written by signs, symbols and lines is one of the most important methods to save tunes and musical tones so that we can read the melody and retrieve it again when needed. A new method has been proposed in this research to highlight printed music labels, Where a computer system was built to read the images of various musical labels and then perform a series of sequential processes as a preliminary processing of the image. Then use Linear Discriminant Analysis ( LDA) algorithm for the purpose of extracting the important characteristics of the process of discrimination from the images of different types of different marks and as a result of reducing the size of data entered, thus providing the time and capacity of the treasury during the treatment and discrimination symbols. The structure similarity index SSIM is then used which allows measuring the similarity between the input image and training images. The quality of the input signal is evaluated for the second signal, which can be considered to be of optimal quality, This metric has been used to identify different musical labels. The linear discrimination analysis algorithm with the structural similarity algorithm achieved very good performance and low executive time. A classification accuracy of 89.5% was obtained, and the search for any marker took about 0.784990 seconds.