Handwritten Numbers and Characters Recognition Using Machine Learning Algorithms
Abstract
One of the most well-known challenges in machine learning and computer vision applications is handwritten digit recognition. To overcome the challenge of handwritten digit and/or letter recognition, a variety of machine learning algorithms have been used. The current research focuses on using Neural Networks to overcome the problem. Deep neural networks, deep belief networks, and convolutional neural networks are the three most well-known methodologies. The accuracy and performance of these three neural network algorithms are compared and assessed in terms of a variety of variables such as accuracy and performance. However, the rate of recognition accuracy and performance is not the only factor in the assessment process; there are more relevant metrics like execution time. Random and standard dataset of handwritten digit will be used for conducting the experiments. The trials will be conducted using a random and standard dataset of handwritten digits. The results demonstrate that Deep belief network is the most accurate algorithm among the three neural network techniques, with a 98.08 percent accuracy rate. Deep belief network, on the other hand, has a comparable execution time to the other two methods. Each method, on the other hand, has a 1-2 percent error rate due to digit form similarities, particularly with the digits (1,7), (3,5), (3,8), (8,5), and (9,5). (6, 9).
Authors
Kumar K., Singh H. ,Khan S., Manu Sharma