This is a class final project for 05-499, Human-AI Interaction taught by Professor Jeffrey Bigham.
In countries such as the United States, all paper money have similar sizes. This makes it very difficult for blind people to distinguish one banknote from another. Usually, blind people have to fold different value’s paper money in different ways, or put it in different divisions in their wallets. This is very inconvenient, and they can never know whether the banknotes retrieved from an ATM or a cashier is correct.
What it does
An assistive mobile application that can scan, classify and report the value of banknotes for blind people.
How I built it
An android app to capture images through phone’s camera, and use a machine learning model to classify the image.
Used transfer learning by modifying ResNet. To train the model, I collected image data by scrapping images from webpages (Google results for different values of banknotes).
The app read out the result of classification.