AI Learning Tools

To hear the audio from and learn more about these tools, reference the App Demo page.

Transcription

The Transcription Service helps hearing-impaired students communicate with those who might not know sign language. This uses Speech Recognition (AVFoundation).

Document Reader

The Document Reader helps the visually impaired gain knowledge from external content by reading them aloud. The Object Recognizer uses Vision based ML Model to help visually impaired students identify objects in their classroom. These tools use Vision, Text Recognition, Speech Synthesis, and SqueezeNet, an ML Object Detection Model.

Object Recognition

The Object Recognition tool helps visually-impaired students identify objects in their environment using the camera. It uses the SqueezeNet Convolutional Neural Network for Image Identification and runs on-device.

Book Recommender

The Book Recommender tool uses a custom ML model (trained by me) to recommend books to students based on their answers to personalized questions like favorite subject. I trained this model using CreateML with data gathered from actual students and deployed it using CoreML. 

All of these tools use on-device processing so that Firefly can serve students living in the remotest of locations without an internet connection.