Jun 2020
Samantha’s Master’s Thesis

(Logo for the DeafBlind Contact Center in Allston, MA)

In June 2020, in the height of the COVID pandemic, Samantha began her Master’s thesis aimed to develop a robotic hand to fingerspell tactile ASL. Working with the Deaf-Blind Contact Center in Allston MA, the first prototypes were evaluated by people with deafblindness. 

Aug 2021
Collaboration with Engineering Groups

(Logos for the University of Auckland’s New Dexterity group, Northeastern University’s Institute for Experiential Robotics, and the FreeMoCap project)

In order to ensure the device was the most effective, collaborations were established with the University of Auckland’s New Dexterity group, Northeastern University’s Institute for Experiential Robotics, and the FreeMoCap project - each group passionate about the team’s goal to develop assistive technology for the DB community. 

Aug 2021
Collaboration with CNIB and CNIB DBCS

(Logo for the Canadian National Institute for the Blind)

After completing her thesis project, the Canadian National Institute for the Blind reached out to voice their support and offer a grant to continue R&D. This inspired the founding of the company, Tatum Robotics!

Oct 2021
New site at MassRobotics

(Logo for MassRobotics)

Through the ongoing collaboration with the Institute for Experimental Robotics, the team moved into the innovation workspace, MassRobotics. MassRobotics is home to dozens of individual companies with robotic and supporting technology with access to shared equipment and collaborative robots!

Dec 2021
Collaboration with Sense International - India

(Logo for Sense International - India)

In an effort to learn more about how tactile sign language is used worldwide, the team is excited about a new collaboration with Sense International - India. A group that is supporting over 78,000 DeafBlind people in India alone find access to resources and community, we are looking forward to learning more about the needs of their community.

Nov 2021
Collaboration with Perkins School for the Blind












(Logo for Perkins School for the Blind)

In November 2021, we began collaborating with teachers at Perkins School for the Blind, where there is a special program for DeafBlind children. Because each child has different language needs depending on their age, language exposure, and cognitive ability, the teachers at Perkins are used to prioritizing language access in whatever form it must take. They are contributing their unique insights and experiences to Tatum's team.

April 2022
Collaboration with Rochester Institute for Technology National Technical Institute for the Deaf's Center on Access Technology


(Logo for RIT NTID's Center on Access Technology) 

In April 2022, Tatum Robotics connected with the Center on Access Technology (CAT) at the National Technical Institute for the Deaf (NTID), one of America's leading schools for the Deaf and hard-of-hearing. The CAT develops innovative assistive technologies for people in the signing community. Going forward, Deaf engineers and faculty members there will be helping us with R&D and data collection.

In Development of Robotic Arm and Hands

(Tatum prototype demonstrating the signs for numbers 3, 4, 5, 7, 8, 9)

Hardware: The team is currently working on two projects. The first is a low-cost robotic anthropomorphic hand that will fingerspell tactile sign language. We hope to validate this device in real-time settings with DB individuals soon to confirm the design changes and evaluate ease-of use. Simultaneously, progress is ongoing to develop a safe, compliant robotic arm so that the system can sign more complex words and phrases. The systems will work together to create a humanoid device that can sign tactile sign languages. 

Linguistics: In an effort to sign accurately and repeatably, the team is looking to logically parse through tactile American Sign Language (ASL), Pidgin Signed English (PSE), and Signed Exact English (SEE). Although research has been conducted in this field, we aim to be the first to develop an algorithm to understand the complexities and fluidity of t-ASL without the need for user confirmation of translations or pre-programmed responses.