2TACS-DP: Assistive Tongue-Based Technology for Persons with Disabilities

A Groundbreaking USP Innovation for Inclusive Communication

2TACS-DP: Assistive Tongue-Based Technology for Persons with Disabilities

A Groundbreaking USP Innovation for Inclusive Communication

The University of the South Pacific is excited to announce a pioneering research project titled “Two-way Tongue-based Assistive Communication System for Disabled Persons (2TACS-DP).” This project is a collaborative effort between USP and the National University of Samoa (NUS) aimed at improving the quality of life for individuals with severe physical disabilities across the Pacific region.

🌍 Why This Matters

Millions of people worldwide, including many in the South Pacific, live with paralysis caused by spinal cord injuries or degenerative diseases. Traditional assistive devices like speech recognition tools, head pointers, and eye trackers are often limited by noise, fatigue, or discomfort. There is an urgent need for non-invasive, intelligent, and user-friendly technologies that promote independence and dignity.

💡 The Innovation

2TACS-DP is a wearable, wireless assistive system that enables individuals with limited mobility to:

  • Control digital devices (TV, lights, computer) using tongue gestures
  • Communicate wirelessly between two users with disabilities
  • Translate tongue movements into natural language using an AI-generated extended alphabet
  • Interpret incomplete gestures through adaptive AI models

This technology uses magnetic sensors and a tiny tongue-mounted tracer to capture motion data, which is processed by a microcontroller system and transmitted to control external devices.

The Vision

This research aims to make a lasting impact on social inclusion and digital accessibility for people with disabilities in the Pacific. It lays the foundation for future innovation in wearable AI, rehabilitation engineering, and non-verbal communication systems.

 

 

🔧 How It Works

  1. A small magnetic tracer is placed on the tongue.
  2. Magnetic field variations are detected by head-mounted sensors.
  3. Data is transmitted at 300 Mbps to a processor that converts the movement into commands.
  4. Commands are used to operate home devices or exchanged between users in real-time.
  5. AI techniques ensure accuracy and adapt to user-specific variations in gestures.

This system is low-power, non-invasive, and optimized for real-world use through intelligent sensor activation and battery conservation.

🧪 Research Approach

  • Multi-disciplinary methodology: Combines engineering, AI, linguistics, and health sciences.
  • User testing: Male and female participants aged 18–66 with varying technical experience.
  • AI training: Algorithms refine gesture interpretation and support multi-language capabilities (English and Samoan).
  • Feedback tools: Semi-structured interviews and validated questionnaires guide system design.
  • Develop and test a functional prototype
  • Expand the tongue gesture alphabet for natural communication
  • Publish 2 Q1-ranked journal articles and 2 assistive tech-focused conference papers
  • Partner with NGOs to design a comprehensive command dictionary for real-world deployment
  • Generate policy recommendations for inclusive digital accessibility
  • USP Laucala Campus, Fiji
  • NUS Campus, Samoa
  • Duration: 21 months
  • PIURN-funded Budget: FJD $29,141.52
  • Covers: travel, equipment, student assistant support, simulation tools, 3D prototyping, and stakeholder engagement.
  • Principal Investigator: Mansour Assaf (USP)
  • Co-Investigator: Ioana Chan Mow (NUS)
  • Collaborators: Dr. Lorina Chandra (Sports City Medical Center) and NGOs for Persons with Disabilities (PWDs)
  • Collaborators: Rahul Kumar and Bibhya Sharma (USP)

For collaboration or more information, contact:
📧 mansour.assaf@usp.ac.fj