Tech-taal: Regenerating the Embodied Heritage of Bharatanatyam through Multisensory Restoration and Machine Learning
Description
Abstract: This ongoing research frames Machine Learning as an embodied process by reimagining the Indian classical dance form ‘Bharatanatyam’ as a regenerative system of embodied knowledge, one that can be restored, reinterpreted, and re-experienced through multisensory translation using electronic textile (e-textile), motion design, computer vision, and generative sound synthesis. Traditionally performed with Indian Carnatic music, ‘Bharatanatyam’ is an intricate composition of rhythmic steps (‘Nrit’), expressive gestures (‘Nritya’), and dramatic storytelling (‘Natya’), built on foundational movement units known as ‘Adavus’. ‘Tech-taal’ seeks to digitally regenerate these core performance movements as new media vocabularies of a learning model. The project employs sound and motion sensing to capture the embodied rhythm of ‘Adavus’. Motion tracking sensors and microphones are integrated as e-textile embeds into traditional performance accessories and ornaments such as ‘Ghungroo’ (ankle bells), ‘Bajubandh’ (armlet), and ‘Thaalam’ (percussive hand cymbals) to record both kinetic and acoustic data. These signals form the generative inputs for visual and sonic compositions, producing a dynamic archive that restores and reanimates the sensorial essence of the dance in multisensory form. By embedding sensors into culturally significant artifacts, ‘Tech-taal’ functions as a living interface between tradition and technology. The project does not seek to resolve the layered complexities of ‘Bharatanatya’m, create a digital twin, or a teaching model, but to sustain and extend performance vocabularies through iterative processes of digital embodiment, regeneration, and reinterpretation.
Artists