02288nas a2200337 4500000000100000000000100001008004100002260006800043653002000111653001000131653002400141653001500165653001600180653001500196653001500211653002800226653002500254653002400279653001800303100002600321700002200347700002400369700001900393700000900412245010900421856015000530300001600680490001600696520119600712020004201908 2018 d c2018///bInstitute of Electrical and Electronics Engineers Inc.10aDigital storage10adance10aPattern recognition10aExtraction10aData mining10aKinematics10aKinematics10achoreographic sequences10aKey-frame extraction10akeyframe extraction10aSummarization1 aAthanasios Voulodimos1 aNikolaos Doulamis1 aAnastasios Doulamis1 aIoannis Rallis1 aIEEE00aKinematics-based Extraction of Salient 3D Human Motion Data for Summarization of Choreographic Sequences uhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85059737928&doi=10.1109%2fICPR.2018.8545078&partnerID=40&md5=74245ca672cf41fce02ee6c0c1220c8d a3013-3018, 0 v2018-August3 aCapturing, documenting and storing Intangible Cultural Heritage content has been recently enabled at unprecedented volume and quality levels through a variety of sensors and devices. When it comes to the performing arts, and mainly dance and kinesiology, the massive amounts of RGB-D and 3D skeleton data produced by video and motion capture devices the huge number of different types of existing dances and variations thereof, dictate the need for organizing, indexing, archiving, retrieving and analyzing dance-related cultural content in a tractable fashion and with lower computational and storage resource requirements. In this context, we present a novel framework based on kinematics modeling for the extraction of salient 3D human motion data from real-world choreographic sequences. Two approaches are proposed: A clustering-based method for the selection of the basic primitives of a choreography, and a kinematics-based method that generates meaningful summaries at hierarchical levels of granularity. The dance summarization framework has been successfully validated and evaluated with two real-world datasets and with the participation of dance professionals and domain experts. a10514651 (ISSN); 9781538637883 (ISBN)