Journal article Open Access

Deep Motifs and Motion Signatures

Aristidou Andreas; Cohen-Or Daniel; Hodgins Jessica K; Chrysanthou Yiorgos; Shamir Ariel


MARC21 XML Export

<?xml version='1.0' encoding='UTF-8'?>
<record xmlns="http://www.loc.gov/MARC21/slim">
  <leader>00000nam##2200000uu#4500</leader>
  <datafield tag="041" ind1=" " ind2=" ">
    <subfield code="a">eng</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Motion capture</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Motion processing</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Animation</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Motion Word</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Motif</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Motion Signature</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Convolutional Network,</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Triplet Loss</subfield>
  </datafield>
  <controlfield tag="005">20191111070842.0</controlfield>
  <datafield tag="500" ind1=" " ind2=" ">
    <subfield code="a">This work has been partly supported by the project that has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 739578 (RISE – Call: H2020-WIDESPREAD-01-2016-2017-TeamingPhase2)  and the Government of the Republic of Cyprus through the Directorate General for European Programmes, Coordination and Development. 

@2018 ACM Copyright under the CC-BY-NC-ND 4.0 license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Citation: Andreas Aristidou, Daniel Cohen-Or, Jessica K. Hodgins, Yiorgos Chrysanthou, and Ariel Shamir. 2018. Deep motifs and motion signatures. ACM Trans. Graph. 37, 6, Article 187 (December 2018), 13 pages. DOI: https://doi.org/10.1145/3272127.3275038</subfield>
  </datafield>
  <controlfield tag="001">2658798</controlfield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Tel-Aviv University</subfield>
    <subfield code="a">Cohen-Or Daniel</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Carnegie Mellon University</subfield>
    <subfield code="a">Hodgins Jessica K</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Research Centre on Interactive Media Smart Systems and Emerging Technologies</subfield>
    <subfield code="a">Chrysanthou Yiorgos</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">The Interdisciplinary Centre</subfield>
    <subfield code="a">Shamir Ariel</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">25304177</subfield>
    <subfield code="z">md5:e4716169f486e6ad90f0633c6d8564df</subfield>
    <subfield code="u">https://zenodo.org/record/2658798/files/DeepSignatures.pdf</subfield>
  </datafield>
  <datafield tag="542" ind1=" " ind2=" ">
    <subfield code="l">open</subfield>
  </datafield>
  <datafield tag="260" ind1=" " ind2=" ">
    <subfield code="c">2018-11-01</subfield>
  </datafield>
  <datafield tag="909" ind1="C" ind2="O">
    <subfield code="p">user-rise-teaming-cyprus</subfield>
    <subfield code="o">oai:zenodo.org:2658798</subfield>
  </datafield>
  <datafield tag="100" ind1=" " ind2=" ">
    <subfield code="u">The Interdisciplinary Centre</subfield>
    <subfield code="a">Aristidou Andreas</subfield>
  </datafield>
  <datafield tag="245" ind1=" " ind2=" ">
    <subfield code="a">Deep Motifs and Motion Signatures</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">user-rise-teaming-cyprus</subfield>
  </datafield>
  <datafield tag="536" ind1=" " ind2=" ">
    <subfield code="c">739578</subfield>
    <subfield code="a">Research Center on Interactive Media, Smart System and Emerging Technologies</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
    <subfield code="u">http://creativecommons.org/licenses/by-nc-nd/4.0/legalcode</subfield>
    <subfield code="a">Creative Commons Attribution Non Commercial No Derivatives 4.0 International</subfield>
  </datafield>
  <datafield tag="650" ind1="1" ind2="7">
    <subfield code="a">cc-by</subfield>
    <subfield code="2">opendefinition.org</subfield>
  </datafield>
  <datafield tag="520" ind1=" " ind2=" ">
    <subfield code="a">&lt;p&gt;Many analysis tasks for human motion rely on high-level similarity between sequences of motions, that are not an exact matches in joint angles, timing, or ordering of actions. Even the same movements performed by the same person can vary in duration and speed. Similar motions are characterized by similar sets of actions that appear frequently. In this paper we introduce motion motifs and motion signatures that are a succinct but descriptive representation of motion sequences. We first break the motion sequences to short-term movements called motion words, and then cluster the words in a high-dimensional feature space to find motifs. Hence, motifs are words that are both common and descriptive, and their distribution represents the motion sequence. To cluster words and find motifs, the challenge is to define an effective feature space, where the distances among motion words are semantically meaningful, and where variations in speed and duration are handled. To this end, we use a deep neural network to embed the motion&amp;nbsp;words into feature space using a triplet loss function. To define a signature, we choose a finite set of motion-motifs, creating a bag-of-motifs representation for the sequence. Motion signatures are agnostic to movement order, speed or duration variations, and can distinguish fine-grained differences between motions of the same class. We illustrate examples of characterizing motion sequences by motifs, and for the use of motion signatures in anumber of applications.&lt;/p&gt;</subfield>
  </datafield>
  <datafield tag="024" ind1=" " ind2=" ">
    <subfield code="a">10.1145/3272127.3275038</subfield>
    <subfield code="2">doi</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">publication</subfield>
    <subfield code="b">article</subfield>
  </datafield>
</record>
70
40
views
downloads
Views 70
Downloads 40
Data volume 1.0 GB
Unique views 56
Unique downloads 39

Share

Cite as