Conference paper Restricted Access

Interleaving Hierarchical Task Planning and Motion Constraint Testing for Dual-Arm Manipulation

Suárez Hernández, Alejandro; Alenyà, Guillem; Torras, Carme


DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd">
  <identifier identifierType="URL">https://zenodo.org/record/3463479</identifier>
  <creators>
    <creator>
      <creatorName>Suárez Hernández, Alejandro</creatorName>
      <givenName>Alejandro</givenName>
      <familyName>Suárez Hernández</familyName>
      <nameIdentifier nameIdentifierScheme="ORCID" schemeURI="http://orcid.org/">0000-0003-1611-614X</nameIdentifier>
      <affiliation>IRI, CSIC-UPC</affiliation>
    </creator>
    <creator>
      <creatorName>Alenyà, Guillem</creatorName>
      <givenName>Guillem</givenName>
      <familyName>Alenyà</familyName>
      <nameIdentifier nameIdentifierScheme="ORCID" schemeURI="http://orcid.org/">0000-0002-6018-154X</nameIdentifier>
      <affiliation>IRI, CSIC-UPC</affiliation>
    </creator>
    <creator>
      <creatorName>Torras, Carme</creatorName>
      <givenName>Carme</givenName>
      <familyName>Torras</familyName>
      <nameIdentifier nameIdentifierScheme="ORCID" schemeURI="http://orcid.org/">0000-0002-2933-398X</nameIdentifier>
      <affiliation>IRI, CSIC-UPC</affiliation>
    </creator>
  </creators>
  <titles>
    <title>Interleaving Hierarchical Task Planning and Motion Constraint Testing for Dual-Arm Manipulation</title>
  </titles>
  <publisher>Zenodo</publisher>
  <publicationYear>2018</publicationYear>
  <subjects>
    <subject>Task analysis, Planning, Uncertainty, Shape, Manipulators, Cameras</subject>
  </subjects>
  <dates>
    <date dateType="Issued">2018-10-01</date>
  </dates>
  <resourceType resourceTypeGeneral="ConferencePaper"/>
  <alternateIdentifiers>
    <alternateIdentifier alternateIdentifierType="url">https://zenodo.org/record/3463479</alternateIdentifier>
  </alternateIdentifiers>
  <relatedIdentifiers>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsIdenticalTo">10.1109/IROS.2018.8593847</relatedIdentifier>
  </relatedIdentifiers>
  <rightsList>
    <rights rightsURI="info:eu-repo/semantics/restrictedAccess">Restricted Access</rights>
  </rightsList>
  <descriptions>
    <description descriptionType="Abstract">&lt;p&gt;In recent years the topic of combining motion and symbolic planning to perform complex tasks in the field of robotics has received a lot of attention. The underlying idea is to have access at once to the reasoning capabilities of a task planner and to the ability of the motion planner to verify that the plan is feasible from a physical and geometrical point of view. The present work describes a framework to perform manipulation tasks that require the use of two robotic manipulators. To do so we employ a Hierarchical Task Network (HTN) planner interleaved with geometric constraint verification. In this framework we also consider observation actions and handle noisy perceptions from a probabilistic perspective. These ideas are put into practice by means of an experimental set-up in which two Barrett WAM robots have to cooperatively solve a geometric puzzle. Our findings provide further evidence that considering explicitly physical constraints during task planning, rather than deferring their validation to the moment of execution, is advantageous in terms of execution time and breadth of situations that can be handled.&lt;/p&gt;</description>
  </descriptions>
  <fundingReferences>
    <fundingReference>
      <funderName>European Commission</funderName>
      <funderIdentifier funderIdentifierType="Crossref Funder ID">10.13039/100010661</funderIdentifier>
      <awardNumber awardURI="info:eu-repo/grantAgreement/EC/H2020/731761/">731761</awardNumber>
      <awardTitle>Robots Understanding Their Actions by Imagining Their Effects</awardTitle>
    </fundingReference>
  </fundingReferences>
</resource>
70
9
views
downloads
Views 70
Downloads 9
Data volume 49.7 MB
Unique views 51
Unique downloads 2

Share

Cite as