Software Open Access

Building Location Embeddings from Physical Trajectories and Textual Representations

Biester, Laura; Banea, Carmen; Mihalcea, Rada


DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd">
  <identifier identifierType="DOI">10.5281/zenodo.4479440</identifier>
  <creators>
    <creator>
      <creatorName>Biester, Laura</creatorName>
      <givenName>Laura</givenName>
      <familyName>Biester</familyName>
      <nameIdentifier nameIdentifierScheme="ORCID" schemeURI="http://orcid.org/">0000-0003-3901-2968</nameIdentifier>
      <affiliation>University of Michigan</affiliation>
    </creator>
    <creator>
      <creatorName>Banea, Carmen</creatorName>
      <givenName>Carmen</givenName>
      <familyName>Banea</familyName>
      <affiliation>University of Michigan</affiliation>
    </creator>
    <creator>
      <creatorName>Mihalcea, Rada</creatorName>
      <givenName>Rada</givenName>
      <familyName>Mihalcea</familyName>
      <affiliation>University of Michiagn</affiliation>
    </creator>
  </creators>
  <titles>
    <title>Building Location Embeddings from Physical Trajectories and Textual Representations</title>
  </titles>
  <publisher>Zenodo</publisher>
  <publicationYear>2020</publicationYear>
  <dates>
    <date dateType="Issued">2020-12-04</date>
  </dates>
  <language>en</language>
  <resourceType resourceTypeGeneral="Software"/>
  <alternateIdentifiers>
    <alternateIdentifier alternateIdentifierType="url">https://zenodo.org/record/4479440</alternateIdentifier>
  </alternateIdentifiers>
  <relatedIdentifiers>
    <relatedIdentifier relatedIdentifierType="URL" relationType="IsSupplementTo" resourceTypeGeneral="Text">https://www.aclweb.org/anthology/2020.aacl-main.44/</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsVersionOf">10.5281/zenodo.4479439</relatedIdentifier>
  </relatedIdentifiers>
  <rightsList>
    <rights rightsURI="https://creativecommons.org/licenses/by/4.0/legalcode">Creative Commons Attribution 4.0 International</rights>
    <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
  </rightsList>
  <descriptions>
    <description descriptionType="Abstract">&lt;p&gt;Code for building and evaluating location embeddings for the 2020 AACL-IJCNLP&amp;nbsp;paper &amp;quot;Building Location Embeddings from Physical Trajectories and Textual Representations.&amp;quot;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Abstract:&amp;nbsp;&lt;/strong&gt;Word embedding methods have become the de-facto way to represent words, having been successfully applied to a wide array of natural language processing tasks. In this paper, we explore the hypothesis that embedding methods can also be effectively used to represent spatial locations. Using a new dataset consisting of the location trajectories of 729 students over a seven month period and text data related to those locations, we implement several strategies to create location embeddings, which we then use to create embeddings of the sequences of locations a student has visited. To identify the surface level properties captured in the representations, we propose a number of probing tasks such as the presence of a specific location in a sequence or the type of activities that take place at a location. We then leverage the representations we generated and employ them in more complex downstream tasks ranging from predicting a student&amp;#39;s area of study to a student&amp;#39;s depression level, showing the effectiveness of these location embeddings.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Contact:&lt;/strong&gt;&amp;nbsp;Please contact Laura Biester (lbiester@umich.edu) with questions.&lt;/p&gt;</description>
  </descriptions>
</resource>
34
3
views
downloads
All versions This version
Views 3434
Downloads 33
Data volume 590.0 kB590.0 kB
Unique views 2828
Unique downloads 33

Share

Cite as