<?xml version="1.0" encoding="UTF-8"?>
<rdf:RDF xmlns:community="http://www.bibsonomy.org/ontologies/2008/05/community#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:owl="http://www.w3.org/2002/07/owl#" xmlns:admin="http://webns.net/mvcb/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" xmlns:cc="http://web.resource.org/cc/" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" xmlns:swrc="http://swrc.ontoware.org/ontology#" xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#" xmlns="http://purl.org/rss/1.0/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xml:base="https://puma.ub.uni-stuttgart.de/tag/Computing%20processing,"><owl:Ontology rdf:about=""><rdfs:comment>PUMA publications for /tag/Computing%20processing,</rdfs:comment><owl:imports rdf:resource="http://swrc.ontoware.org/ontology/portal"/></owl:Ontology><rdf:Description rdf:about="https://puma.ub.uni-stuttgart.de/bibtex/2381251de13a0c13447b9f359be320f12/hcics"><owl:sameAs rdf:resource="/uri/bibtex/2381251de13a0c13447b9f359be320f12/hcics"/><rdf:type rdf:resource="http://swrc.ontoware.org/ontology#Article"/><swrc:date>Thu Jul 11 10:05:52 CEST 2024</swrc:date><swrc:journal>IEEE Pervasive Computing</swrc:journal><swrc:number>2</swrc:number><swrc:pages>48-57</swrc:pages><swrc:title>What&#039;s in the Eyes for Context-Awareness?</swrc:title><swrc:volume>10</swrc:volume><swrc:year>2011</swrc:year><swrc:keywords>Computing Machine Pervasive Wearable computing, hcics learning, processing, signal vis </swrc:keywords><swrc:abstract>Eye movements are a rich source of information about a person&#039;s context. Analyzing the link between eye movements and cognition might even allow us to develop cognition-aware pervasive computing systems that assess a person&#039;s cognitive context.</swrc:abstract><swrc:hasExtraField><swrc:Field swrc:value="10.1109/MPRV.2010.49" swrc:key="doi"/></swrc:hasExtraField><swrc:author><rdf:Seq><rdf:_1><swrc:Person swrc:name="Andreas Bulling"/></rdf:_1><rdf:_2><swrc:Person swrc:name="Daniel Roggen"/></rdf:_2><rdf:_3><swrc:Person swrc:name="Gerhard Tr{\&#034;{o}}ster"/></rdf:_3></rdf:Seq></swrc:author></rdf:Description><rdf:Description rdf:about="https://puma.ub.uni-stuttgart.de/bibtex/28488a975de92a7205c78b3fc95ff2326/hcics"><owl:sameAs rdf:resource="/uri/bibtex/28488a975de92a7205c78b3fc95ff2326/hcics"/><rdf:type rdf:resource="http://swrc.ontoware.org/ontology#Article"/><swrc:date>Thu Jul 11 10:05:52 CEST 2024</swrc:date><swrc:journal>IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)</swrc:journal><swrc:note>spotlight</swrc:note><swrc:number>4</swrc:number><swrc:pages>741-753</swrc:pages><swrc:title>Eye {M}ovement {A}nalysis for {A}ctivity {R}ecognition {U}sing {E}lectrooculography</swrc:title><swrc:volume>33</swrc:volume><swrc:year>2011</swrc:year><swrc:keywords>Feature Ubiquitous and computing evaluation hcics processing, selection, signal vis </swrc:keywords><swrc:abstract>In this work we investigate eye movement analysis as a new sensing modality for activity recognition. Eye movement data was recorded using an electrooculography (EOG) system. We first describe and evaluate algorithms for detecting three eye movement characteristics from EOG signals - saccades, fixations, and blinks - and propose a method for assessing repetitive patterns of eye movements. We then devise 90 different features based on these characteristics and select a subset of them using minimum redundancy maximum relevance feature selection (mRMR). We validate the method using an eight participant study in an office environment using an example set of five activity classes: copying a text, reading a printed paper, taking hand-written notes, watching a video, and browsing the web. We also include periods with no specific activity (the NULL class). Using a support vector machine (SVM) classifier and a person-independent (leave-one-out) training scheme, we obtain an average precision of 76.1% and recall of 70.5% over all classes and participants. The work demonstrates the promise of eye-based activity recognition (EAR) and opens up discussion on the wider applicability of EAR to other activities that are difficult, or even impossible, to detect using common sensing modalities.</swrc:abstract><swrc:hasExtraField><swrc:Field swrc:value="10.1109/TPAMI.2010.86" swrc:key="doi"/></swrc:hasExtraField><swrc:author><rdf:Seq><rdf:_1><swrc:Person swrc:name="Andreas Bulling"/></rdf:_1><rdf:_2><swrc:Person swrc:name="Jamie A. Ward"/></rdf:_2><rdf:_3><swrc:Person swrc:name="Hans Gellersen"/></rdf:_3><rdf:_4><swrc:Person swrc:name="Gerhard Tr{\&#034;{o}}ster"/></rdf:_4></rdf:Seq></swrc:author></rdf:Description><foaf:Group rdf:about="https://puma.ub.uni-stuttgart.de/tag/Computing%20processing,"><foaf:name>Computing processing,</foaf:name><description>Community for tag(s) Computing processing,</description></foaf:Group></rdf:RDF>