Service-oriented context reasoning incorporating patterns and knowledge for understanding human-augmented situations

Gi Hyun Lim, Kun Woo Kim, Byoungjun Chung, Il Hong Suh, Hyowon Suh, Munsang Kim

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Understanding situations has been regarded as a highly difficult task due to its complexity, especially in case of human-augmented ones where the contexts of situations are largely influenced by human activity. Even though the complexity of situations can be solved by modeling the real environment, it is not easy to model which covers uncertain problems of real world effectively. For this, this paper proposes a fusion technology for service-oriented context reasoning combining low-level sensory patterns and high-level semantic knowledge using Hidden Markov Model (HMM) and ontology. Integrated temporal reasoning using our approach enables a service robot to understand human-augmented situations immediately whenever the critical situation happens. Experimental results show that the proposed method can successfully extract situations from continuous sensory signals with 80% percent reliability.
Original languageEnglish
Title of host publication19th International Symposium in Robot and Human Interactive Communication
PublisherIEEE
Pages144-150
Number of pages7
ISBN (Electronic)9781424479900
ISBN (Print)9781424479917
DOIs
Publication statusPublished - 11 Oct 2010
Event19th IEEE International Symposium on Robot and Human Interactive Communication - Principe di Piemonte, Viareggio, Italy
Duration: 12 Sept 201015 Sept 2010
Conference number: 82469

Conference

Conference19th IEEE International Symposium on Robot and Human Interactive Communication
Abbreviated titleROMAN 2010
Country/TerritoryItaly
CityViareggio
Period12/09/1015/09/10

Fingerprint

Dive into the research topics of 'Service-oriented context reasoning incorporating patterns and knowledge for understanding human-augmented situations'. Together they form a unique fingerprint.

Cite this