Posts Tagged ‘2014’

PhD thesis: Context-Aware Assistive Systems for Augmented Work. A Framework Using Gamification and Projection

At May 21st I finished my PhD in Computer Science at the Institute for Visualization and Interactive Systems (VIS) at the University of Stuttgart. The advisors were Prof. Dr. Albrecht Schmidt from the VIS and Prof. Dr. Fillia Makedon from the Universiy of Texas Alington (UTA). The work is based on the project ASLM acquired by Prof. Dr. Thomas Hörz from the University of Applied Sciences Esslingen and was continued in the project motionEAP.

Diss-CoverThe PhD is situated in the University of Stuttgart’s SimTech Cluster and was supported by the German Federal Ministry for Economic Affairs and Energy. It is published as “Open Source” – this means you can download and distribute this work freely as long as you indicate the source:

Context-Aware Assistive Systems for Augmented Work. A Framework Using Gamification and Projection (PDF, 7.7 MB)

If you prefer a printed version you can order it at Lulu Press.

Keywords:

assistive systems, assistive technologies, gamification, projection, motion recognition, context-aware, game design, human computer interaction, HCI, elderly, impaired, ethics, digital factory, cyber-physical systems, CPS

Abstract:

While context-aware assistive systems (CAAS) have become ubiquitous in cars or smartphones, most workers in production environments still rely on their skills and expertise to make the right choices and movements. (more…)

PETRA ’14

The 7th International Conference on PErvasive Technologies
Related to Assistive Environments

(organized by the University of Texas at Arlington, USA)

PETRA-14

http://www.petrae.org/

May 27-30, Island of Rhodes, Greece

Presentation of Paper: Context-aware Assistive Systems at the Workplace. Analyzing the Effects of Projection and Gamification

Context-aware Assistive Systems at the Workplace. Analyzing the Effects of Projection and Gamification

CAAS-ModelKorn, Oliver; Funk, Markus; Abele, Stephan; Schmidt, Albrecht; Hörz, Thomas:
Context-aware Assistive Systems at the Workplace. Analyzing the Effects of Projection and Gamification

In: PETRA ’14 Proceedings of the 7th International Conference on PErvasive Technologies Related to Assistive Environments, ACM, New York, NY, USA, 2014
DOI =10.1145/2674396.2674406

Abstract

Context-aware assistive systems (CAAS) have become ubiquitous in cars or smartphones but not in industrial work contexts: while there are systems controlling work results, context-specific assistance during the processes is hardly offered. As a result production workers still have to rely on their skills and expertise. While un-impaired workers may cope well with this situation, elderly or impaired persons in production environments need context-sensitive assistance.

The contribution of the research presented here is three-fold: (1) We provide a framework for context-aware assistive systems in production environments. These systems are based on motion recognition and use projection and elements from game design (gamification) to augment work. (2) Based on this framework we describe a prototype with respect to both the physical and the software implementation. (3) We present the results of a study with impaired workers and quantifying the effects of the augmentations on work speed and quality.

An Augmented Workplace for Enabling User-Defined Tangibles

Funk, Markus; Korn, Oliver; Schmidt, Albrecht:User-defined-Tangibles

An Augmented Workplace for Enabling User-Defined Tangibles

In: Extended Abstracts of the ACM SIGCHI Conference on Human Factors in Computing Systems, ACM, New York, NY, USA, 2014, DOI =10.1145/2559206.2581142

Abstract

In this work, we introduce a novel setup for an augmented workplace, which allows for de ning and interacting with user-de ned tangibles. State-of-the-art tangible user interface systems equip both the underlying surface and the tangible control with sensors or markers. At the workplace, having unique tangibles for each available action results in confusion. Furthermore, tangible controls mix with regular objects and induce a messy desk. Therefore, we introduce the concept of user-de fined tangibles, which enable a spontaneous binding between physical objects and digital functions. With user-de fined tangibles, the need for specially designed tangible controls disappears and each physical object on the augmented
workplace can be turned into a tangible control. We introduce our prototypical system and outline our proposed interaction concept.

Assisitive Augmentation at the Manual Workplace using In-Situ Projection

table_sketchFunk, Markus; Korn, Oliver; Schmidt, Albrecht:
Assisitive Augmentation at the Manual Assembly Workplace using In-Situ Projection

In: CHI ’14 Workshop on Assistive Augmentation. April 27th 2014.

Abstract [CHI-Worshop Paper]

In this paper, we argue for using in-situ projection to augment a user’s working experience. By recognizing objects on a workplace, the system is able to detect the current step within a workflow. Based on the information about position and orientation of the work-piece, speci c feedback can be given – even as a projection on top of the workpiece. So far, our work indicates that this technology is accepted by the industry. Currently, we are investigating the use of gami cation elements on the error rate. Additionally, we introduce a model for the conception of context aware assistive systems (CAAS). With our workshop participation, we want to discuss the potentials of in-situ projection at the manual workplace with the participants.

Tangible and Intuitive Interaction – Video of a Prototype

motionEAP-prototypeAt the University of Stuttgart Institute for Visualization and Interactive Systems (VIS) and the Korion GmbH I am part of a team developing the prototype of a new kind of assistive system in the project motionEAP. The prototype combines the 3D-spaces of the depth sensors Kinect and Leap Motion.

It detects individual fingers of both hands and allows to direct processes with simple gestures. Both gestures in space and touch events on the surface of the workspace are detected. At the same time the system allows to project videos or interactive 3D-spaces on any kind of surface.

As an example this allows to zoom or rotate a workpart through simple gestures. In future development iterations we will integrate object detection allowing a context- or product-specific feedback on processes, e.g. in manual assembly. This context-aware feedback is a pre-requisite for the later implementation of gamification components. These will allow to integrate feedback smoothly and least disruptive while motivating the assistive system’s users.

The following video illustrates the prototype’s current features: