Grin logo
de en es fr
Shop
GRIN Website
Texte veröffentlichen, Rundum-Service genießen
Zur Shop-Startseite › Informatik - Angewandte Informatik

Eye Gaze Interaction for a Mobile Text Application. A New Interface Concept

Titel: Eye Gaze Interaction for a Mobile Text Application. A New Interface Concept

Bachelorarbeit , 2019 , 40 Seiten , Note: 1,0

Autor:in: Thomas Mayer (Autor:in)

Informatik - Angewandte Informatik
Leseprobe & Details   Blick ins Buch
Zusammenfassung Leseprobe Details

A common element of a user interface for initiating an action is the button that is traditionally touched or clicked. A new concept is introduced in this work that extends the standard functionality of a button with advanced gaze-based functions, the GazeButton.

It acts as an universal interface for gaze-based UI interactions during a classic touch interaction. It is easy to join with existing UIs and unobtrusively complementary because it keeps the users’ freedom to choose between classic and gaze-based interaction. In addition it is uncomplicated, because all new functions are locally bound to it, and, despite being just a small UI-Element, it is usable for interactions throughout the display and even beyond because of its gaze component. The GazeButton is demonstrated using a text editing application on a multitouch tablet computer. For example, a word can be selected by looking at it and tapping the GazeButton avoiding strenuous physical movement.

For such examples, concrete systematic designs are presented that combine visual and manual user input whereafter the gaze based text selection is compared with the classic touch based one in a user study.

Leseprobe


Table of Contents

1 Introduction

2 Related Work

2.1 UI for Handheld Devices

2.2 General Gaze Interaction

2.3 Gaze Interaction on handheld devices

2.4 Eye Typing

2.4.1 Dwell Time Based Eye Typing

2.4.2 Dwell Time Free Eye Typing

2.4.3 Eye Typing in Security and VR

3 Gaze enhanced Text Editing

3.1 Explanation of important Components

3.2 Design Considerations

3.3 Input Interpretation

3.4 Complementary interaction modalities

3.4.1 Typing

3.4.2 Text Selection

3.5 Implementation

3.6 Feedback

3.7 Interaction Technique Examples

3.7.1 Touch typing with ’gaze shift’

3.7.2 Gaze based cursor positioning

3.7.3 Gaze based word selection

3.7.4 Free text area selection with gaze

3.7.5 Eye typing a special character

3.7.6 Selecting extension keys with gaze

4 User Study: Word Selection

4.1 Study Design

4.2 Participants

4.3 Study Procedure

5 Results

5.1 Performance

5.2 Questionnaires and Feedback

6 Discussion

7 Conclusion

Research Objectives and Core Themes

This thesis investigates how to enhance text entry on wide touch screens by combining traditional manual touch interaction with eye tracking technology. The primary goal is to improve the usability and ergonomics of text editing applications on mobile devices through the introduction of the 'GazeButton' concept, which allows for multimodal interaction without obscuring screen content or requiring strenuous arm movements.

  • Integration of eye gaze as an auxiliary input modality in standard 2D GUIs.
  • Development of the 'GazeButton' as a universal interface for combining touch and gaze.
  • Ergonomic assessment of gaze-based text selection versus conventional touch methods.
  • Investigation of multimodal interaction techniques (typing, cursor positioning, selection).

Excerpt from the Thesis

3.2 Design Considerations

Other than works that aimed to design interfaces for two-thumb text entry on tablet computers [34] [2], this thesis aims to provide more functionality in text based applications by using gaze recognition while keeping the UI straightforward. Through simplification, the mental and physical stress during tablet usage is reduced, especially regarding arm effort when using gaze instead of large arm movement. The newly presented GazeButton can be used in several ways to improve the user experience. Trying to retain simplicity and coming up with intuitive interaction techniques, the following points can be taken into consideration:

Where the user looks The user’s gaze is sensed by the eye tracker and then translated into 2D coordinates to detect where the user looks on or beside the screen as the gaze is not limited by screen dimensions. Since the user usually looks at the screen, the gaze coordinates are specified in a coordinate system whose origin is at a corner of the screen and all the gaze points correspond to pixels on the screen or imaginary pixels next to the screen with the same point density. In the text editing application of this thesis, these gaze points outside of the screen are used to expand selection ranges of objects near the screen borders when they are selected by gaze. Based on the user’s gaze, different features can be activated, so a designer can decide to implement different application reactions depending on whether the user looks at the GazeButton or somewhere else or at another specific area in the application.

Summary of Chapters

1 Introduction: Introduces the limitations of current direct manipulation interfaces on mobile devices and proposes the 'GazeButton' as an innovative solution to enhance interaction.

2 Related Work: Discusses existing research on handheld UI challenges, general gaze interaction techniques, and historical developments in eye-typing and mobile eye-tracking.

3 Gaze enhanced Text Editing: Details the prototype application architecture, input interpretation, design considerations, and various interaction techniques enabled by the GazeButton.

4 User Study: Word Selection: Describes the design, participants, and procedure of the empirical study conducted to compare the performance of gaze-based versus touch-only word selection.

5 Results: Presents the quantitative performance data and qualitative questionnaire feedback collected from the user study.

6 Discussion: Interprets the findings, noting that users prefer gaze-based interaction for physical comfort, despite challenges with precision for small targets.

7 Conclusion: Summarizes the thesis findings, confirming the feasibility of gaze-enhanced touch interaction and suggesting directions for future research.

Keywords

Gaze interaction, Eye tracking, GazeButton, Multimodal interaction, Touch screen, Mobile devices, Text editing, Ergonomics, Human-computer interaction, User study, Gaze selection, Text entry, Tablet computing, Interface design, Input modalities.

Frequently Asked Questions

What is the core focus of this research?

The research focuses on enhancing text entry on wide touch screens by combining touch interaction with eye gaze recognition to improve usability and reduce physical effort.

What are the primary thematic fields covered?

The thesis covers handheld UI design, multimodal interaction, eye tracking technology, and human-computer interaction (HCI) performance evaluation.

What is the primary research goal?

The goal is to develop and evaluate the 'GazeButton', a concept that extends traditional buttons with gaze-aware functionality to simplify complex text editing tasks.

What scientific methods were employed?

The research utilized a prototype-based development approach, followed by a controlled lab user study to empirically compare interaction techniques, using ANOVA for performance analysis.

What topics are discussed in the main body?

The main body details the system architecture, input event interpretation, design for gaze-touch combinations, and specific interaction examples like 'gaze shift' and free text selection.

Which keywords characterize this work?

Key terms include Gaze interaction, Eye tracking, GazeButton, Multimodal interaction, and Mobile text entry.

How does the GazeButton handle the 'Midas touch' problem?

The system avoids the 'Midas touch' problem by requiring an intentional touch interaction on the GazeButton to trigger gaze-based actions, rather than relying solely on dwelling.

Did participants prefer gaze-based or touch-based selection?

Participants significantly preferred gaze-based selection due to increased physical comfort, particularly when dealing with larger font sizes, although they noted some concerns regarding precision for smaller objects.

How is gaze jitter managed in this application?

The system uses an intermediate step that averages the four most recent gaze coordinates stored in a fixed-size array to smooth the input signal before further calculation.

Ende der Leseprobe aus 40 Seiten  - nach oben

Details

Titel
Eye Gaze Interaction for a Mobile Text Application. A New Interface Concept
Hochschule
Ludwig-Maximilians-Universität München
Note
1,0
Autor
Thomas Mayer (Autor:in)
Erscheinungsjahr
2019
Seiten
40
Katalognummer
V497717
ISBN (eBook)
9783346038814
ISBN (Buch)
9783346038821
Sprache
Englisch
Schlagworte
gaze interaction mobile text application interface concept
Produktsicherheit
GRIN Publishing GmbH
Arbeit zitieren
Thomas Mayer (Autor:in), 2019, Eye Gaze Interaction for a Mobile Text Application. A New Interface Concept, München, GRIN Verlag, https://www.grin.com/document/497717
Blick ins Buch
  • Wenn Sie diese Meldung sehen, konnt das Bild nicht geladen und dargestellt werden.
  • Wenn Sie diese Meldung sehen, konnt das Bild nicht geladen und dargestellt werden.
  • Wenn Sie diese Meldung sehen, konnt das Bild nicht geladen und dargestellt werden.
  • Wenn Sie diese Meldung sehen, konnt das Bild nicht geladen und dargestellt werden.
  • Wenn Sie diese Meldung sehen, konnt das Bild nicht geladen und dargestellt werden.
  • Wenn Sie diese Meldung sehen, konnt das Bild nicht geladen und dargestellt werden.
  • Wenn Sie diese Meldung sehen, konnt das Bild nicht geladen und dargestellt werden.
  • Wenn Sie diese Meldung sehen, konnt das Bild nicht geladen und dargestellt werden.
  • Wenn Sie diese Meldung sehen, konnt das Bild nicht geladen und dargestellt werden.
  • Wenn Sie diese Meldung sehen, konnt das Bild nicht geladen und dargestellt werden.
Leseprobe aus  40  Seiten
Grin logo
  • Grin.com
  • Versand
  • Kontakt
  • Datenschutz
  • AGB
  • Impressum