3D positioning system based on one-handed thumb interactions for 3d annotation placement

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper presents a 3D positioning system based on one-handed thumb interactions for simple 3D annotation placement with a smart-phone. To place an annotation at a target point in the real environment, the 3D coordinate of the point is computed by interactively selecting the corresponding points in multiple views by users while performing SLAM. Generally, it is difficult for users to precisely select an intended pixel on the touchscreen. Therefore, we propose to compute the 3D coordinate from multiple observations with a robust estimator to have the tolerance to the inaccurate user inputs. In addition, we developed three pixel selection methods based on one-handed thumb interactions. A pixel is selected at the thumb position at a live view in FingAR, the position of a reticle marker at a live view in SnipAR, or that of a movable reticle marker at a freezed view in FreezAR. In the preliminary evaluation, we investigated the 3D positioning accuracy of each method.

Original languageEnglish
Title of host publication26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1181-1182
Number of pages2
ISBN (Electronic)9781728113777
DOIs
Publication statusPublished - Mar 1 2019
Event26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Osaka, Japan
Duration: Mar 23 2019Mar 27 2019

Publication series

Name26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings

Conference

Conference26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019
CountryJapan
CityOsaka
Period3/23/193/27/19

Fingerprint

Pixels
Touch screens

All Science Journal Classification (ASJC) codes

  • Human-Computer Interaction
  • Media Technology

Cite this

Tashiro, S., Uchiyama, H., Thomas, D. G. F., & Taniguchi, R-I. (2019). 3D positioning system based on one-handed thumb interactions for 3d annotation placement. In 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings (pp. 1181-1182). [8797979] (26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/VR.2019.8797979

3D positioning system based on one-handed thumb interactions for 3d annotation placement. / Tashiro, So; Uchiyama, Hideaki; Thomas, Diego Gabriel Francis; Taniguchi, Rin-Ichiro.

26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. p. 1181-1182 8797979 (26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Tashiro, S, Uchiyama, H, Thomas, DGF & Taniguchi, R-I 2019, 3D positioning system based on one-handed thumb interactions for 3d annotation placement. in 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings., 8797979, 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings, Institute of Electrical and Electronics Engineers Inc., pp. 1181-1182, 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019, Osaka, Japan, 3/23/19. https://doi.org/10.1109/VR.2019.8797979
Tashiro S, Uchiyama H, Thomas DGF, Taniguchi R-I. 3D positioning system based on one-handed thumb interactions for 3d annotation placement. In 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2019. p. 1181-1182. 8797979. (26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings). https://doi.org/10.1109/VR.2019.8797979
Tashiro, So ; Uchiyama, Hideaki ; Thomas, Diego Gabriel Francis ; Taniguchi, Rin-Ichiro. / 3D positioning system based on one-handed thumb interactions for 3d annotation placement. 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. pp. 1181-1182 (26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings).
@inproceedings{98c697dbc1074bde8f6f042a9a827dd7,
title = "3D positioning system based on one-handed thumb interactions for 3d annotation placement",
abstract = "This paper presents a 3D positioning system based on one-handed thumb interactions for simple 3D annotation placement with a smart-phone. To place an annotation at a target point in the real environment, the 3D coordinate of the point is computed by interactively selecting the corresponding points in multiple views by users while performing SLAM. Generally, it is difficult for users to precisely select an intended pixel on the touchscreen. Therefore, we propose to compute the 3D coordinate from multiple observations with a robust estimator to have the tolerance to the inaccurate user inputs. In addition, we developed three pixel selection methods based on one-handed thumb interactions. A pixel is selected at the thumb position at a live view in FingAR, the position of a reticle marker at a live view in SnipAR, or that of a movable reticle marker at a freezed view in FreezAR. In the preliminary evaluation, we investigated the 3D positioning accuracy of each method.",
author = "So Tashiro and Hideaki Uchiyama and Thomas, {Diego Gabriel Francis} and Rin-Ichiro Taniguchi",
year = "2019",
month = "3",
day = "1",
doi = "10.1109/VR.2019.8797979",
language = "English",
series = "26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "1181--1182",
booktitle = "26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings",
address = "United States",

}

TY - GEN

T1 - 3D positioning system based on one-handed thumb interactions for 3d annotation placement

AU - Tashiro, So

AU - Uchiyama, Hideaki

AU - Thomas, Diego Gabriel Francis

AU - Taniguchi, Rin-Ichiro

PY - 2019/3/1

Y1 - 2019/3/1

N2 - This paper presents a 3D positioning system based on one-handed thumb interactions for simple 3D annotation placement with a smart-phone. To place an annotation at a target point in the real environment, the 3D coordinate of the point is computed by interactively selecting the corresponding points in multiple views by users while performing SLAM. Generally, it is difficult for users to precisely select an intended pixel on the touchscreen. Therefore, we propose to compute the 3D coordinate from multiple observations with a robust estimator to have the tolerance to the inaccurate user inputs. In addition, we developed three pixel selection methods based on one-handed thumb interactions. A pixel is selected at the thumb position at a live view in FingAR, the position of a reticle marker at a live view in SnipAR, or that of a movable reticle marker at a freezed view in FreezAR. In the preliminary evaluation, we investigated the 3D positioning accuracy of each method.

AB - This paper presents a 3D positioning system based on one-handed thumb interactions for simple 3D annotation placement with a smart-phone. To place an annotation at a target point in the real environment, the 3D coordinate of the point is computed by interactively selecting the corresponding points in multiple views by users while performing SLAM. Generally, it is difficult for users to precisely select an intended pixel on the touchscreen. Therefore, we propose to compute the 3D coordinate from multiple observations with a robust estimator to have the tolerance to the inaccurate user inputs. In addition, we developed three pixel selection methods based on one-handed thumb interactions. A pixel is selected at the thumb position at a live view in FingAR, the position of a reticle marker at a live view in SnipAR, or that of a movable reticle marker at a freezed view in FreezAR. In the preliminary evaluation, we investigated the 3D positioning accuracy of each method.

UR - http://www.scopus.com/inward/record.url?scp=85071840582&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85071840582&partnerID=8YFLogxK

U2 - 10.1109/VR.2019.8797979

DO - 10.1109/VR.2019.8797979

M3 - Conference contribution

AN - SCOPUS:85071840582

T3 - 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings

SP - 1181

EP - 1182

BT - 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -