Hello, I'm

Dr. Oliver Beren Kaul

Mobile Developer, Researcher & AI Enthusiast

Lucerne, Switzerland

Intro

About me

Portrait of Beren

Hi there! My name is Beren. I'm currently employed as Technical Lead Engineer for a multicultural team of five Software Engineers working on several iOS apps using Augmented Reality and Artificial Intelligence at Arbrea Labs AG in Zurich, Switzerland. Arbrea Labs is an ETH spin-off that focuses on visualizing the results of potential plastic surgery for patients and surgeons to help them make informed decisions.

Before migrating to Switzerland in June 2021, I finished a Ph.D. at the Human-Computer Interaction Group led by Michael Rohs at the University of Hannover in Germany. During my Ph.D., I focused on tactile feedback in Human-Computer Interaction. My main Ph.D. project was called HapticHead, a 24 tactile actuator display on the head. This interface can be used in Augmented and Virtual Reality scenarios for various applications such as guidance and navigation for people with visual impairments or increasing immersion in VR scenarios by playing back tactile features of the virtual world. Imagine feeling a raindrop hitting your head exactly where you'd expect it to hit or feeling the impact of an explosive shockwave.

Apart from my main Ph.D. topic, I had the freedom to work on several side projects, mostly in Artificial Intelligence, as I find AI applications to be a fascinating research direction. Notably, I developed the AhemPreventor, a system that detects filler words in oral presentations and discretely gives presenters concealed tactile feedback whenever they use a filler word. This system may train oneself to use fewer filler words in presentations subconsciously. Another exciting project revolved around grading the damage level of concrete through ultrasonic signals and Deep Learning in collaboration with a colleague from material sciences.

As a side hustle during my Ph.D., I developed and supported three iOS and several Android apps for miovent.de, a small agile event company and subsidiary of eventit.ag and Deutsche Messe AG.

Skills

Leadership skills
I currently lead a team of five Software Engineers for Arbrea Labs AG, previously gave lectures to up to 600 CS students in various topics, and supervised dozens of Bachelor/Master theses during my Ph. D. and for Arbrea Labs AG in collaboration with ETH.

Mobile App Development
My side hustle during my studies (mostly Android) and current job (iOS). I like polishing the UX of apps through gamification and animations so that they feel fun and engaging to use.

Programming languages
Preference for newer languages: Swift, Kotlin & Python while still knowing how basic C and pointers work. I'm not lost without ChatGPT 😃

Artificial Intelligence
I trained neural networks and deployed several mobile apps with AI.

Data analysis and visualization
I conducted dozens of user studies during my Ph. D. and analyzed data with statistical methods.

Social skills
Empathy, pragmatism, traveling, and cultural appreciation. Others appreciate my smiling, positive attitude and pragmatic can-do character.

Experience

Work and education

Work experience

  • July 2022 - present

    Technical Lead Engineer


    Arbrea Labs AG
    Zurich, Switzerland

    Technical Lead Engineer for Arbrea's iOS apps and web services. Starting in July 2022, we began expanding the technical team with five new Software Engineers that I advise in my current role as Technical Lead Engineer at Arbrea Labs AG. Furthermore, due to a close collaboration with ETH, we have multiple Bachelor and Master students working on their theses and investigating possible future technical directions for Arbrea with my advice.


  • June 2021 - June 2022

    Senior Software Engineer


    Arbrea Labs AG
    Zurich, Switzerland

    Senior Software Engineer for iOS. I gradually improved the UX and functionality of several iOS apps which revolve around Augmented Reality, Computer Vision, and Deep Learning. I was also responsible for improving user management on the frontend and backend, database connectivity (Firebase), app analytics, and statistics.


  • December 2013 - February 2020

    Mobile Developer


    miovent.de, subsidiary of eventit AG

    My role included the development and continued improvement of several Android and three iOS apps concerning event management.


  • January 2012 - December 2014

    Research Assistant


    Systems Research and Architecture Group
    University of Hannover

    Development and implementation of Android apps and organic algorithms for robots (Khepera III).
    Administration and presentation of lectures within the institute.

Education

  • August 2015 - May 2021

    Ph.D. Computer Science


    HCI Group, University of Hannover

    My role included research, interaction design, prototype implementation, and user evaluation on Tactile Feedback in Augmented and Virtual Reality and Artificial Intelligence for sensor data.


  • May 2015

      Master Computer Science


    University of Hannover

    Software Development and Architecture, Mobile Development, UI Design, Requirements Engineering.

    Minor: Geoinformatics


  • October 2011

      Bachelor Computer Science


    University of Hannover

    Minor: Economics

Projects

Protoypes and fun stuff

Here's an overview of my research projects. Tap on an image to see more information about the project and a link to the accompanying research paper.

HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality

Spatial Tactile Display

Current virtual and augmented reality head-mounted displays usually include no or only a single vibration motor for haptic feedback and do not use it for guidance. We present HapticHead, a system utilizing multiple vibrotactile actuators distributed in three concentric ellipses around the head for intuitive haptic guidance through moving tactile cues. We conducted three experiments, which indicate that HapticHead vibrotactile feedback is both faster (2.6 s vs. 6.9 s) and more precise (96.4 % vs. 54.2 % success rate) than spatial audio (generic head-related transfer function) for finding visible virtual objects in 3D space around the user. The baseline of visual feedback is – as expected – more precise (99.7 % success rate) and faster (1.3 s) in comparison, but there are many applications in which visual feedback is not desirable or available due to lighting conditions, visual overload, or visual impairments. Mean final precision with HapticHead feedback on invisible targets is 2.3° compared to 0.8° with visual feedback. We successfully navigated blindfolded users to real household items at different heights using HapticHead vibrotactile feedback independently of a head-mounted display.

Project leader:
Oliver Beren Kaul

Other author:
Michael Rohs

Conference:
CHI' 2017

Library paper link:
ACM library

Paper download:
Author version

Publication date
May 2017

Increasing Presence in Virtual Reality with a Vibrotactile Grid Around the Head

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

Increasing Presence in Virtual Reality with a Vibrotactile Grid Around the Head

Virtual Reality Presence and Immersion

A high level of presence is an essential aspect of immersive virtual reality applications. However, presence is difficult to achieve as it depends on the system's user, immersion capabilities (visual, auditory, and tactile), and the concrete application. We use a vibrotactile grid around the head to further increase the level of presence users feel in virtual reality scenes. In a between-groups com-parison study, the vibrotactile group scored significantly higher in a standardized presence questionnaire than the baseline of no tactile feedback. Our study suggests the proposed prototype as an additional tool to increase the level of presence users feel in virtual reality scenes.

Project leader:
Oliver Beren Kaul

Other authors:
Kevin Meier,
Michael Rohs

Conference:
INTERACT' 2017

Library paper link:
SpringerLink library

Paper download:
Author version

Publication date
September 2017

VRTactileDraw: A Virtual Reality Tactile Pattern Designer for Complex Spatial Arrangements of Actuators

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

VRTactileDraw: A Virtual Reality Tactile Pattern Designer for Complex Spatial Arrangements of Actuators

Virtual Reality Tactile Pattern Design

Creating tactile patterns on the body via a spatial arrangement of many tactile actuators offers many opportunities and presents a challenge, as the design space is enormous. This paper presents a VR interface that enables designers to rapidly prototype complex tactile interfaces. It allows for painting strokes on a modeled body part and translates these strokes into continuous tactile patterns using an interpolation algorithm. The presented VR approach avoids several problems of traditional 2D editors. It realizes spatial 3D input using VR controllers with natural mapping and intuitive spatial movements. To evaluate this approach in detail, we conducted a user study and iteratively improved the system. The study participants gave predominantly positive feedback on the presented VR interface (SUS score 79.7, AttrakDiff ``desirable''). The final system is released alongside this paper as an open-source Unity project for various tactile hardware.

Project leader:
Oliver Beren Kaul

Other authors:
Andreas Domin,
Michael Rohs,
Benjamin Simon,
Maximilian Schrapel

Conference:
INTERACT' 2021

Library paper link:
Springerlink library

Paper download:
Author version

Publication date
August 2021

Around-the-Head Tactile System for Supporting Micro Navigation of People with Visual Impairments

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

Around-the-Head Tactile System for Supporting Micro Navigation of People with Visual Impairments

Augmented Reality Accessibility

Tactile patterns are a means to convey navigation instructions to pedestrians and are especially helpful for people with visual impairments. This article presents a concept to provide precise micro-navigation instructions through a tactile around-the-head display. Our system presents four tactile patterns for fundamental navigation instructions in conjunction with continuous directional guidance. We followed an iterative, user-centric approach to design the patterns for the fundamental navigation instructions, combined them with a continuous directional guidance stimulus, and tested our system with 13 sighted (blindfolded) and 2 blind participants in an obstacle course, including stairs. We optimized the patterns and validated the final prototype with another five blind participants in a follow-up study. The system steered our participants successfully with a 5.7 cm average absolute deviation from the optimal path. Our guidance is only a little less precise than the usual shoulder wobbling during normal walking and an order of magnitude more precise than previous tactile navigation systems. Our system allows various new use cases of micro-navigation for people with visual impairments, e.g., preventing collisions on a sidewalk or as an anti-veering tool. It also has applications in other areas, such as personnel working in low-vision environments (e.g., firefighters).

Project leader:
Oliver Beren Kaul

Other authors:
Michael Rohs,
Marc Mogalle,
Benjamin Simon

Journal:
TOCHI - ACM Transactions on Computer-Human Interaction

Journal paper link:
ACM library

Paper download:
Author version

Publication date
July 2021

Mobile Recognition and Tracking of Objects in the Environment through Augmented Reality and 3D Audio Cues for People with Visual Impairments

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

Mobile Recognition and Tracking of Objects in the Environment through Augmented Reality and 3D Audio Cues for People with Visual Impairments

Mobile Guidance, Augmented Reality, Artificial Intelligence, Accessibility

People with visual impairments face challenges in scene and object recognition, especially in unknown environments. We combined the mobile scene detection framework Apple ARKit with MobileNet-v2 and 3D spatial audio to provide an auditory scene description to people with visual impairments. The combination of ARKit and MobileNet allows keeping recognized objects in the scene even if the user turns away from the object. An object can thus serve as an auditory landmark. With a search function, the system can even guide the user to a particular item. The system also provides spatial audio warnings for nearby objects and walls to avoid collisions. We evaluated the implemented app in a preliminary user study. The results show that users can find items without visual feedback using the proposed application. The study also reveals that the range of local object detection through MobileNet-v2 was insufficient, which we aim to overcome using more accurate object detection frameworks in future work (YOLOv5x).

Project leader:
Oliver Beren Kaul

Other author:
Kersten Behrens,
Michael Rohs

Conference:
CHI' 2021

Library paper link:
ACM library

Paper download:
Author version

Publication date
May 2021

Vibrotactile Funneling Illusion and Localization Performance on the Head

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

Vibrotactile Funneling Illusion and Localization Performance on the Head

Tactile Illusions and Localization Performance

The vibrotactile funneling illusion is the sensation of a single (non-existing) stimulus somewhere in-between the actual stimulus locations. Its occurrence depends upon body location, the distance between the actuators, signal synchronization, and intensity. Related work has shown that the funneling illusion may occur on the forehead. We were able to reproduce these findings and explored five other regions to get a complete picture of the funneling illusion's occurrence on the head. Our study results (24 participants) show that the actuator distance, for which the funneling illusion occurs, strongly depends upon the head region. Moreover, we evaluated the centralizing bias (smaller perceived than actual actuator distances) for different head regions, which also showed widely varying characteristics. We computed a detailed heat map of vibrotactile localization accuracies on the head. The results inform the design of future tactile head-mounted displays that aim to support the funneling illusion.

Project leader:
Oliver Beren Kaul

Other author:
Michael Rohs,
Benjamin Simon,
Kerem Can Demir,
Kamillo Ferry

Conference:
CHI' 2020

Library paper link:
ACM library

Paper download:
Author version

Publication date
April 2020

Publications

Conference papers and notes

Please refer to my Google Scholar profile

Here's a download link to my dissertation "HapticHead - Augmenting Reality via Tactile Cues" (PDF, 55 MB) - officially published here.

In case you cannot access a particular paper, feel free to drop me an email and I'll send the pdf your way! 😃

If you would like my current CV as PDF, please contact me. I cannot publish it online as it contains private information of my references.

Contact

Let's talk

Email
Social Media
Top