Hello, I'm

Dr. Oliver Beren Kaul

Lead Software Engineer, Researcher & AI Enthusiast

Lucerne, Switzerland

Intro

About me

Portrait of Beren

🚀 Hi there, I’m Beren!

I’m a Lead Software Engineer passionate about creating intuitive, AI-powered experiences. Currently, I lead a multicultural team of five engineers at Arbrea Labs AG in Zurich, Switzerland, where we build cutting-edge iOS apps leveraging Augmented Reality (AR) and Artificial Intelligence (AI). Our mission? To revolutionize the field of aesthetic medicine by helping surgeons and patients visualize potential plastic surgery outcomes.

🎓 From Ph.D. Research to Real-World Impact

Before moving to Switzerland in 2021, I completed my Ph.D. in Human-Computer Interaction at the University of Hannover, where I explored tactile feedback in AR/VR. My main research project, HapticHead, introduced a 24-actuator vibrotactile display that enhances immersion in virtual environments—whether it’s guiding visually impaired users or simulating the sensation of raindrops in VR.

Beyond my core research, I delved into AI applications, including:

  • AhemPreventor – An AI-powered tool that detects filler words in speech and provides real-time, discreet tactile feedback to improve public speaking.
  • Concrete Damage Detection – Using Deep Learning to assess structural damage via ultrasonic signals in collaboration with material scientists.

🎤 Tackling Stage Fright with AI

One of my passion projects is Stage Fright Buddy, an AI-driven personalized training app for public speaking. Built with Swift 6 and SwiftUI, it helps users overcome performance anxiety through voice-based chat, curated challenges, and adaptive feedback, continuously optimized with A/B testing and behavioral analytics.

📱 Building Exceptional Mobile Experiences

With over 10 years of experience in iOS and Android development, I’ve helped companies scale their mobile products. Before Arbrea, I developed and maintained several event management apps for miovent.de, a subsidiary of Deutsche Messe AG.

🔍 What’s Next?

I thrive in innovative, fast-paced environments where I can combine AI, HCI, and mobile development to build next-gen applications. If you’re looking for a pragmatic Senior Software Engineer or Tech Lead with a passion for creating intelligent, user-friendly software, let’s connect!

Connect on LinkedIn

Skills

Leadership skills
I currently lead a team of five Software Engineers for Arbrea Labs AG, previously gave lectures to up to 600 CS students in various topics, and supervised dozens of Bachelor/Master theses during my Ph. D. and for Arbrea Labs AG in collaboration with ETH.

Mobile App Development
My side hustle during my studies (mostly Android) and current job (iOS). I like polishing the UX of apps through gamification and animations so that they feel fun and engaging to use.

Programming languages
Preference for newer languages: Swift, Kotlin & Python while still knowing how basic C and pointers work. I'm not lost without ChatGPT 😃

Artificial Intelligence
I trained neural networks and deployed several mobile apps with AI.

Data analysis and visualization
I conducted dozens of user studies during my Ph. D. and analyzed data with statistical methods.

Social skills
Empathy, pragmatism, traveling, and cultural appreciation. Others appreciate my smiling, positive attitude and pragmatic can-do character.

Experience

Work and education

Work experience

  • July 2022 - present

    Lead Software Engineer


    Arbrea Labs AG
    Zurich, Switzerland

    Lead Software Engineer for Arbrea's iOS apps and web services. Starting in July 2022, we began expanding the technical team with five new Software Engineers that I advise in my current role as Lead Software Engineer at Arbrea Labs AG. Furthermore, due to a close collaboration with ETH, we have multiple Bachelor and Master students working on their theses and investigating possible future technical directions for Arbrea with my advice.


  • June 2021 - June 2022

    Senior Software Engineer


    Arbrea Labs AG
    Zurich, Switzerland

    I gradually improved the UX and functionality of several iOS apps which revolve around Augmented Reality, Computer Vision, and Deep Learning. I was also responsible for improving user management on the frontend and backend, database connectivity (Firebase), app analytics, and statistics.


  • December 2013 - February 2020

    Software Engineer iOS & Android


    miovent.de, subsidiary of eventit AG

    My role included the development and continued improvement of several Android and three iOS apps concerning event management.


  • January 2012 - December 2014

    Research Assistant


    Systems Research and Architecture Group
    University of Hannover

    Development and implementation of Android apps and organic algorithms for robots (Khepera III).
    Administration and presentation of lectures within the institute.

Education

  • August 2015 - May 2021

    Ph.D. Computer Science


    HCI Group, University of Hannover

    My role included research, interaction design, prototype implementation, and user evaluation on Tactile Feedback in Augmented and Virtual Reality and Artificial Intelligence for sensor data.


  • May 2015

      Master Computer Science


    University of Hannover

    Software Development and Architecture, Mobile Development, UI Design, Requirements Engineering.

    Minor: Geoinformatics


  • October 2011

      Bachelor Computer Science


    University of Hannover

    Minor: Economics

Projects

Protoypes and fun stuff

Here's an overview of my research projects. Tap on an image to see more information about the project and a link to the accompanying research paper.

HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality

Spatial Tactile Display

Current virtual and augmented reality head-mounted displays usually include no or only a single vibration motor for haptic feedback and do not use it for guidance. We present HapticHead, a system utilizing multiple vibrotactile actuators distributed in three concentric ellipses around the head for intuitive haptic guidance through moving tactile cues. We conducted three experiments, which indicate that HapticHead vibrotactile feedback is both faster (2.6 s vs. 6.9 s) and more precise (96.4 % vs. 54.2 % success rate) than spatial audio (generic head-related transfer function) for finding visible virtual objects in 3D space around the user. The baseline of visual feedback is – as expected – more precise (99.7 % success rate) and faster (1.3 s) in comparison, but there are many applications in which visual feedback is not desirable or available due to lighting conditions, visual overload, or visual impairments. Mean final precision with HapticHead feedback on invisible targets is 2.3° compared to 0.8° with visual feedback. We successfully navigated blindfolded users to real household items at different heights using HapticHead vibrotactile feedback independently of a head-mounted display.

Project leader:
Oliver Beren Kaul

Other author:
Michael Rohs

Conference:
CHI' 2017

Library paper link:
ACM library

Paper download:
Author version

Publication date
May 2017

Increasing Presence in Virtual Reality with a Vibrotactile Grid Around the Head

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

Increasing Presence in Virtual Reality with a Vibrotactile Grid Around the Head

Virtual Reality Presence and Immersion

A high level of presence is an essential aspect of immersive virtual reality applications. However, presence is difficult to achieve as it depends on the system's user, immersion capabilities (visual, auditory, and tactile), and the concrete application. We use a vibrotactile grid around the head to further increase the level of presence users feel in virtual reality scenes. In a between-groups com-parison study, the vibrotactile group scored significantly higher in a standardized presence questionnaire than the baseline of no tactile feedback. Our study suggests the proposed prototype as an additional tool to increase the level of presence users feel in virtual reality scenes.

Project leader:
Oliver Beren Kaul

Other authors:
Kevin Meier,
Michael Rohs

Conference:
INTERACT' 2017

Library paper link:
SpringerLink library

Paper download:
Author version

Publication date
September 2017

VRTactileDraw: A Virtual Reality Tactile Pattern Designer for Complex Spatial Arrangements of Actuators

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

VRTactileDraw: A Virtual Reality Tactile Pattern Designer for Complex Spatial Arrangements of Actuators

Virtual Reality Tactile Pattern Design

Creating tactile patterns on the body via a spatial arrangement of many tactile actuators offers many opportunities and presents a challenge, as the design space is enormous. This paper presents a VR interface that enables designers to rapidly prototype complex tactile interfaces. It allows for painting strokes on a modeled body part and translates these strokes into continuous tactile patterns using an interpolation algorithm. The presented VR approach avoids several problems of traditional 2D editors. It realizes spatial 3D input using VR controllers with natural mapping and intuitive spatial movements. To evaluate this approach in detail, we conducted a user study and iteratively improved the system. The study participants gave predominantly positive feedback on the presented VR interface (SUS score 79.7, AttrakDiff ``desirable''). The final system is released alongside this paper as an open-source Unity project for various tactile hardware.

Project leader:
Oliver Beren Kaul

Other authors:
Andreas Domin,
Michael Rohs,
Benjamin Simon,
Maximilian Schrapel

Conference:
INTERACT' 2021

Library paper link:
Springerlink library

Paper download:
Author version

Publication date
August 2021

Around-the-Head Tactile System for Supporting Micro Navigation of People with Visual Impairments

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

Around-the-Head Tactile System for Supporting Micro Navigation of People with Visual Impairments

Augmented Reality Accessibility

Tactile patterns are a means to convey navigation instructions to pedestrians and are especially helpful for people with visual impairments. This article presents a concept to provide precise micro-navigation instructions through a tactile around-the-head display. Our system presents four tactile patterns for fundamental navigation instructions in conjunction with continuous directional guidance. We followed an iterative, user-centric approach to design the patterns for the fundamental navigation instructions, combined them with a continuous directional guidance stimulus, and tested our system with 13 sighted (blindfolded) and 2 blind participants in an obstacle course, including stairs. We optimized the patterns and validated the final prototype with another five blind participants in a follow-up study. The system steered our participants successfully with a 5.7 cm average absolute deviation from the optimal path. Our guidance is only a little less precise than the usual shoulder wobbling during normal walking and an order of magnitude more precise than previous tactile navigation systems. Our system allows various new use cases of micro-navigation for people with visual impairments, e.g., preventing collisions on a sidewalk or as an anti-veering tool. It also has applications in other areas, such as personnel working in low-vision environments (e.g., firefighters).

Project leader:
Oliver Beren Kaul

Other authors:
Michael Rohs,
Marc Mogalle,
Benjamin Simon

Journal:
TOCHI - ACM Transactions on Computer-Human Interaction

Journal paper link:
ACM library

Paper download:
Author version

Publication date
July 2021

Mobile Recognition and Tracking of Objects in the Environment through Augmented Reality and 3D Audio Cues for People with Visual Impairments

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

Mobile Recognition and Tracking of Objects in the Environment through Augmented Reality and 3D Audio Cues for People with Visual Impairments

Mobile Guidance, Augmented Reality, Artificial Intelligence, Accessibility

People with visual impairments face challenges in scene and object recognition, especially in unknown environments. We combined the mobile scene detection framework Apple ARKit with MobileNet-v2 and 3D spatial audio to provide an auditory scene description to people with visual impairments. The combination of ARKit and MobileNet allows keeping recognized objects in the scene even if the user turns away from the object. An object can thus serve as an auditory landmark. With a search function, the system can even guide the user to a particular item. The system also provides spatial audio warnings for nearby objects and walls to avoid collisions. We evaluated the implemented app in a preliminary user study. The results show that users can find items without visual feedback using the proposed application. The study also reveals that the range of local object detection through MobileNet-v2 was insufficient, which we aim to overcome using more accurate object detection frameworks in future work (YOLOv5x).

Project leader:
Oliver Beren Kaul

Other author:
Kersten Behrens,
Michael Rohs

Conference:
CHI' 2021

Library paper link:
ACM library

Paper download:
Author version

Publication date
May 2021

Vibrotactile Funneling Illusion and Localization Performance on the Head

Around-the-Head Spatial Tactile System for Supporting Micro Navigation of Blind and Visually Impaired People
Hide

Vibrotactile Funneling Illusion and Localization Performance on the Head

Tactile Illusions and Localization Performance

The vibrotactile funneling illusion is the sensation of a single (non-existing) stimulus somewhere in-between the actual stimulus locations. Its occurrence depends upon body location, the distance between the actuators, signal synchronization, and intensity. Related work has shown that the funneling illusion may occur on the forehead. We were able to reproduce these findings and explored five other regions to get a complete picture of the funneling illusion's occurrence on the head. Our study results (24 participants) show that the actuator distance, for which the funneling illusion occurs, strongly depends upon the head region. Moreover, we evaluated the centralizing bias (smaller perceived than actual actuator distances) for different head regions, which also showed widely varying characteristics. We computed a detailed heat map of vibrotactile localization accuracies on the head. The results inform the design of future tactile head-mounted displays that aim to support the funneling illusion.

Project leader:
Oliver Beren Kaul

Other author:
Michael Rohs,
Benjamin Simon,
Kerem Can Demir,
Kamillo Ferry

Conference:
CHI' 2020

Library paper link:
ACM library

Paper download:
Author version

Publication date
April 2020

Publications

Conference papers and notes

Please refer to my Google Scholar profile

Here's a download link to my dissertation "HapticHead - Augmenting Reality via Tactile Cues" (PDF, 55 MB) - officially published here.

In case you cannot access a particular paper, feel free to drop me an email and I'll send the pdf your way! 😃

If you would like my current CV as PDF, please contact me. I cannot publish it online as it contains private information of my references.

Contact

Let's talk

Email
Social Media

Imprint

Hide

Imprint

This website is my personal page, so an imprint is not obligatory.
Nevertheless, here's my contact info:

Address:
Oliver Beren Kaul
Arbrea Labs AG
Technoparkstrasse 1
8005 Zürich
Switzerland

Email:

Disclaimer

Hide

Disclaimer

All information provided on this website is for information purposes only. Although every reasonable effort is made to present current and accurate information, this website makes no guarantees of any kind.

Use of Links

This website contains a number of links to other websites. In providing these links, I do not in any way endorse the contents of these other websites. I have not developed or reviewed the contents of those websites and do not accept any responsibility or liability for the contents of these other websites.

© - Oliver Beren Kaul

Top