A blue ball floating above an iPhone that indicates visual, auditive and haptic feedback.
Apps

Developing Comprehensive Feedback Mechanisms for a Medical iOS App

Autor:in
Lesezeit
10 ​​min

In collaboration with the Intelligent Systems Research Group (ISRG) at the University of Applied Sciences Karlsruhe and the physiotherapy company Varilag, inovex participated in the InferMod3D research project over the past two years. inovex provided support for the development of an iOS app that aimed at diagnosing and monitoring infant head deformation, specifically plagiocephaly.

Background Information: Head Deformation

Plagiocephaly is a condition characterised by an asymmetric deformation of an infant’s skull, typically caused by a preferred sleeping position. As the skull is still flexible during this stage of development, deformations can occur and must be treated before the skull solidifies. If left untreated, it can lead to impairments in balance and affect parts of the brain. The most common method for diagnosing plagiocephaly is using a caliper and using these measurements to calculate the Cranial Vault Asymmetry Index (CVAI), which indicates the severity of the plagiocephaly.

An illustration of infant head deformation or plagiocephaly.

How the app works

The app, developed by the ISRG and inovex, utilises the TrueDepth camera system of the iPhone, which is typically used for FaceID, to create scans of infants’ heads by recording RGBD data. The forward-facing position of the sensor poses a challenge for the operator, as the screen is difficult to see during the scanning process. To overcome this, the app offers unique feedback to assist the operator during the scanning process. It provides haptic, audible, and visual feedback to allow for scanning without the need to constantly look at the display.

The tech stack

The application uses:

  • the SwiftUI framework for building the user interface,
  • the Core Haptics framework to create haptic feedback,
  • the AVFoundation framework to create audible feedback,
  • the computer vision library OpenCV for detecting the markers on the cap,
  • the languages Swift, C++, Objective-C and
  • the hardware-accelerated graphics API Metal for rendering the 3D point cloud.

A screenshot of the home screen of the app. An animated screenshot while performing a scanning procedure.

Requirements for a scan

The app requires an iPhone with FaceID (iPhone X or newer) for the scanning process. To create a coherent 3D model of the head, infants must wear a special cap with (ArUco-) markers during the scanning procedure. As infants at this age are not yet able to sit still on their own, they must be held by a parent during the scan. A physiotherapist then moves the device around the head, adjusting the position based on the feedback provided by the app. The scanning process typically takes between one to two minutes.

A doll wearing a cap with ArUco markers that was used for testing.

Feedback concept

The feedback mechanism includes two types of feedback: positioning and progress feedback. The positioning feedback helps the user to correctly position the iPhone, while the progress feedback keeps the user informed of the overall progress of the scanning procedure.

A chart of how the feedback concept of the app works.
Positioning feedback

Primary feedback: The marker overlap status between the frames serves as the primary criteria for whether the positioning of the iPhone is satisfactory or needs to be improved. To signal to the user that something about the positioning needs to be corrected, the iPhone vibrates. This vibration is continuous and only stops once the overlap is sufficient again. However, the vibration is only started and stopped when the marker overlap is the same four frames in a row. This threshold is important as the app processes 30 frames per second and therefore, we need to reduce the number of times feedback is triggered.

Secondary feedback: The positioning feedback regarding the distance or the 2D positioning is only triggered if the primary feedback has been triggered and provides further information to the user on how to correct the positioning. Audio output “too close“ if more than 30 percent of all markers have invalid depth (depth value = not-a-number) information. This threshold is derived heuristically from initial testing. Audible positioning instructions (“left-up“, “right“, …) if less than 30 percent of all markers have invalid depth information.

Progress feedback

The progress of marker detection is indicated by the number of detected marker IDs. The user can choose to receive audio updates every 25 %, 50 %, or 100 % of marker detection in the app’s settings. The progress is also displayed visually as a progress bar on the screen, using traffic light colours for intuitive understanding. This feature was implemented to provide an alternative means of monitoring progress, in case audio output is not audible, for example, if the operator is distracted by crying infants.

Insights into the feedback implementation

Thermal monitoring

This app uses the TrueDepth camera system, which uses infrared to work. Since this sensor can overheat, the app listens to the ProcessInfo.thermalStateDidChangeNotification and shows an alert when the sensor gets too hot.

Text-to-speech

The audible feedback uses text-to-speech. The class AVSpeechUtterance is used to configure the text that should be spoken and in combination with the class AVSpeechSynthesisVoice, the language of the voice, in our case, German, can be set. The class AVSpeechSynthesizer synthesizes speech from the provided text and provides the method speak to output the audio.
Another key feature is that the audio is played and is audible even when the device is in silent mode. This is done by setting the playback category of the audio session: AVAudioSession.sharedInstance().setCategory(.playback).

Vibrations/Haptic Feedback

To create custom vibration patterns, the app uses the CHHapticEngine class to connect to the iPhone hardware, and the CHHapticPattern class to specify the intensity, sharpness, and duration of the vibration. Finally, the CHHapticAdvancedPatternPlayer class plays the vibration pattern.

Visual feedback by displaying a 3D point cloud with SwiftUI

The camera and sensor capture both RGB data and depth data in the form of AVDepthData and CVPixelBuffer. This data is used to construct a point cloud using Metal. By including RGB data, the point cloud is coloured and therefore looks like the actual camera feed at first sight. Using a point cloud offers the advantage of constructing a new image consisting only of the relevant features while eliminating unnecessary features and the ability to highlight the detected markers in different colours, for example. The ArUco marker extraction is done by using OpenCV. To display a view rendered by Metal, first, an MTKView has to be created with a reference to the device with MTLCreateSystemDefaultDevice. However, to display an MTKView we use a UIViewRepresentable as there is no standard SwiftUI Metal view component. The Metal view is updated through the updateUIView function with the latest RGBD data.

Testing and evaluation

The features and user experience of the app were evaluated in a small test setting by a team of physical therapists over a four-week period. Direct feedback on the scanning process was collected within the app after each scan, using an evaluation form that included sliders and toggle switches. The form was designed to gather immediate feedback on the scanning procedure. In addition, in-person interviews were conducted with the physical therapists who used the app during the evaluation period, to gain a deeper understanding of the strengths and limitations of the app.

Results

Data collected from the in-app evaluation already provided some insights into what could be the reason for the positive or negative rating of the performed scans. Data collected from the in-app evaluation suggests that the reason for bad scans (rating of 1 or 2) could stem from the infant’s behaviour and the fit of the cap. However, good scans (rating of 4 or 5) do not seem to correlate with the cap fit and the infant’s behaviour which suggests that the scan was good because the app fulfilled its requirements. Only one scan suggests that there might be an issue with the online 3D reconstruction pipeline. Lastly, the average scan rating is 3.56. In total, there were four very good (ratings of 4 or 5) scans and two bad (ratings of 1, 2) scans.

Feedback from physical therapists indicated that the app’s audio feedback on the distance to the head was helpful, but opinions differed on the vibrations indicating positioning issues.
Positioning instructions were found to be unintuitive and hard to interpret. Progress feedback was seen as motivating, but some wished for smaller increments. The cap used during the scans was seen as a problem as it did not fit well for all infants and some markers fell off. Suggested improvements include adding a feature to remind the operator to scan the nose and ears, and a feature to detect scanned parts of the head and guide further scanning.

Future work

Future work should address the prototypical nature of the cap by designing it so that the markers do not fall off anymore and that the cap is so flexible that each infant can comfortably wear it. This indicates to be a challenging tradeoff between printability and wearing comfort in the selection of the material of the cap. Furthermore, the evaluation of the app could be improved by including more infants and scanning operators in the app evaluation. In an effort to make the user more aware of all the features of the app, future work can also include the creation of an onboarding screen.

Additional information

This project was part of a Bachelor Thesis and resulted in a paper, entitled “Feedback Mechanisms of an iOS App to Record RGBD Data for AI-Based Diagnosis and Monitoring of Infant Head Deformation“. This paper was presented onsite at the 3DBody.Tech Conference & Expo in Lugano, Switzerland in 2022.

Hat dir der Beitrag gefallen?

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert