Publications

B. Zhou

Planar force or pressure is a fundamental physical aspect during any people-vs-people and people-vs-environment activities and interactions. It is as significant as the more established linear and angular acceleration (usually acquired by inertial measurement units). There have been several studies involving planar pressure in the discipline of activity recognition, as reviewed in the first chapter. These studies have shown that planar pressure is a promising sensing modality for activity recognition. However, they still take a niche part in the entire discipline, using ad hoc systems and data analysis methods. Mostly these studies were not followed by further elaborative works. The situation calls for a general framework that can help push planar pressure sensing into the mainstream. This dissertation systematically investigates using planar pressure distribution sensing technology for ubiquitous and wearable activity recognition purposes.

H. Bello, B. Zhou, S. Suh, P. Lukowicz

We present a wearable system to detect body postures and gestures that does not require sensors to be firmly fixed to the body or integrated into a tight-fitting garment. The sensing system can be used in a loose piece of clothing such as a coat/blazer. It is based on the well-known theremin musical instrument, which we have unobtrusively integrated into a standard men’s blazer using conductive textile antennas and OpenTheremin hardware as a prototype, the ”MoCaBlazer.” Fourteen participants with diverse body sizes and balanced gender distribution mimicked 20 arm/torso movements with the unbuttoned, single-sized blazer. State-of-the-art deep learning approaches were used to achieve average recognition accuracy results of 97.18% for leave one recording out and 86.25% for user independent recognition.

J. Vargas, B. Zhou, H. Bello, P. Lukowicz

We aim to facilitate broad use of EEG sensing in multi-modal smart garments by developing an open-source EEG sensing module with the state-of-the-art analog front-end that is pin/protocol-compatible with popular ecosystems in the wearable and DIY community. The EEG functionality is validated with the neuroscience standard n-back memory load task. We also demonstrate the seamless integration of EEG electrodes with low-frequency Force Sensitive Resistors (FSR) and high-frequency piezoelectric sensors within a single probe. Finally, we show the embedding of the entire setup in a textile baseball cap. We also present how signals from the different modalities complement each other under situations such as motion artifacts and different activities from an unobtrusive head-worn garment. The system is available to the community through a public GitHub repository.

H. Fischer, D. Wittmann, A.B. Costa, B. Zhou, G. Joost, P. Lukowicz

We present a technology-garment co-designed smart mask concept, Masquare or Masque2, to enable the daily face garments with cardio-respiratory health monitoring functions, through scientifically established results. Masquare can be used in various scenarios where both protection against harmful elements and continuous health status monitoring can benefit the user. Our design approach thoroughly considers the textile and garment integration to combine smart sensing technologies inside a face mask while keeping its traditional air filtering functions, all at the same time not deviating from the already generally accepted mask appearance.

A. Smerdov, A. Somov, E. Burnaev, B. Zhou, P. Lukowicz

Current research in eSports lacks the tools for proper game practising and performance analytics. The majority of prior work relied only on in-game data for advising the players on how to perform better. However, in-game mechanics and trends are frequently changed by new patches limiting the lifespan of the models trained exclusively on the in-game logs. In this article, we propose the methods based on the sensor data analysis for predicting whether a player will win the future encounter. The sensor data were collected from 10 participants in 22 matches in League of Legends video game. We have trained machine learning models including Transformer and Gated Recurrent Unit to predict whether the player wins the encounter taking place after some fixed time in the future. For 10 seconds forecasting horizon Transformer neural network architecture achieves ROC AUC score 0.706. This model is further developed into the detector capable of predicting that a player will lose the encounter occurring in 10 seconds in 88.3% of cases with 73.5% accuracy. This might be used as a players’ burnout or fatigue detector, advising players to retreat. We have also investigated which physiological features affect the chance to win or lose the next in-game encounter.

S. Bian, B. Zhou, H. Bello, P. Lukowicz

We present a wearable, oscillating magnetic field-based proximity sensing system to monitor social distancing as suggested to prevent COVID 19 spread (being between 1.5 and 2.0m) apart. We evaluate the system both in controlled lab experiments and in a real life large hardware store setting. We demonstrate that, due physical properties of the magnetic field, the system is much more robust than current BT based sensing, in particular being nearly 100% correct when it comes to distinguishing between distances above and below the 2.0m threshold.

B. Zhou, A.B. Costa, P. Lukowicz

Cardiorespiratory (CR) signals are crucial vital signs for fitness condition tracking, medical diagnosis, and athlete performance evaluation. Monitoring such signals in real-life settings is among the most widespread applications of wearable computing. We investigate how miniaturized barometers can be used to perform accurate spirometry in a wearable system that is built on off-the-shelf training masks often used by athletes as a training aid. We perform an evaluation where differential barometric pressure sensors are compared concurrently with a digital spirometer, during an experimental setting of clinical forced vital capacity (FVC) test procedures with 20 participants. The relationship between the two instruments is derived by mathematical modeling first, then by various regression methods from experiment data. The results show that the error of FVC vital values between the two instruments can be as low as 2∼3%. Beyond clinical tests, the method can also measure continuous tidal breathing air volumes with a 1∼3% error margin. Overall, we conclude that barometers with millimeter footprints embedded in face mask apparel can perform similarly to a digital spirometer to monitor breathing airflow and volume in pulmonary function tests.

B. Zhou, P. Lukowicz

This paper investigates the possibility of using soft smart textiles over the hair regions to detect chewing activities under episodes of snacking in a simulated scenario with everyday activities. The planar pressure textile sensors are used to perform mechanomyography of the temporalis muscles in the form of a cap. 10 participants contributed 30 recording sessions with time periods between 30 and 60 minutes. A frequency analysis method is developed to detect moments of snacking events with continuous sliding windows on 1-second time granularity. Our approach results in a baseline 80% accuracy, over 85% after outlier removal, and above 90% accuracy for some of the participants.

A. Alieyv, B. Zhou, P. Hevesi, M. Hirsch, P. Lukowicz

This work demonstrates a connected smart helmet platform, HeadgearX, aimed at improving personnel safety and real-time monitoring of construction sites. The smart helmet hardware design is driven by flexible and expandable sensing and actuating capabilities to adapt to various workplace requirements and functionalities. In our demonstrator, the system consists of ten different sensors, visual and haptic feedback mechanism, and Bluetooth connectivity. A companion Android application is also developed to add further functionalities including those configurable over-the-air. The construction project supervisors can monitor all on-site personnel’s real-time statuses from a central web server which communicates to individual HeadgearX helmets via the companion app. Several use case scenarios are demonstrated as examples, while further specific functionalities can be added into HeadgearX by either software reconfigurations with the existing system or hardware modifications.

H. Bello, B. Zhou, P. Lukowicz

Many human activities and states are related to the facial muscles’ actions: from the expression of emotions, stress, and non-verbal communication through health-related actions. such as coughing and sneezing to nutrition and drinking. In this work, we describe, in detail, the design and evaluation of a wearable system for facial muscle activity monitoring based on a re-configurable differential array of stethoscope-microphones. In our system, six stethoscopes are placed at locations that could easily be integrated into the frame of smart glasses. The paper describes the detailed hardware design and selection and adaptation of appropriate signal processing and machine learning methods. For the evaluation, we asked eight participants to imitate a set of facial actions, such as expressions of happiness, anger, surprise, sadness, upset, and disgust, and gestures, like kissing, winkling, sticking the tongue out, and taking a pill. An evaluation of a complete data set of 2640 events with 66% training and a 33% testing rate has been performed. Although we encountered high variability of the volunteers’ expressions, our approach shows a recall = 55%, precision = 56%, and f1-score of 54% for the user-independent scenario(9% chance-level). On a user-dependent basis, our worst result has an f1-score = 60% and best result with f1-score = 89%. Having a recall ≥60% for expressions like happiness, anger, kissing, sticking the tongue out, and neutral(Null-class).

S.Z. Bian, B. Zhou, P. Lukowicz

Social distancing and contact/exposure tracing are accepted to be critical strategies in the fight against the COVID-19 epidemic. They are both closely connected to the ability to reliably establish the degree of proximity between people in real-world environments. We proposed, implemented, and evaluated a wearable proximity sensing system based on an oscillating magnetic field that overcomes many of the weaknesses of the current state of the art Bluetooth based proximity detection. In this paper, we first described the underlying physical principle, proposed a protocol for the identification and coordination of the transmitter (which is compatible with the current smartphone-based exposure tracing protocols). Subsequently, the system architecture and implementation were described, finally an elaborate characterization and evaluation of the performance (both in systematic lab experiments and in real-world settings) were performed. Our work demonstrated that the proposed system is much more reliable than the widely-used Bluetooth-based approach, particularly when it comes to distinguishing between distances above and below the 2.0 m threshold due to the magnetic field’s physical properties.

B. Zhou, T. Ghose, P. Lukowicz

We investigate how pressure-sensitive smart textiles, in the form of a headband, can detect changes in facial expressions that are indicative of emotions and cognitive activities. Specifically, we present the Expressure system that performs surface pressure mechanomyography on the forehead using an array of textile pressure sensors that is not dependent on specific placement or attachment to the skin. Our approach is evaluated in systematic psychological experiments. First, through a mimicking expression experiment with 20 participants, we demonstrate the system’s ability to detect well-defined facial expressions. We achieved accuracies of 0.824 to classify among three eyebrow movements (0.333 chance-level) and 0.381 among seven full-face expressions (0.143 chance-level). A second experiment was conducted with 20 participants to induce cognitive loads with N-back tasks. Statistical analysis has shown significant correlations between the Expressure features on a fine time granularity and the cognitive activity. The results have also shown significant correlations between the Expressure features and the N-back score. From the 10 most facially expressive participants, our approach can predict whether the N-back score is above or below the average with 0.767 accuracy.

B. Zhou, A.B. Costa, P. Lukowicz

We present the system CoRSA to incorporate integrated sensors in millimeter-scale packages for continuous cardiorespiratory (CR) evaluation in sports activities. CoRSA retrofits trending sports apparel to add on CR sensing capability. The system uses an air pressure sensor inside a vented mask to approximate a spirometer, and an earlobe pulse-oximeter (PO) to monitor heart rate (HR) and oxygen saturation (SpO2). CoRSA also includes an inertial measurement unit for tracking activity and future study on motion artifact correction on the CR signals in active sports. An aerobic exercise evaluation is also performed which shows results similar to sports studies using bulkier conventional medical equipment in the CR signals' characteristics.

B. Zhou, P. Lukowicz

There have been many studies in recent years using the Textile planar Pressure Mapping (TPM) technology for computer-human interactions and ubiquitous activity recognition. A TPM sensing system generates a time sequence of spatial pressure imagery. We propose a novel, comprehensive and unified feature set to evaluate TPM data from the space and time domain. The initial version of the TPM feature set presented in this paper includes 663 temporal features and 80 spatial features. We evaluated the feature set on 3 datasets from past studies in the scopes of ambient, smart object and wearable sensing. The TPM feature set has shown superior recognition accuracy compared with the ad-hoc algorithms from the corresponding studies. Furthermore, we have demonstrated the general approach to further reduce and optimise the feature calculation process for specific applications with neighbourhood component analysis.

J. Auda, M. Hoppe, O. Amiraslanov, B. Zhou, P. Knierim, S. Schneegass, A. Schmidt, P. Lukowicz

We present LYRA, a modular in-flight system that enhances service and assists flight attendants during their work. LYRA enables passengers to browse and order services from their smartphones. Smart glasses and a smart shoe-clip with RFID reader module provides flight attendants with situated information. We gained first insights into how flight attendants and passengers use of the system during a long distance flight from Frankfurt to Houston.

A.B. Costa, B. Zhou, O. Amiraslanov, P. Lukowicz

In this work, we present and evaluate a concept for using an integrated environment sensor as a wearable spirometer. Unlike a standard spirometer that by design is fairly bulky, our device can be unobtrusively integrated into various configurations suitable for long-term use in everyday settings (open headset, regular face mask, and professional sports mask). The sensor measures the transient change in air pressure, humidity and temperature in front of wearers’ mouth and nostrils. We present our hardware design and signal analysis methods needed to extract breathing rate information. We compare the results with a standard spirometer. Moreover, a calibration between the BME280 sensor and the spirometer is performed, having both working in parallel. We show that our approach is able to distinguish between normal breaths and deep breaths, as well as to capture the period and magnitude of the breath cycles, with a wearable device that can be used in everyday scenarios, as well as sport activities. The classification accuracy is 96% in face mask settings and 82% in an open headset setting. We also show that the sensor is able to approximate air volume by comparing the sensor’s pressure channel to the spirometer’s flow rate results.

B. Zhou, B.A.V. Altamirano, H.C. Zurian, S.R. Atefi, E. Billing, F. Seoane, P. Lukowicz

In this paper, we developed a fully textile sensing fabric for tactile touch sensing as the robot skin to detect human-robot interactions. The sensor covers a 20-by-20 cm 2 area with 400 sensitive points and samples at 50 Hz per point. We defined seven gestures which are inspired by the social and emotional interactions of typical people to people or pet scenarios. We conducted two groups of mutually blinded experiments, involving 29 participants in total. The data processing algorithm first reduces the spatial complexity to frame descriptors, and temporal features are calculated through basic statistical representations and wavelet analysis. Various classifiers are evaluated and the feature calculation algorithms are analyzed in details to determine each stage and segments’ contribution. The best performing feature-classifier combination can recognize the gestures with a 93. 3% accuracy from a known group of participants, and 89. 1% from strangers.

B. Zhou, M.S. Singh, M. Yildirim, I. Prifti, H.C. Zurian, Y.M. Yuncosa, P. Lukowicz

We present a pervasive sensing system in the form of a wireless smart blanket that unobtrusively monitors people’s posture on ergonomic design chairs by covering the back piece of the chair and measuring the pressure profile of the user’s back. The sensor system has 1024 sensitive points, covering 48-by-48 cm2 area. With simple and efficient classification algorithm, we reach around 80% among 10 postures, including lordotic and kyphotic lumber spine on different degrees, and lean to the sides on different degrees. The web browser based user interface offers timely and reprogrammable intervention from the user’s posture history.

B. Zhou, M. Sundholm, J. Cheng, H.C. Zurian, P. Lukowicz

We present a wearable textile sensor system for monitoring muscle activity, leveraging surface pressure changes between the skin and compression garment. Our first prototype is based on an 8×16 element fabric resistive pressure mapping matrix with 1 cm spacial resolution and 50 fps refresh rate. We evaluate the prototype by monitoring leg muscles during realistic leg workouts in a gym out of the lab. The sensor covers the lower part of quadriceps of the user. The shape and movement of the two major muscles (vastus lateralis and medialis) are visible from the data visualizing during various exercises. With features extracted from spacial and temporal domains out of the pressure force mapping information, with 4 different leg exercises plus non-workout activities (relaxing, adjusting machines and walking), we have reached 81.7% average recognition accuracy on a fine-grain sliding window basis, 93.3% on an event basis, and 85.6% spotting F1-score. We further investigate the relationships between people’s perception of the exercise quality/difficulty and the variation and consistency of the force pattern. A second prototype is also developed with tether-free to explore various placements including upper-arm, chest and lower back; a brief comparison with arm-worn EMG shows our approach is comparable on the signal quality level.

M.S. Singh, V. Pondenkandath, B. Zhou, P. Lukowicz, M. Liwickit

Convolutional Neural Networks (CNNs) have become the state-of-the-art in various computer vision tasks, but they are still premature for most sensor data, especially in pervasive and wearable computing. A major reason for this is the limited amount of annotated training data. In this paper, we propose the idea of leveraging the discriminative power of pre-trained deep CNNs on 2-dimensional sensor data by transforming the sensor modality to the visual domain. By three proposed strategies, 2D sensor output is converted into pressure distribution imageries. Then we utilize a pre-trained CNN for transfer learning on the converted imagery data. We evaluate our method on a gait dataset of floor surface pressure mapping. We obtain a classification accuracy of 87.66%, which outperforms the conventional machine learning methods by over 10%.

B. Zhou, G. Bahle, L. Fürg, M.S. Singh, H.Z. Cruz, P. Lukowicz

In this demonstrator, we present Trainwear, a wearable garment that utilizes fabric pressure sensing for sport exercise activity recognition and feedback. The shirt emphasizes on design for public users using our developed sensing technology. A video of the demo is linked at the end of this technical paper.

B. Zhou, M.S. Singh, S. Doda, M. Yildirim, J. Cheng, P. Lukowicz

In this paper, we present an approach for person identification using morphing footsteps measured from a fabric-based pressure mapping sensor system. The flexible fabric sensor is 0.5 mm thin and operates under a 5 mm thick normal carpet; therefore, it can be easily implemented into modern smart living spaces. We extract features concerning single steps with the shifting of gravity center, maximum pressure point and overall pressed area, which are independent from shape details and inter-step relationships of the walking sequences. The system is evaluated with 13 participants wearing shoes and walking normally across the carpet. Overall 529 footsteps are recorded, and the resulting average identification accuracy is 76.9%. Our approach can also be used for further activity recognition with the same physical carpet sensors.

S. Jiong, B. Erik, S. Fernando, B. Zhou

Social touch plays an important role not only in human communication but also in human-robot interaction. We here report results from an ongoing study on affective human-robot interaction. In our previous research, touch type is shown to be informative for communicated emotion. Here, a soft matrix array sensor is used to capture the tactile interaction between human and robot and a method based on PCA and kNN is applied in the experiment to classify different touch types, constituting a pre-stage to recognizing emotional tactile interaction. Results show an average recognition rate for classified touch type of 71%, with a large variability between different types of touch. Results are discussed in relation to affective HRI and social robotics.

B. Zhou, H.C. Zurian, S.R. Atefi, E. Billing, F. Seoane, P. Lukowicz

In this paper, we developed a fully textile sensing fabric for tactile touch sensing as the robot skin to detect humanrobot interactions. We defined seven gestures which are inspired by the interactions of typical people to people and pet scenarios. Through evaluation experiment with 5 participants, the machine learning algorithm can recognize the gesture with a 98% accuracy if the algorithm has encountered the person before, and 86.8% accuracy if the person is excluded in the training data.

Textile pressure force mapping , Smart Textiles, Springer, 2017

B. Zhou, P. Lukowicz

While much effort in smart textile technology development has been put on acquiring biomedical signals such as ECG/EMG or tissue bioimpedance, an important alternative is mapping the pressure which is applied to the textile substrate itself. The modality has inspired researchers to instrument a wide variety of daily items and wearable garments for interactive controlling and activity monitoring in the recent years. To offer a guideline for implementing such systems, this chapter will introduce textile-based pressure force mapping sensing technology, from comparisons with other smart textile technologies to sensing principles, driving circuitry and finally several application examples.

B. Zhou, J. Cheng, A. Mawandia, Y. He, Z. Huang, M. Sundholm, M. Yildrim, H.C. Zurian, P. Lukowicz

Based on a series of projects with textile pressure mapping matrix (TPM) for ubiquitous and wearable activity recognition in various scenarios, we have accumulated the knowledge and experience to develop an open-access hardware and software framework, which enables a broader education and allows the scientific community to build their own TPM applications. The hardware framework includes all the necessary resources to manufacture the sensing equipment and instructions to build the fabric sensors for an up to 32× 32 TPM. The software framework ‘Textile-Sandbox’contains ready-to-use tools and modules that support both running experiments and data mining. The framework is evaluated with 10 master students working in 4 groups. 4 applications are developed from scratch and validated within only 40 hours. We present this framework and the evaluated applications in this paper.

B. Zhou, H. Koerger, M. Wirth, C. Zwick, C. Martindale, B. Eskofier, H.C. Zurian, P. Lukowicz

In this paper we present a smart soccer shoe that uses textile pressure sensing matrices to detect and analyze the interaction between players' foot and the ball. We describe the sensor system that consists of two 3 × 4 and one 3 × 3 matrices sampled at over 500Hz with low power electronics that allows continuous operation (incl. wireless transmission) for 8 hours using a small 800mA/h Li-Po battery. We show how relevant parameters for shot analysis such as contact speed and contact angles can be reliably derived from the sensor signals. To ensure reliable ground truth we evaluated the system with a kick robot in the adidas testing facility, which is the standard approach used by adidas to systematically and quantitatively test new shoes and balls. The test encompasses 17 different types of shots and achieves a near 100% classification accuracy/F-score. The system endured extreme levels of impact resulting in over 100km/hr ball speed.

R. Zhang, M. Freund, O. Amft, J. Cheng, B. Zhou, P. Lukowicz, F. Seoane, P. Chabrecek

We investigate a generic fabric material as basis for resistive pressure and bio-impedance sensors and apply the fabric in a shirt collar for swallowing spotting. A pilot study confirmed the signal performance of both sensor types.

J. Cheng, M. Sundholm, B. Zhou, M. Hirsch, P. Lukowicz

In this paper we present textile-based surface pressure mapping as a novel, unobtrusive information source for activity recognition. The concept is motivated by the observation that the vast majority of human activities are associated with certain types of surface contact (walking, running, etc. on the floor; sitting on a chair/sofa; eating, writing, etc. at a table; exercising on a fitness mat, and many others). A key hypothesis which we validate in this paper is: by analysing subtle features of such interaction, various complex activities, often ones that are difficult to distinguish using other unobtrusive sensors, can be well recognised. A core contribution of our work is a sensing and recognition system based on cheap, easy-to-produce textile components. These components can be integrated into matrices with tens of thousands of elements, a spatial pitch as fine as 1 cm2, temporal granularity of up to 40 Hz and pressure dynamic range from 0.25×105 to 5×105 Pa. We present the evaluation of the concept and the technology in five scenarios, through matrix monitoring driver motions at a car seat (32×32 sensors on 32×32 cm2), a Smart-YogaMat (80×80 sensors on 80×80 cm2) detecting and counting exercises, to a Smart-Tablecloth (30×42 sensors on 30×42 cm2) recognising various types of food being eaten.

B. Zhou, M. Sundholm, J. Cheng, H.C. Zurian, P. Lukowicz

We present a wearable textile sensor system for monitoring muscle activity, leveraging surface pressure changes between the skin and an elastic sport support band. The sensor is based on an 8×16 element fabric resistive pressure sensing matrix of 1cm spatial resolution, which can be read out with 50fps refresh rate. We evaluate the system by monitoring leg muscles during leg workouts in a gym out of the lab. The sensor covers the lower part of quadriceps of the user. The shape and movement of the two major muscles (vastus lateralis and medialis) are visible from the data during various exercises. The system registers the activity of the user for every second, including which machine he/she is using, walking, relaxing and adjusting the machines; it also counts the repetitions from each set and evaluate the force consistency which is related to the workout quality. 6 people participated in the experiment of overall 24 leg workout sessions. Each session includes cross-trainer warm-up and cool-down, 3 different leg machines, 4 sets on each machine. Plus relaxing, adjusting machines, and walking, we perform activity recognition and quality evaluation through 2-dimensional mapping and the time sequence of the average force. We have reached 81.7% average recognition accuracy on a 2s sliding window basis, 93.3% on an event basis, and 85.6% spotting F1-score. We further demonstrate how to evaluate the workout quality through counting, force pattern variation and consistency.

B. Zhou, J. Cheng, P. Lukowicz, A. Reiss, O. Amft

In this article, the authors investigate a new source of information for dietary monitoring: pressure distribution on the surface underneath dining plates. Pressure sensing has been used to consider the weight of the eaten food. The core idea behind their work is that dynamic pressure information can also be used to distinguish between various cutlery-related activities, such as cutting, poking, stirring, or scooping. The authors show how to spot such individual actions in continuous datastreams, assign them to specific containers (main plate, salad bowl), count them (how many bites taken), and relate them to different abstract food categories. They consider two sensing modalities: (1) textile pressure-sensor matrix technology facilitating a "smart tablecloth" that looks and feels like a standard tablecloth but provides detailed information on the spatial and temporal pressure; and (2) standard force sensitive resistor (FSR) sensors placed underneath a rigid tray. They also present the results of a new, comprehensive study with 10 subjects, each of whom consumed a total of eight meals chosen from 17 possible main dishes with six possible side dishes; results show an average accuracy of up to 94 percent. This article is part of a special issue on pervasive food.

B. Zhou, J. Cheng, M. Sundholm, A. Reiss, W. Huang, O. Amft, P. Lukowicz

We present a novel sensor system for the support of nutrition monitoring. The system is based on smart table cloth equipped with a fine grained pressure textile matrix and a weight sensitive tablet. Unlike many other nutrition monitoring approaches, our system is unobtrusive, non privacy invasive and easily deployable in every day life. It allows the spotting and recognition of food intake related actions, such as cutting, scooping, stirring, etc., the identification of the plate/container on which the action is executed, and the tracking of the weight change in the containers. In other words, we can determine how many pieces are cut on the main dish plate, how many are taken from the side dish, how many sips are taken from the drink, how fast the food is being consumed and how much weight is taken overall. In addition, the distinction between different eating actions, such as cutting, scooping, poking, provides clues to the type of food taken and the way the meal is consumed. We have evaluated our system on 40 meals (5 subjects) in a real life living environment: for seven eating related actions (cutting, scooping, stirring, etc.), resulting in above 90% average recognition rate for person dependent cases, and spotting each action out of continuous data streams (average F1 score 87%).

J. Cheng, M. Sundholm, M. Hirsch, B. Zhou, S. Palacio, P. Lukowicz

We present how ubiquitous pressure sensor matrix can be used as information source for service-robots in two different applications. The textile pressure sensor, that utilizes the ubiquitousness of gravity, can be put on most surfaces in our environment to trace forces. As safety and human robot interaction are key factors for daily life service robots, we evaluated the pressure matrix in two scenarios: on the ground with toy furniture demonstrating its capability for indoor localization and obstacle mapping, and on a sofa as an ubiquitous input device for giving commands to the robot in a natural way.

B. Wang, J. Cheng, B. Zhou, O. Amiraslanov, P. Lukowicz, M. Zhang

In this paper we use ubiquitous smart-chairs to evaluate live presentations. We validate the hypothesis that the audiences' activities can be recognized with pressure sensors under chairs' legs (74.6% accuracy rate from 8 typical activities in 8 live presentations, each with 6 chairs seated), and certain activities (eg talk, take notes) are linked to the audiences' subjective attitudes and evaluation of the presentation, moreover, the combinations of interactive activities like Talk-Take Note, Nod-Laugh and Take Note-Laugh are well linked to a positive rating of the presentation.

M. Sundholm, J. Cheng, B. Zhou, A. Sethi, P. Lukowicz

There is a large class of routine physical exercises that are performed on the ground, often on dedicated "mats" (e.g. push-ups, crunches, bridge). Such exercises involve coordinated motions of different body parts and are difficult to recognize with a single body worn motion sensors (like a step counter). Instead a network of sensors on different body parts would be needed, which is not always practicable. As an alternative we describe a cheap, simple textile pressure sensor matrix, that can be unobtrusively integrated into exercise mats to recognize and count such exercises. We evaluate the system on a set of 10 standard exercises. In an experiment with 7 subjects, each repeating each exercise 20 times, we achieve a user independent recognition rate of 82.5% and a user independent counting accuracy of 89.9%. The paper describes the sensor system, the recognition methods and the experimental results.

J. Cheng, M. Sundholm, B. Zhou, M. Kreil, P. Lukowicz

We demonstrate through a pressure sensor matrix, that weight distribution on feet is influenced by body posture. A small cheap carpet equipped with low precision pressure sensor matrix is already sufficient to detect subtle activities and identity of the person on the carpet. By a 0.4 m2 matrix of 32 × 32, 12 bit pressure sensors, we achieve 78.7% accuracy for 11 test subjects performing 7 subtle activities (open 7 different drawers or cabinet doors) and 88.6% accuracy in recognizing who has performed the activities. We thus see the potential of using a single carpet as a unified approach in houses to detect how inhabitants interact with the furniture without attaching different sensors onto each single furniture.

B. Zhou, J. Cheng, M. Sundholm, P. Lukowicz

We describe the design and implementation of an unobtrusive, cheap, large scale, pressure sensor matrix that can be used for a variety of applications ranging from smart clothing, through smart furniture, to an intelligent table cloth or carpet. The specific functionality and with it most of the complexity lies in the electronics and the processing software. We propose a scalable, modular architecture for such electronics, describe a prototype implementation, and present the results of its application to three different scenarios.

W. Huang, M. Kreil, J. Cheng, B. Wang, B. Zhou, P. Lukowicz

In this paper we present a compact and versatile acquisition platform designed by modularized method, which is useful and flexible for various sensing applications (such as mechanic detection and ubiquitous computing) thanks to the combination of high resolution analog-to-digital converter (ADC), low power consumption and exchangeable wireless transmission. The 31*45 mm2 module supports simultaneous acquisition on maximum 8 channels with an effective number of bits (ENOB) of 20.9 at most and a sampling rate of up to 32KSPS each. Also we implement two popular modes of wireless transmission featuring high data rate and low power, namely a transmission rate of 21.3Kbps using Bluetooth low energy (BLE, power consumption 30mW) and 263Kbps using WiFi (723mW). Additionally, two typical wireless sensing applications, floor vibration detection and electrocardiograph (ECG) signal acquisition, are tested to demonstrate the performance of this platform. Results show that the characteristic of this platform can primely satisfy the requirements of these applications featuring high resolution, low power and high-efficient wireless data transmission, which is greatly helpful for exploring-stage experiments in wireless sensing application cases.

J. Cheng, B. Zhou, M. Sundholm, P. Lukowicz

In this paper, we investigate how much information about user activity can be extracted from simple pressure sensors mounted under the legs of a chair. We show that it is possible to detect not only different postures (0.826 accuracy for 5 subjects and 7 classes) but also subtle hand and head related actions like typing and nodding (0.880 accuracy for 5 subjects and 5 classes). Combining features related to postures and such simple actions, we can detect high-level activities such as working on a PC, watching a movie or eating in a continuous, real-life data stream. In a dataset of 105.6 hours recorded from 4 subjects, we achieved 0.783 accuracy for 7 classes.

J. Cheng, B. Zhou, K. Kunze, C.C. Rheinländer, S. Wille, N. Wehn, J. Weppner, P. Lukowicz

In this paper, we investigate how much information about user activity can be extracted from simple pressure sensors mounted under the legs of a chair. We show that it is possible to detect not only different postures (0.826 accuracy for 5 subjects and 7 We build on previous work [5] that demonstrated, in simple isolated experiments, how head and neck related events (eg swallowing, head motion) can be detected using an unobtrusive, textile capacitive sensor integrated in a collar like neckband. We have now developed a 2nd generation that allows long term recording in real life environments in conjunction with a low power Bluetooth enabled smart phone. It allows the system to move from the detection of individual swallows which is too unreliable for practical applications to an analysis of the statistical distribution of swallow frequency. Such an analysis allows the detection of" nutrition events" such as having lunch or breakfast. It also allows us to see the general level of activity and distinguish between just being absolutely quiet (no motion) and sleeping. The neckband can be useful in a variety of applications such as cognitive disease monitoring and elderly care. ) but also subtle hand and head related actions like typing and nodding (0.880 accuracy for 5 subjects and 5 classes). Combining features related to postures and such simple actions, we can detect high-level activities such as working on a PC, watching a movie or eating in a continuous, real-life data stream. In a dataset of 105.6 hours recorded from 4 subjects, we achieved 0.783 accuracy for 7 classes.