DESIGN AND CONSTRUCTION OF AN INDOOR PEOPLE COUNTING SYSTEM USING SENSFLOOR® AND COMPUTATIONAL NEURAL
NETWORK
By
William Budiatmadjaja 11601018
BACHELOR’S DEGREE in
Mechanical Engineering - Mechatronics Concentration Faculty of Engineering and Information Technology
SWISS GERMAN UNIVERSITY The Prominence Tower
Jalan Jalur Sutera Barat No. 15, Alam Sutera Tangerang, Banten 15143 - Indonesia
July 2020
Revision after Thesis Defense on July 17, 2020
STATEMENT BY THE AUTHOR
I hereby declare that this submission is my own work and to the best of my knowledge, it contains no material previously published or written by another person, nor material which to a substantial extent has been accepted for the award of any other degree or diploma at any educational institution, except where due acknowledgement is made in the thesis.
William Budiatmadjaja
_____________________________________________
Student Date
Approved by:
Dr. Rusman Rusyadi, M. Sc.
_____________________________________________
Thesis Advisor Date
Dr. Axel Steinhage
_____________________________________________
Thesis Co-Advisor Date
Dr. Maulahikmah Galinium, S. Kom., M. Sc.
_____________________________________________
Dean Date
ABSTRACT
DESIGN AND CONSTRUCTION OF AN INDOOR PEOPLE COUNTING SYSTEM USING SENSFLOOR® AND COMPUTATIONAL NEURAL
NETWORK
By
William Budiatmadjaja
Dr. Rusman Rusyadi, M. Sc., Advisor Dr. Axel Steinhage, Co-Advisor
SWISS GERMAN UNIVERSITY
Due to the high need for ambient assisted living, the thesis presents methods for human sensing and counting using the SensFloor system. The usage of neural network architectures such as feed-forward, recurrent, and convolution neural network is done to see which architectures perform the best. The dataset taken from data logs and virtually generated are used to train the model using the TensorFlow library. Performing similarly, recurrent and convolution neural networks are researched further to sense and count humans through coordinate grid segmentation. The usage of a proper neural network algorithm will increase the performance of the SensFloor system, as proven on the single object localisation problem.
Keywords: SensFloor, Ambient Assisted Living, Machine Learning, Recurrent Neural Networks, Convolution Neural Networks, Object Detection, TensorFlow.
© Copyright 2020 by William Budiatmadjaja
All rights reserved
DEDICATION
I dedicate this works for the development of ambient assisted living.
ACKNOWLEDGEMENTS
I would like to express my gratitude to the members of my committee for all the support and patience. Particularly for Dr. Axel Steinhage and Raoul Hoffmann for their direct guidance throughout my thesis work. Their guidance helps not only on the technical work of my thesis but also helps me go through this though pandemic times. I would like to thank Dr. Rusman Rusyadi for his long-distance support and our weekly web meetings. His support meant a lot in making my decision on my thesis work. I would also like to thank Dr. Yunita Umniyati as my academic counsellor for believing in me and supporting me from my first year at university until my graduation time.
I am also grateful for all the support from my parent, Santoso, gave me. Many thanks to my significant other, Sora Azalia, who have given me emotional support and some technical support on my thesis work. My thanks also for introducing me to her family, Martinus, Rhatna, Nadia Ariela, and Angelica Aurea who have given me their support and trust. I would also like to recognize the assistance and support that I received from all my friends during my times in Swiss German University.
TABLE OF CONTENTS
Page
STATEMEN T BY THE AUTHOR ... 2
ABSTRACT ... 3
DEDICATION... 5
ACKNOWLEDGEMENTS ... 6
TABLE OF CONTEN TS ... 7
LIST OF FIGURES ... 10
LIST OF TABLES ... 12
1. INTRODUCTION... 13
1.1 Background ... 13
1.2 Research Problems ... 14
1.3 Research Objectives ... 14
1.4 Significance of Study ... 15
1.5 Research Questions ... 15
1.6 Hypothesis... 15
2. LITERATURE REVIEW ... 16
2.1 Sensors ... 16
2.1.1 Video Camera ... 16
2.1.2 Visible Light Communication... 17
2.1.3 Pressure Sensitive Floor... 18
2.1.4 SensFloor ... 19
2.2 Jupyter Notebook and Python ... 20
2.3 Neural Network Frameworks... 21
2.3.1 Pytorch ... 21
2.3.2 TensorFlow ... 22
2.3.3 Keras API ... 23
2.4 Neural Network Architectures ... 24
2.4.1 Machine Learning Techniques... 24
2.4.2 Feed-Forward Neural Networks... 25
2.4.3 Neural Network Hyperparameters ... 25
2.4.3.1 Activation Function ... 28
2.4.3.1.1 Softmax ... 28
2.4.3.1.2 Sigmoid ... 28
2.4.3.1.4 Rectified Linear Units (ReLU)... 29
2.4.3.2 Loss Function... 29
2.4.3.2.1 Mean Squared Error ... 30
2.4.3.2.2 Mean Squared Logarithmic Error ... 30
2.4.3.2.3 Mean Absolute Error ... 30
2.4.3.2.4 Binary Cross-Entropy... 30
2.4.3.2.5 Categorical Cross-Entropy ... 31
2.4.3.3 Optimizer ... 31
2.4.3.3.1 Stochastic Gradient Descent (SGD) ... 31
2.4.3.3.2 Adaptive Moment Estimation (Adam) ... 31
2.4.4 Convolution Neural Networks ... 32
2.4.5 Recurrent Neural Network ... 33
2.4.5.1 Long-Short Term Memory (LSTM) ... 34
2.4.5.2 Gated Recurrent Unit (GRU) ... 35
3. RESEARCH METHODS ... 37
3.1 Research Framework... 37
3.2 SensFloor Data ... 38
3.2.1 SE3-P Connection ... 38
3.2.2 Message Format ... 38
3.2.3 SensFloor API Analysis ... 39
3.2.4 VirtualSftxServer ... 41
3.3 Neural Network Implementation ... 41
3.3.1 Image Processing Method ... 42
3.3.2 Raw Data Processing Method ... 42
3.4 Tests Evaluation and Final Model Analysis ... 43
3.5 Research Timeline... 43
4. RESULTS AND DISCUSSION ... 44
4.1 Environment Preparation ... 44
4.2 First Stage ... 46
4.2.1 Data Preparation... 47
4.2.2 Neural Network Model ... 49
4.2.2.1 Architecture ... 49
4.2.2.2 Training Process ... 51
4.2.2.3 Model Result... 52
4.3 Second Stage ... 53
4.3.1 Data Preparation... 54
4.3.2 Neural Network Model ... 57
4.3.2.1 Architecture ... 58
4.3.2.2 Training Process ... 61
4.3.2.3 Model Result... 63
4.4 Third Stage ... 65
4.4.1 Data Preparation... 65
4.4.2 Neural Network Model ... 68
4.4.2.1 Architecture ... 68
4.4.2.2 Training Process ... 71
4.4.2.2.1 First Approach ... 71
4.4.2.2.2 Second approach ... 72
4.4.2.3 Model Result... 73
4.5 Final Model ... 79
4.5.1 Input Mapping... 79
4.5.2 State Filtering ... 81
5. CONCLUSIONS AND RECOMMENDATIONS ... 85
5.1 Conclusions ... 85
5.2 Recommendations ... 86
GLOSSARY... 87
REFERENCES... 88
APPENDIXES ... 93
CURRICULUM VITAE... 114
Figure 2.1.1 SensFloor Schematic ... 19
Figure 2.1.2 Schematic of an Open Capacitor (Zimmerman et al., 1995) ... 20
Figure 2.3.1 Schematic TensorFlow dataflow graph (Abadi et al., 2016b) ... 22
Figure 2.4.1 Batch Size and Number of Epoch Effects on Accuracy ... 26
Figure 2.4.2 Dropout Rate Effects on Accuracy ... 27
Figure 2.4.3 A General CNN Architecture ... 32
Figure 2.4.4 A General RNN Layer ... 33
Figure 2.4.5 A General LSTM Architecture ... 34
Figure 2.4.6 A General GRU Architecture ... 35
Figure 3.1.1 Research Framework ... 37
Figure 3.2.1 Splitting Object... 40
Figure 3.2.2 VirtualSftxServer Flow ... 41
Figure 4.1.1 SensFloor Installed Corridor ... 45
Figure 4.2.1 First Stage Data Parsing ... 47
Figure 4.2.2 Input Data and O utput Data Set-Up Function ... 48
Figure 4.2.3 First Stage Data Intake ... 49
Figure 4.2.4 First Stage Model Architecture ... 50
Figure 4.2.5 First Stage Model Accuracy Records ... 51
Figure 4.2.6 First Stage Model Loss Function Records... 51
Figure 4.2.7 One Hidden Layer Model Intermed iate Activation Visualisation... 52
Figure 4.3.1 Second Stage Data Parsing ... 54
Figure 4.3.2 Data Integration ... 55
Figure 4.3.3 Data Distribution ... 57
Figure 4.3.4 Second Stage First Approach Model Architecture ... 58
Figure 4.3.5 Activation Grid Search ... 58
Figure 4.3.6 Second Stage Second Approach Model Architectures ... 59
Figure 4.3.7 Second Stage Model Accuracy Records... 61
Figure 4.3.8 Second Stage Model Loss Function Records ... 62
Figure 4.3.9 Model Result True vs Prediction ... 63
Figure 4.3.10 Extrapolation Test Result ... 64
Figure 4.4.1 Object Count from Available Data Log ... 66
Figure 4.4.2 Visualisation of SensFloor Messages ... 66
Figure 4.4.3 Second Stage Simulated Data Trained LSTM Model Result ... 67
Figure 4.4.4 Third Stage Convolution Model Architecture ... 69
Figure 4.4.5 Third Stage First Approach Accuracy Records ... 71
Figure 4.4.6 Third Stage First Approach Loss Function Records ... 72
Figure 4.4.7 Third Stage Second Approach Accuracy Reco rds ... 72
Figure 4.4.8 Third Stage Second Approach Loss function Records ... 72
Figure 4.4.9 Third Stage First Approach Coordinate Prediction Result ... 73
Figure 4.4.10 Third Stage First Approach Number of Object Prediction Result ... 74
Figure 4.4.11 Training Data N umber Effect on Third Stage First Approach Models . 75 Figure 4.4.12 Different Maximum Number of Object Effects on Model Performance ... 76
Figure 4.4.13 Third Stage First Approach Number of Object Prediction Result ... 77
Figure 4.4.14 Different Maximum Number of Object Effects on Model Performance ... 78
Figure 4.4.15 Training Data Number Effect on Third Stage Second Approach Models ... 78
Figure 4.5.1 Final Model General Architecture ... 79
Figure 4.5.2 Input Mapping Pretraining ... 80
Figure 4.5.3 Structure of ConvLSTM (Xavier, 2019) ... 81
Figure 4.5.4 State Filtering Training... 82
Figure 4.5.5 Final Model Prediction Sample ... 83
Figure 4.5.6 Final Model Tweaked Standard Convolution Prediction ... 83
Table 3.2.1 SensFloor Message Format... 38
Table 3.2.2 SensFloor Message Meaning ... 39
Table 3.2.3 SensFloor Tracking API O utput Format... 40
Table 3.5.1 Research Timeline ... 43
Table 4.1.1 System Configuration and Library-Version ... 46