Implementation of Covid-19 Social Distance Detection and Suspicious Human Behavior Recognition using Machine Learning

  • Ravikiran. H.N PES College of Engineering, Mandya
  • S. Jyothi PES College of Engineering, Mandya
Keywords: Abandoned luggage, behavior recognition, blob matching, , fainting, fighting, interobject motion, loitering, meeting, object tracking, occlusion, real time, semantics based, surveillance


Detection of suspicious activities in public transport areas using video surveillance has attracted an increasing level of attention. In general, automated offline video processing systems have been used for post-event analysis, such as forensics and riot investigations. However, very little has been achieved regarding real-time event recognition. In this paper, we introduce a frame- work that processes raw video data received from a fixed color camera installed at a particular location, which makes real- time inferences about the observed activities. These supervised machine learning techniques are used for detection and tracking of Covid-19 social distancing between one or more person’s movements in public places and these observations can be done by the CCTV videos. First, the proposed framework obtains 3-D object-level information by detecting and tracking people and luggage in the scene using a real-time blob matching technique. Based on the temporal properties of these blobs, behaviors and events are semantically recognized by employing object and interobject motion features. A number of types of behavior that are relevant to security in public transport areas have been selected to demonstrate the capabilities of this approach. Examples of these are abandoned and stolen objects, fighting, fainting, and loitering. Using standard public data sets, the experimental results presented here demonstrate the out- standing performance and low computational complexity of this approach.

Author Biography

S. Jyothi, PES College of Engineering, Mandya

Associate Professor


1. Dr. H S Mohan and Mahanthesha U, “Human action Recognition using STIP Techniques”, International Journal of Innovative Technology and Exploring Engineering (IJITEE) ISSN: 2278-3075, Volume-9 Issue-7, May 2020
2. J. F. Allen, “Maintaining knowledge about temporal intervals,” Commun. ACM, vol. 26, no. 11, pp. 832–843, Nov. 1983.
3. C. Fernandez, P. Baiget, X. Roca, and J. Gonzalez, “Interpretation of complex situations in a semantic-based surveillance framework,” Image Commun., vol. 23, no. 7, pp. 554–569, Aug. 2008.
4. J. Candamo, M. Shreve, D. B. Goldgof, D. B. Sapper, and R. Kasturi, “Understanding transit scenes: A survey on human behavior-recognition algorithms,” IEEE Trans. Intell. Transp. Syst., vol. 11, no. 1, pp. 206–224, Mar. 2010.
5. Y. Changjiang, R. Duraiswami, and L. Davis, “Fast multiple object tracking via a hierarchical particle filter,” in Proc. 10th IEEE ICCV, 2005, vol. 1, pp. 212–219.
6. A. Loza, W. Fanglin, Y. Jie, and L. Mihaylova, “Video object tracking with differential Structural SIMilarity index,” in Proc. IEEE ICASSP, 2011, pp. 1405–1408.
7. D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 25, no. 5, pp. 564–577, May 2003.
8. V. Papadourakis and A. Argyros, “Multiple objects tracking in the presence of long-term occlusions,” Comput. Vis. Image Underst., vol. 114, no. 7, pp. 835–846, Jul. 2010.
9. Mahanthesh U, Dr. H S Mohana “Identification of Human Facial Expression Signal Classification Using Spatial Temporal Algorithm” International Journal of Engineering Research in Electrical and Electronic Engineering (IJEREEE) Vol 2, Issue 5, May 2016
10. NikiEfthymiou, PetrosKoutras, Panagiotis, Paraskevas, Filntisis, Gerasimos Potamianos, Petros Maragos “Multi-View Fusion for Action Recognition in Child-Robot Interaction”: 978-1-4799-7061-2/18/$31.00 ©2018 IEEE.
11. Nweke Henry Friday, GhulamMujtaba, Mohammed Ali Al-garadi, Uzoma Rita Alo, analysed “Deep Learning Fusion Conceptual Frameworks for Complex Human Activity Recognition Using Mobile and Wearable Sensors”: 978-1-5386-1370-2/18/$31.00 ©2018 IEEE.
12. Van-Minh Khong, Thanh-Hai Tran, ”Improving human action recognition with two-stream 3D convolutional neural network”, 978-1-5386-4180-4/18/$31.00 ©2018 IEEE.
13. Nour El Din Elmadany , Student Member, IEEE, Yifeng He, Member, IEEE, and Ling Guan, Fellow, IEEE ,”Information Fusion for Human Action Recognition via Biset/MultisetGlobality Locality Preserving Canonical Correlation Analysis” IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 27, NO. 11, NOVEMBER 2018.
14. Pavithra S, Mahanthesh U, Stafford Michahial, Dr. M Shivakumar, “Human Motion Detection and Tracking for Real-Time Security System”, International Journal of Advanced Research in Computer and Communication Engineering ISO 3297:2007 Certified Vol. 5, Issue 12, December 2016.
15. Lalitha. K, Deepika T V, Sowjanya M N, Stafford Michahial, “Human Identification Based On Iris Recognition Using Support Vector Machines”, International Journal of Engineering Research in Electrical and Electronic Engineering (IJEREEE) Vol 2, Issue 5, May 2016
16. RoozbehJafari, Nasser Kehtarnavaz “A survey of depth and inertial sensor fusion for human action recognition”,, 07/12/2018.
17. Rawya Al-Akam and Dietrich Paulus, ”Local Feature Extraction from RGB and Depth Videos for Human Action Recognition”, International Journal of Machine Learning and Computing, Vol. 8, No. 3, June 2018
18. V. D. Ambeth Kumar, V. D. Ashok Kumar, S. Malathi, K. Vengatesan and M. Ramakrishnan, “Facial Recognition System for Suspect Identification Using a Surveillance Camera”, ISSN 1054-6618, Pattern Recognition and Image Analysis, 2018, Vol. 28, No. 3, pp. 410–420. © Pleiades Publishing, Ltd., 2018.