In this presentation, we will address the problem of automatic discovery and analysis of activity patterns from long-term sensor logs with special focus on video data coming from public scenes, like a busy traffic junction or a crowded metro station. Analyzing this passively recorded video data can be useful in many applications like event based video retrieval, automatic stream selection, unusual event detection and event prediction. The adopted approach to address this problem relies on: (i) simple visual features like optical flow and foreground information to avoid the need for sophisticated object tracking; (ii) unsupervised approaches so that no manual annotation is needed, and (iii) probabilistic topic models due to its hierarchical extensions and powerful inference methods. We will look at a variety of approaches such as Probabilistic Latent Semantic Analysis (PLSA), Probabilistic Latent Sequential Motifs (PLSM), Hierarchical Dirichlet Latent Sequential Motifs (HDLSM) and the Mixed Event Relationship (MER) model that use simple low-level features to infer higher level scene semantics including dominant activities, when and how they occur, abnormal events, event relations and scene contexts.
Jagan Varadarajan is a Researcher at Advanced Digital Sciences Center (ADSC) in Singapore. Jagan Varadarajan earned his B.Sc (Mathematics), M.Sc (Mathematics) and M.Tech in Computer Science all from Sri Sathya Sai University, Prasanthi Nilayam, AP, India. Subsequently, he worked at HP Labs, Bangalore and GE Global Research, Bangalore as a Scientist from 2005 to 2008 on projects dealing with document image processing, online handwriting recognition, text and video information retrieval and video analysis before joining ADSC in 2012.
ADSC is a research center led by Illinois faculty, located in Singapore dedicated to transform the way people and organizations use and interact with information technologies and contribute to the advance of the knowledge-based economy in Singapore and beyond.