Thursday, May 9, 2013

Camera Motion Detection + UI

This was supposed to be done long time back.

Some ideas that I worked on:

i) Face detection: Not added due to unreliable results
ii) Direct comparison of 2 images: Not added due to unreliable results and lot of false positives.
iii) RGB composition based techniques: Works correctly based on experiments.
iv) Aggregated LUMA based techniques: Works correctly and added into system.
v) State based techniques: Works correctly and also added into system.

New work:

Created an UI where the user can choose what algorithm they can choose to detect intrusion using cameras.
Also UI added where user can decide on what images they want to store in each phones on detection, i.e., previous frame, original frame and changed frame.
Also added UI to add phone number, where a text message will be displayed on every intrusion detected.

Wednesday, April 24, 2013

Mechanical Turks

Finally, I have found out the way to create a hit in the mechanical turk using the Java API for turks.I have successfully created and received the response from the hit ( I used the workers,requesters sandbox ).
The documentation for this is very limited. :(

Also done is the video creation( each video is of 5 seconds ) and simultaneous upload of the video.
The next thing remaining is to integrate the Turk Java SDK into the android application.

Thursday, March 28, 2013

SensorFusion

Sensor fusion is an effective idea of using two or more sensors together to get faster and more accurate results. The following talk is a good watch. It helped me to get perspective on accelerometer and gyroscope. Though I am still trying to avoid using gyroscope in my project, who knows for perfect motion detection I might end up using gyroscope along with the other sensors.


Tuesday, March 26, 2013

Using Accelerometer : Ups and Downs

Accelerometer is a popular sensor for detecting motion and closer analysis reveals type of motion as well. In simple terms, this sensor gives you the acceleration along each axis. Researchers and app developer have used it extensively for detecting walking, running, postures etc. A generic accelerometer can be 3 axis and even 6 axis.  For this project, I want to use the accelerometer sensor that comes embedded in an Android device for detection opening and closing of doors, windows. Comparatively these types of motion are not as complex as change of postures, walking vs running and others.

Again there are few unanticipated challenges I faced while trying to detect this predictable variation in acceleration. As you may know, accelerometer readings have lots of noise in them. Suppose the device is lying on your table. If you see the consequent readings, it keeps changing every reading for even x and z axis. The y axis will be showing errors due to the gravity factor. Now this noisy reading becomes a challenge because unlike for walking/running, the change of acceleration for opening a door is not going to be that high to observe a distinct difference unless you open/close the door very fast. Moreover opening and closing of doors will be happening in matter of few seconds ( at max may be 2 seconds). Detection needs to be fast. So you will also not have the option for sampling for larger intervals. And also false positives and false negatives should be ideally zero. I can think of 3 ways to approach this problem for time being - 1. Do quick sampling ( 200 - 400 ms) and look for distinct change in acceleration or maintain a threshold; 2. Use gyroscope readings as well to look for variation in angle and also to filter out noise; 3. Have better classification algorithm to categorize changes.

I have been trying to do a variation of the first way, since it is simple and fast. So I have been experimenting with different threshold values. And also instead of depending on a single instance of change, if there has been a continuous increase or decrease in acceleration over few intervals then I classify the door to be in motion. For the particular case of opening and closing of door, I consider acceleration along x and z axes for obvious reasons to cut down on the noise even more.

I will add snapshots of graphs showing the different cases later today.
In the graph below, the green dots are the points when our algorithm detects door to be in motion. If you study the graph, the algo fails when the door is opened very slowly. For this reason we will add a second layer to this detection  which is based on sound level/amplitude (later)



Sunday, March 24, 2013

Literature survey (meant to post this a while back)



1) Hao Zheng et-al, Design of Mobile Video Surveillance Based on Android, IEEE CSSS, 2012


This paper introduces a system architecture of using mobile video based surveillance using smartphone. The paper goes on detail in streaming media transmission, h.264 and FFmpeg decoding in detail. The performance metric used in this paper is mainly image quality, delay, fluency. Some of the lacking points of this paper in terms of surveillance using smartphone is: i) only considering video as an input and not considering any other sensors, ii) no performance metric to validate improvements in surveillance using mobile surveillance approach rather than regular traditional approaches. But, overall, the paper showed that, mobile video over 3G network can be used in surveillance.

2) Iria Estevez-Ayres et-al, Using Android Smartphones in a Service-Oriented Video Surveillance System, IEEE ICCE, 2011

3) Won-Ho Chung, A smartphone watch for mobile surveillance service, J. Personal and  Ubiquitous Computing, 2012

The papers also introduce a system of using IP cameras along with smartphones and describe a video based surveillance system. It gives an overall architecture which provides more flexibility in the surveillance system by using smartphones as user terminals to watch and control over various areas of interest. Some of the lacking points in these papers in terms of surveillance using smartphone is: i) No use of other sensors in improving surveillance. ii) No performance metric to validate improvements in surveillance. But, overall, the paper makes a contribution in identifying that a smartphone can be used as a user terminal and can give a flexibility over traditional surveillance approaches in terms of control.

4) Sangseok Yoon et-al,Virtual Lock: A Smartphone Application for Personal Surveillance Using Camera Sensor Networks, 17th IEEE RTCSA, 2011

This paper introduces a sensor network application for home surveillance using smartphone and wireless camera sensor motes. If an interested event is detected, then a notification is sent to the user via smartphone. It demonstrates the potential of camera sensor networks in surveillance. A system architecture is defined which has motes, connected to a server that can talk to both motes and the smartphone and also proposed light weight computer vision algorithms for detection. Some of the lacking points are: i) Not using other sensors and relying only on camera motes.

Looking into the current literature and most relevant papers, we can conclude that multi-modal sensing using smartphones for surveillance is a new and novel concept. A new system architecture will be proposed by us and a potential scheduling problem can arise, that I will discuss later in detail. Once we gather all related sensor data from our smartphones for surveillance inside a building, then we want to upload all those data to server for detection and some of the constraints are mainly, wi-fi capacity and a scheduling problem may arise about how to upload data based on its importance and bandwidth constraints.

Tuesday, March 5, 2013

Handling Sensors in Android

Hi everyone!

Today I was looking at the APIs available for handling the sensors and I came across few classes in Android SDK that might be useful for the project. I am listing the classes and their brief functions.

i) android.hardware.camera
                A class that enables application to interact with the camera to snap a photo, acquire images for a preview screen, and modify parameters used to govern how the camera operates.

ii) android.hardware.SensorManager
           A class that permits access to the sensors available within the Android platform. Not every Android-equipped device will support all of the sensors in the SensorManager, though it's exciting to think about the possibilities.

iii) android.hardware.SensorListener
           An interface implemented by a class that wants to receive updates to sensor values as they change in real time. An application implements this interface to monitor one or more sensors available in the hardware. For example, the code in this article contains a class that implements this interface to monitor the orientation of the device and the built-in accelerometer.

iv) android.media.MediaRecorder
           A class, used to record media samples, that can be useful for recording audio activity within a specific location (such as a baby nursery). Audio clippings can also be analyzed for identification purposes in an access-control or security application.

v) android.FaceDetector
           A class that permits basic recognition of a person's face as contained in a bitmap. You cannot get much more personal than your face. Using this as a device lock means no more passwords to remember — biometrics capability on a cell phone.

vi) android.os.*
                   A package containing several useful classes for interacting with the operating environment, including power management, file watcher, handler, and message classes. Like many portable devices, Android-powered phones can consume a tremendous amount of power. Keeping a device "awake" at the right time to be in position to monitor an event of interest is a design aspect that deserves attention up front.