In order to use the flow on this page, you must have followed node-opencv with Node-Red first.

BackgroundSubtractionFlow

Some advanced motion detection using BackgroundSubtractor in NR:

In order to use this flow, you will need to have pulled the btsimonh-dev branch from https://github.com/btsimonh/node-opencv-nr and built it on your Pi; where background subtraction has been enabled for opencv2.

You will also need node-red-contrib-multipart-stream-encoder by Bart if you want to see video output in a browser.

Load the flow from here (sorry, flows directly on this wordpress site get correupted)

Then inject from ‘Start’.

To view the output of the BackgroundSubtraction process, point a browser at http://raspberrypi:1880/test2, and you should see an mjpeg stream.  Move something in front of the camera, and you should see the results of MOG.  You can also use MOG2 and GMG by changing the code in NR (createMOG -> createMOG2/createGMG).

Any of these are better than a simple averaging and difference…

Note that the MOG/MOG2/GMG processing is currently Synchronous.. Hoping to add Asynchronous code as time allows.  On my RPi3, the above flow runs at ~80% of one core using ~200mbytes, including the coding to mjpeg (which is Asynchronous), I’ve not measured the FPS or the processing duration yet, but am assuming 30fps, as we’re not restricting the input rate, and normally we’d start to eat memory or cause other problems if we were not able to process the frames fast enough.

Update: Asynchronous is now available!

Note that I have fitted heatsinks and a fan to my RPi3; if it does hit 85 degrees, then the CPU speed is reduced, and it may not be able to keep up with the framerate anymore.

Example output:

motiondogmog

 

Using the output of the BackgroundSubtractor

This flow:

motiondogflow

draws boxes around the areas of the original incoming picture to highlight detected motion by tracing the contours, and then for any contours which have an area bigger than 500 pixels, drawing an enclosing box on the original video frame.

Example output:

motiondog

 

The flows above are difficult to copy; I’ve placed a flow in the NR flows library, which is easier to extract from, here:

5FPS motion detection flow

 

Next installment: capturing video sequences of motion detected

Â