DIY outdoor CCTV system short how-to. ------------------------------------- Requirements: ------------- - keep video streams on disk for 30+ days; - online access to the streams from desktop browser and iOS devices (no flash); - unstable ip uplink tolerance; - night vision; - no clouds (thanks); - minimum wiring (i.e. PoE) and reasonable CPU power requirements; - moderate costs; - learn nginx-rtmp-module; - get some fun. At this point these are not required: - motion detection; - face recognition; ... Hardware: --------- 1. Cameras: 2x HiWatch DS-N201, 1x DS-I110. All are IP66, rtsp/h.264, PoE. With the default settings (1024x768, up to 2mbps stream, medium quality, 25fps) each camera creates ~40MB in 5 mins. This gives us 24 hours * 12 t/slots * 40MB * 3 cams = 40GB/day roughly or 1.2TB/month. This is not that big these days but you can decrease this amount greatly by reducing fps or quality or resolution. 2. It makes all sense to use a simple PoE switch. Mine is TP-Link TL-SF1008P. 3. Server box with moderate CPU power and big enough storage. Mine is HP Microserver Gen8 with 2x3TB 3.5" SATA HDDs. Software: --------- 1. Unix-like system: I'm happy with FreeBSD 10.3, your mileage may vary (tm). 2. nginx + nginx-rtmp-module: super-powerful combo. 3. ffmpeg: even more powerful -- I don't know anybody who was able to read ffmpeg man page from the beginning to the end. Configuration: -------------- Some essential parts of nginx.conf: # nginx-rtmp-module wiki: https://github.com/arut/nginx-rtmp-module/wiki # nginx-rtmp-module habrahabr (on Russian): https://habrahabr.ru/post/162237/ http { <...> # HLS location for iOS devices. location /hls { root /var/tmp/nginx; } } rtmp { server { listen 1935; application cctv { allow publish 127.0.0.0/24; allow publish 10.82.64.3; deny publish all; allow play all; live on; # HLS fragments generation. hls on; hls_path /var/tmp/nginx/hls; hls_fragment 5s; # nginx will record all cameras streams # in 5 mins fragments. record video; record_path /cctv/spool; record_interval 5m; record_suffix -%Y-%m-%d-%H-%M.flv; } } } /var/tmp/nginx/hls/index.html: A part of html page for playing flash/rtmp with jwplayer:
Loading the player ...
<...> jwplayer stuff was copied from arut's github repo. And now just copy and publish rtmp streams from each camera: $ ffmpeg -re -rtsp_transport tcp -i rtsp://$cam-ip \ -c copy -f flv rtmp://127.0.0.1/cctv/cam00 You can extend ffmpeg with different video processing options, e.g. combine all three pictures in one, play with the frame rate etc. With all the above the load on 2 Celeron 2.30GHz cores is barely measurable. You also need to write some simple startup and housekeeping scripts to remove old video files from the spool, rotate logs and offline video processing. Some useful ffmpeg links for those who can't sleep: https://trac.ffmpeg.org/wiki/How%20to%20speed%20up%20/%20slow%20down%20a%20video http://ffmpeg.org/ffmpeg-filters.html#overlay-1 https://ffmpeg.org/pipermail/ffmpeg-user/2013-June/015662.html http://stackoverflow.com/questions/11552565/vertically-stack-several-videos-using-ffmpeg Typical CPU usage, top(1) summary: last pid: 83326; load averages: 0.15, 0.11, 0.09 up 13+22:39:34 13:26:19 35 processes: 1 running, 34 sleeping CPU: 0.4% user, 0.0% nice, 0.6% system, 0.0% interrupt, 99.0% idle Mem: 22M Active, 3180M Inact, 673M Wired, 32M Cache, 426M Buf, 20M Free Swap: 8192M Total, 69M Used, 8123M Free # Collection of small ffmpeg recipes. # Cut video from "-ss" time stamp with "-t" duration. ffmpeg -i cam02-2016-11-15-08-23.flv -ss 00:00:35 -t 00:00:36 -async 1 cut.mp4 # Motion detection ffmpeg -i temp.flv -vf "select=gt(scene\,0.050),setpts=N/(25*TB)" out.flv # Concat files $ cat list file 1.mp4 file 2.mp4 file 3.mp4 $ ffmpeg -f concat -i list -c copy ivan.mp4 # Convert mov to mp4 ffmpeg -i input.mov -qscale 0 output.mp4 -- maxim.konovalov@gmail.com $Id: cctv.txt,v 1.9 2016/12/14 10:34:06 maxim Exp $