11-18-2017, 12:53 AM | #1 |
Major
1403
Rep 1,466
Posts |
Dashcam file auto download script
I posted in the 'What did you do to your M2 today about the fact that I'd used a script to automatically download the files from my dashcam whenever I'm home. I got a request from dcmac to post more details about this, so here it is... in a more appropriate sub-section.
I should point out from the outset that this script is aimed at BlackVue cameras. It could potentially be modified for other brands, but of course, the device would need to be on your network, and there would need to be some way of getting a list of files, and then pulling them off. In the BlackVue cameras this all happens over HTTP, so it's pretty straightforward. So far this script is working perfectly, but I am finding that it takes a very long time to download the files, simply because there is so many of them and they're a large size. 1 day of data not going particularly far is in excess of 40GB! I could probably reduce this by turning down the sensitivity in parking mode, but I would rather know if something has happened to my car, and the low cost of memory makes this a bit of a non-issue anyway. So the script itself, is as follows: Code:
#!/bin/bash echo ###### Set location where you want to store dashcam data. Sub-directories will be created for each day ###### rootpath=[Your data storage location] ###### If IP Address can be set static in the dascham or DHCP server, set the IP address here, otherwise comment out this line ###### ipaddress=[Dashcam IP Address] ###### Use to locate IP address if you cannot set it to be static in the dashcam or DHCP, otherwise comment out whole section ###### # macaddress="00:25:42:XX:XX:XX" # ipaddress=$(arp -n | grep -i $macaddress | awk '{print $1}') # if [[ -z $ipaddress ]]; then # # Ping all addresses between 192.168.1.100 and 192.168.1.200 to make sure we have a complete arp table # for i in {100..200}; do (ping 192.168.1.$i -c 1 -w 5 &> /dev/null &); done # # Allow some time to get the ping responses # sleep 5s # # Try again to locate IP address in arp table # ipaddress=$(arp -n | grep -i $macaddress | awk '{print $1}') # fi if [[ ! -z $ipaddress ]]; then count=$(ping -c4 $ipaddress | grep 'received' | awk -F',' '{ print $2}' | awk '{ print $1}') if [ $count -eq 4 ]; then echo "BlackVue is up at $(date) ($ipaddress)" echo "Getting file list" if curl -# "http://$ipaddress/blackvue_vod.cgi" -o ${rootpath}list.txt --no-verbose;then # sort the file, get the oldest files first sort ${rootpath}list.txt -o ${rootpath}sortlist.txt while read line do # check if valid line if [[ $line = *Record* ]];then # extract the different file formats from the line read path=$(echo $line| cut -d':' -f 2) path=$(echo $path| cut -d',' -f 1) file=$(echo $path| cut -d'/' -f 3) sdir=$(echo $file| cut -d'_' -f 1) # echo "$sdir" - "$file" - "$path" # check if directory exists. If not, create it. if [ ! -d "$rootpath$sdir" ]; then mkdir "$rootpath$sdir" fi # check if file exist. If not, copy from dashcam. if [ ! -f "$rootpath$sdir/$file" ]; then #If front camera file, try to download gps and 3gf file first if [[ $path = *F.mp4* ]];then echo "Downloading ${path/F.mp4/.gps}" (cd $rootpath$sdir; curl -# "http://$ipaddress${path/F.mp4/.gps}" -O) echo "Downloading ${path/F.mp4/.3gf}" (cd $rootpath$sdir; curl -# "http://$ipaddress${path/F.mp4/.3gf}" -O) fi if ! (echo "Downloading $path"; cd $rootpath$sdir; curl -# "http://$ipaddress$path" -O);then echo Transfer of "$file" failed... if [ -f "$rootpath$sdir/$file" ]; then # remove bogus file rm -f "$rootpath$sdir/$file" fi fi fi fi done < ${rootpath}sortlist.txt echo "Completed at $(date)" fi else echo "BlackVue is down at $(date) ($ipaddress not responding)" fi I have a launch daemon running this once every hour. The script only copies files that aren't already copied, so once the transfer is finished, the only thing happening every hour is a check for new files. I hope this is useful to some of you.
__________________
2018 ///M2 LCI, LBB, 6MT...
Current Performance Mods: CSF FMIC, ER CP, Fabspeed Cat, Aquamist WMI, GFB DV+, NGK 97506, BM3 (Stage 2 93 OTS), CDV delete, UCP, M2C/M3/M4 Strut Brace, M3/M4 Reinforcement Rings |
11-18-2017, 08:47 AM | #2 | |
Lieutenant
439
Rep 468
Posts |
Quote:
|
|
Appreciate
0
|
11-20-2017, 12:07 AM | #3 |
I INDIC8
612
Rep 1,286
Posts
Drives: 2018 LBB M2, 6MT
Join Date: Dec 2016
Location: Cleveland, OH
|
This is awesome. I have a blackvue on my xmas list, and I will probably setup my NAS to run this at night to clear out the files.
__________________
2004 Matrix XRS 6MT => 2008 VW R32 DSG => 2012 Audi TTRS 6MT => 2018 BMW ///M2 6MT
I'm an IT guy by trade and tech nerd by choice. I like HPDEs, parts of US Hwy 129, NC 28, and the Cherohala Skyway. I'm also a fan of aural pleasure. |
Appreciate
0
|
11-20-2017, 01:14 AM | #4 | |
Major
1403
Rep 1,466
Posts |
Quote:
I think the script will work fine on Linux, but you might have to modify back to using wget instead of curl if that's not already installed in your distribution.
__________________
2018 ///M2 LCI, LBB, 6MT...
Current Performance Mods: CSF FMIC, ER CP, Fabspeed Cat, Aquamist WMI, GFB DV+, NGK 97506, BM3 (Stage 2 93 OTS), CDV delete, UCP, M2C/M3/M4 Strut Brace, M3/M4 Reinforcement Rings |
|
Appreciate
0
|
11-20-2017, 03:21 PM | #5 | |
I INDIC8
612
Rep 1,286
Posts
Drives: 2018 LBB M2, 6MT
Join Date: Dec 2016
Location: Cleveland, OH
|
Quote:
__________________
2004 Matrix XRS 6MT => 2008 VW R32 DSG => 2012 Audi TTRS 6MT => 2018 BMW ///M2 6MT
I'm an IT guy by trade and tech nerd by choice. I like HPDEs, parts of US Hwy 129, NC 28, and the Cherohala Skyway. I'm also a fan of aural pleasure. |
|
Appreciate
0
|
12-09-2017, 02:18 PM | #6 |
Major
1403
Rep 1,466
Posts |
OK, a little update to the script...
I quickly found that I was filling up the disk I was using to save files to, and was having to go and manually delete the older files, which is of course annoying. I have enough disk space for just over 2 weeks of regular driving. I'm sure a long trip would result in more data, but I do have some wiggle room on the disk, so decided on 14 days of backups. I inserted the following lines just after the section that sorts the list of files, before the while loop. This way it doesn't delete files older than 14 days if the dashcam isn't connected at all. Code:
# delete dashcam files over 14 days to prevent over filling the disk find "$rootpath" -mtime +14 -delete
__________________
2018 ///M2 LCI, LBB, 6MT...
Current Performance Mods: CSF FMIC, ER CP, Fabspeed Cat, Aquamist WMI, GFB DV+, NGK 97506, BM3 (Stage 2 93 OTS), CDV delete, UCP, M2C/M3/M4 Strut Brace, M3/M4 Reinforcement Rings |
Appreciate
0
|
05-09-2018, 12:27 AM | #7 |
I INDIC8
612
Rep 1,286
Posts
Drives: 2018 LBB M2, 6MT
Join Date: Dec 2016
Location: Cleveland, OH
|
I found what basically amounts to a "one-liner" of your script over here...
https://www.bjornsblog.nl/tips-and-t...synology-wifi/ wget has some niceness - it won't overwrite or try to re-download an existing file, but it is smart enough to complete an unfinished download, so the script itself doesn't have to include any logic to check existing files, and it's much more robust against the camera disconnecting in the middle of the download. I just set it up tonight and I'm going to let it run for a few days and see what I wind up with. I have ~4.4 TB free on my NAS so space isn't really a concern right now...
__________________
2004 Matrix XRS 6MT => 2008 VW R32 DSG => 2012 Audi TTRS 6MT => 2018 BMW ///M2 6MT
I'm an IT guy by trade and tech nerd by choice. I like HPDEs, parts of US Hwy 129, NC 28, and the Cherohala Skyway. I'm also a fan of aural pleasure. |
Appreciate
0
|
01-07-2020, 05:20 AM | #8 |
Registered
0
Rep 2
Posts |
Refined script. Thank you.
I joined this forum just say thanks to Nezil for initiating this post and I have since expanded on the intellectual property of many others and have created a more defined script:
1. worked and tested on Netgear RN426 (running ReadyNAS 6.10.2) and Blackvue DR900s-2CH 2. incorporated to remove old downloads at certain time every day 3. added features: a. ffmpeg to check integrity of downloaded mp4 files (sometimes they get corrupted during download, eg car left garage) b. ability to run under crontab every single minute (to run cronjobs) because it will not start another instance if one instance is already running (I set mine at every 5 minutes) c. ability to exclude one file type (I excluded new files (=N*), so I download only event files (=E*), park files (=P*) and manual files (=M*)) d. ability to avoid duplicate download of gps and 3gf files e. all the adjustable parameters are set at the start of the scripts to allow easy modification to suit your particular environment f. ability to choose how may times the script will try to download the file g. most excitingly, speedup codes were added. Previously, the downloads were slow because after a wifi disconnection, it restarts download from the very start. Although it would ignore the already downloaded files, it wastes a lot of time going from file to file to find out it has already been downloaded. Note: 1. there is the part from Nezil's script that allows variable IP address (obtaining from router), but I always have the router set to give my Blackvue a fixed IP address. 2. I am not a programmer but learnt everything from here and other forums 3. Blackvue DR900S is really good. (Thank you Blackvue) I have Unifi wifi access points that allow devices to connect to the same SSID but automatically connects at 2.4 GHz or 5GHz depending on whichever gives the best connection. I can see sometimes Blackvue will connect at 5GHz and sometimes at 2.4GHz. Of course when the signal is good, 5GHz is a lot faster. (This is happening despite choosing ONE frequency only on the Blackvue's own settings). 4. On Blackvue, set the wifi cloud to use your garage wifi as "garage wifi". Although your car has been connecting to your car's mobile 4g, it will automatically reconnect to the garage wifi when you get home because of this setting 4. Installing ffmpeg on my ReadyNAS is easy, just sudo appget ffmpeg and it will install itself, as the repository is already sitting in the Netgear remote server ready to be downloaded and installed. ffmpeg is very fast in checking video integrity. 5. cronjobs: google please: for ReadyNAS: it is crontab -l for listing, crontab -e for editing 6. need to know how to do SSH access into Ready NAS (suggest putty.exe freeware) 5. Thanks to Nezil on the share on hardware installation: https://f87.bimmerpost.com/forums/sh....php?t=1457010 I did exactly that: Cellink Neo with a relay switch with car battery source, and also a power magic pro between the car battery source and the relay switch. (Power magic pro NOT between Blackvue and final electricity supply) 6. Sorry, I can't be a quick responder. Can only do these forum things during holiday. |
Appreciate
0
|
01-07-2020, 05:40 AM | #9 |
Registered
0
Rep 2
Posts |
Here it is
Code:
#!/bin/bash # Script path scriptpath=/data/Video/Blackvue/Scripts/ # Root path rootpath=/data/Video/Blackvue/X5/ # Car Name car="X5" # Exclude recording type..X=no exclusion,N=Exclude Normal Recording ExcType="N" # Number of attempts to download mp4 file (minimum is 1) times="2" # Locate IP address in arp table ipaddress=192.168.100.17 # Removal of files older than certain days and empty folders left behind certain=60 # Time to cleanup old files and empty folders eg. 03-18 or 11-00 or 23-30 cleanuptime=11-00 ###### Use to locate IP address if you cannot set it to be static in the dashcam or DHCP, otherwise comment out whole section ###### # macaddress="00:25:42:XX:XX:XX" # ipaddress=$(arp -n | grep -i $macaddress | awk '{print $1}') ##### in case ipaddress is still empty then # if [[ -z $ipaddress ]]; then # # Ping all addresses between 192.168.1.100 and 192.168.1.200 to make sure we have a complete arp table # for i in {100..200}; do (ping 192.168.1.$i -c 1 -w 5 &> /dev/null &); done # # Allow some time to get the ping responses # sleep 5s # # Try again to locate IP address in arp table # ipaddress=$(arp -n | grep -i $macaddress | awk '{print $1}') # fi # lock dirs/files LOCKDIR="/tmp/"$car"script-lock" PIDFILE="${LOCKDIR}/PID" # exit codes and text ENO_SUCCESS=0; ETXT[0]="ENO_SUCCESS" ENO_GENERAL=1; ETXT[1]="ENO_GENERAL" ENO_LOCKFAIL=2; ETXT[2]="ENO_LOCKFAIL" ENO_RECVSIG=3; ETXT[3]="ENO_RECVSIG" ### ### start locking attempt ### trap 'ECODE=$?; echo "["$car"script] Exit: ${ETXT[ECODE]}($ECODE)" >&2' 0 echo -n "["$car"script] Locking: " >&2 if mkdir "${LOCKDIR}" &>/dev/null; then # lock succeeded, install signal handlers before storing the PID just in case # storing the PID fails trap 'ECODE=$?; echo "["$car"script] Removing lock. Exit: ${ETXT[ECODE]}($ECODE)" >&2 rm -rf "${LOCKDIR}"' 0 echo "$$" >"${PIDFILE}" # the following handler will exit the script upon receiving these signals # the trap on "0" (EXIT) from above will be triggered by this trap's "exit" command! trap 'echo "["$car"script] Killed by a signal." >&2 exit ${ENO_RECVSIG}' 1 2 3 15 echo "success, installed signal handlers" else # lock failed, check if the other PID is alive OTHERPID="$(cat "${PIDFILE}")" # if cat isn't able to read the file, another instance is probably # about to remove the lock -- exit, we're *still* locked # Thanks to Grzegorz Wierzowiecki for pointing out this race condition on # http://wiki.grzegorz.wierzowiecki.pl/code:mutex-in-bash if [ $? != 0 ]; then echo "lock failed, PID ${OTHERPID} is active" >&2 exit ${ENO_LOCKFAIL} fi if ! kill -0 $OTHERPID &>/dev/null; then # lock is stale, remove it and restart echo "removing stale lock of nonexistant PID ${OTHERPID}" >&2 rm -rf "${LOCKDIR}" echo "["$car"script] restarting myself" >&2 exec "${scriptpath}$0" "$@" else # lock is valid and OTHERPID is active - exit, we're locked! echo "lock failed, PID ${OTHERPID} is active" >&2 exit ${ENO_LOCKFAIL} fi fi # at cleanuptime every day when the script is not busy downloading, delete dashcam files over certain days and empty directories to prevent overfilling the disk if [[ "$(date +"%H-%M")" = "${cleanuptime}" ]]; then echo $(date) Started removing files older than $certain days and empty folders under $car.....>>${scriptpath}oldfilecleanup.log logfile=$(find "$rootpath" -mtime +${certain} -delete -print) echo $(date) Removed old files:>>${scriptpath}oldfilecleanup.log echo ${logfile} | tr ' ' '\n'.....>>${scriptpath}oldfilecleanup.log logdir=$(find "$rootpath" -type d -empty -delete -print) echo $(date) Removed empty directories:>>${scriptpath}oldfilecleanup.log echo $logdir | tr ' ' '\n'.....>>${scriptpath}oldfilecleanup.log echo $(date) Finished removing old files and empty folders under $car.....>>${scriptpath}oldfilecleanup.log fi if [[ ! -z $ipaddress ]]; then count=$(ping -c4 $ipaddress | grep 'received' | awk -F',' '{ print $2}' | awk '{ print $1}') if [ $count -eq 4 ]; then echo "BlackVue-"$car" is up at $(date) ($ipaddress)" if wget "http://$ipaddress/blackvue_vod.cgi" --output-document=${rootpath}list.txt --no-use-server-timestamps --no-verbose --timeout 60 --tries 1;then # sort the file, get the oldest files first sort ${rootpath}list.txt -o ${rootpath}sortlist.txt # speedup : remove outdated filenames from Downloaded mp4 file list.txt touch ${rootpath}"Downloaded mp4 file list.txt" sort ${rootpath}"Downloaded mp4 file list.txt" -o ${rootpath}"Downloaded mp4 file list.txt" line=$(head -n 1 ${rootpath}sortlist.txt) path=$(echo $line| cut -d':' -f 2) path=$(echo $path| cut -d',' -f 1) file=$(echo $path| cut -d'/' -f 3) if grep -Fq $file ${rootpath}"Downloaded mp4 file list.txt" then sed -i -n -E -e "/${file}/,$ p" ${rootpath}"Downloaded mp4 file list.txt" fi # speedup : cleanse the sortlist.txt by removing filenames of already downloaded or excluded mp4 files while read line do if [[ $line = *Record* ]];then # extract the different file formats from the line read path=$(echo $line| cut -d':' -f 2) path=$(echo $path| cut -d',' -f 1) file=$(echo $path| cut -d'/' -f 3) if grep -Fq $file ${rootpath}"Downloaded mp4 file list.txt" then grep -v "${file}" ${rootpath}sortlist.txt > ${rootpath}"1.txt"; mv ${rootpath}"1.txt" ${rootpath}sortlist.txt fi fi done < ${rootpath}sortlist.txt while read line do #check that dashcam is still online countA=$(ping -c2 $ipaddress | grep 'received' | awk -F',' '{ print $2}' | awk '{ print $1}') if [ $countA -ne 2 ]; then echo "BlackVue-"$car" is no longer online" exit fi # check if valid line if [[ $line = *Record* ]];then # extract the different file formats from the line read path=$(echo $line| cut -d':' -f 2) path=$(echo $path| cut -d',' -f 1) file=$(echo $path| cut -d'/' -f 3) sdir=$(echo $file| cut -d'_' -f 1) # echo "$sdir" - "$file" - "$path" # check if directory exist. If not, create it. if [ ! -d "$rootpath$sdir" ]; then mkdir "$rootpath$sdir" fi # check if file exist. If not, copy from dashcam. if [ ! -f "$rootpath$sdir/$file" ]; then # If front camera file, try to download gps and 3gf file first if [[ $path = *F.mp4* ]];then if [ ! -f "$rootpath$sdir/${file/F.mp4/.gps}" ]; then if ! wget "http://$ipaddress${path/F.mp4/.gps}" --directory-prefix=$rootpath$sdir --no-use-server-timestamps --no-verbose --timeout 60 --tries 1;then echo Transfer of "${file/F.mp4/.gps}" failed... if [ -f "$rootpath$sdir/${file/F.mp4/.gps}" ]; then # remove bogus file rm -f "$rootpath$sdir/${file/F.mp4/.gps}" fi else # add to the route file cat "$rootpath$sdir/${file/F.mp4/.gps}" | awk -F ']' '{ print $2 }' | egrep -v '^$' >> "$rootpath$sdir/route.log" fi fi if [ ! -f "$rootpath$sdir/${file/F.mp4/.3gf}" ]; then if ! wget "http://$ipaddress${path/F.mp4/.3gf}" --directory-prefix=$rootpath$sdir --no-use-server-timestamps --no-verbose --timeout 60 --tries 1;then echo Transfer of "${file/F.mp4/.3gf}" failed... if [ -f "$rootpath$sdir/${file/F.mp4/.3gf}" ]; then # remove bogus file rm -f "$rootpath$sdir/${file/F.mp4/.3gf}" fi fi fi fi if [[ ! $path = *${ExcType}*.mp4* ]];then k=1 while [ $k -le $times ] do ((k++)) if ! wget "http://$ipaddress$path" --directory-prefix=$rootpath$sdir --no-use-server-timestamps --no-verbose --timeout 60 --tries 1;then echo $(date) Transfer of "$file" failed.....>>"$rootpath$sdir/error.log" i=$(($times-$k+1)) echo Transfer of "$file" failed. Trying download again $i more time"("s")". if [ -f "$rootpath$sdir/$file" ]; then # remove bogus file rm -f "$rootpath$sdir/$file" fi else #check integrity of downloaded mp4 errors=$(ffmpeg -v error -i "/$rootpath$sdir/$file" null $1 2>&1) if [[ ! $errors = *null* ]] || [[ $errors = *channel* ]]; then echo $(date) $errors .....>>"$rootpath$sdir/error.log" i=$(($times-$k+1)) echo "$rootpath$sdir/$file" corrupted. Trying download again $i more time"("s")". rm -f "$rootpath$sdir/$file" else echo Download of "$rootpath$sdir/$file" successful! echo $file >>${rootpath}"Downloaded mp4 file list.txt" break fi fi done else # speedup : these files are diberately excluded, so they go to the Downloaded mp4 file list echo $file >>${rootpath}"Downloaded mp4 file list.txt" fi else # speedup : useful at initiation then useful because most likely the file is corrupt errors=$(ffmpeg -v error -i "/$rootpath$sdir/$file" null $1 2>&1) if [[ ! $errors = *null* ]] || [[ $errors = *channel* ]]; then echo $(date) DELETED because $errors .....>>"$rootpath$sdir/error.log" echo "$rootpath$sdir/$file" corrupted and hence deleted rm -f "$rootpath$sdir/$file" else echo $file >>${rootpath}"Downloaded mp4 file list.txt" fi fi fi done < ${rootpath}sortlist.txt echo "Completed at $(date)" fi else echo "BlackVue-"$car" is down at $(date) ($ipaddress not responding)" fi else echo "BlackVue-"$car" is down at $(date) (not found in arp table)" fi |
Appreciate
0
|
Post Reply |
Bookmarks |
|
|