I posted in the 'What did you do to your M2 today about the fact that I'd used a script to automatically download the files from my dashcam whenever I'm home. I got a request from
dcmac to post more details about this, so here it is... in a more appropriate sub-section.
I should point out from the outset that this script is aimed at BlackVue cameras. It could potentially be modified for other brands, but of course, the device would need to be on your network, and there would need to be some way of getting a list of files, and then pulling them off. In the BlackVue cameras this all happens over HTTP, so it's pretty straightforward.
So far this script is working perfectly, but I am finding that it takes a very long time to download the files, simply because there is so many of them and they're a large size. 1 day of data not going particularly far is in excess of 40GB! I could probably reduce this by turning down the sensitivity in parking mode, but I would rather know if something has happened to my car, and the low cost of memory makes this a bit of a non-issue anyway.
So the script itself, is as follows:
Code:
#!/bin/bash
echo
###### Set location where you want to store dashcam data. Sub-directories will be created for each day ######
rootpath=[Your data storage location]
###### If IP Address can be set static in the dascham or DHCP server, set the IP address here, otherwise comment out this line ######
ipaddress=[Dashcam IP Address]
###### Use to locate IP address if you cannot set it to be static in the dashcam or DHCP, otherwise comment out whole section ######
# macaddress="00:25:42:XX:XX:XX"
# ipaddress=$(arp -n | grep -i $macaddress | awk '{print $1}')
# if [[ -z $ipaddress ]]; then
# # Ping all addresses between 192.168.1.100 and 192.168.1.200 to make sure we have a complete arp table
# for i in {100..200}; do (ping 192.168.1.$i -c 1 -w 5 &> /dev/null &); done
# # Allow some time to get the ping responses
# sleep 5s
# # Try again to locate IP address in arp table
# ipaddress=$(arp -n | grep -i $macaddress | awk '{print $1}')
# fi
if [[ ! -z $ipaddress ]]; then
count=$(ping -c4 $ipaddress | grep 'received' | awk -F',' '{ print $2}' | awk '{ print $1}')
if [ $count -eq 4 ]; then
echo "BlackVue is up at $(date) ($ipaddress)"
echo "Getting file list"
if curl -# "http://$ipaddress/blackvue_vod.cgi" -o ${rootpath}list.txt --no-verbose;then
# sort the file, get the oldest files first
sort ${rootpath}list.txt -o ${rootpath}sortlist.txt
while read line
do
# check if valid line
if [[ $line = *Record* ]];then
# extract the different file formats from the line read
path=$(echo $line| cut -d':' -f 2)
path=$(echo $path| cut -d',' -f 1)
file=$(echo $path| cut -d'/' -f 3)
sdir=$(echo $file| cut -d'_' -f 1)
# echo "$sdir" - "$file" - "$path"
# check if directory exists. If not, create it.
if [ ! -d "$rootpath$sdir" ]; then
mkdir "$rootpath$sdir"
fi
# check if file exist. If not, copy from dashcam.
if [ ! -f "$rootpath$sdir/$file" ]; then
#If front camera file, try to download gps and 3gf file first
if [[ $path = *F.mp4* ]];then
echo "Downloading ${path/F.mp4/.gps}"
(cd $rootpath$sdir; curl -# "http://$ipaddress${path/F.mp4/.gps}" -O)
echo "Downloading ${path/F.mp4/.3gf}"
(cd $rootpath$sdir; curl -# "http://$ipaddress${path/F.mp4/.3gf}" -O)
fi
if ! (echo "Downloading $path"; cd $rootpath$sdir; curl -# "http://$ipaddress$path" -O);then
echo Transfer of "$file" failed...
if [ -f "$rootpath$sdir/$file" ]; then
# remove bogus file
rm -f "$rootpath$sdir/$file"
fi
fi
fi
fi
done < ${rootpath}sortlist.txt
echo "Completed at $(date)"
fi
else
echo "BlackVue is down at $(date) ($ipaddress not responding)"
fi
I actually found this script I think on a Tesla forum, or maybe a dashcam forum. I've tried to look up where I found it to credit the source, but I'm not able to do so. Having said that, I did modify it quite heavily to work on a Mac. Macs don't have wget natively, and rather than install wget, I modified the script to work with curl.
I have a launch daemon running this once every hour. The script only copies files that aren't already copied, so once the transfer is finished, the only thing happening every hour is a check for new files.
I hope this is useful to some of you.