tdkyo

Clean and Undisturbed Space for Thoughts and Writing

    Backing Up Video to the Cloud for Motioneye

    Background

    One of the first projects I did when I got my first Raspberry Pi was making a security camera for my house. After my Raspberry Pi Zero W Camera Pack came from adafruit, I hooked up the Raspberry Pi Zero W with the included camera module and installed motioneyeos to my Pi device. After logging on to the motioneye web interface on my browser, my new security camera was ready to record videos and photos based on motion detection. One of the major benefits of using a Raspberry Pi with motioneye installed is that I had a fully automated WiFi security camera that could upload captured media on major cloud storage providers (e.g., Google Drive) without any monthly fees!

    Although I liked using motioneye, I did not particularly like using motioneyeos, because the Linux distribution did not allow installing third-party programs via apt-get command. Motioneyeos was designed to be a single purpose distribution, where the operating system’s focus was on video surveillance and self-updating with minimal user intervention. Motioneyeos satisfies many people’s needs, including those who may not be familiar with working with the Linux operating system. As I become more accustomed to the Linux command line interface, I wanted to look beyond what motioneyeos has in store.

    Motioneye under Raspbian

    Taking out the SD card, I downloaded and installed Raspbian (now known as the Raspberry Pi OS) to my Raspberry Pi 3 B+ device and hooked my camera module to the more powerful Raspberry Pi device. I decided to use my Raspberry Pi 3 B+ instead of my Raspberry Pi Zero because I found that the Pi Zero was too slow to capture video at high resolutions. I also decided to use Raspbian instead of motioneyeos because I wanted to use Rclone to backup my video and photo files to my Google Drive.

    Long term storage on Google Drive with a small SD Card

    Even with a relatively large SD card (128 GB), I can easily accumulate enough motion-activated video footage to fill up the SD card within three to four days. I have a large cloud storage on Google Drive that could store a lot more videos than my SD card. How can I set up an automated system, where motioneye would only keep videos only for a few days, while backing up the video footage for longer term storage, such as keeping video files up to three weeks?

    I used a combination of motioneye, rclone, and cron job to get it done all automatically.

    Motioneye

    After installing Raspbian, I installed motioneye by following these steps. After logging into motioneye, I opened the settings tab and toggle opened both the “Still Images” and “Movies” section. Under “Preserve Pictures” and “Preserve Movies”, I set it to “For One Day.” Motioneye will only keep one day worth of media moving forward. We will use rclone to create long term storage on our Google Drive.

    rclone for interfacing with Google Drive

    Rclone is “a command line program to manage files on cloud storage.” rclone.org Rclone allows users to interface with various cloud storage providers, almost like an attached storage drive. Rclone can interface with Google Drive, so I followed the instructions on rclone’s Google Drive page to set up a Rclone remote drive interfaced with my Google Drive. I named my Google Drive remote as GoogleDrive:.

    Now, I want to backup all my motioneye captured media files on a folder within my Google Drive. So, I used the following command to create a folder named “motioneye” on the root of my Google Drive.

    rclone mkdir GoogleDrive:motioneye

    Granted, I could use Google Drive’s web interface to create the same folder on my browser, but I wanted to familiarize myself with all the features rclone has available for managing my cloud storage. To check whether the folder was created, I used the ls command on rclone,

    rclone ls GoogleDrive:

    where, among other files and folders, rclone would list our newly created folder “motioneye.”

    Placing all the commands together on a bash script

    Using my favorite Linux text editor (GNU nano), I wrote the following commands in order to a bash file “rcloneBackup.sh”.

    killall rclone;

    This command checks and kills any previously running rclone instance. Sometimes, due to network issues, rclone may run for a long time, and we may inadvertently launch another rclone instance, which may slow the backing up process even further. Thus, we make sure we only will have one rclone running moving forward.

    rclone delete GoogleDrive:motioneye --min-age 21d;

    Via rclone, we can selectively delete files on our remote drive’s motioneye folder based on the age of the file. Using the --min-age option, we can specify the minimum age of the files before rclone can go ahead and delete them. I set it to 21 days for my own personal preference. (You can adjust this parameter based on your own liking.)

    rclone rmdirs GoogleDrive:motioneye --min-age 22d --leave-root;

    If the previous delete command deleted files, thenrmdirs command deletes folders. If our previous file deletion command ran successfully, we would have empty folders starting after 21 days. Thus, I set the --min-age parameter to one day later from the previous command. The --leave-root option prevents rclone from attempting to delete the root folder “motioneye”.

    rclone cleanup GoogleDrive:;

    If you delete files on Google Drive via rclone, the deleted files end up on Google Drive’s Trash. To prevent Drive’s Trash from accumulating with deleted files (and fill up the storage quota for the Google account), I use this command to tell rclone to empty Drive’s Trash.

    (sleep 230m && killall -9 rclone) & (rclone copy /var/lib/motioneye/ GoogleDrive:motioneye --exclude "*.thumb" --transfers=1 --vv);

    There are two concurrent sets of commands going on. The first half of the code, (sleep 230m && killall rclone), acts as a timer to wait 230 minutes until our rclone instance gets terminated. We add this timer in to make sure that we do not have concurrent instances of rclones running on top of each other when we decide to run these series of commands again after 230 minutes.

    The second half of the code, (rclone copy /var/lib/motioneye/ GoogleDrive:motioneye --exclude "*.thumb" --transfers=1 --vv);, copies all the files where we have our recorded media from motioneye (at the default folder path of /var/lib/motioneye/) to our rclone remote drive’s folder. I used the --exclude parameter to exclude any *.thumb files from being transferred because those files are merely thumbnails of each of the media files which we don’t necessarily need for cloud storage. I also used the --transfers parameter to limit my transfers to one file at a time, because I found that having concurrent transfers with exceedingly large files tend to not upload on time within our 230 minutes window. Finally, I added the --vv parameter to force rclone to report every progress to our command line console.

    That’s it! Placing all of the commands in sequence together, we have the following series of commands for our bash script “rcloneBackup.sh”.

    killall rclone;
    rclone delete GoogleDrive:motioneye --min-age 21d;
    rclone rmdirs GoogleDrive:motioneye --min-age 22d --leave-root;
    rclone cleanup GoogleDrive:;
    (sleep 230m && killall -9 rclone) & (rclone copy /var/lib/motioneye/ GoogleDrive:motioneye --exclude "*.thumb" --transfers=1 --vv);
    

    I saved our bash script to our home directory, which is usually located at /home/[username]/. Afterward, I tested the script to make sure that all the commands are working.

    bash rcloneBackup.sh
    

    Crontab to regularly run the backup script

    After verifying that our bash script is working, we now have to crontab to run our bash script at regular intervals. If you are not familiar with cron, please check out a short guide by the Raspberry Pi Foundation. We can start by opening crontab.

    crontab -e

    If this is the first time running crontab, the system will ask you to pick a text editor to modify our crontab (also known as cron table). I usually pick nano, because that is the text editor that I am most familiar with editing text files in Linux.

    Next, I navigate to the end of the crontab text file to enter our crontab entry. If you recall, we set a timer of 230 minutes before terminating the rclone instance. The reason for picking 230 minutes was to ensure that we could run our bash script again every 240 minutes or 4 hours. (Feel free to modify those time values.) Assuming that I wanted to schedule our bash script to run every 4 hours, I added the following line to our crontab.

    0 */4 * * * bash "/home/[username]/rcloneCopy.sh"

    The first column represents “minutes” on crontab, and placing 0 on the first column tells cronjob that we want to run the script when the minute is at 0. (e.g., 1:00 AM, 2:00 PM, 3:00 AM; but not 1:01 AM, 2:02 PM, or 3:03 AM) The second column represents “hour”, and writing */4 tells cronjob to run the script every four hours after 12:00 AM. The third column represents days, the fourth column represents months, and the fifth column represents weekday. I placed a * on the third, fourth, and fifth column because I want to run the script every day, month, and weekday.

    On the second part of the script, bash "/home/[username]/rcloneCopy.sh, I wrote the bash command for cronjob to run every four hours. This command runs our prewritten bash script from earlier.

    After making sure that there aren’t any errors in our crontab entry, I save our modified crontab and exit our text editor.

    To make sure that our new crontab entry is loaded to our cronjob, I restarted the cron service.

    sudo service cron restart

    Await results

    That’s it! Every four hours, our Raspberry Pi will upload all of our captured media to our Google Drive and save it for three weeks. Fortunately, motioneye will delete all the locally captured media files after one day, so our SD card would not get filled up to capacity.

    You can now navigate to your Google Drive motioneye folder to view your captured media using your web browser from anywhere you have an internet connection. I hope this write-up helps you manage your limited SD card storage on your Raspberry Pi while having a much larger archive of captured media files in the cloud!