Month: January 2025

  • Continuously monitor your Servers and Services

    In order to monitor your Servers and Services, nothing better than phpServerMonitor running in a container on Synology.

    The container phpServerMonitor of scavin will do the trick (It’s possibly not the most up-to-date, but is works fine). You will only need a MySQL database hosted on your Synology.

    phpServerMonitor looks like this:

    First, install “MariaDB 10” (the default alternative to MySQL commonly used on Synology)

    Then, open MariaDB10, enable a port (Ex: the default 3306) and apply this change

    If it’s not yet done, install phpMyAdmin (not explained here)

    Launch it with your administrator account (Select MariaDB10 as Server)

    Create a DB named, for example, “servermon”:

    On that new DB, add a new user account:

    Type a user name, for example “servermon”, select a strong password (take note of it) and tick the option “Check all” Global privileges for the moment. Once phpServerMonitor installed and configured, you will only keep the “Data” access rights !

    Launch the “Container Manager” on your Synology and in the Registry, search for scavin/phpservermonitor and download that image.

    Once downloaded, configure and run it. The only config required is

    • mapping a port (Ex.: 4080) of the Synology on container’s port 80 (443 won’t work) and
    • removing the environment variable PHP_MD5 !!!

    Now, navigate to http://<Your Synology Address>:<Port mapped> and click “Let’s GO”

    Complete the required info in the configuration screen:

    • the Application base url should be completed automatically with : http://Your Synology IP:<Port mapped>
    • type the IP of your Synology as “database host” and the port enabled in MariaDB10 as “database port “
    • introduce new the name of database created previously, as well as the name and the password of the user added onto that DB.

    Click on “Save Configuration” and if “MariaDB” was accessible, you should see this:

    Click once more on “Save Configuration” and here you are:

    “Go to your Monitor” to add Servers and Services. More info in the official website of phpServerMonitor 😉

    Haaa, don’t forget to remove “Structure” and “Administration” rights for the user “servermon” on the DB “servermon” and all access rights at global level (if any):

    Et voilà.

    Loading

  • Continuously track your Internet Speed

    In order to monitor your internet connection and confirm that it’s stable, day and night, nothing better than a Speed Test Tracker running in a container on Synology.

    The Speedtest Tracker of Henry Whitaker will do the trick. It’s based on Ookla’s Speedtest Cli from which it records recurrently the results.

    It looks like this:

    It’s quite easy to install and run into a container of DSM 7.x. First, install the “Container Manager”:

    Then, register the image of henrywhitaker3/speedtest-tracker:

    Next, create a new container with that image and enable auto-start (here, this container wont’ be exposed via the Web Station)

    Finally, in the next tab:

    • Map a port of your Synology (Ex.: 7080) onto the port 80 of the container (and/or possibly 443).
    • Map a shared folder of your Synology (Ex.: /docker/Ookla) onto the folder /config of the container.

    Scrolling down in that tab, add an environment variable “OOKLA_EULA_GDPR” with the value “true”.

    If you don’t do it, the container will fail to run (it will stop automatically after one or two minutes) and you will find the following message in the Log of the container:

    Once the configuration done, you can run the container. Wait for a few minutes before opening the Speedtest Tracker. Once the container up, you should see this in the Logs of the container:

    You can now open the Speedtest Tracker (using http(s)://<your NAS IP>:<port configured>) and configure the scheduler to run a test on a regular basis. For that purpose, go to the “Settings” menu . Here under, the pattern ***** is used to run every minute(as per this site).

    The default speedtest server in use does not support such a high pace.. It will soon stop and you will see these messages in the Logs:

    The follow-up schedule can be use to run every 10 minutes: */10 * * * *

    You can possibly look for other speedtest, using this URL : https://www.speedtest.net/api/js/servers?engine=js&https_functional=true&limit=100&search=YOURLOCATION

    Use the ID of a server found via that URL in the Settings of the Speedtest Checker…

    There are much more settings and other options to configure the container, but this minimal setup runs fine… And you will soon see if your internet connection is stable or not:

    Et voilà.

    Loading

  • Download your Google Photos and Videos using JDownloader

    I have 1TB of videos and photos stored on Google Photos. As nowadays, the only export option is Takeout, as explained here, it’s a real pain in the ass to download all of them…

    Fortunately, one option of Takeout, is to get a copy in your Google Drive. And then, you can easily download all your exported data using the free JDownloader as it support “Google Drive” 🙂

    First, take notes that the export from Google Photo to Google Drive will consume space. If you have 1TB of photos and videos, your need 1TB free on your Google Drive account !!! This is THE big limitation of this solution….

    But assuming that you have enough free space, then GO: open Takeout and create a new Export that includes Google Photo (or anything else you want to export):

    Click on Next Step, where you will be able to choose Google Drive as destination. I would suggest also to select 50 GB as File size.

    Now, you have to wait until the export is ready. You will receive an email

    Once notified that your export is ready, go and check in your Google Drive (Do with with Chrome). You should find a folder “Takeout” that contains all the zip file exported from Google Photo :

    Install the Chrome Extension named “EditThisCookie” (available here on the Chrome Web Store) and Export the cookie of Google Drive (it will be copied into the clipboard).

    Now, open JDownload, go to the Settings > Account Manager and Add the Hoster “drive.google.com”, pasting the cookie exported into the clipboard (The name is not important).

    Once this new Hoster saved, you should now see this:

    To avoid that this cookie expires, clear all cookies in Chrome (CTRL-Shift-Del). If you access Google Drive later, it will send you a new cookie but won’t expires the previous one (deleted).

    Now, keeping JDownloader opened (to grap the links when you copy them into your clipboard), go to Google Drive and Copy the Links one by, using the menu (tree vertical dots) > Share > Copy Link

    Each time you copy a link, it should be captured by JDownloader (it’s an option enabled by default) and you can start download the file.

    In case you get an error on the downloads, go back to the “Account Manager”. The cookie possibly expired (Right Click > Refresh it to be sure). You have to redo the operation and then “Force Download Start” on the files to resume the downloads…

    It’s far from perfect, but “voilà”.

    Loading