• Synology HyperBackup Explorer & StorJ S3

    StorJ is a great solution to backup a Synology NAS with HyperBackup and probably the cheapest (Even if for a less than 1TB I prefer the “native” Synology C2 Storage). But how to browse your backup and recover only a few files? The solution: mount the StorJ’s S3 Bucket on your PC with Mountain Duck (As a Virtual Disk) and use HyperBackup Explorer to browse that “Local Backup” !

    Synology’s HyperBackup Explorer can only browse a local backup or a Synology C2 Storage. It means that, when using a third party S3, you would have to download your full backup locally to be able to retrieve just a few files… (Which cost 7$/TB with StorJ).

    Indeed, the actual files (images, video, documents, …) can’t be browsed within a HyperBackup image with a simple S3 explorer (like the free “S3 Browser” or “CyberDuck”). You only find “bucket” parts…

    Fortunately, with third party S3 like StorJ (pronounced “Storage”), you can mount the Bucket as a local Virtual Disk on your PC… And then use HyperBackup Explorer. This is great as you easily navigate also the timeline (version) of your files…

    First, as I discovered on DrFrankenstrien’s Tech Stuff, here is how to create a StorJ S3 Bucket (you can start a StorJ account with 25GB free for 30 days):

    As I am in the EU, I go to this SignUp page:

    Confirm your subscription (Attention: the verification email arrived in Gmail’s Spam !!!)

    Now select the “PERSONAL” account

    Enter your “Name”, choose “Backup & Recovery” and click “Continue”

    Click “Start Free Trial”

    And finally “Activate Free Trial”

    Ok, now that you have an account, you have to create a “Project” with a “Bucket” where you will upload “Objects” from your Synology:

    In my case, a default “My Sotrj Project” has been created automatically. My first step was therefore to click on “Set a Passphrase”

    I selected to type my own passphrase (it will be requested each time one connect into StorJ’s Dashboard)

    Once the passphrase enters, the next step is to “Create a Bucket”, here named “demostorj4synology”, with the default retention settings applies during the upload (I will let Hyper Backup manager the versioning)

    You are now nearly ready to upload Objects…

    Now, you have to prepare an API Key and get a Secret Key, which will be used to configured the connection on your Synology

    Type an “Name” for this access and select the access type “S3 Credential”. In my case I grant “Full Access” (I.e. HyperBackup will be able to Read/Write/List/Delete).

    Once “Access” created, you will have to download or at least copy the “Access Key”, the “Secret Key” and the “Endpoint” in a secure place (not to be shared, obviously) !

    Voilà, your are now ready to go to your Synology and configure Hyper Backup. First step: create a new “Backup” of “Folders and Packages”

    You have to scroll down in the next screen to select “S3 Storage”

    Now:

    • Select “Custom Server URL” as “S3 Server”
    • Copy the “endpoint” you got when creating an “Access Key” as Server address
    • Select v4 as Signature Version
    • Keep Request Style as is
    • Enter your Access Key (Here is mine as I will delete it anyway after this demo)
    • Enter your Secret Key
    • If your Endpoint, Access Key and Secret Key are corrects, you should see your Bucket available as bucket name.

    You can now select Folders to be included in your backup as well as Synology Packages (attention that some packages will also include all the files, such as the Web Station that includes the shared folder “web” – or Synology Drive Server that includes the shared folder “homes”)

    Configure your Backup as you wish, regarding schedule and retention:

    Now that the backup is running, it’s time to look how to mount that backup as a Virtual Disk on your PC, to be able to browse it with HyperBackup Explorer. A free solution exist but is not for beginners: rClone

    There are a few user friendly tools available for those not confortable with command prompts and scripts… The cheapest is “Mountain Duck“. It’s the one I am using. Here is how to configure it. To make it easier, in the “Preferences”, you can enable the “Storj DCS” profile

    So, now, add a connection

    In, the popup windows:

    • Select ‘Sotrj DCS’ (or ‘Amazon S3’ if you didn’t enable that profile) for the connection
    • Enter a Label. Ex.: StorJ
    • in Server, type the Endpoint of StorJ,if not yet displayed (would be the case is you select “Amazon S3” as profile)
    • Enter your “Access Key” (uncheck the box Anonymous login)
    • Enter your “Secret Key”
    • Other parameters are just fine, but you can force the Driver Letter is you prefer…

    Just in case you used the “Amazon S3” profile, the labels are different, but the config is the very same:

    Now, you should see a Disk R: mounted on your PC, exposing the Bucket from your StorJ S3.

    Install and run HyperBackup Explorer, then “Browse Local Backup”:

    Pick the .bkpi file to open the backup image and navigate your files

    Et voilà :

    Loading

  • Continuously monitor your Servers and Services

    In order to monitor your Servers and Services, nothing better than phpServerMonitor running in a container on Synology.

    The container phpServerMonitor of scavin will do the trick (It’s possibly not the most up-to-date, but is works fine). You will only need a MySQL database hosted on your Synology.

    phpServerMonitor looks like this:

    First, install “MariaDB 10” (the default alternative to MySQL commonly used on Synology)

    Then, open MariaDB10, enable a port (Ex: the default 3306) and apply this change

    If it’s not yet done, install phpMyAdmin (not explained here)

    Launch it with your administrator account (Select MariaDB10 as Server)

    Create a DB named, for example, “servermon”:

    On that new DB, add a new user account:

    Type a user name, for example “servermon”, select a strong password (take note of it) and tick the option “Check all” Global privileges for the moment. Once phpServerMonitor installed and configured, you will only keep the “Data” access rights !

    Launch the “Container Manager” on your Synology and in the Registry, search for scavin/phpservermonitor and download that image.

    Once downloaded, configure and run it. The only config required is

    • mapping a port (Ex.: 4080) of the Synology on container’s port 80 (443 won’t work) and
    • removing the environment variable PHP_MD5 !!!

    Now, navigate to http://<Your Synology Address>:<Port mapped> and click “Let’s GO”

    Complete the required info in the configuration screen:

    • the Application base url should be completed automatically with : http://Your Synology IP:<Port mapped>
    • type the IP of your Synology as “database host” and the port enabled in MariaDB10 as “database port “
    • introduce new the name of database created previously, as well as the name and the password of the user added onto that DB.

    Click on “Save Configuration” and if “MariaDB” was accessible, you should see this:

    Click once more on “Save Configuration” and here you are:

    “Go to your Monitor” to add Servers and Services. More info in the official website of phpServerMonitor 😉

    Haaa, don’t forget to remove “Structure” and “Administration” rights for the user “servermon” on the DB “servermon” and all access rights at global level (if any):

    Et voilà.

    Loading

  • Continuously track your Internet Speed

    In order to monitor your internet connection and confirm that it’s stable, day and night, nothing better than a Speed Test Tracker running in a container on Synology.

    The Speedtest Tracker of Henry Whitaker will do the trick. It’s based on Ookla’s Speedtest Cli from which it records recurrently the results.

    It looks like this:

    It’s quite easy to install and run into a container of DSM 7.x. First, install the “Container Manager”:

    Then, register the image of henrywhitaker3/speedtest-tracker:

    Next, create a new container with that image and enable auto-start (here, this container wont’ be exposed via the Web Station)

    Finally, in the next tab:

    • Map a port of your Synology (Ex.: 7080) onto the port 80 of the container (and/or possibly 443).
    • Map a shared folder of your Synology (Ex.: /docker/Ookla) onto the folder /config of the container.

    Scrolling down in that tab, add an environment variable “OOKLA_EULA_GDPR” with the value “true”.

    If you don’t do it, the container will fail to run (it will stop automatically after one or two minutes) and you will find the following message in the Log of the container:

    Once the configuration done, you can run the container. Wait for a few minutes before opening the Speedtest Tracker. Once the container up, you should see this in the Logs of the container:

    You can now open the Speedtest Tracker (using http(s)://<your NAS IP>:<port configured>) and configure the scheduler to run a test on a regular basis. For that purpose, go to the “Settings” menu . Here under, the pattern ***** is used to run every minute(as per this site).

    The default speedtest server in use does not support such a high pace.. It will soon stop and you will see these messages in the Logs:

    The follow-up schedule can be use to run every 10 minutes: */10 * * * *

    You can possibly look for other speedtest, using this URL : https://www.speedtest.net/api/js/servers?engine=js&https_functional=true&limit=100&search=YOURLOCATION

    Use the ID of a server found via that URL in the Settings of the Speedtest Checker…

    There are much more settings and other options to configure the container, but this minimal setup runs fine… And you will soon see if your internet connection is stable or not:

    Et voilà.

    Loading

  • Download your Google Photos and Videos using JDownloader

    I have 1TB of videos and photos stored on Google Photos. As nowadays, the only export option is Takeout, as explained here, it’s a real pain in the ass to download all of them…

    Fortunately, one option of Takeout, is to get a copy in your Google Drive. And then, you can easily download all your exported data using the free JDownloader as it support “Google Drive” 🙂

    First, take notes that the export from Google Photo to Google Drive will consume space. If you have 1TB of photos and videos, your need 1TB free on your Google Drive account !!! This is THE big limitation of this solution….

    But assuming that you have enough free space, then GO: open Takeout and create a new Export that includes Google Photo (or anything else you want to export):

    Click on Next Step, where you will be able to choose Google Drive as destination. I would suggest also to select 50 GB as File size.

    Now, you have to wait until the export is ready. You will receive an email

    Once notified that your export is ready, go and check in your Google Drive (Do with with Chrome). You should find a folder “Takeout” that contains all the zip file exported from Google Photo :

    Install the Chrome Extension named “EditThisCookie” (available here on the Chrome Web Store) and Export the cookie of Google Drive (it will be copied into the clipboard).

    Now, open JDownload, go to the Settings > Account Manager and Add the Hoster “drive.google.com”, pasting the cookie exported into the clipboard (The name is not important).

    Once this new Hoster saved, you should now see this:

    To avoid that this cookie expires, clear all cookies in Chrome (CTRL-Shift-Del). If you access Google Drive later, it will send you a new cookie but won’t expires the previous one (deleted).

    Now, keeping JDownloader opened (to grap the links when you copy them into your clipboard), go to Google Drive and Copy the Links one by, using the menu (tree vertical dots) > Share > Copy Link

    Each time you copy a link, it should be captured by JDownloader (it’s an option enabled by default) and you can start download the file.

    In case you get an error on the downloads, go back to the “Account Manager”. The cookie possibly expired (Right Click > Refresh it to be sure). You have to redo the operation and then “Force Download Start” on the files to resume the downloads…

    It’s far from perfect, but “voilà”.

    Loading

  • Green screen within Synology Surveillance’s records

    If you have a green screen when watching records of Synology Surveillance Station on your mobile phone, or tablet, and are using H.264 compression, then simply disable the “H.264 hardware decoding”.

    Do you see something like this on your mobile ?

    Open the “DS Cam” App, click on the “three horizontal lines” top-left icon, and then on the gear icon:

    There, disable the “Hardware decoding”:

    Now, the image should be fine:

    Et voilà.

    Loading

  • Cannot pair Aqara or other Zigbee devices with Homey Pro?

    Pairing Aqara device (Zibgee) with Homey can sometimes be frustrating. And it’s actually the same with other Hubs… Here is my trick : keep the device awake by pressing shortly on the button every 5 sec.

    Click to Read More

    Usually, to pair an Aqara device (Zigbee) with your hub, you need to press the reset button until it starts flashing (usually 5 or 8 seconds depending on the model). However, often, nothing happens, and your hub will report that the pairing failed, even though you were close to the hub and made sure there were no Bluetooth devices communicating in the nearby area.

    In such cases, after the reset (explained above), press and release immediately the reset button once every 5 seconds. This will keep the device awake and maximize the chance of a successful pairing.

    Notice: Aqara devices are built by Lumi United Technology. Lumi is also producing the devices of Xiaomi. So, this trick is valid for most of them (and most ZigBee devices on battery)

    Loading

  • Use Gmail antispam with an OVH Mail Pro mailbox

    This is probably a dirty trick, but it works. All my emails are passed through a gmail intermediary mailbox, where Spams are deleted and remaining emails are forwared to my OVH mailbox.

    Click to Read More

    As partially explained here, I used to have a mailbox “MyUser@MyDomain” in a MX Plan of OVH to store all my emails and an email address “MyAddress@MyDomain” which was a redirection to a gmail mailbox “MyUser@gmail.com” (People are sending me emails only on “MyAddress@MyDomain”).

    All emails passed to “MyUser@gmail.com” are filtered (antispam), forwarded into my mailbox “MyUser@MyDomain” and archived at gmail (so a “backup” is kept there).

    My Outlook used to be configured with IMAP to fetch emails from “MyUser@MyDomain” and SMTP to sent emails “From: MyAddress@MyDomain”

    Unfortunately, mailboxes of MX Plan at OVH are limited to 5GB. So, I decided to migrate my mailbox “MyUser@MyDomain” to a “Mail Pro” account at OVH.

    First, while doing so, very important, I had to define the domain as “non-authoritative”  in the “Mail Pro” account !!! Otherwise, the “Redirections” defined in the MX Plan won’t work anymore.

    Next, I configured my new Mail Pro account in Outlook (NB.: this account, migrated from the MX Plan, still has the same name: “MyUser@MyDomain”). Unfortunately, once this Mail Pro account configured in Outlook, I couldn’t sent email “From: MyAddress@MyDomain” anymore. The reason is that “Mail Pro” does not know “MyAddress@MyDomain” which is defined in the MX Plan. Sending emails via the account “MyUser@MyDomain” but “From: MyAddress@MyDomain” is therefore considered as Spoofing by Mail Pro (NB.: it’s not the case within a MX Plan).

    If you don’t care about the gmail antispam filtering anymore, then you can simply:

    • Delete the Redirection “MyAddress@MyDomain” from the MX Plan.
    • Wait for 30 sec, to be sure the delete is executed.
    • Create an Alias “MyAddress@MyDomain” in your Mail Pro (it will fail if the delete is not yet executed. If this occurs, retry)

    To, instead, keep the mechanism above in place:

    • Create a new mailbox in the MX Plan, with simply 50MB, going to be used only to send emails via SMTP: “MyMail@MyDomain”
    • Configure Outlook:
      • To use “MyAddress@MyDomain” as “From”
      • IMAP with the account of Mail Pro (“MyUser@MyDomain”)
      • SMTP with the account of MX Plan (“MyMail@MyDomain”)

    It results into this schema:

    Disclaimer: an SME from the community OVH reports that not only OVH is sometimes blocking outgoing emails without notifications (as those are considered as Spams, but also Gmail is rejecting many incoming emails with unclear explanation for most standard users. As a consequence, some/many (?) emails could be “silently lost” with the mechanism above….

    Loading

  • Insta360 Studio crashes immediately when exporting

     

    After upgrading to the version 4.9.1, from Insta360 Studio started to crash immediately after pressing the export button. A downgrade to 4.8.4 didn’t solve the issue. But using “CPU” instead of “Auto” as “Image Processing Acceleration” did the trick !

    Click to Read More

    I was expecting an issue with the Codecs and so I did first untick all the options enabled by default in related tab of the User Preference

    But that didn’t help. So I tried next to set “CPU” instead of “Auto” for “Image Processing Acceleration” in the Hardware Acceleration tab. And it solved the problem.

    I didn’t find any information on that flag on Insta360 forum…

     

    NB: the error message from Insta360 when trying to export the video was something like : Error Code 1024

    Loading

  • Renewal of LetsEncrypt certificates on Synology after a move

    After exporting all my certificates from an old NAS to a new one, I realized that they were not renewed automatically anymore. Trying to renew them manually via the DMS UI (Control Panel > Security > Certificate), a zip file with a CSR file (Certificate Signing Request) and a Key file, was downloaded. I had no idea how to proceed with these, so I investigated why the automatic renewal was not working as on the old NAS. The reason was the lack of “renew.json” file on the new NAS.

    Click to Read More


    Before starting, I strongly advice to export all the certificates, one by one, using the DSM UI  (Control Panel > Security > Certificate) !!!

    Connected on the NAS via SSH, I tried first to renew the certificates with the command: /usr/syno/sbin/syno-letsencrypt renew-all

    Looking into /var/log/messages, I noticed that syno-letsencrypt was complaining about a missing renew.json file :

    syno-letsencrypt[19750]: syno-letsencrypt.cpp:489 can not find renew.json. [No such file or directory][/usr/syno/etc/certificate/_archive/XXXXXX]

    NB.: To get more details, the verbose version of the renewal can be useful: /usr/syno/sbin/syno-letsencrypt renew-all -vv

    On Synology, there is one folder /usr/syno/etc/certificate/_archive/XXXXXX per certificate, where XXXXXX is the ID of the certificate. It is assumed to contain these files: cert.pem, chain.pem, fullchain.pem, privkey.pem and renew.json. And indeed, there was no file renew.json, in the folder XXXXXX

    So, on the old NAS, I looked for the folder AAAAAA containing the same certificate as in XXXXXX (once imported on another NAS, the certificate gets a new unique ID ). Check the file /usr/syno/etc/certificate/_archive/INFO to identify the ID of the certificate.

    Once the folder AAAAAA identified, read the file renew.json which looks like this:

    {
    "account" : "/usr/syno/etc/letsencrypt/account/BBBBBB/",
    "domains" : "<your domain>",
    "server" : "https://acme-v02.api.letsencrypt.org/directory",
    "version" : 2
    }

    BBBBBB is the folder containing your letsencrypt user account, stored in the file /usr/syno/etc/letsencrypt/account/BBBBBB/info.json (Notice: there can be several accounts if you used different contact emails for your various certificates).

    Look on the new NAS for the folder ZZZZZZ equivalent to BBBBBB (comparing the info.json files).

    Once AAAAAA and BBBBBB determined, I did create a file /usr/syno/etc/certificate/_archive/XXXXXX/renew.json on the new NAS, containing:

    {
    "account" : "/usr/syno/etc/letsencrypt/account/ZZZZZZ/",
    "domains" : "<your domain>",
    "server" : "https://acme-v02.api.letsencrypt.org/directory",
    "version" : 2
    }

    And finally, I could run successfully the renewal: /usr/syno/sbin/syno-letsencrypt renew-all -vv

    To update only one certificate (for testing purpose, it’s safer than renew-all), use the folder name XXXXX of the certificate : /usr/syno/sbin/syno-letsencrypt renew -c XXXXXX -vv

    Here attached, find a script created by ChatGPT to help in generating the renew.json files (Copy and run it into /usr/syno/etc/certificate/_archive/)

    renewCertificates

    Loading

    Attachments

  • How to identify a USB device

    You found a USB device (a dongle or whatever) but have no idea what it is ? USBDeview is the tool to help you in identifying that device.

    Click to Read More

    Here is what you must do to get information on your device:

    1. Download and store USBDeview on your PC
    2. In the same folder, store this file with an unofficial list of USB devices
    3. Run USBDeview and in the “View” menu, “Choose Columns”
    4. Sort the Columns to have: Device Name, Description, Device Type, Drive Letter, Serial Number, Registry Time 1 and 2, VendorID, ProductID, InstanceID
    5. Plug your USB device into your PC
    6. Look for the latest updated USB device in USBDeview (sorting on the column “Registry Time”). This is your device to be identified (you see several lines with the same timestamp, VendorID and ProductID)

    You can also get extra information on this web site, searching for the VendorID and ProductID. You can also use there software, temple, to find the name of the vendor if it was unknown for USBDeview: Run Temple and filter on the VendorID or ProductID to highlight the device (DO NOT TYPE ENTER, as it close temple). Ex.: for the ProductID 1701 here above, with an unknow vendor, Temple tell us that it is “Lester Electrical”:

    Does “Lester Electrical” ring a bell for you ? Then you found what’s this dongle.

    In many cases unfortunately, you can’t do any link between the vendor of the hardware (in the usb device) and what this device is actually used for. Here above, I finally figured out that this was the wireless dongle to connect one of my mini keyboard Rii mini i8 imported from China by RiiTek. But there was really no link between both…

    Loading

Tags


Acer iDea 510 AirPlay Android Backup DD-WRT DNS DS209+ DS713+ DS1815+ FlexRaid Galaxy Galaxy S2 Galaxy S7 Google Assistant Google Home HTPC Kies MCE MySQL Nabi 2 Nvidia TV Shield One Plus One OpenHab Outlook Philips Hue Plex RAID Raspberry PI Remote Desktop Root Access Samsung Scripts Synology Team Build Visual Studio VMWare Wi-Fi Windows Windows 8 Windows 8.1 Windows 10 Windows Server 2012 XBMC Xpenology ZigBee

Categories


Archives