Backup remote linux machines to Synology

The Purpose

I use a number of Virtual Private Servers (VPS) and wanted to make a backup of the data and applications running on them.

The first step is to make a local copy of your data to a folder on the remote machine, then you can pull these files to the Synology NAS via a scheduled task. For my applications I simply used tar to backup all the directories I can about to a single file and a mysql dump to dump all the databases in the mysql server to a single file.

Setup Authentication

generate your keys


#ssh-keygen

  • do not configure with a password

Verify sshd is configured to use key files


vi /etc/ssh/sshd_config


AuthorizedKeysFile      %h/.ssh/authorized_keys

Add the public key to ssh authorized_keys


cat key.pub >> ~/.ssh/authorized_keys

Copy the private key to the Synology

Use any method you like for this. I personally simply copied the contents of the private key, then pasted it into a file on my local machine and moved it to an existing share on the NAS.

Connect to source machine from Synology and trust the source machine


chmod 400 $AbsolutePathToPrivateKey
ssh -p 22 -i $PRIVATEKEY your-user-name@server.your-domain.com

The scheduled task

Create the scheduled task

save the script below local to the Synology and make it executable

notes:

  1. You may need to enable SSH terminal access on your NAS.
  2. If you edit the script locally on a windows machine with Notepad ++ make sure you change the EOL (End of Line) to Unix


#!/bin/bash
USER="your-user-name"
SERVER="server.your-domain.com"
PORT="22"
SSHID="/volume1/backups/scripts/certificates/server.your-domain.com.privkey.pem"
SOURCE="/backups/synology/"
TARGET="/volume1/backups/server.your-domain.com/"
LOG="/volume1/backups/server.your-domain.com/backup.log"
/usr/bin/rsync -avz --progress -e "ssh -p $PORT -i $SSHID" $USER@$SERVER:$SOURCE $TARGET >> $LOG 2>&1

Run the script and verify your data is copied

References

http://raphael.kallensee.name/journal/how-to-backup-an-external-server-with-a-synology-nas-via-rsync/

Windows computers not reporting to WSUS

Verify client configuration

Local Computer Policy

Verify Resultant Policy is correct

Verify Correct GPO’s are being applied

C:\>gpresult /scope computer

Update Group Policies

C:\>gpupdate /force

verify connectivity

ping wsus-server-01.domain.com

telenet wsus-server-01.domain.com 8530

If you are using a hosts file and having troubles with resolution, check out this post

Reset the client

wuauclt.exe /resetauthorization /detectnow

Force check in

wuauclt.exe /reportnow

Check WSUS in 10-15 minutes

If you are still having issues check out the client log file:

C:\Windows\WindowsUpdate.log

Windows hosts file not being used for resolution

windows version: Server 2003 R2 Standard x64 SP2

Verify it’s not working

ipconfig /flushdns

ipconfig /displaydns | more

Check for type-o’s!

Start with the simple solution first

Verify hosts file location

Open Registry Editor

Verify key: My Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services|Tcpip\Parameters\DataBasePath

Copy Value data and paste it into Explorer to verify you are editing the correct file

Verify file permissions (This was my issue)

If machine\users is not given Read and Read & Execute permissions, add the account.

 

Synology: Remove orphaned AWR files when JPG is deleted in Photo Studio 6

Background

Hardware: Synology DS716+
Software: Synology Photo Station 6
Data Files: .jpg & .arw (raw)

The problem

When using a Synology NAS to manage your photos via the Photo Station 6 application when I delete the JPG the RAW (ARW) remains behind.

The solution

Search the photo directory for orphan .arw files (ones without a matching .jpg), then remove it. While we are at it, lets record what we delete to a file.

Deploy an Ubuntu docker image and mount the photos directory


Use the code


#!/usr/bin/python
import os
rootdir = '/mnt/photo/Dump/2016/2016-02_Muppo-playing'
files = os.listdir(rootdir)
for file in files:
if file.endswith('.ARW'):
filename, file_ext = os.path.splitext(rootdir + '/' + file)
if not os.path.isfile(filename + '.JPG'):
os.remove(rootdir + '/' + file)
print('REMOVED:' + rootdir + '/' + file)
with open("clean-up.log", "a") as logfile:
logfile.write("\n")
logfile.write('REMOVED:' + rootdir + '/' + file)

How to add Domain Admins to sudoers

This process assumes your linux machine has Centrify Express running on it.

Determine the group name

$adquery user rick -G

domain_admins

domain_users

jira-software-users

Add entry to sudoers file

sudo echo “%domain_admins ALL=(ALL) NOPASSWD: ALL” >> /etc/sudoers

 

 

 

Run nginx in a Docker container on a Synology

In this walk through we will perform the following:

Note: The actual nginx configuration will not be covered here.

  1. Deploy the nginx Docker container (vr-ngx-01)
  2. Mount the following folders and file:
    1. /etc/nginx/conf.d/
      1. it’s assumed your sites .conf file is in this director
    2. /etc/nginx/certs/
      1. it’s assumed your SSL certs live here and are properly referenced in your /etc/nginx/conf.d/your.site.conf
    3. /etc/nginx/nginx.conf
      1. it’s assumed SSL is configured and includes conf.d/*.conf
  3. Link vr-ngx-01 to the Home-Assistant container (vr-hass-01)
  4. Fire up the container and verify connectivity over a secured connection
  5. Remove local port mapping for vr-hass-01

1. Deploy the container

2. Mount the local folders & file

3. Link vr-ngx-01 to vr-hass-01

4. Verify site loads

Browse to https://YOUR-SYNOLOGY-NAME:4443

Note: to make this appear at https://www.virtualrick.com you can configure your router/firewall for port forwarding. Example: external TCP 443 forwards to internal TCP 4443.

5. Remove local port mapping for vr-hass-01

Now that the nginx container is linked to the home-assistant container, there is no need for the home-assistant service port (8123) to be available directly.

Make sure the home-assistant container is turned off, then edit the container and remove the local port configuration.

Running Home-Assistant in a Docker container on a Synology NAS

Update: Link to post following this one with steps for deploying nginx as a proxy for the Home-Assistant container deployed here: CLICK HERE

 

 

I recently received my Synology DS716+ and discovered it supports running Docker containers. I figured why not run Home-Assistant in a Docker container on the Synology? Doing this will free my Raspberry Pi for another project. Here is what I did to make this happen.

Mount Points:

/config

Store your configuration.yaml here

/scripts

Store any scripts called within your confiruation.yaml. I have a number of scripts used to execute remote commands on various devices.

/root/.ssh

I mount this folder so I can store the keys that are trusted on remote devices

Step by step screenshots

Download the image

Create the container

Launch the application

Using PowerShell to produce a list of databases from a list of server\instances

Need to produce a report showing all the databases in your environment? Why not include the name, size and owner while we are at it and export it to a csv file. Here you go!

Note: The SQLPS module is installed on a machine with Microsoft SQL Server Management Studio. I have tested this with MSSMS 2014.

The PowerShell Script

import-module "C:\Program Files (x86)\Microsoft SQL Server\120\Tools\PowerShell\Modules\SQLPS" -DisableNameChecking
$rootdir = "C:\Users\VirtualRick\SQL Server Audit\"
$instances = import-csv $rootdir\server-instance.csv
ForEach($row in $instances)
{
$sqlPath = "SQLSERVER:\SQL\$($row.server)\$($row.instance)\Databases\"
dir $sqlPath | select Name, Size, Owner | export-csv $rootdir\export.csv -Append
}

 

server-instance.csv file example:

Server,Instance

MyServer,default

Upload local file to Azure File Share via Power Shell

Want to upload a log file to your cloud storage? Here is a quick and easy way to do it via Power Shell.


$AccountName = "YOURACCOUNTNAMEHERE"
$AccountKey = "YOURACCESSKEYHERE"
$Context01 = New-AzureStorageContext -StorageAccountKey $AccountKey -StorageAccountName $AccountName
Set-AzureStorageFileContent -ShareName backups -Source "C:\file.log" -Path "DestinationFile.Name" -Context $Context01