What you see on the console when you attempt to run iouyap:
What you see in GNS3 when you start a router:
How to fix it
I use a number of Virtual Private Servers (VPS) and wanted to make a backup of the data and applications running on them.
The first step is to make a local copy of your data to a folder on the remote machine, then you can pull these files to the Synology NAS via a scheduled task. For my applications I simply used tar to backup all the directories I can about to a single file and a mysql dump to dump all the databases in the mysql server to a single file.
generate your keys
- do not configure with a password
Verify sshd is configured to use key files
Add the public key to ssh authorized_keys
cat key.pub >> ~/.ssh/authorized_keys
Copy the private key to the Synology
Use any method you like for this. I personally simply copied the contents of the private key, then pasted it into a file on my local machine and moved it to an existing share on the NAS.
Connect to source machine from Synology and trust the source machine
chmod 400 $AbsolutePathToPrivateKey
ssh -p 22 -i $PRIVATEKEY firstname.lastname@example.org
The scheduled task
Create the scheduled task
save the script below local to the Synology and make it executable
- You may need to enable SSH terminal access on your NAS.
- If you edit the script locally on a windows machine with Notepad ++ make sure you change the EOL (End of Line) to Unix
/usr/bin/rsync -avz --progress -e "ssh -p $PORT -i $SSHID" $USER@$SERVER:$SOURCE $TARGET >> $LOG 2>&1
Run the script and verify your data is copied
Verify client configuration
Local Computer Policy
Verify Resultant Policy is correct
Verify Correct GPO’s are being applied
C:\>gpresult /scope computer
Update Group Policies
telenet wsus-server-01.domain.com 8530
If you are using a hosts file and having troubles with resolution, check out this post
Reset the client
wuauclt.exe /resetauthorization /detectnow
Force check in
Check WSUS in 10-15 minutes
If you are still having issues check out the client log file:
windows version: Server 2003 R2 Standard x64 SP2
Verify it’s not working
ipconfig /displaydns | more
Check for type-o’s!
Start with the simple solution first
Verify hosts file location
Open Registry Editor
Verify key: My Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services|Tcpip\Parameters\DataBasePath
Copy Value data and paste it into Explorer to verify you are editing the correct file
Verify file permissions (This was my issue)
If machine\users is not given Read and Read & Execute permissions, add the account.
Hardware: Synology DS716+
Software: Synology Photo Station 6
Data Files: .jpg & .arw (raw)
When using a Synology NAS to manage your photos via the Photo Station 6 application when I delete the JPG the RAW (ARW) remains behind.
Search the photo directory for orphan .arw files (ones without a matching .jpg), then remove it. While we are at it, lets record what we delete to a file.
Deploy an Ubuntu docker image and mount the photos directory
Use the code
rootdir = '/mnt/photo/Dump/2016/2016-02_Muppo-playing'
files = os.listdir(rootdir)
for file in files:
filename, file_ext = os.path.splitext(rootdir + '/' + file)
if not os.path.isfile(filename + '.JPG'):
os.remove(rootdir + '/' + file)
print('REMOVED:' + rootdir + '/' + file)
with open("clean-up.log", "a") as logfile:
logfile.write('REMOVED:' + rootdir + '/' + file)
This process assumes your linux machine has Centrify Express running on it.
Determine the group name
$adquery user rick -G
Add entry to sudoers file
sudo echo “%domain_admins ALL=(ALL) NOPASSWD: ALL” >> /etc/sudoers
In this walk through we will perform the following:
Note: The actual nginx configuration will not be covered here.
- Deploy the nginx Docker container (vr-ngx-01)
- Mount the following folders and file:
- it’s assumed your sites .conf file is in this director
- it’s assumed your SSL certs live here and are properly referenced in your /etc/nginx/conf.d/your.site.conf
- it’s assumed SSL is configured and includes conf.d/*.conf
- Link vr-ngx-01 to the Home-Assistant container (vr-hass-01)
- Fire up the container and verify connectivity over a secured connection
- Remove local port mapping for vr-hass-01
1. Deploy the container
2. Mount the local folders & file
3. Link vr-ngx-01 to vr-hass-01
4. Verify site loads
Browse to https://YOUR-SYNOLOGY-NAME:4443
Note: to make this appear at https://www.virtualrick.com you can configure your router/firewall for port forwarding. Example: external TCP 443 forwards to internal TCP 4443.
5. Remove local port mapping for vr-hass-01
Now that the nginx container is linked to the home-assistant container, there is no need for the home-assistant service port (8123) to be available directly.
Make sure the home-assistant container is turned off, then edit the container and remove the local port configuration.
Update: Link to post following this one with steps for deploying nginx as a proxy for the Home-Assistant container deployed here: CLICK HERE
I recently received my Synology DS716+ and discovered it supports running Docker containers. I figured why not run Home-Assistant in a Docker container on the Synology? Doing this will free my Raspberry Pi for another project. Here is what I did to make this happen.
Store your configuration.yaml here
Store any scripts called within your confiruation.yaml. I have a number of scripts used to execute remote commands on various devices.
I mount this folder so I can store the keys that are trusted on remote devices
Step by step screenshots
Download the image
Create the container
Launch the application
Need to produce a report showing all the databases in your environment? Why not include the name, size and owner while we are at it and export it to a csv file. Here you go!
Note: The SQLPS module is installed on a machine with Microsoft SQL Server Management Studio. I have tested this with MSSMS 2014.
The PowerShell Script
import-module "C:\Program Files (x86)\Microsoft SQL Server\120\Tools\PowerShell\Modules\SQLPS" -DisableNameChecking
$rootdir = "C:\Users\VirtualRick\SQL Server Audit\"
$instances = import-csv $rootdir\server-instance.csv
ForEach($row in $instances)
$sqlPath = "SQLSERVER:\SQL\$($row.server)\$($row.instance)\Databases\"
dir $sqlPath | select Name, Size, Owner | export-csv $rootdir\export.csv -Append
server-instance.csv file example: