I’m using the bash on windows (WSL) every day now to take a snapshot of my unix-based web sites.
Given that most people have a windows machine with a huge disk, WSL is a great way to take a snapshot of unix directories using a familiar interface – rsync/ssh.
Here is a line from a backup script that runs under bash under windows.
/usr/bin/rsync -avz --delete ec2-1-2-3-4.compute.amazonaws.com:/home/bitnami/htdocs/ ~/backups/
This asks rsync to copy a directory off an EC2 directory, and mirror it locally.
rsync will use a profile entry in the ~/.ssh/config file to provide the ssh certificate at run time in order to login and encrypt the rsync session.
This shows rsync which .pem file to use to login to the EC2 instance.
It doesn’t run as fast as a pure linux box would but certainly fast enough for my purposes.
Loving WSL !
Now that my main email group has over 5,000 members I decided to see if I could 1) get Amazon SES to up my sending rate and 2) whether an Amazon EC2 t2.micro could keep up with 50 sends a second.
The previous send rate was 20/sec and it seemed just fine and wasn’t such a problem, but as the list grows 50 seems much useful.
So the good new is that according to watching the status updates as a campaign goes out, and eyeballing the per-second updates, the t2.micro kept up really well and I’ll keep the rate at 50/sec going forward.
So Amazon EC2 t2.micro, sendy.co and Amazon SES is a really good cheap option for running a decent sized email list.
Still trying to get my head around what exactly bash on Windows 10 really is. Have read a few posts and watched a few videos, but it is still weird to do a `top’ and see ubuntu processing running in a command window that in my Windows machine that is running real linux binaries.
But awesome too.
I was really impressed with the bash WSL on Windows 10 Creators Update. It worked to backup some web sites from Amazon EC2 using the rsync and ssh scripts untouched. I just copied across the ~/.ssh directory and voila! I can now rsync over ssh to pull backups from my web sites down to my windows machine.
This suddenly makes the big honking disk and windows 10 a far more developer friendly machine. Why didn’t microsoft do this years ago?
I am becoming more and more impressed with the WSL for Windows 10 Creators Update.
I installed apache and mysql and so far it appears to be working just fine and dandy !!
sudo apt-get install apache2
sudo apt-get install mysql-server
sudo apt-get install php libapache2-mod-php
sudo apt-get install php-mysql
sudo vim /etc/apache2/apache2.conf
AcceptFilter http none
sudo /etc/init.d/apache2 restart
sudo service mysql restart
and voila!, you have a web server running.
http://localhost is alive.
So it really is behaving like a real ubuntu shell. Wowser. Windows just became more of a developers dream.
It looks like it might only be a minor feature update, but Open Live Writer is now available via the Windows 10 Store.
You will need to have the updates for Windows 10 Anniversary Update before you can install it.
So if and when any updates get made, we can get them automatically.
Not that there has been many updates in the last year ….
It has come time to make the WordPress sites I made for my kids disappear off the public internet. They are getting old enough that their friends are finding their sites using google searches.
Here is what I did.
- Password protect the entire WordPress site
- allow a hole in the password protection to allow robots.txt to still be accessible.
If you password protect the whole site, then archived copies of the site will contine to show at the Internet Wayback Machine. So you still need the robots.txt to be accessible, but it should block all web archivers from indexing the site.
This tells all web robots to go away, and indeed once this is active, the internet archive stops showing any of the snapshots that it has collected over the years.
I used SiteGround’s cpanel tool to password protect the whole tree containing the blog.
That resulted in the following Sample .htaccess contents
Allow from all
I added the Files paragraph to allow the robots.txt to be visible. cf;
user@shell: curl https://elliot.pascoe.biz/robots.txt
So far so good, the site will soon disappear out of Google and be only visible to those to whom I will share the password.