IT

OpenVPN keeps disconnecting with an inactivity timeout

I’ve used openvpn on Mac (using Tunnelblick) for a few years for a home->work VPN connection with no issues, and suddenly today, it would disconnect me every two minutes.

This seems a very accurate timing problem, so can’t be ISP related surely? But I could not find the issue anywhere. Rebooted Mac and the router, still the same.

So, I just managed to get the same VPN working recently on a Debian based Linux box using the command line version of openvpn – and that outputs all the logs to the terminal. I thought I’ll give that a go in case it yields any helpful information.

Anyway, the error that stuck out was this one:

Wed May 25 19:32:30 2016 ERROR: Linux route add command failed: external program exited with error status: 2
Wed May 25 19:32:30 2016 Initialization Sequence Completed
Wed May 25 19:34:30 2016 [chris] Inactivity timeout (--ping-restart), restarting
Wed May 25 19:34:30 2016 SIGUSR1[soft,ping-restart] received, process restarting

And a quick search on Google took me to this chap : http://www.drmaciver.com/2012/05/openvpn-repeatedly-losing-connections-with-inactivity-timeout/ which said he had an inactive openvpn connection.

Perhaps my Linux box has got it stuck open? It wasn’t connected until just now, but lets reboot the machine anyway (it never gets rebooted as its a media server)

Lo and behold, it sorts out the issue, so the Debian based machine had been connected to the VPN, even though all the terminal windows had been closed. Naughty…

Posting Slack message on Google Docs spreadsheet cell update

We’ve been using a bit of a google doc recently for logging stuff, and sometimes cells get missed. I wanted to just test out a bit of google doc and slack integration so wrote this short script to send a slack message to a certain channel when a user updates a particular cell.

The column in the spreadsheet is 14 FYI, which is why I check for that column and return if not.

Because this script does other things, like a HTTP request, it cannot be assigned to the default onEdit() functions, you need to create it as a custom function and assign it in the Resources menu. Resources->Current Project Triggers and create a new one for onEdit. Just point it at your function and you’re done.

Excuse my unprofessionalism regarding HTTP response codes, but this was a simple test to see how easy it was. 1 hour. Done.

/**
 * @author Chris Tate-Davies
 * @revision 0.0.1
 *
 * 10th May 2016
 * Purpose - send a slack payload to bot-database informing users of database update requirements
**/
function ceta_db_column_edit(event){

 //get this spread sheet
 var ceta_spreadsheet = SpreadsheetApp.getActiveSpreadsheet();

 //get the sheets and range from the spreadsheet
 var ceta_sheet = event.source.getActiveSheet();
 var ceta_range = event.source.getActiveRange();

 //get the cell thingy
 var active_cell = ceta_sheet.getActiveCell();
 var active_row = active_cell.getRow();
 var active_column = active_cell.getColumn();
 
 //If header row then exit
 if (active_row < 2) return;
 
 //if not the db column get out
 if (active_column != 14) return;

 //get the revision
 var revision_range = ceta_sheet.getRange(active_row, 2);
 var revision_content = revision_range.getValue();

 //get the changes in the cell 
 var db_changes_range = ceta_sheet.getRange(active_row, 14);
 var db_changes_content = db_changes_range.getValue();
 
 //if its nothing then lets not bother (they're probably deleting stuff)
 if (db_changes_content == "") return;
 
 //the url to post to
 var slack_url = "https://hooks.slack.com/services/<ENTER YOUR TOKENS HERE>";

 //get the logged in user (we can only get email I thinks)
 var current_user = Session.getActiveUser().getEmail();
 
 //if its blank (why?)
 if (current_user == "") {
 
 //at least put something in
 current_user = "An unknown";
 }
 
 //generate the payload text object
 var payload = { "text" : current_user + " has just entered text into the db field for revision " + revision_content + " - Content is: ```" + db_changes_content + "```" };

 //the URL payload
 var options = {
     "method" : "post",
     "contentType" : "application/json",
     "payload" : JSON.stringify(payload),
     "muteHttpExceptions" : true
 };

 //send that bugger
 var response = UrlFetchApp.fetch(slack_url, options);

 //we could check for response, but who cares?
}

Rename a branch in SVN

If you made a typo or a similar mistake when naming your new branch, and you want to rename it, is very simple:

svn move http://svn.domain.com/project/branches/old/ http://svn.domain.com/project/branches/new

Remember to switch any working copies of the old branch though:

svn switch http://svn.domain.com/project/branches/new

 

Mac OSX – Can’t eject disk (in use)

So you get that annoying “The volume can’t be ejected because it’s currently in use.” message trying to unmount your disk, well, to find out what is using the disk, you can use the lsof command (LiSt Open Files)

lsof

Run it as sudo, and you should easily be able to see which application has files open (usually VLC for me)

sudo lsof | grep /Volumes/My_Drive

Obviously substitute My_Drive with the mount name from your machine

svnX – Use DiffMerge as diff tool

So, I use svnX on MacOSX and I find DiffMerge a much nicer application than the default FileMerge, but I couldn’t get it to load. I was getting some spurious error about a missing .sh file:

Can’t find tool ‘diffmerge.sh’ required to use DiffMerge

Anyway. I found this file in the following folder:

/Applications/DiffMerge.app/Contents/Resources/

And I simply copied it to a folder within the path env,

cp /Applications/DiffMerge.app/Contents/Resources/diffmerge.sh /usr/local/bin/

Sorted. Don’t forget you actually have to change the application preferences.

Plex – External FAT32 USB drive

So, trying to use a Linux Mint laptop and Plex for a lower power consumption server seemed a good idea, until I realised the disk in the laptop was only a few hundred gig. I opted to use my backup USB disk to serve the media. Its FAT32 and already contained hundreds of videos (as it was my backup)

I wanted it to automount, and always be connected so that Plex just sits there serving away and occasionaly updates the library with new files. My mount point is /media/plexmedia/

I used fstab to mount it. and with many different options and trials, I discovered this to be the optimum line:

UUID=DA3C-7706 /media/plexmedia/ vfat auto,users,umask=000 0 0

Explanation of the settings:

umask=000

This will set every file to rw for all users and groups.

auto

Will automatically mount the drive. Not really necessary but I like to leave it there.

users

Allows the disk to be mounted by any user. I don’t want root owning it, as then Plex might not be able to read/write nicely.

Finally the 0 0

The first 0 means that I don’t want the disk automatically backed up ever
The second 0 is telling fsck that I don’t want to check the device for errors.

 

Now, to get your UUID, you’ll need to run:

sudo blkid

Note: Its better to use the drive’s UUID – as this doesn’t change, but if you were to shuffle the order of the disks around (as in the physical USB ports) the /dev/sda labels may change.

Its important to note, that you cannot change the attributes of the files on a FAT32 disk in Linux, as they don’t have that sort of security metadata within them. The user access changes must be applied to the mount point. A lot of people forget this and is a source of confusion often.

Mysqldump all databases and gzip

So, quick and simple MySQL backup in a CRON.

#!/bin/bash
_hour=`date '+%H'`;
_user="root";
_pass="thepassword";
_dest="/srv/backups/daily/";
mysqldump -u $_user -p$_pass --all-databases --ignore-table=mysql.event | gzip > $_dest_$_hour.sql.gz

I’ve used this in an hourly CRON so that I always have 24 hours of backups.

Note: The ignore of mysql.event is to stop an annoying new feature of a later version of MySQL that seems to report a notice that its skipped this table. I don’t really need it so I am ignoring it.

Grep duplicate JSON keys

If you have large JSON files with application settings in, you need to be sure that the settings only appear once. Not a problem until you get to the stage of very large files, being edited by all sorts of people manually.

[
"setting_1" : "some value",
"setting_2" : "another value",
"setting_1" : "different again"
]

Run a script to check for duplicate key names:

grep -Po '"[a-z_0-9]+"[ ]*:' <filename> | uniq -d

The above will output the duplicated setting(s) if any, to the console. Tested on Ubuntu 12.04

Backup all SVN repos

If you need to backup your SVN repositories, then you can use this bash script to do so:

#!/bin/bash
DATE=`date +"%Y-%m-%d"`
BACKUP_DIR=/home/user/svn/backup/${DATE}
mkdir -p $BACKUP_DIR
for dir in `ls /var/svn/`; do
    RES_DIR=/var/svn/$dir;
    svnadmin dump $RES_DIR | gzip > "${BACKUP_DIR}/${dir}.dump.gz";
done

This basically dumps every revision from each repository in your server, and gzip’s them and then puts them in /home/user/svn/backup under the current date