Betting and “tipsters”

As a keen sports follower, I got enticed into gambling a while ago. I’ve won big and lost big. Fairly evens overall with sparse bets. 

Then I found twitter and the umpteen amounts of so called tipsters. I followed a couple and noticed they were winning quite a lot. So I started following their tips. 

No and again I would win with them. It mainly their accumulator bets would miss out by the odd game. 

Then I saw they do “challenges” which consist of turning say, £10 to £100 in several bets. 

By the time the challenge is up to almost the finish, you’re staking £80 on something so trivial it’s a lot of risk for little return. I’ve seen £10-£1000 challenges where they are suggesting you stake £800 on over 0.5 fhgs in a game in Korea. That’s just madness in my opinion. 

Basically a lot of these tipsters have affiliations with companies and if they get people to sign up through their link they get X commission on losing bets. Bet365 is the main one here. So, it’s really in the tipsters interest to post losing bets. Especially when they have a lot of trusting followers. 

I believe that if you sign up for a bet365 account through one of these links, then your account is forever connected to the tipster. 

Anyway, I’ve had my bet365 account for years so I’m clean 😉

What I’ve learned is just the enormous amount of markets. And I’ve used this to my advantage. 

What you need to decide if you want a slow steady rise in account funds or a one off Big Bang. I would suggest slow and steady with smaller big bangs. 

When my account is looking low I have a small strategy of finding 0-0 football games in the 50-70ish minutes with at least 8 shots on target between them and I bet about 35% of my funds on over 0.5 goals. This is usually a decent way of increasing my pot. 

What I also realised quite quickly is that massive accumulators very rarely come in. So triples are the way forward here. And not silly ones, safe bets. If you’re unsure about a win, use the double chance option. The only really rule is make sure the odds are > 1.25 else it’s hardly worth it. 

Also if you are following an accumulator tip and they are tipping BTTS and you’re unsure, then change it to over 1.5 goals. Simple. 

So betting is fun and it can make you money but don’t put all your hard earned money into somebody else’s hands. Especially someone on the internet you’ve never met. 

OpenVPN keeps disconnecting with an inactivity timeout

I’ve used openvpn on Mac (using Tunnelblick) for a few years for a home->work VPN connection with no issues, and suddenly today, it would disconnect me every two minutes.

This seems a very accurate timing problem, so can’t be ISP related surely? But I could not find the issue anywhere. Rebooted Mac and the router, still the same.

So, I just managed to get the same VPN working recently on a Debian based Linux box using the command line version of openvpn – and that outputs all the logs to the terminal. I thought I’ll give that a go in case it yields any helpful information.

Anyway, the error that stuck out was this one:

Wed May 25 19:32:30 2016 ERROR: Linux route add command failed: external program exited with error status: 2
Wed May 25 19:32:30 2016 Initialization Sequence Completed
Wed May 25 19:34:30 2016 [chris] Inactivity timeout (--ping-restart), restarting
Wed May 25 19:34:30 2016 SIGUSR1[soft,ping-restart] received, process restarting

And a quick search on Google took me to this chap : http://www.drmaciver.com/2012/05/openvpn-repeatedly-losing-connections-with-inactivity-timeout/ which said he had an inactive openvpn connection.

Perhaps my Linux box has got it stuck open? It wasn’t connected until just now, but lets reboot the machine anyway (it never gets rebooted as its a media server)

Lo and behold, it sorts out the issue, so the Debian based machine had been connected to the VPN, even though all the terminal windows had been closed. Naughty…

Posting Slack message on Google Docs spreadsheet cell update

We’ve been using a bit of a google doc recently for logging stuff, and sometimes cells get missed. I wanted to just test out a bit of google doc and slack integration so wrote this short script to send a slack message to a certain channel when a user updates a particular cell.

The column in the spreadsheet is 14 FYI, which is why I check for that column and return if not.

Because this script does other things, like a HTTP request, it cannot be assigned to the default onEdit() functions, you need to create it as a custom function and assign it in the Resources menu. Resources->Current Project Triggers and create a new one for onEdit. Just point it at your function and you’re done.

Excuse my unprofessionalism regarding HTTP response codes, but this was a simple test to see how easy it was. 1 hour. Done.

/**
 * @author Chris Tate-Davies
 * @revision 0.0.1
 *
 * 10th May 2016
 * Purpose - send a slack payload to bot-database informing users of database update requirements
**/
function ceta_db_column_edit(event){

 //get this spread sheet
 var ceta_spreadsheet = SpreadsheetApp.getActiveSpreadsheet();

 //get the sheets and range from the spreadsheet
 var ceta_sheet = event.source.getActiveSheet();
 var ceta_range = event.source.getActiveRange();

 //get the cell thingy
 var active_cell = ceta_sheet.getActiveCell();
 var active_row = active_cell.getRow();
 var active_column = active_cell.getColumn();
 
 //If header row then exit
 if (active_row < 2) return;
 
 //if not the db column get out
 if (active_column != 14) return;

 //get the revision
 var revision_range = ceta_sheet.getRange(active_row, 2);
 var revision_content = revision_range.getValue();

 //get the changes in the cell 
 var db_changes_range = ceta_sheet.getRange(active_row, 14);
 var db_changes_content = db_changes_range.getValue();
 
 //if its nothing then lets not bother (they're probably deleting stuff)
 if (db_changes_content == "") return;
 
 //the url to post to
 var slack_url = "https://hooks.slack.com/services/<ENTER YOUR TOKENS HERE>";

 //get the logged in user (we can only get email I thinks)
 var current_user = Session.getActiveUser().getEmail();
 
 //if its blank (why?)
 if (current_user == "") {
 
 //at least put something in
 current_user = "An unknown";
 }
 
 //generate the payload text object
 var payload = { "text" : current_user + " has just entered text into the db field for revision " + revision_content + " - Content is: ```" + db_changes_content + "```" };

 //the URL payload
 var options = {
     "method" : "post",
     "contentType" : "application/json",
     "payload" : JSON.stringify(payload),
     "muteHttpExceptions" : true
 };

 //send that bugger
 var response = UrlFetchApp.fetch(slack_url, options);

 //we could check for response, but who cares?
}

Rename a branch in SVN

If you made a typo or a similar mistake when naming your new branch, and you want to rename it, is very simple:

svn move http://svn.domain.com/project/branches/old/ http://svn.domain.com/project/branches/new

Remember to switch any working copies of the old branch though:

svn switch http://svn.domain.com/project/branches/new

 

Mac OSX – Can’t eject disk (in use)

So you get that annoying “The volume can’t be ejected because it’s currently in use.” message trying to unmount your disk, well, to find out what is using the disk, you can use the lsof command (LiSt Open Files)

lsof

Run it as sudo, and you should easily be able to see which application has files open (usually VLC for me)

sudo lsof | grep /Volumes/My_Drive

Obviously substitute My_Drive with the mount name from your machine

svnX – Use DiffMerge as diff tool

So, I use svnX on MacOSX and I find DiffMerge a much nicer application than the default FileMerge, but I couldn’t get it to load. I was getting some spurious error about a missing .sh file:

Can’t find tool ‘diffmerge.sh’ required to use DiffMerge

Anyway. I found this file in the following folder:

/Applications/DiffMerge.app/Contents/Resources/

And I simply copied it to a folder within the path env,

cp /Applications/DiffMerge.app/Contents/Resources/diffmerge.sh /usr/local/bin/

Sorted. Don’t forget you actually have to change the application preferences.

Plex – External FAT32 USB drive

So, trying to use a Linux Mint laptop and Plex for a lower power consumption server seemed a good idea, until I realised the disk in the laptop was only a few hundred gig. I opted to use my backup USB disk to serve the media. Its FAT32 and already contained hundreds of videos (as it was my backup)

I wanted it to automount, and always be connected so that Plex just sits there serving away and occasionaly updates the library with new files. My mount point is /media/plexmedia/

I used fstab to mount it. and with many different options and trials, I discovered this to be the optimum line:

UUID=DA3C-7706 /media/plexmedia/ vfat auto,users,umask=000 0 0

Explanation of the settings:

umask=000

This will set every file to rw for all users and groups.

auto

Will automatically mount the drive. Not really necessary but I like to leave it there.

users

Allows the disk to be mounted by any user. I don’t want root owning it, as then Plex might not be able to read/write nicely.

Finally the 0 0

The first 0 means that I don’t want the disk automatically backed up ever
The second 0 is telling fsck that I don’t want to check the device for errors.

 

Now, to get your UUID, you’ll need to run:

sudo blkid

Note: Its better to use the drive’s UUID – as this doesn’t change, but if you were to shuffle the order of the disks around (as in the physical USB ports) the /dev/sda labels may change.

Its important to note, that you cannot change the attributes of the files on a FAT32 disk in Linux, as they don’t have that sort of security metadata within them. The user access changes must be applied to the mount point. A lot of people forget this and is a source of confusion often.

Mysqldump all databases and gzip

So, quick and simple MySQL backup in a CRON.

#!/bin/bash
_hour=`date '+%H'`;
_user="root";
_pass="thepassword";
_dest="/srv/backups/daily/";
mysqldump -u $_user -p$_pass --all-databases --ignore-table=mysql.event | gzip > $_dest_$_hour.sql.gz

I’ve used this in an hourly CRON so that I always have 24 hours of backups.

Note: The ignore of mysql.event is to stop an annoying new feature of a later version of MySQL that seems to report a notice that its skipped this table. I don’t really need it so I am ignoring it.

Grep duplicate JSON keys

If you have large JSON files with application settings in, you need to be sure that the settings only appear once. Not a problem until you get to the stage of very large files, being edited by all sorts of people manually.

[
"setting_1" : "some value",
"setting_2" : "another value",
"setting_1" : "different again"
]

Run a script to check for duplicate key names:

grep -Po '"[a-z_0-9]+"[ ]*:' <filename> | uniq -d

The above will output the duplicated setting(s) if any, to the console. Tested on Ubuntu 12.04