Saturday, April 02, 2011

Starting a rails project

This is just a quick reminder about how to start a standard rails project. The assumption is that rvm already installed on an Ubuntu machine (I'm using Natty).
  • mkdir projects
  • cd projects
  • rails new project_name -T
  • echo "rvm --create use 1.9.2@project_name" > .rvmrc
  • cd project_name
  • (agree to the use of the .rvmrc file)
My default Gemfile at the moment takes into account that I'll be creating a general API and will be using rspec (I tend to use rspec for integration testing too). It includes spork and autotest for a cool autotest environment.
  • (create the following Gemfile)
  • gem install bundler
  • bundle install
source ''

gem 'rails', '3.0.1'
gem 'sqlite3-ruby', '1.2.5', :require => 'sqlite3'
gem 'gravatar_image_tag', '0.1.0'
gem 'will_paginate', '3.0.pre2'

group :development do
gem 'rspec-rails', '2.0.1'
gem 'annotate-models', '1.0.4'
gem 'faker', '0.3.1'

group :test do
gem 'rspec', '2.0.1'
gem 'webrat', '0.7.1'
gem 'spork', '0.9.0.rc4'
gem 'libnotify', '0.3.0'
gem 'factory_girl_rails', '1.0'
Add the inital state of the framework to git. Take the opportunity to create a project at github and upload your public key (so that you can push up the project).
  • git init
  • git add .
  • git commit -am "Initial commit of Rails framework"
  • git remote add origin
  • git push origin master
Then create a project at heroku (goto and sign up) - you may need to install the heroku gem (gem install heroku)
  • heroku create
  • git push heroku master
  • heroku rename project-name #you can't use _ so, I tend to use - instead here
Finally, it's time to get rspec installed
  • rails generate rspec:install
Get some generate setup done for static pages:
  • remove public/index.html
  • add public/stylesheet/blueprint
  • add public/stylesheet/custom.css #see rails3tutorial
  • rails generate controller Pages home contact about
Add some autotesting
  • sudo apt-get install libnotify-bin
  • echo "require 'autotest/growl'" >> .autotest
  • spork --bootstrap
  • edit spec/spec_helper.rb to contain the contents of Listing 3.13 from the Rails3Tutorial Book
  • echo "--drb" >> .rspec
  • create /usr/local/bin/runautotest
sleep 10s
  • sudo chmod +x /usr/local/bin/runautotest

Sunday, March 13, 2011

Git: upgrading to parent

I have a fork of a GIT project. The parent had got ahead of my local repro by quite a bit, so I wanted to pull in all those changes.
  • cd [child-local-dir]
  • git remote add parent [parent-url]
  • git pull parent [stable]
Clearly, if you want a different branch, you can specify that.

Once it's local, if you have a remote location for your repro (such as github) just
  • git push

Saturday, March 12, 2011

Natty install

We've moved into Alpha3 and beyond and I've been running a distribution upgrade from Maverick for a while now. So, I took the opportunity to test a clean install this morning and it's been a huge success.

  • Downloaded the alpha3 .iso and burnt it to disc.
  • Shoved it in the drive and watched it do it's stuff
  • Selected "get updates" and "install restricted drivers"
  • Before long it was installed and ready to go (am liking the new Unity interface)
Mount the various partitions. I'd previously set up my machine so that the main drive is partitioned into:
  • sda1: 50Gb - Windows
  • sda2: 50Gb - Linux
  • sda3: the rest - data
So, after doing a clean install onto sda2, I mounted sda3 to /home and another drive to /backup.
  • Create the folders that you want to mount the drives to
  • List your drives to make sure you know where they are: "sudo fdisk -l"
  • Test that they mount okay: "mount /dev/sda3 /home"
  • Add a link to /etc/fstab "/dev/sda3/ /home defaults 0 0"
  • Remove the previous mount: "sudo umount /home"
  • Use fstab to mount all: "sudo mount -a"
Install some helpful programs
  • sudo apt-get install joe guake geany gdebi
Install some programs from downloaded debs: for some reason, installing them with the software centre triggers an error in alpha, so I just used gdebi to install them and everything worked fine. With ~/.chrome unaffected by the reinstall, all the previous settings were preserved.
  • chrome
  • dropbox
  • skype
Install Crashplan (ensuring that the jre from the restricted extras in installed). Adopt the old version of the machine so that a new seed of data isn't required. Copy over CrashPlanRemote (see previous blog post).

Setup apache, PHP, mysql etc.
  • sudo apt-get install php5 apache2 mysql-server phpmyadmin
Copy over /etc/hosts and the config files from sites-available and then use "sudo a2ensite blah" to enable them; follow that with a "sudo /etc/init.d/apache2 reload" and everything should be working. Import data into mysql.

Passwordless login is key to great workflow. Because ~ was preserved (on a partition unaffected by the reinstall), my ~/.ssh directory was unchanged. So, after /etc/hosts was reinstated, passwordless login was back up and running.

All in all - a complete reinstall in less than 2 hours!

Tuesday, February 22, 2011

Mysql backups

I have found the most wonderful tool for backing up mysql databases. It's not new... it's just that I've only found it recently!

automysqlbackup does exactly what is says on the tin - it backs up your mysql database. There are some great todos, but I thought I'd post what I did on out Plesk servers so that I've got it noted down for the future!
  • sudo -s
  • mkdir /var/mysql
  • mv automysqlbackup /usr/local/bin/
  • ln -s /usr/local/bin/automysqlbackup /etc/cron.daily/automysqlbackup
  • joe /etc/crontab (edit the time that the daily cron takes place and save)
When setting up automysqlbackup, these are the settings I used:
### START CFG ###
# Username to access the MySQL server e.g. dbuser

# Password to access the MySQL server e.g. password

# Host name (or IP address) of MySQL server e.g localhost

# List of DBNAMES for Daily/Weekly Backup e.g. "DB1 DB2 DB3"

# Backup directory location e.g /backups

# Mail setup
# What would you like to be mailed to you?
# - log : send only log file
# - files : send log file and sql files as attachments (see docs)
# - stdout : will simply output the log to the screen if run manually.
# - quiet : Only send logs if an error occurs to the MAILADDR.

# Set the maximum allowed email size in k. (4000 = approx 5MB email [see docs])

# Email Address to send mail to? (

# ============================================================
# === ADVANCED OPTIONS ( Read the doc's below for details )===

# List of DBBNAMES for Monthly Backups.

# List of DBNAMES to EXLUCDE if DBNAMES are set to all (must be in " quotes)

# Include CREATE DATABASE in backup?

# Separate backup directory and file for each DB? (yes or no)

# Which day do you want weekly backups? (1 to 7 where 1 is Monday)

# Choose Compression type. (gzip or bzip2)

# Compress communications between backup server and MySQL server?

# Additionally keep a copy of the most recent backup in a seperate directory.

# The maximum size of the buffer for client/server communication. e.g. 16MB (maximum is 1GB)

# For connections to localhost. Sometimes the Unix socket file must be specified.

# Command to run before backups (uncomment to use)

# Command run after backups (uncomment to use)
### END CFG ###

Plesk, Backup and Crashplan

Finally, I think I've found a good solution for Plesk and backup. For ages, I've been trying to find ways of efficiently backing up server data (www, mail mysql) and have tried to the Plesk solution, TotalBackup from 4psa and other rsync methods. To date, they all create a massive amount of data.

TotalBackup has served us okay. It does manage a backup of the whole server and then performs daily deltas, but each week, we still have a massive backup split up into 1000Mb files moving from server to server to facilitate an offline backup solution. And, potentially, with a server going down, we're still 24 hours out of date. Added to that, the restore process (which needs to knit all the parts of the backup together, apply the deltas and then unzip it, before offering the chance to actually get at your files) and we're in a position that doesn't really allow us the best backup/restore cycle.

I've been using Crashplan at home for a while now and love the product (especially after seeing what Mozy did with their pricing strategy and Crashplan promise not to do the same).

So, why not use the same technology with the servers... that's the plan, and that's what I've been spending the last couple of days thinking about. The plan is to backup:
  • /var/www/vhosts (where all the www data is)
  • /var/qmail/mailnames (where all the mail is)
  • /var/mysql/latest (where my latest mysql db backups are)
For more information about the mysql backup process, see the separate post.

Putty tunnels

In a previous post, I was talking about getting a headless instance of Crashplan setup on an Ubuntu server and then administrating it from a local machine (in that case, Ubuntu). However, I needed to do a similar thing with a windows machine.

Get putty: the best (maybe only) SSH client worth having on Windows.

On the support page at Crashplan, there is some discussion about how to use putty to create the remote connection via a tunnel, but either I'm daft, or it's not quite explicit enough. So, here's a little reminder about creating tunnels using putty.

Make sure that the Crashplan desktop client is not open (the tray icon can continue to run in the background).

Change the config
  • edit C:\Program Files\CrashPlan\conf\
  • uncomment the servicePort line and make sure it reads servicePort=4200
  • save the file
NB. Have just found Notepad++ which is a great little replacement for notepad.

Create the tunnel
  • open putty
  • in the first screen, enter the server details (as you would if just SSHing normally)
  • now in the left hand column, under "Connection" and then "SSH" is a tab helpfully titled "Tunnels"
  • in the source port box, type 4200
  • in the destination box, type localhost:4243
  • click Add (don't forget to do this bit!)
  • finally, click Open
  • log in with your normal user credentials
Check that the port is open
  • Start -> Run
  • cmd
  • telnet localhost 4200 (should not get a message that talks about failed connection)
Open Crashlpan - and you should get a remote connection over an SSH tunnel.

Once you're finished, change back the config file and close the putty session which will close the tunnel.

Missing desktop notification link

When you visit, you should get a helpful hint at the top of the page that relates to the new desktop notifications (which are very cool).

"Click here to enable desktop notifications in Google".

However, while there are links to "Learn more" and "Hide", there is no actual link to setup notification. There *should be* a link to the settings page where you can actually set up notifications.

in reference to: Desktop notifications in Chrome - Gmail Help (view on Google Sidewiki)