Setting up an Ubuntu development VM: Scripted

Having seen this blog post about setting up a development Linux VM in a recent Morning Brew, I had to have a shot at doing it all in a script instead, since it looked like an awful lot of hard work to do it manually.

The post I read covers downloading and installing VirtualBox (which could be scripted also, using the amazing Chocolatey) and then installing Ubuntu, logging in to the VM, downloading and installing Chrome, SublimeText2, MonogDB, Robomongo, NodeJs, NPM, nodemon, and mocha.

Since all of this can be handled via apt-get and a few other cunning configs, here’s my attempt using Vagrant. Firstly, vagrant init a directory, then paste the following into the Vagrantfile:

Vagrantfile

[bash]
Vagrant.configure(2) do |config|

config.vm.box = "precise32"
config.vm.box_url = "http://files.vagrantup.com/precise32.box"

end
[/bash]

Setup script

Now create new file in the same dir as the Vagrantfile (since this directory is automatically configured as a shared folder, saving you ONE ENTIRE LINE OF CONFIGURATION), calling it something like set_me_up.sh. I apologise for the constant abuse of > /dev/null – I just liked having a clear screen sometimes..:

[bash]#!/bin/sh

clear
echo "******************************************************************************"
echo "Don’t go anywhere – I’m going to need your input shortly.."
read -p "[Enter to continue]"

### Set up dependencies
# Configure sources & repos
echo "** Updating apt-get"
sudo apt-get update -y > /dev/null

echo "** Installing prerequisites"
sudo apt-get install libexpat1-dev libicu-dev git build-essential curl software-properties-common python-software-properties -y > /dev/null

### deal with intereactive stuff first
## needs someone to hit "enter"
echo "** Adding a new repo ref – hit Enter"
sudo add-apt-repository ppa:webupd8team/sublime-text-2

echo "** Creating a new user; enter some details"
## needs someone to enter user details
sudo adduser developer

echo "******************************************************************************"
echo "OK! All done, now it’s the unattended stuff. Go make coffee. Bring me one too."
read -p "[Enter to continue]"

### Now the unattended stuff can kick off
# For mongo db – http://docs.mongodb.org/manual/tutorial/install-mongodb-on-ubuntu/
echo "** More prerequisites for mongo and chrome"
sudo apt-key adv –keyserver hkp://keyserver.ubuntu.com:80 –recv 7F0CEB10 > /dev/null
sudo sh -c ‘echo "deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen" | sudo tee /etc/apt/sources.list.d/mongodb.list’ > /dev/null
# For chrome – http://ubuntuforums.org/showthread.php?t=1351541
wget -q -O – https://dl-ssl.google.com/linux/linux_signing_key.pub | sudo apt-key add –

echo "** Updating apt-get again"
sudo apt-get update -y > /dev/null

## Go, go, gadget installations!
# chrome
echo "** Installing Chrome"
sudo apt-get install google-chrome-stable -y > /dev/null

# sublime-text
echo "** Installing sublimetext"
sudo apt-get install sublime-text -y > /dev/null

# mongo-db
echo "** Installing mongodb"
sudo apt-get install mongodb-10gen -y > /dev/null

# desktop!
echo "** Installing ubuntu-desktop"
sudo apt-get install ubuntu-desktop -y > /dev/null

# node – the right(?) way!
# http://www.joyent.com/blog/installing-node-and-npm
# https://gist.github.com/isaacs/579814

echo "** Installing node"
echo ‘export "PATH=$HOME/local/bin:$PATH"’ >> ~/.bashrc
. ~/.bashrc
mkdir ~/local
mkdir ~/node-latest-install
cd ~/node-latest-install
curl http://nodejs.org/dist/node-latest.tar.gz | tar xz –strip-components=1
./configure –prefix=~/local
make install

# other node goodies
sudo npm install nodemon > /dev/null
sudo npm install mocha > /dev/null

## shutdown message (need to start from VBox now we have a desktop env)
echo "******************************************************************************"
echo "**** All good – now quitting. Run *vagrant halt* then restart from VBox to go to desktop ****"
read -p "[Enter to shutdown]"
sudo shutdown 0
[/bash]

The gist is here, should you want to fork and edit it.

You can now open a prompt in that directory and run
[bash]
vagrant up && vagrant ssh
[/bash]
which will provision your VM and ssh into it. Once connected, just execute the script by running:
[bash]
. /vagrant/set_me_up.sh
[/bash]

(/vagrant is the shared directory created for you by default)

Nitty Gritty

Let’s break this up a bit. First up, I decided to group together all of the apt-get configuration so I didn’t need to keep calling apt-get update after each reconfiguration:

[bash]
# Configure sources & repos
echo "** Updating apt-get"
sudo apt-get update -y > /dev/null

echo "** Installing prerequisites"
sudo apt-get install libexpat1-dev libicu-dev git build-essential curl software-properties-common python-software-properties -y > /dev/null

### deal with intereactive stuff first
## needs someone to hit "enter"
echo "** Adding a new repo ref – hit Enter"
sudo add-apt-repository ppa:webupd8team/sublime-text-2
[/bash]

Then I decided to set up a new user, since you will be left with either the vagrant user or a guest user once this script has completed; and the vagrant one doesn’t have a desktop/home nicely configured for it. So let’s create our own one right now:

[bash]
echo "** Creating a new user; enter some details"
## needs someone to enter user details
sudo adduser developer

echo "******************************************************************************"
echo "OK! All done, now it’s the unattended stuff. Go make coffee. Bring me one too."
read -p "[Enter to continue]"
[/bash]

Ok, now the interactive stuff is done, let’s get down to the installation guts:

[bash]
### Now the unattended stuff can kick off
# For mongo db – http://docs.mongodb.org/manual/tutorial/install-mongodb-on-ubuntu/
echo "** More prerequisites for mongo and chrome"
sudo apt-key adv –keyserver hkp://keyserver.ubuntu.com:80 –recv 7F0CEB10 > /dev/null
sudo sh -c ‘echo "deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen" | sudo tee /etc/apt/sources.list.d/mongodb.list’ > /dev/null
# For chrome – http://ubuntuforums.org/showthread.php?t=1351541
wget -q -O – https://dl-ssl.google.com/linux/linux_signing_key.pub | sudo apt-key add –

echo "** Updating apt-get again"
sudo apt-get update -y > /dev/null
[/bash]

Notice the URLs in there referencing where I found out the details for each section.

The only reason these config sections are not at the top with the others is that they can take a WHILE and I don’t want the user to have to wait too long before creating a user and being told they can go away. Now we’re all configured, let’s get installing!

[bash]
## Go, go, gadget installations!
# chrome
echo "** Installing Chrome"
sudo apt-get install google-chrome-stable -y > /dev/null

# sublime-text
echo "** Installing sublimetext"
sudo apt-get install sublime-text -y > /dev/null

# mongo-db
echo "** Installing mongodb"
sudo apt-get install mongodb-10gen -y > /dev/null

# desktop!
echo "** Installing ubuntu-desktop"
sudo apt-get install ubuntu-desktop -y > /dev/null
[/bash]

Pretty easy so far, right? ‘Course it is. Now let’s install nodejs on linux the – apparently – correct way. Well it works better than compiling from source or apt-getting it.

[bash]
# node – the right(?) way!
# http://www.joyent.com/blog/installing-node-and-npm
# https://gist.github.com/isaacs/579814

echo "** Installing node"
echo ‘export "PATH=$HOME/local/bin:$PATH"’ >> ~/.bashrc
. ~/.bashrc
mkdir ~/local
mkdir ~/node-latest-install
cd ~/node-latest-install
curl http://nodejs.org/dist/node-latest.tar.gz | tar xz –strip-components=1
./configure –prefix=~/local
make install
[/bash]

Now let’s finish up with a couple of nodey lovelies:
[bash]
# other node goodies
sudo npm install nodemon > /dev/null
sudo npm install mocha > /dev/null
[/bash]

All done! Then it’s just a case of vagrant halting the VM and restarting from Virtualbox (or edit the Vagrantfile to include a line about booting to GUI); you’ll be booted into an Ubuntu desktop login. Use the newly created user to log in and BEHOLD THE AWE.

Enough EPICNESS, now the FAIL…

Robomongo Fail 🙁

The original post also installs Robomongo for mongodb administration, but I just couldn’t get that running from a script. Booo! Here’s the script that should have worked; please have a crack and try to sort it out! qt5 fails to install for me which then causes everything else to bomb out.

[bash]
# robomongo
INSTALL_DIR=$HOME/opt
TEMP_DIR=$HOME/tmp

# doesn’t work
sudo apt-get install -y git qt5-default qt5-qmake scons cmake

# Get the source code from Git. Perform a shallow clone to reduce download time.
mkdir -p $TEMP_DIR
cd $TEMP_DIR
sudo git clone –depth 1 https://github.com/paralect/robomongo.git

# Compile the source.
sudo mkdir -p robomongo/target
cd robomongo/target
sudo cmake .. -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=$INSTALL_DIR
make
make install

# As of the time of this writing, the Robomongo makefile doesn’t actually
# install into the specified install prefix, so we have to install it manually.
mkdir -p $INSTALL_DIR
mv install $INSTALL_DIR/robomongo
mkdir -p $HOME/bin
ln -s $INSTALL_DIR/robomongo/bin/robomongo.sh $HOME/bin/robomongo

# Clean up.
rm -rf $TEMP_DIR/robomongo
[/bash]

Not only is there the gist, but the whole shebang is over on github too.

ENJOOOYYYYY!

Node.js 101: Wrap up

Year of 101s, Part 1 – Node January

Summary – What was it all about?

I set out to spend January learning some node development fundementals.

Part #1 – Intro

I started with a basic intro to using node – a Hello World – which covered what node.js is, how to create the most basic of all programs, and mentioned some of the development environments.

Part #2 – Serving web content

Second was creating a very simple node web server, which covered using nodemon to develop your node app, the concept of exports, basic request routing, and serving various content types.

Part #3 – A basic API

Next was a simple API implementation, where I proxy calls to the Asos API, return a remapped subset of the data returned, reworked the routing creating basic search functionality and a detail page, and touched on being able to pass in command line arguements.

Part #4 – Basic deployment and hosting with Appharbor, Azure, and Heroku

Possibly the most interesting and fun post for me to work on involved deploying the node code on to three cloud hosting solutions where I discovered the oddities each provider has, various solutions to the problems this raises, as well as some debugging cleverness (nice work, Heroku!). The simplicity of a git-remote-push-deploy process is incredible, and really makes quick application development and hosting even more enjoyable!

Part #5 – Packages

Another interesting one was getting to play with node packages, the node package manager (npm), the express web framework, jade templating engine, and stylus css pre-processor, and deploying node apps with packages to cloud hosting.

Part #6 – Web-based development

The final part covered the fantastic Cloud9IDE, including a (very) basic intro to github, and how Cloud9 can still be used in developing and deploying directly to Azure, Appharbor, or Heroku.

What did I get out of it?

I really got into githubbing and OSSing, and really had to try hard to not over stretch myself as I had starting forking repos to try and make a few tweaks to things whilst working on the node month.

It has been extremely inspiring and has opened up so many other random tangents for me to explore in other projects at some other time. Very motivating stuff.

I’ve now got a month of half decent blog posts – I had only intended to do a total of 4 posts but including this one I’ve done 7, since I kept adding more information as it turned up and needed to split a few posts into two.

Also I’ve learned a bit about blogging; trying to do posts well in advance allowed me to build up the details once I’d discovered more whilst working on subsequent posts. For example, how Appharbor and Azure initially track master – but can be configured to track different branches. Also, debugging with Heroku only came up whilst working with packages in Heroku.

Link list

Node tutorials and references

Setting up a node development environment on Windows
Node Beginner – a great article, and I’ve also bought the associated eBooks.
nodejs.org – the official node site, the only place to go for reference

Understanding Javascript better

Execution in The Kingdom of Nouns
Object Orientation and Inheritance in Javascript

Appharbor

Appharbor and git

Heroku

Heroku toolbelt download and reference
node on Heroku

Azure

Checkout what Azure can do!

February – coming up, Samsung Smart TV App Development!

Yeah, seriously. How random is that?.. 🙂

Node.js 101: Part #5 – Packages

Following on from my recent post about doing something this year, I’m committing to doing 12 months of “101”s; posts and projects themed at begining something new (or reasonably new) to me

January is all about node, and I started with a basic intro, then cracked open a basic web server with content-type manipulation and basic routing, created a basic API, before getting stuck into some great deployment and hosting solutions

Node Packages

Up until now I’ve been working with node using the basic code I’ve written myself. What about if you want to create an application that utilises websockets? Or how about a Sinatra-inspired web framework to shortcut the routing and request handling I’ve been writing? Maybe you want to have a really easy to build website without having to write HTML with a nice look without writing any CSS? Like coffeescript? mocha? You gaddit.

Thanks to the node package manager you can easily import pre-built packages into your project to do alllll of these things and loads more. This command line tool (which used to be separate but is now a part of the node install itself) can install the packages in a ruby gem-esque/.Net nuget fashion, pulling down all the dependencies automatically.

Example usage:
[code]npm install express -g[/code]

The packages (compiled C++ binaries, just like node itself) are pulled either into your working directory (local node_modules folder) or as a global package (with the “-g” parameter). You then reference the packages in your code using “requires”.

Or you can install everything your project needs at once by creating a package.json e.g.:
[code]{
"name": "basic-node-package",
"version": "0.0.1",
"dependencies": {
"express": "*",
"jade": "*",
"stylus": "*",
"nib": "*"
}
}[/code]

And then call [code]npm install[/code]

A great intro to using these four packages can be found on the clock website

I’ve decided to write a wrapper for my basic node API using express, jade, stylus, and nib. All I’m doing is call the api and displaying the results on a basic page. The HTML is being written in jade and the css in stylus & nib. Routing is being handled by express.

app.js
[js]var express = require(‘express’)
, stylus = require(‘stylus’)
, nib = require(‘nib’)
, proxy = require(‘./proxy’)

var app = express()
function compile(str, path) {
return stylus(str)
.set(‘filename’, path)
.use(nib())
}
app.set(‘views’, __dirname + ‘/views’)
app.set(‘view engine’, ‘jade’)
app.use(express.logger(‘dev’))
app.use(stylus.middleware(
{ src: __dirname + ‘/public’
, compile: compile
}
))
app.use(express.static(__dirname + ‘/public’))

var host = ‘rposbo-basic-node-api.azurewebsites.net’;

app.get(‘/products/:search/:key’, function (req,response) {
console.log("Request handler ‘products’ was called");

var requestPath = ‘/products/’ + req.params.search + ‘?key=’ + req.params.key;

proxy.getRemoteData(host, requestPath, function(json){
var data = JSON.parse(json);

response.render(‘products’,
{
title: ‘Products for’ + data.category,
products: data.products,
key: req.params.key
}
);
})
});

app.get(‘/product/:id/:key’, function (req,response) {
console.log("Request handler ‘product’ was called");

var requestPath = ‘/product/’ + req.params.id + ‘?key=’ + req.params.key;

proxy.getRemoteData(host, requestPath, function(json){
var data = JSON.parse(json);

response.render(‘product’,
{
title: data.title,
product: data
}
);
})
});

app.get(‘/’, function (req,response) {
console.log("Request handler ‘index’ was called");
response.end("Go");
});

app.listen(process.env.PORT);
[/js]

So that file sets up the express, jade, and stylus references and wires up the routes for /products/ and /product/ which then make a call using my old proxy.js to the API; I can probably do all of this with a basic inline http get, but I’m just reusing it for the time being.

Notice how the route “/products/:search/:key” which would actually be something like “/products/jeans/myAp1k3Y” is referenced using req.params.search and req.params.key.

Then all I’m doing is making the API call, parsing the returned JSON and passing that parsed object to the view.

The views are written in jade and have a main shared one:
layout.jade
[code]!!!5
html
head
title #{title}
link(rel=’stylesheet’, href=’/stylesheets/style.css’)
body
header
h1 basic-node-packages
.container
.main-content
block content
.sidebar
block sidebar
footer
p Running on node with Express, Jade and Stylus[/code]

Then the route-specific ones:

products.jade:
[code]extend layout
block content
p
each product in products
li
a(href=’/product/’ + product.id + ‘/’ + key)
img(src=product.image)
p
=product.title[/code]

and

product.jade:
[code]extend layout
block content
p
img(src=product.image)
li= product.title
li= product.price[/code]

The stylesheet is written in stylus & nib:

style.styl
[css]/*
* Import nib
*/
@import ‘nib’

/*
* Grab a custom font from Google
*/
@import url(‘http://fonts.googleapis.com/css?family=Quicksand’)

/*
* Nib provides a CSS reset
*/
global-reset()

/*
* Store the main color and
* background color as variables
*/
main-color = #fa5b4d
background-color = #faf9f0

body
font-family ‘Georgia’
background-color background-color
color #444

header
font-family ‘Quicksand’
padding 50px 10px
color #fff
font-size 25px
text-align center

/*
* Note the use of the `main-color`
* variable and the `darken` function
*/
background-color main-color
border-bottom 1px solid darken(main-color, 30%)
text-shadow 0px -1px 0px darken(main-color, 30%)

.container
margin 50px auto
overflow hidden

.main-content
float left

p
margin-bottom 20px

li
width:290
float:left

p
line-height 1.8

footer
margin 50px auto
border-top 1px dotted #ccc
padding-top 5px
font-size 13px[/css]

And this is compiled into browser-agnostic css upon compilation of the app.

The other files used:

proxy.js:
[js]var http = require(‘http’);

function getRemoteData(host, requestPath, callback){

var options = {
host: host,
port: 80,
path: requestPath
};

var buffer = ”;
var request = http.get(options, function(result){
result.setEncoding(‘utf8’);

result.on(‘data’, function(chunk){
buffer += chunk;
});

result.on(‘end’, function(){
callback(buffer);
});
});

request.on(‘error’, function(e){console.log(‘error from proxy call: ‘ + e.message)});
request.end();
};
exports.getRemoteData = getRemoteData;[/js]

package.json
[js]{
"name": "basic-node-package",
"version": "0.0.1",
"dependencies": {
"express": "*",
"jade": "*",
"stylus": "*",
"nib": "*"
}
}[/js]

web.config
[xml]<configuration>
<system.web>
<compilation batch="false" />
</system.web>
<system.webServer>
<handlers>
<add name="iisnode" path="app.js" verb="*" modules="iisnode" />
</handlers>
<iisnode loggingEnabled="false" />

<rewrite>
<rules>
<rule name="myapp">
<match url="/*" />
<action type="Rewrite" url="app.js" />
</rule>
</rules>
</rewrite>
</system.webServer>
</configuration>[/xml]

All of these files are, as usual, on Github

Deployment with Packages

Something worth bearing in mind is that deploying something which includes packages and the result of packages (e.g. minified js or css from styl) requires all of these artifacts to be added into your git repo before deployment to certain hosts such as Appharbor and Azure; Heroku will actually run an npm install as part of the deployment step, I believe, and also compile the .styl into .css, unlike Azure/Appharbor.

The files above give a very basic web interface to the /products/ and /product/ routes:
asos-jade-products-1

asos-jade-product-1

Coming up

Web-based node development and deployment!

Node.js @ UKWAUG: MS Cloud Day – Windows Azure Spring Release

The fourth session I attended was the highly energetic and speedy introduction to writing node.js and running it on Azure, presented by the author of Simple.Data and Simple.Web, and one of those voices of the developer community with a great JFDI attitude, Mark Rendle (@markrendle).

I’ve just recently got into node.js development myself and have been very much enjoying node, npm, express, stylus, and nib; there is a fantastic community and expanse of modules already and that can be a bit daunting.

During the session Mark’s short code example shows just how simple it can be to get up and running with node, and also how easy it is to deploy to Azure.

A nice comment was that we are on the road to “ecmascript harmony”! And that “Javascript is a great language so long as you ignore the 90% of it which coffeescript doens’t compile to.”

It was a very fast-paced session; hopefully my notes still make sense though..

What the various aspects of Azure do

  • compute – web, worker, vm
  • websites – .net, node, php
  • storage – blob, tables (distributed nosql, like cassandra), queues
  • sql – sql azure, reporting
  • services – servicebus, caching, acs

What are the Cloud Service types used for

  • web roles – iis, for apps
  • worker – no iis, for running anything

How to peruse the contents of blob or table

General tips for developing sites for use in Azure

  • keep static content in blob storage
  • websites commit and deploy much faster than cloud serviecs commit and deploy process
  • azure/iis needs server.js, not app.js

How to run RavenDB in Azure

  • Spin up a vm and install it!! (this used to be a much trickier process, but the recent Azure update meant that the VM support is mature enough to allow the simpler solution)

Developing node.js

Use jetbrains webstorm for debugging/ or the wonderful online editor, Cloud9IDE. Sublime Text 2 is a great editor for simple code requirements, and has great plugins for Javascript support. I also used this for taking all of the seminar notes as it has a simple “zen” zero-distractions interface

Next up – Hadoop and High Performance Computing