Archive

Archive for July, 2009

Scala Continuous Testing with sbt

July 27, 2009 Comments off

I’ve recently had occasion to start an open source project, and the correct tool for the job appears to be Scala.

So far the project is going well, but the pain has been around the build and IDE support for rapid and convenient development in Scala. Although all three of the major IDEs I’ve worked with recently (Eclipse, IntelliJ IDEA and Netbeans) have plugins for Scala, they are all early releases, and have various degrees of pain associated with them.

I ended up using Netbeans for editing and as a subversion client, then building with Maven when I wanted to compile and/or run tests. Calling Maven from within Netbeans to build a Scala project is still a bit creaky, so I was doing it from a terminal window directly.

This is very inconvenient, for a number of reasons. First, I’m working in a Behaviour-Driven Development mode, using specs as my BDD framework. This means I first write a specification in specs, see it fail, then write the code necessary to make it pass, then write (or extend) the next specification for the next behaviour I want, and so forth.

When I want to run a test, I had to flip to a command window, issue a Maven command to build and select the specified test to run, something like this:


mvn -Dtest=foo test

In order to make this work I had to declare my specs as JUnit tests (with the @Test annotation), even though they don’t use anything else from JUnit. This felt like a bit of a hack, albeit a useful one. Another pain point was the startup time for Maven (although I understand there’s a “console” plugin for Maven as well that can perhaps reduce this particular pain).

As I like to tinker with new stuff, I thought I’d make a departure from Maven and give sbt a try. Sbt is a build tool written in Scala that supports building both Scala and Java (and mixed) projects in a very simple way. Unlike Ant, there’s no up-front pain to write a build script, though, as sbt can make reasonable assumptions (which you can override) about where to find your classes and libraries, so you hit the ground running.

In literally seconds I was up and running after following the install instructions on the sbt site. After a bit of experimenting I found the “console” mode in sbt, where you launch sbt and leave it running.

Once in console mode you can either just type “test” every time you want to build and run all tests, or be more selective, and run only the tests that failed last time, or just a single specified test, if you’re working on just one feature. Any of these operations are fast – mostly because sbt is already loaded and running, but also because sbt does a bit less work then Maven does on every build.

Although sbt can be configured to work in conjunction with Ivy or Maven repositories, you can also just drop your dependency libs in to “lib” directory in your project. For open source this is rather nice, as it saves the user of the project the trouble of trying to find them. Even supplying a Maven pom that specifies the repositories from which to download your dependencies is not a guarantee, as repositories change over time. Many is the time I’ve gone to download a dependency (or rather, Maven has gone to do it for me), only to find it’s not where it used to be, is a different name or version, or some other problem causes my build to fail. Like Ant, sbt can avoid this problem by keeping dependencies locally. Unlike Ant, it can also go get the dependencies the first time for you from the same repositories Maven uses – perhaps giving you the best of both words in some situations.

Even more interesting was the command


~ test

Which runs all the tests, then waits for any source code to change (test or main code). When it sees a change, it runs all the tests again (after compiling the changes, of course). Poor mans continuous testing 🙂

Wait, it gets even awesomer! When you say


~ test SomeTest

sbt will wait for any changes, then run just the specified test. This is ideal when you know you’re only working on a specific set of functionality (and therefore affecting only a single test). When sbt is waiting, you can just hit any key to return to the interactive mode, so it’s easy to change it from one of these modes to another.

Other commands in sbt are also very familiar and quick, such as “compile”, which does exactly as you’d expect from the name. “Package” is another good one – it produces a jar artifact, just like the Maven command of the same name. I haven’t yet tried it’s deploy mechanisms properly, but early results look promising.

I also like the “console” command, which runs the Scala command-line console, but with your project on the classpath, along with all it’s dependencies. This lets you do ad-hoc statements quickly and easily, and see the results right away. When you’re not sure what’s going on with a failing spec, I’ve found this mode very helpful to experiment. Scala is such an expressive language, I can write a quick experiment in one or two lines of code, see the result (as the Scala console also evaluates expressions by default), and go back to coding and testing, all without re-starting sbt. Quite nice, and somewhat reminiscent of the similar functionality in Rails and “irb” (and JRuby’s equivilant, Jirb).

There are many other things I’ve found about sbt that I like so far, but those are topics for another post later on….

By: Mike Nash

Advertisements

The Corporate Culture of Post-it Notes

July 8, 2009 Comments off

Ahh, the ubiquitous Post-It® Note.  My workspace is covered with lovely multi-coloured notes, or it was until I discovered Digital Notes. The unassuming Post-it has become a legend but I wanted to  share it again, as told in The Knowledge-Creating Company by Nonaka, and Takeuchi.

Knowledge_Creating_Company

“Art [Fry] sang in the church  choir and noticed that the slips of paper he inserted to mark selected hymns would fall out.  He decided to create a marker that would stick to the page but would peel off without damaging it.  He made use of a peel-able adhesive that Spence Silver at the Central Research Lab had developed four years previously, and made himself some prototypes of the self-attaching sheets of paper.

Sensing a market beyond just hymnal markers, Fry got permission to use a pilot plant and started working nights to develop a process for coating Silver’s adhesive on paper. When he was told that the machine he designed could take six months to make and cost a small fortune, he single-handledly built a crude version in his own basement overnight and brought it to work the next morning.  The machine worked.  But the marketing people did some surveys with potential customers, who said they didn’t feel the need for paper with a weak adhesive.  Fry said, “Even though I felt that there would be demand for the product, I didn’t know how to explain it in words.  Even if I found the words to explain, no one would understand…” Instead, Fry distributed samples within 3M and asked people to try them out.  The rest was history.  Post-it Notes became a sensation thanks to Art Fry’s entrepreneurial dedication and dogged persistence.

(Nonaka I, Takeuchi H. The Knowledge-Creating Company: How Japanese Companies Create the Dynamics of Innovation. 1995)

That entrepreneurial spirit has been part of 3M’s corporate culture almost since inception.  As stated in the William L. McKnight Management Principles:

“As our business grows, it becomes increasingly necessary to delegate responsibility and to encourage men and women to exercise their initiative. This requires considerable tolerance. Those men and women, to whom we delegate authority and responsibility, if they are good people, are going to want to do their jobs in their own way.

“Mistakes will be made. But if a person is essentially right, the mistakes he or she Post-itmakes are not as serious in the long run as the mistakes management will make if it undertakes to tell those in authority exactly how they must do their jobs.

“Management that is destructively critical when mistakes are made kills initiative. And it’s essential that we have many people with initiative if we are to continue to grow.”

Chris touched on this when he blogged about Failing Should be Easy and even Why don’t people like my ideas?!.  Art Fry was given the opportunity to fail.  When people didn’t like his idea, he proceeded to find a way to prove that his idea truly was great.

I’m proud that we here at Point2 allow people to fail; in fact, using Test Driven Development we ensure that everyone fails at first.

We give people time to explore and experiment, and everyone has some time for professional development.

Incidentally, Ken Schwaber’s early paper, “SCRUM Development Process” references heavily the work of Takeuchi and Nonaka and their description of a rugby organization style.

By: Kevin Bitinsky

Linux Certification Study Group

July 1, 2009 1 comment

Why

Since the day I joined Point2 I couldn’t stop noticing that we were a Microsoft solutions company in a period of migration towards more Open Source solutions and Linux.

One thing that was clear to me was that we would have to increase the level of Linux knowledge we had in the company.

What

I decided to start a Linux Certification Study Group with 6 people from three different departments. Why such a small number of participants, you will ask ? Shhh…. I had a hidden agenda. My intention when putting this group together wasn’t only to help them getting the Linux Certification, it was also a golden opportunity to make them collaborate, talk, learn from each other. After more than three months of studying as a group I can safely say that we do have a bond around Linux.

The Certification

For the study group we chose the Junior Level Linux Professional (LPIC-1) certification from the Linux Professional Institute. It is very well-structured and has three levels of expertise: junior, advanced and senior. It is as distribution agnostic as much as it can be. Linux Professional Institute - Lpi

The certification comprises 2 exams: 101 and 102, with 60 questions per exam.

A LPI certified Level 1 Linux professional (designated LPIC-1) should have the technical capability to maintain and use a Linux system at the entry level.

  • Has installed, maintained and configured a Linux system
  • Works at the Linux command line
  • Performs easy maintenance tasks such as:
    • help out users
    • manage data
    • add, manage and delete users
    • create basic shell scripts
    • shutdown, reboot and understand the components of a Linux system
  • Installs and configures a workstation (including X) and connects it to a LAN

Cost

Per student, Point2 pays for:

Per facilitator, Point2 covers the cost for:

How

We meet once a week during our time offered by the company for self development (Friday’s between 13:00-15:00). During each of the sessions we cover one or two objectives by going through the material as a group. Obviously as a facilitator I have to prepare some exercises that we try to solve during the session. Having a well structured set of topics helps us to keep the sessions pretty focused. At several occasions we got distracted with other interesting Linux topics. However we quickly realized we were getting side tracked and returned to the topics we are going to be tested on.

Usually on Wednesdays I circulate an email with 12 refresher exercises from previous weeks. That pushes us all to keep the study material fresh in our minds. We meet for 20 minutes during our lunch break every Wednesday and exchange the solutions of the previous week’s refresher exercises.

We deliberately started the 4 topics in the following order:

  • Topic 103: GNU and Unix Commands
  • Topic 104: Devices, Linux Filesystems, Filesystem Hierarchy Standard
  • Topic 102: Linux Installation and Package Management
  • Topic 101: System Architecture

so that our candidates got familiar with the Unix Commands quickly. I think it makes it more fun to be able to run commands from day one rather than going into the theory of the System Architecture.

As for computers, we’ve been using all sorts, PC’s with a connection to a Linux VM, a netbook with Linux (Ubuntu), laptops with Mac OSX (running a Fedora VM); we even built two cheap PC’s especially for the course. These are fun because you can do with them anything you like 🙂 This should not be an impediment for you to start your courses; anything will do to keep you going.

When

It has taken us four months to prepare for the first exam (101) and we are going to take four months “off” before we tackle the next exam (102) which we expect to take another four months to prepare, i.e. it takes a good year to get the certification.

We will interleave the groups, whilst the first group is relaxing we’ll take the next group to prepare for the exam 101. This will prevent us from having too many groups at the same time.

Where

We can take the certification exams in any Vue Pearson or Prometric center. Luckily there are TWO sites in Saskatoon:

Academy Of Learning
1202A Quebec Ave
Saskatoon, Saskatchewan S7K 1V2

and

Saskatoon Business College
Saskatoon, Saskatchewan S7K 2H7
Phone: 306-244-6333 Site Code: SS4

Who

Three months ago we organized a group of people – two System Admins (John and Andrew), two Technical Support Admins (Mike and Tyler), two members of our development team (Nathan and Logan) and I as a facilitator.

From these six people, we’ve chosen 2 volunteers (Logan and John) to run the next groups.

There will be 3 groups next time, two slow (meeting fortnightly) and another one fast (meeting once a week).

Lessons learned

  • high motivation – keep groups small
  • constant – allocate the same 2 hours every week
  • feed the hungry – don’t stop with the bare minimum asked in each topic, e.g. we asked Kevin B. to run an advanced vim workshop
  • stay focused – don’t get too sidetracked by interesting stuff you will come across, more likely you will touch that topic again in other parts of the certification
  • show progress
  • measure progress using an assessment test

LPIC-1 Detailed objectives

Objectives: Exam 101 (First part of the Level 1 certification exams)

  • Topic 101: System Architecture
    • 101.1 Determine and configure hardware settings
    • 101.2 Boot the system
    • 101.3 Change runlevels and shutdown or reboot system
  • Topic 102: Linux Installation and Package Management
    • 102.1 Design hard disk layout
    • 102.2 Install a boot manager
    • 102.3 Manage shared libraries
    • 102.4 Use Debian package management
    • 102.5 Use RPM and YUM package management
  • Topic 103: GNU and Unix Commands
    • 103.1 Work on the command line
    • 103.2 Process text streams using filters
    • 103.3 Perform basic file management
    • 103.4 Use streams, pipes and redirects
    • 103.5 Create, monitor and kill processes
    • 103.6 Modify process execution priorities
    • 103.7 Search text files using regular expressions
    • 103.8 Perform basic file editing operations using vi
  • Topic 104: Devices, Linux Filesystems, Filesystem Hierarchy Standard
    • 104.1 Create partitions and filesystems
    • 104.2 Maintain the integrity of filesystems
    • 104.3 Control mounting and unmounting of filesystems
    • 104.4 Manage disk quotas
    • 104.5 Manage file permissions and ownership
    • 104.6 Create and change hard and symbolic links
    • 104.7 Find system files and place files in the correct location

Objectives: Exam 102 (Second part of the Level 1 certification exams)

  • Topic 105: Shells, Scripting and Data Management
    • 105.1 Customize and use the shell environment
    • 105.2 Customize or write simple scripts
    • 105.3 SQL data management
  • Topic 106: User Interfaces and Desktops
    • 106.1 Install and configure X11
    • 106.2 Setup a display manager
    • 106.3 Accessibility
  • Topic 107: Administrative Tasks
    • 107.1 Manage user and group accounts and related system files
    • 107.2 Automate system administration tasks by scheduling jobs
    • 107.3 Localisation and internationalisation
  • Topic 108: Essential System Services
    • 108.1 Maintain system time
    • 108.2 System logging
    • 108.3 Mail Tranfer Agent (MTA) basics
    • 108.4 Manage printers and printing
  • Topic 109: Networking Fundamentals
    • 109.1 Fundamentals of internet protocols
    • 109.2 Basic network configuration
    • 109.3 Basic network troubleshooting
    • 109.4 Configure client side DNS
  • Topic 110: Security
    • 110.1 Perform security administration tasks
    • 110.2 Setup host security
    • 110.3 Securing data with encryption

About the LPI:

“The Linux Professional Institute is globally supported by the IT industry, enterprise customers, community professionals, government entities and the educational community. LPI’s certification program is supported by an affiliate network spanning five continents and is distributed worldwide in multiple languages in more than 7,000 testing locations. Since 1999, LPI has delivered more than 195,000 exams and 62,000 LPIC certifications around the world.”

Happy Linux!

By: Marcos Tarruella