Thursday 31 December 2009

Better the devil you know?

As a professional software developer I sometimes find myself having to defend decisions made, when the inevitable unexpected problems arise.

A few months ago I was pleased when the decision was made to move away from the JBoss portal server implementation, to a more open platform which would run directly on Tomcat 6. Leaving the office on Christmas eve with two known potential show-stoppers was not my ideal way to start a holiday.

This week has supposedly knocked one of those out but will still involve change and the uncertainty of another upgrade.

The less important issue of the combination of multiple JSF-backed forms in separate portlets on a single page still needs to be dealt with.

It seems like every second day a different person has queried the team about why we moved away from something that worked. I'm trying to stay positive and optimistic that we will have fewer issues with configuring aspects such as load balancing, but a part of me is hoping that someone else gets the task of configuring the clustering.

Tuesday 15 December 2009

The benefits of health insurance?

I've been feeling a bit under the weather recently, so thought I should check out the health insurance paperwork that my employer recently gave me.

Imagine my surprise at reading the period covered finished yesterday - just a few weeks after it started!

It turns out that the policy is being renewed to include an updated list of hospitals etc., but I can't help but feel that I am technically not covered.

I have a client visiting work for some training over the next few days, so being absent is not really an option anyway.

Tomcat configuration

Sorry for the lack of content in this post, but I sometimes use this blog as a bookmark/reminder for myself.

Here are a couple of blog posts from one of the maintainers of Tomcat.

I found it reassuring to see the load testing tools that Mark uses matched up with what I had been using a few months back.

Wednesday 9 December 2009

Google finally cracking down on scammers

Several months back I was disappointed to see Facebook allowing adverts for something that looked vary much like a scam, using Google's name for credibility.

While I haven't seen those ads for a while, I was pleased to see an article indicating that Google is suing:

Sunday 6 December 2009

Phone upgrade time

I've decided it is time to move on from my cheap and primitive Nokia.

Last week I ordered myself a shiney new T-Mobile Pulse.

Apparently it is the first Android phone to be offered as a Pay As You Go deal.

I would have preferred the Hero but it is just too much hassle to get onto a contract deal, given that I have not been living in the UK for 3 years, and possibly won't be here for another 24 months.

Today I noticed an email informing me that due to popular demand it could take 5 working days instead of 2 for the package to arrive.

Tuesday 6 October 2009

Cunning spammers on Twitter?

Today I had yet another "follower" on my Twitter account. Someone who I'd never heard off, in a line of business that I have no interest in etc.

Sure enough their following vs follower count was up around 1,000 to 50.

Being an IT type person, I try to look for patterns for such things. I notice that this particular individual had the bulk of their posts listed as being sent using something called Seesmic.

According to their website, the Seesmic catch phrase is "Build your community. Given that the first sentence on their home page is, "A desktop client to manage your lifestream from Facebook & multiple Twitter accounts", I'm picking that they could be helping people to spam via social networks.

The words that raise a red flag for me are multiple Twitter accounts.

If you're using Twitter to follow people and organisations then 1,000 would seem a bit excessive a number to be acvtively following.

Sunday 4 October 2009

Advantages of the London lifestyle

I'm getting accustomed to carrying a technical book with me on my tube ride to and from work.

I've never been a speed reader, so it's good to have half an hour or so at each end of the day where there isn't much else that I can usefully be doing.

One of the books that I am currently reading is Refactoring To Patterns by Joshua Kerievsky. It's part of the Martin Fowler signature series.

Today I jumped ahead to read the Replace Conditional Dispatcher With Command section, as some code that I have been looking at recently could be a good candidate for this type of simplication. So far it looks like a more robust solution to what the original developers were trying to achieve, so I may add this to the maintenance roadmap.

Friday 25 September 2009

What's their motivation?

From a fairly early age I've been accused of having an analytical mind.

In recent years I think it has made me come across as a bit cynical, but I don't see that as being a negative thing - maybe I'm just a cynical optimist?

I recently stumbled across an old blog post by Kent Beck which rebutted some comments that Joel Spolsky had made during a podcast.

After several years of working in project teams applying XP principles, including some TDD, and having recently read some of Kent's books (better late than never), I felt that Joel was coming from a position of ignorance, making presumptions that people like Kent and Uncle Bob Martin were living in some kind of dream world and didn't know how things needed to operate in order for code to ship.

Realising that the podcast and Kent's response blog post are both actually quite old, I decided not to contribute to the debate.

Then today I saw Uncle Bob Martin post a response to another article by Joel. This time Joel was praising the developer that hacks bits and pieces together, but doesn't pay attention to some of the approaches to software development that I would consider to now be mainstream. He even goes so far as to infer an associationg between ugly multiple inheritence and design patterns. Oh the irony!

In the comments section for Bob's blog I saw an interesting Twitter tweet:
"never take software advice from a bug tracking system salesman

Sure enough, it turns out that the main product produced by Fog Creek Software - where Joel is CEO - is a bug tracking system.

Now, remembering that I might be a tad cynical, I can see why someone who makes money from sales of bug tracking software might want to downplay the benefits of unit testing, design patterns and other aspects of modern software development that have become mainstream in the last few years.

Thursday 17 September 2009

The state of agile

A few weeks ago I attended an XTC (eXtreme Tuesday Club) meeting in London which was based around people reporting on what their experience of Agile 2009 was.

The whole arrangement was unlike any kind of meeting that I have attended before. A bunch of people with similar interests meet in a room at a pub in central London and socialise. There were no announcements or presenters up the front.

I met a chap called Tom Gilb who showed me a few of the nice features of the new iPhone S - including some video capture, editing, and uploading to Twitter.

The only report back about Agile 2009 that I managed to (over)hear was Uncle Bob Martin saying something along the lines of, "It was okay, but there was nothing new".

It was reassuring to read a blog post this evening that indicated that other luminaries in the agile community were thinking along the same lines. It made me realise that Uncle Bob was not being cynical - and he definitely does not come across as that kind of guy. (He even didn't mind signing a book that I had in my bag from his Robert C. Martin series - even though it was authored by Michael Feathers).

Tuesday 15 September 2009

Server config and logging in Java

I've somehow been "volunteered" into doing the stuff that no one else wants to do.

It's a strange world where entire days can go by without much feeling of achievement, or a few minutes can lead to a revelation, like finding a patch for some related third party system, or opening a configuration file in a different editor which highlights syntax differently. It's as if a spotlight has been shone on the tiniest - previously insignificant - line or two of text and everything is going to work according to plan.

On the developer's desktop logging isn't particularly important, as it doesn't slow the system down much, and it's nice to be able to see what's going on in realtime.

On staging and production servers there are different priorities, such as performance, conserving space and alerting someone if a significant event is detected.

In an ideal world there would be a single logging system, used consistently throughout the application. Of course we don't live in an ideal world, so there are multiple logging systems that are used with different levels of granularity by the various third party libraries and runtime systems that form our application.

The application itself uses log4j, but a third party component is tied to the java.util.logging implementation.

Friday 11 September 2009

Java Portlets

It seems to me that Liferay and GateIn (JBoss and eXo) are the only big players in the open source space for Java Portlet Containers.

When I recover from the joy that I experienced from configuring the previous generation on a server, I will have to take a deep look into these options.

Apologies to anyone who came here looking for an insiteful comparison.

Thursday 10 September 2009

JBoss 4.2.2 and Apache 2 with AJP

In case I'm not the only person on the Internet with an interest in this, JBoss 4.2.2 does not play nicely with Apache using the AJP protocol.

After a lot of experimentation with various combinations of settings at both the Apache 2.2 end (mod_jk and mod_proxy_ajp) I came across the following:

I found this slightly reassuring.

Due to various reasons and time restrictions, I was not able to upgrade to 4.2.3 - but I thought it might have a compatible upgraded / patched version of the jbossweb.jar. Nope.

So, I did some digging around and figured out that the source code is available for anonymous download from a subversion repository at:


Paraphrased steps to apply the patch:
- Download the JBoss Web project from the JBOSSWEB_2_0_1_GA tag - to ensure compatibility
- Adjust the file to specify version 4 2 2 GA
- Copy JBoss 4.2.2 into the appropriate directory, or run the ant command to have a fresh distribution downloaded
- Edit the 2 Java classes mentioned in the patch in the Jira issue that I linked to earlier
- Run the appropriate ant build command

Take your shiney custom built jbossweb.jar and copy it over the existing one in the jboss-web.deployer/lib directory.

Restart JBoss and enjoy not having to restart JBoss periodically to free up those connection resources.


You should also look at the: What is an optimized mod_jk configuration for use in Apache with JBoss? Red Hhat Knowledgebase article.

In addition, the usual consideration should be afforded to JVM settings for memory sizes and garbage collection.

Thursday 13 August 2009

Maven and friends can only get you so far

I think that one of the best things about software development is that it is always changing - unfortunately this ever changing environment doesn't often allow for things to change as simply as dropping in the next version.

Deploying applications to some servers recently I discovered that some of the components used different versions of some low level libraries. This could have made me appreciate more what projects developed with Maven or Ivy get to see while in progress, but in this case it was not some component of our code that had jumped ahead of some other component - it was actually a container application on the server.

Fortunately enough the versions of the Apache commons jars are backwards compatible, so it was a relatively painless upgrade process.

Tuesday 11 August 2009

VMWare to acquire SpringSource

I found myself doing a double-take as I skimread the Twitter feeds this evening. VMWare to acquire SpringSource - hey, hey what the?

Rod Johnson has blogged about it, and this time it's not April fools day so there may be some truth behind it.

I wonder what this means for other virtualisation and cloud computing providers?

For instance, will this influence the direction of Grails support for Google App Engine? After all, it didn't take long for Grails to do an about turn and support Tomcat instead of Jetty as its default servlet engine after G2One was acquired by SpringSource.

Tuesday 4 August 2009

My experience on the London IT job market during economic downturn

After a holiday in Ireland and the UK back in 2007, I decided that if nothing much was happening for me in New Zealand then I should head to the UK where I would have more opportunities to see the rest of the world.

In September 2008 my paperwork came through for my tier 1 visa to live and work in the UK.

I booked my flights so that I would arrive on the same date that the visa became valid, hoping to make a quick start at securing work.

Strange as it may sound, I didn't have any real experience of looking for work. My job in New Zealand had resulted from knowing a bit about a particular technology - CORBA - and a little bit of good luck.

I had initially hoped for some parttime work during my post-graduate study, but got invited to take on a fulltime job - with the option to continue studying parttime. Given that I was only studying to improve my chances of getting a job, this was an opportunity that I could not refuse.

Nearly 10 years later, and on the other side of the planet I had to actually apply for jobs and wear a suit to the interviews - as opposed to the "Pink and The Brain" t-shirt that had been part of my clothing selection back in 1999.

Working in a single company for a long time has some down sides, one of which turned out to be that the choice of technologies used did not match up well with what many job advertisements listed. Spring and Hibernate were the two biggies that I soon identified as being a big deal, so I read some books attended user group meetings etc. and even managed to win a 4 day training course.

After following other people's advice by applying for contracting roles, I started to consider permanent roles. After about the 3rd face to face interview I was lead to believe that an offer of employment was just a formality, so I stopped applying for other roles and considered having a little holiday over Christmas.

In early January the role fell through due to corporate re-structuring that had been followed by a companywide recruitment freeze. It was tough to get myself back into interview mode, so the next couple of companies didn't get a good impression of what I was capable of - 1 even told the recruitment agent who had represented me that I came across as though I wasn't that interested in being there.

After lowering my salary expectations I had more roles to consider, and soon came to the awkward situation of having to choose between two roles.

I chose the role that was located closer to home and the venue of the various technical user groups that I attend during weeknight evenings.

After being hired as a Java developer I accidentally painted myself into a corner that was labelled "Grails/Groovy development" at a time when Grails 1.1 and 1.1.1 were having stability issues.

At the end of my standard 3 month probation period the company shocked me by saying they were "letting me go". Given that I wasn't making much headway with the battle against the Grails issues, and several of their former colleagues had recently become available I can understand their decision.

Second time around on the London job market was a little easier. After six weeks and interviewing for eight different roles, I had two offers to choose from again.

I chose the well-established consultancy that offered greater potential for professional development, over that startup that offered more money.

No regrets.

Thursday 30 July 2009

Senior developer now

My new job has been running quite smoothly so far, although the seniority aspect is a little different to what I was expecting as most of my teammates are recent graduates or still studying.

Today I got that "hmm, maybe I am getting old" feeling when one of my colleagues mentioned that he would only have been five or six years old around the timeframe I was describing. It started off as a, "what's this WSDL file for?" question that lead me to explain a bit about what it is and how it is used etc. I then dug a bit deeper and mentioned how a similar concept called CORBA had been around since the mid-nineties but was largely considered as a legacy approach these days.

I wonder what will be the equivalent of punch cards when younger people hear about my historical experiences of using computers? Loading programs onto the ZX Spectrum by casette tape is about as historic as I get (and I still have a working 48k Spectrum in storage back in New Zealand).

Eclipse plugins playing catch-up

Since upgrading to Eclipse Galileo (a.k.a. 3.5) I've noticed that there has been a definite lag in plugin developers supporting the latest version.

A couple of examples that have resulted in me keeping a parallel Ganymede (3.4) version installed include:
- Google App Engine
- Spring Tool Suite (well, not really, as I uninstalled this once I saw the recommended configuration tweaks to get the plugins to play nicely with OSGi)

Today I noticed that SpringSource have blogged and tweeted about an impending milestone 1 release of Eclipse Groovy Tools, still tied to 3.4 but with the assurance that 3.5 support will be available soon.

Thursday 23 July 2009

Android phone, to buy or not to buy?

This evening I attended a meeting of the London Android group hoping to glean enough interesting information to tempt me into upgrading from my current mobile phone (see image).

So far my market research online has been interesting. Some mobile phone deals come with special offers - I don't think I need hair straighteners thanks anyway.

A "deal" that almost sucked me in turned out to be for a longer contract term, and included "optional" insurance (free for the first month, then you have to opt out or be charged).

Vodafone almost had me ordering online - but unfortunately I have only been in the UK for the last 10 months or so, so I can't give them valid UK addresses for the past 3 years!

I have been thinking for a while that a decent phone could save me money - I wouldn't have to buy guide books or maps for long weekends out of London etc., but £30 times 18, + roaming data rates could pay for a lot of guide Lonely Planet books!

I'll just have to expand the cost rationalisation out to include how much I already spend on my pay-as-you-go mobile, then see if I could connect my laptop to use the unlimited internet access on the phone - saving me what I currently pay for pay-as-you-go mobile broadband...

Tuesday 21 July 2009

JBoss following SpringSource's leadership, by emphasizing separation between free and commercial offerings?

I've recently started working at a new job, which has involved installing and getting up to speed with some JBoss software.

Based on my previous experience, I thought that it would simply be a matter of heading to the JBoss website and downloading a zip containing the binaries. Alas that was not to be, as the website only appears to offer 30 day evaluation downloads and emphasizes the commercially supported versions.

I thought this might not be a new thing, as it had been almost a year since I last dabbled in JBoss application servers and portal servers. When I checked what tweets had been sent on Twitter today I found that John Smart (author of Java Power Tools) had noticed a similar situation.

So, I had another look and found a single page mentioning the availability of community versions of the software and linking to - even then the most obvious download links on the community site appear to direct users to the evaluation downloads. and seem to be the main pages to find your way to the non-evaluation implementations.

This reminded me of Rod Johnson's blog post entitled, "Red Hat Reacts to SpringSource's Leadership", as it seemed that JBoss / RedHat were now making it more difficult to get at their "open source middleware" much like how SpringSource still require registration before allowing people to attempt to download their free Spring Tool Suite.

If you navigate through the site you could even be forgiven for thinking that Hibernate might only be available as part of a commercial product download. It's fair enough for them to be economical with the truth when they're trying to marker their commercial products.

From my limited experience, it would appear that these significant providers of enterprise Java software have somehow gotten the idea that by making the process of acquiring the community / free version of their software, they will somehow encourage people to look into their commercial offerings. I, for one, am not convinced that the approach will have a positive effect on sales or subscriptions.

If and when Oracle buys Sun Microsystems, they could make some tweaks to the MySQL download process - which currently only has optional registration - to try to introduce a negative initial user experience.

Saturday 18 July 2009

What you don't know that you don't know

Software development is an interesting process

Once you've identified what needs to be developed there's typically some aspect that you haven't dealt with before. Here are some examples:
- a data type in the persistence layer (e.g. GIS Shapefile data)
- an unfamiliar MVC framework (e.g. moving from Struts to Spring MVC)
- a different JDBC connection pooling implementation
- Grails and Groovy instead of java with Spring and Hibernate

The decision to use the unfamiliar has been made with good reasons, it's your job to get stuck in and make it all work.

The approach that I like to take in these types of situations is to isolate each unfamiliar component and have a scratch project where I can try things out quickly and without effecting my development codebase.

With a testing environment in place, I can try out some distinctive characteristics that the application requires.

An example: Loading in some GIS data

This is the sort of thing that you expect to be trivial, after all the product has extensive documentation and is in use in hundreds, if not thousands, of organisations worldwide.

Underestimating the importance of this aspect of a project cost me a weekend researching and installing GIS libraries into PostgreSQL a few years back. Our team developed on Windows workstations, and intended to deploy the finished product onto a Linux server (Debian, in case you're interested). Sure enough, when I tried deploying to a Linux staging server the core of the system that we had been working on for several weeks failed to even initialise.

This was a case of what I like to call, "You don't know what you don't know". We knew what we wanted to produce, we knew what API to use to produce it, but we didn't know that the implementation we had based our GIS manipulation on would not work under Linux.

Fortunately the project schedule had been planned (by yours truly) to get the GIS code in place early on, so apart from my lost weekend the development continued and delivered on time and on budget.

The client subsequently commissioned me to use the same technology for another project, which was an interesting exercise in reusing code originally intended as a "one off". Both sites are still up and running.

Thursday 16 July 2009

Refactoring of persistent classes in Java

This morning on my way home from the latest job interview, I got to thinking about the costs and benefits of applying different approaches to persisting the domain objects of an application.

The company that I had the interview with have two development teams, working on two major projects. One of the projects is using iBatis, which I took as a sign that this company isn't just blindly following the majority of the industry, who seem to be using Hibernate. The other project is using JPA. So I had to ask why they have gone with a different approach, to which the response was that it was quicker and more flexible while the product is still being fleshed out.

So, what happens when you need to make some changes to the structure of your domain, and, correspondingly, your database structure? Is there a best practice, or does each organisation have their own approach?

Are there tools out there that recognise the side effects of the changes that you are making to your Java code, and can produce SQL DML update scripts for you to apply?

This situation reminds me of the early 2000s when UML diagrams were trendy and tools such as TogetherJ attempted to keep class diagrams and source code in sync with eachother, except this situation has an additional dimension to it - in addition to the mappings between your classes and database tables there is existing data which needs to be transformed to fit the new structure.

I see this as a scenario where you really need to keep a close eye on what the ORM system you have chosen is doing. As far as I am aware if you make changes that effect where data is being stored, you are going to have to involve some human intervention for transforming your old database structure and transferring its data before you can safely put your modified code and its ORM mappings into place.

Of course all of this is nothing new, but I'm a little curious as to whether tools can make the implications of changes visible, or even automate the generation of database update scripts. The alternative would seem to require identifying the changes and following the approaches outlined in Refactoring Databases: Evolutionary Database Design.

It looks like someone has made an attempt at automating some of the updates required for updating a schema to match an updated version:

Another PostgreSQL Diff Tool

A quick Google search reveals there are a few commercial products for MS SQL Server as well.

A spontaneous Twitter thought

I wonder how long it will take for world records to start related to Twitter?

e.g. the most retweeted tweet.

I'm mildly curious about seeing some visualisations of patterns of activity in the twitterverse. So far I'd expect Michael Jackson's death to have been the most significant world event since Twitter has been in the mainstream.

Wednesday 15 July 2009

Is London shrinking?

The old expression that "it's a small world" rally rang true for me today.

On my way home from a job interview this afternoon I thought that I recognised an old friend from New Zealand queuing up to use an ATM. Sure enough it was! After nine and a half months I have finally bumped into someone that I know from back home.

As if that wasn't enough, when I attended the London Spring User Group meeting this evening I ended up chatting with one of the 2 people who had interviewed me for a different job yesterday afternoon! Wait for it, later in the meeting I recognised the name of one of the attendees who had won a book in the feedback survey prize draw. Down at the pub for the obligatory socialising and networking afterwards I inquired as to whether he was a recruiter - sure enough it was the agent that I have been dealing with for the job the same job. SMALL WORLD.

This month's Spring user group session was focussed on Spring Batch and what is coming up in Spring 3.0.

I found that I appreciated the details of Spring Batch much more this time around, as opposed to when I first saw a presentation about it - after lunch at the Spring in Finance seminar day back in September 2008.

Seeing the list of what will be removed or deprecated in version 3.0 of Spring made me feel better about not knowing "which controllers" I had used in my experience of Spring MVC - a question that came up in a recent job interview by phone. The hierarchy of form controllers are being ditched in favour of the annotation based approach. So, I have only been using the latest APIs - still doesn't hurt to read about the old way of doing things (even if the MVC chapter in my reference book is 100 pages long).

It's a numbers game - Java developer interviews

Today I heard that I was definitely the second choice, out of the final four candidates that were interviewed for a Java developer role last week. I hadn't been pressuring the agent for feedback, and he was quite keen on emphasizing that the company will be looking to grow further in the coming months - hopefully I won't still be available then.

On Wednesday morning I have my first face-to-face interview with another company who interviewed me by phone last Monday. After the phone interview they had another eight or nine candidates to speak to, so I've already made it to the final three.

Unrelated to the job interviews, I've started noticing that some of the people that are following me on Twitter don't appear to be interested in the topics that I am tweating about. If I see that they are following more than three times the number of people who are following them, then it reinforces my theory that they are just trying to increase their popularity for no good reason - so I block them.

Monday 13 July 2009

Java language specification makes for interesting reading

As a different approach to preparing for the latest round of interviews, I've been reading snippets of the Java Language Specification. After a decade of programming in Java, I was surprised to come across a keyword that I didn't recognise, strictfp.

Evidently it's been there since version 1.2, but I guess back then I was more interested in developing Swing components and troubleshooting garbage collection issues from not tidying up those components' listeners etc.

The old expression is still valid, you do learn something new every day.

Wednesday 8 July 2009

Hibernate and JPA

I've been posting responses to various queries on the Hibernate Forums lately.

There seem to be a few recurring queries, so I'm looking to put together a white paper of sorts.

Here are my thoughts of what this might include.

JPA versus Hibernate:
- Standards compliant versus proprietary
- EntityManager versus Session

JPA does not require EJB3
- using a JPA implementation does not necessarily require an EJB3 container

JPA implementations:
- Hibernate
- DataNucleus (formerly known as JPox)
- Eclipselink
- OpenJPA
- Toplink
- ...?

Things to consider when evaluating products for your product or project:
- Licensing
- Stability
- Documentation and support
- Performance
- Benchmarking that matches your system's likely use cases
- Cost

JDBC Connection pooling
- Avoiding stale connections
- Implementations:
- c3p0
- Spring / Tomcat
- ...?

Lazy initialisation
- collections


XML versus Annotation based configuration

- Annotations for declaring Transactional behaviour of methods
- Which methods in which classes to mark with transactional annotation

- Second level cache
- Query cache
- Session / EntityManager

Performance issues
- N+1 queries
- batch size

Coding considerations:
- lazy initialisation exceptions
- avoid use of instanceof operator when proxies may be in place

Monday 6 July 2009

Maven - more than meets the eye

Earlier this evening I attended a presentation by John Ferguson Smart, a fellow kiwi who is in London running his Java Power Tools Bootcamp at SkillsMatter.

The title of the session was, "Getting Serious About Build Automation: Using Maven in the Real World". Based on his recent JavaOne presentation of the same name (slides available for download).

Before attending the talk I was probably a lot like most of the readers of this blog (both of you? :-) ) in that I had used Maven to get some jars/artifacts into projects that I have worked on, set up some internal project dependencies, groaned aloud in frustration at the maintainers of other external projects for not having their config in place as part of their upgrade process..., but haven't really taken the time to see where to go from there.

Here is a brief summary of some aspects of Maven that I will be looking into further because of the talk:
- the m2eclipse plugin for Eclipse, including its visualization of dependency conflicts (I've seen these conflicts in a project I worked on, but didn't know quite what they meant or how to resolve them by exclusions or specific declarations)
- the ability to standardize project settings using inheritance, including the option to specify versions of artifacts in such a way that they will only be applied to child projects if the child project goes to introduce the dependency ( block)
- using multiple modules to reduce the build time
- the Nexus Maven Repository Manager - refered to as an enterprise repository for caching external dependencies and holding copies of locally built artifacts
- the use of patches to manipulate differences in artifacts being built for different platforms, rather than having to introduce runtime configuration through external dependencies

I find John's blog is a good source of up-to-date info about tools that Java developers should be using to make their lives easier, and their projects progress more smoothly.

Saturday 4 July 2009

What a difference a name makes

I've started working on an application to allow people to manage their Facebook groups from outside of Facebook, as a hobby project to keep my skills fresh while I am between roles.

This morning as I was writing some tests and expanding the application to the point where it would be ready to start interacting with Facebook, I found myself feeling uneasy about the structure of a Service.

The method needed to know about a few fields that I have already encapsulated elsewhere. I found myself wondering, do I pass in the entire object or should the object actually be responsible for calling the method and passing its member fields as parameters?

After a few minutes of umming and aahing, the penny dropped and I renamed my "Service" class to be a Gateway. The way that the call gets processed should have made it stick out like a flashing neon sign that this is a gateway.

Having an organisation object make calls on the gateway, rather than being passed as a parameter into the gateway seems cleaner, but I expect another aspect of the domain that I have yet to uncover will move the responsibility out of the organisation object. Of course day 1 of the project is probably a bit early on to get concerned about this sort of detail.

This is yet another reason why Martin Fowler's Patterns of Enterprise Application Architecture lives on my table, rather than my bookshelf.

Friday 3 July 2009

Professional Experience - A brief summary of commercial software development projects that I have been involved in

This is going to be a long post, something I am considering as an appendix to my CV so that interested parties can get a more detailed view of my areas of expertise.

- Developing RESTful web services using JSON
- Converting stand-alone SOAP services to be integrated into a web application
- Liaising with client project managers and their suppliers
- Consuming XML web services

- Mentoring junior developers
- Retail E-Commerce development using Hybris 4.0, Spring, MySQL, Tomcat 6
- Sole responsibility for implementing payment gateway integration with Commidea XML Web Service v4.

- E-Commerce development using Hybris 3.1
- Configuration of load-balanced production environment on hosted servers
- Developing a back office system using Grails and Groovy
Prototype reporting system for an online casino application with Grails 1.1 and MySQL 5.1.
Used GORM and GSQL (abstraction over Spring and Hibernate) for data access
Added HttpInvokerProxyFactoryBean instances to an existing Spring-based application to expose functionality

- Enhancements to existing applications
Updated database tables, views, stored procedures
Added config for re-structured Spring MVC controllers
Added formatting logic to Velocity templates
Configured TeamCity continuous integration server to check out from subversion and run tests
Set up Javascript and associated cross site scripting code for Facebook functionality

- Agile development in team of 8
Java Persistence API (JPA) (equivalent to Hibernate) for Object-Relational Mapping;
Java Server Faces (JSF) with extensive use of AJAX capabilities provided by the JBoss RichFaces components;
Use of Ivy for managing jar dependencies;
Ant build script;
Hudson with various plugins for continuous integration and monitoring of code test coverage;
Selenium for automated running of acceptance tests;
MS SQL Server 2005 and PostgreSQL databases;
Eclipse as IDE and CVS client;
Prototyping of secure Web services interaction with Ministry of Education;

- Harcourts Grenadier - Real Estate sales management system for a commercial property developer
Project manager / Programmer / Analyst and client liaison responsibilities;
Based on a similar previous project, but with different business rules, data format, and using Java Server Faces and Facelets;

- Infinity Investment Group - Peninsula Bay Property development marketing site
Struts 1.3 and Tiles for MVC architecture and template-based layout
Role-based access control using Struts and a custom authentication system;
Java version of GeoTools and Java Advanced Imaging APIs for generation of map images;
Postgis data source for GIS data;
SQL Server 2005 for registration and bid data;
JPox JDO (now known as DataNucleus and used by Google App Engine) for Object-Relational Mapping;
Javascript and dynamic image maps for presentation of up to date details on map images

Merging of existing contact details database with the new online system;

Provision of back office administration interface to allow staff to quickly view the current status of individual properties and prospective purchasers.

- Airways Collaborative Arrivals Manager
Agile development in a floating team of 4 developers, with a business analyst providing requirements;
Contributing to high level architecture discussions;
Extensive pair programming;
MySQL database;
High Availability Linux cluster;
Use of new java.util.concurrent capabilities. Initially tried Executor, later changed to use Futures;
Some use of GWT for prototyping the required AJAX functionality;
AJAX Javascript with JSON (JavaScript Object Notation);

- Infinity Investment Group - Pegasus Town
Primary developer with support from 2 others for prototyping GIS and modelling the domain;
Project management and client liaison;

Initial project was a website with a novel approach to displaying availability of property. This expanded to include display of live data on projectors in the sales centre, provision of a live auction system for the sale day.

Avoided considerable expense and risk of failure by strongly pushing for the sale day to be operated on the sale site, as opposed to setting up expensive and complex short term Internet connectivity from the site to the server hosting facility.

Success of this project lead to the client returning to us to apply the same underlying technologies for another property development project.

- Windows Mobile / Pocket PC application development using C#, Visual Studio, .net Compact Framework

2005 - 2007
Alchemy were initially approached to provide text messaging capabilities to the existing software, but were then invited to re-architect the system to make it more scalable and able to be maintained without system administrator intervention.

I was the primary developer, responsible for client liaison, acquiring requirements, designing, developing, testing, and deploying all changes.

The first stage involved converting classic ASP VBScript pages backed by multiple Access databases into JSP pages backed by a shared SQL Server database. Struts, Tiles and JDO were used for isolating the various layers of the application.

I suggested to the client that they could expand their potential revenue by adding students to as well as parents to the people that could be contacted by the system.

Enhancement projects included:
- addition of SMS text message sending capacility by sending requests to a SOAP web service
- set up of online stationery ordering using WestPac bank e-commerce services, and providing a simple back office system for schools to monitor orders
- creation of Early Notification System web services for Ministry of Education to allow schools to contact parents by email and SMS text message, with robust status reporting capabilities

- MainPower Content management system based website
Plone, customisation of templates and search functionality
Inquiry forms processed by Python scripts

- LocalEye - Portal website for local government promoting Christchurch and Canterbury New Zealand

Enhancements to Zope-based portal site;
Python scripts for manipulating search results from multiple indexes;
Upgrade of Swish-e search engine;
Evaluation of upgrade to later version of Zope

-NewJobZ - Website for a business based on assisting prospective immigrants to find work in New Zealand

Struts and tiles for MVC and templated look and feel;
TJDO for object-relational mapping;
SQL Server 2000 RDBMS;
Tomcat 5;
Processing of payments online using PayPal;

- RealJourneys Tourism Reservation System: J2EE server with Rich Java Swing GUI client
Java 1.4;
Implemented printing capabilities using XSLT and apache FoP;
Set up fax server interface using HylaFax and Java APIs;
Installed Linux operating system and fax serving software on client hardware.

1999 - 2007
- Harcourts Group
The largest real estate company in New Zealand had a number of projects that I worked on.
- Development of Swing UI components for trust accounting system (Java 1.2).
- Maintaining ASP intranet - including an enhancement that prevented cancelled long running queries from tying up processing resources on the web server.
- Production of Perl scripts to export listing data in various XML formats to several major partner websites.

1999 - 2002 (approximate)

- New Zealand Immgration Service
Maintained public website, processed change requests, added and maintained CGI scripts backed by a PostgreSQL database

Set up Swish-e search engine for indexing content


Conversion of classic ASP based website to a scalable J2EE based solution
A selection of high profile pages within the website were chosen for conversion from classic ASP to a more scalable Java EE based system.
Java Server Pages (JSP) were set up with the same HTML as the existing site (for consistency and a smooth transition);
Existing SQL queries were analysed and brought across to the new JDBC data access objects;
SQL Server 2000 was the existing underlying RDBMS;
Custom tag libraries were developed to provide content to the pages in a way that would work easily for graphic designers using DreamWeaver;
Stored procedures for a custom built online advertising module were tuned so that deadlocks would not occur;
Tomcat was configured as the server to host the new site;
Server side caching of data from the database and the filesystem (camera image file details) were implemented using a group of configurable time-based cache entry maps;
IIS was configured to redirect users from the old ASP URLs to the new site

Snow reports and webcam pages which had been pulling information from a database and a filesystem on every page request were now only updating their data periodically, significantly improving the site's ability to cope under heavy load.

E-commerce - Online shop for purchase of mountain passes - processing credit card payments in real time.

Snow reports, web cam pages, email newsletters.

- Colliers Jardine
Commercial property website with listing management backoffice interface
Team member for initial development, primary maintainer for ongoing changes.
Developed using SQL Server 7, IIS 4, classic ASP, Perlscript, modules developed in house as well as source from CPAN
Responsible for applying enhancements once they had been filtered through the project manager - e.g. addition of one-off campaigns where properties would be featured on the homepage and displayed more prominently with that property type's section of the site.

I know there's a heatwave on, but where's my cloud gone?

Thursday was not a good day to be deploying updates to my hobby application on Google's App Engine.

Early in the day I had been making some good progress with refactoring my client UI, and adding some intelligence to the server side for allowing game rules to start to take effect, but by mid-afternoon I noticed that I was unable to redeploy my application.

Secure in the knowledge that nothing of signicance had changed at my end, I continued to work offline - on paper even - for some design considerations that needed to be addressed soon anyway.

All going well, I should have something fully functional by the weekend.

My original plan of using Javascript for the client didn't feel right, as it didn't look particularly pretty even after a fair amount of fiddling around with CSS. So, I have gone down the Java Applet path with a Swing UI. Next version may even use JavaFX.

Tuesday 30 June 2009

Google App Engine play

I've gotten back around to trying out Google App Engine.

The main area that I have heard about it has been in my dealings with Grails, but I am sticking with the plain Java capabilities for my initial experimenting.

It seems fairly trivial to build and deploy some servlets using the Google plugin for Eclipse - even though that has meant switching back to a Ganymede version of Eclipse instead of the shiney new Galileo.

Being the curious sort, the first chunk of original-ish code that I deployed within my application was a servlet that iterates over the System properties of the server that is running the application and outputs them to the response.

After some sleep I think I will have a go at implementing a distributed game of noughts and crosses using AJAX and some servlets. This seems like it should be a simple enough game to implement, but will have the challenges of keeping track of players and game state.

Thursday 25 June 2009

Grails Bug Fixing Sprint

I was just perusing the Grails User mailing list and came across a post by Graeme Rocher asking for the community to let him know what bugs are "hurting" the most.

As part of the movement towards greater stability outlined as a key objective on the roadmap for Grails 1.2 "Bedivere", Graeme is heading up a bug fixing crusade targeting what is going to produce the greatest value to the active users - that means me, and maybe even you!

If you want to get your 2 cents in on the informal "vote", head over to the Grails Jira to see whether your issue has already been reported. If it hasn't then the best way to see to it that your issue gets attention is to add a Jira entry - make sure you give sufficient details about how to repeat the bug or unexpected behaviour, or risk seeing it drop down in the priority list.

On a slightly unrelated note, I wonder if someone will produce a plugin with a silly title, such as "holy hand grenade of Antioch"? The penny is just tropping for me as to the naming of releases and the Grails product itself.

Wednesday 24 June 2009

I love it when a plan comes together!

Today I had one of those interviews where everything seemed to just flow naturally from the start to the finish.

I was initially hesitant to agree to this face-to-face interview, as I am in the process of preparing for another interview which will involve me using an editing environment that I am not particularly familiar with. The agent's description of the interview process convinced me to go along, and I am quite glad that I did.

I started off with 20 minutes to come up with a design for a web-based application that closely resembled a game I had worked on as part of a university project.

Here's how it rolled off my pen:
- identified entities, their states and relationships to eachother as UML-like class diagram;
- highlighted an aspect of the "business rules" that was ambiguous, if not downright contradictory;
- thought a bit about the processes involved in initialising the game, general game play, where the responsibility for that processing might belong;

Then I went back to the description of the task to see what I was missing:
- jotted down some possible approaches for client-server interaction
- added some detail about how state changes could be communicated from the server back to the client
- considered approaches for dealing with scalability and preventing cheating

None of this had much to do with my university project, but having a slightly familiar concept helped my approach.

I really enjoyed this interview approach, as it demonstrated the way that I go about coming up with a full solution - if you think of the loosely described model as a series of inter-related problems.

The rest of the interview was more typical - a talk through some of the technologies on my CV: "what do you think about Spring?", "describe the JDBC connection pooling problem", "did you use tiles with struts?" etc., followed by a pair programming exercise that went smoothly, apart from the Eclipse upgrade issue preventing the keyboard from functioning :-)

I expect the interview will lead to the next stage, but even if it doesn't it was great practice for the next one which is in a couple of days time.

Monday 22 June 2009

Grails gotchas / showstoppers effecting uptake

I was somewhat relieved when I came across this blog post which highlights the fact that other developers have also faced a loss of productivity when working with Grails.

The "convention over configuration" approach speeds up development and keeps the learning curve fairly easy for most developers, but the list of open issues on the Jira suggests that there are still some aspects that need tuning - or detuning in some cases (e.g. domain objects not having GORM functionality due to something missing in lazy decoration/injection).

I've been wondering what sort of risks are involved when you use a software product that is essentially just providing a layer of abstraction over several loosely related other products.

For example, what can you do if you want to upgrade to a later version of one of the underlying technologies (for a fixed bug, or to use a new feature), when the layer above that hasn't been tested or made configurable for that new release? At present, I would expect that you would be stuck with whatever version is compatible with your version of Grails. This could change once OSGi becomes a bit more mainstream, allowing for the use of multiple versions of the same libraries within a single application. I wouldn't want to rely on that being the saviour of any application of mine though.

Tuesday 16 June 2009

Get rich quick scams advertising on Facebook

Since I tried tricking the context-sensitive advertising system into not showing me all the dating sites, I have noticed quite a few advertisements that looked to good to be true, in other words scams.

Out of curiousity I have clicked on some links and found that they show up as a fake site designed to give the appearance of an online newspaper with a journalist's report on their "product".

If you see a site asking you to pay a small fee to get a package explaining how you can make thousands of £s (or $s) by using Google from home - know to avoid it. It's just common sense, that when these people go out of their way to mislead the public they are up to no good.

I have read about some people being foolish enough to give their credit card details - and then seeing monthly charges that they had not approved, or had missed in some obscured "terms and conditions".

Be sure to report such misleading advertising if you come across it on Facebook etc.

Sunday 14 June 2009

Why some killer apps should be approached with caution

In my career so far as a software developer I have worked on a number of projects that were supposedly guaranteed to launch the investors into the bigtime.

Surprisingly enough - they didn't.

This post is intended to provide some general advise to others that may be considering investing money and manhours into a "killer app".

Be realistic when you determine your potential market, then knock that figure down some serious percentage points to determine your likely market. How many times do you hear people come out with figures about how many cellphones or mobile devices are in use, when compared to PCs? The fact of the matter is, unless you're catering for a fairly low level device, or a web-based application, you will never come up with software that is going to capture enough of the market.

From my observations, Nokia used to have a reasonable market share - so it would have made sense to develop applications for their devices. Then along came PocketPC phones with Windows Mobile and 3-G networks allowing for greater bandwidth.... Then came the iPhone, a device that looks pretty and has some nice applications built into it (e.g. Google Maps).

Providing a service that is based on a user interface to cater to the lowest common denominator of devices might get you a wider potential market, but it won't be "cool" or "sexy" enough to attract that wider market share.

Just because it's useful, doesn't mean that people will use it. I recently thought of producing an online lost and found site, where people could go to register missing / located items and tie that information to a specific location. While this sounded like a useful system, it has several flaws - how would people know about the site? why would they choose to bother using the site?

It's not the sort of site that would get a lot of repeat usage - let's face it if you lose something, you can prettymuch consider it gone for good after a few days.

Don't assume that someone bigger than you isn't already working on the same concept. There are a number of features that now come as standard on the iPhone and various other devices that companies around the world must have invested millions into trying to develop for their clients, or themselves.

Expect change. Prior to Windows Mobile 5 there was no way to get at the text messages directly from .Net compact framework. You had to build up a C++ DLL and hope like hell you never had to maintain that code.

Likewise, it doesn't seem like that long ago that having your phone know your location was something out of the movies. Now it's a common feature on any modern halfway decent device for use in applications with location-based services - whether it be for showing you a map so that you can find your way to the bar to meet your friends, or to include metadata with the photo you have just take with your phone.

Build for a real world client before attempting to cater for all possibilities. This could be re-phrased as "scope your project sensibly". Trying to produce a sports administration system from scratch, complete with all the bells and whistles, which can be used for every type of organisation known to man was overly ambitious.

Minimise your uncertainties. If you have a lot of experience with developing systems on a particular platform, then you shouldn't reach out of that comfort zone without a very good reason, and listening to the advise that you're given. Nobody likes to hear, "I told you so", when it comes to their money going to waste.

Look before you leap. Know your competitors. Once you think you have the market sussed out, look again. I was surprised that the market researchers for one project I worked on had not found out about a competitor that I knew of. They went on to identify that competitor as significant, but then changed their minds on the basis of a (probably out of date) article on a single website.

Once you have identified the competition. try to take an impartial look at their offering. If their product looks prettier than what you can imagine producing, and already has significant market share - what are you going to do to be better? How do you know that they are not already taking their product to that next level?

Don't re-invent the wheel, but check the terms and conditions
. If there's open source code out there, use it - provided of course that the associated copyright and licensing conditions are not going to force you to make your extremely valuable and top secret code available to everyone that asks for a copy.

Another limited use killer app

A week or so ago I signed up with a Google App Engine account. The next step was trying to think up an application to develop and deploy onto this new-ish cloud computing environment. I thought about an auction system, as I have some experience of that from a project that I lead back in New Zealand, but I got side-tracked into preparing for some interviews.

Yesterday I was looking for someplace on Google maps, which made me somehow think of providing an online lost and found service. Users can register to post their lost/found item details:
  • using a Google Maps mashup (or whatever you like to call it) to specify the location
  • upload image(s) of the item
  • specify a means of being contacted
Sure enough when I did a quick search I found that someone else is already providing a similar service - but I'm still considering it as a toy application that could motivate me to try to produce a map-based system that can be configured to use different mapping services, as opposed to being tied to a particular provider. That was an issue that came up in one of those "what if" conversations at a pub after a technical user group.

Friday 12 June 2009

Declaring directives in Grails

I thought that I may have been missing some subtle aspect about programming in Groovy when I noticed some code that was trying to specify the transactional behaviour of a service.

According to the Grails documentation the way to specify this is by including a static variable named transactional and set it to true or false.

The default behaviour (without the existence of this static variable) is true.

In a number of cases, including the 2nd edition of The Definitive Guide to Grails I didn't see the static keyword preceding the transactional variable name (I've submitted errata for the offending code snippets).

In the world of Java development, I would hope that my IDE would pick up cases where it appeared as though I had made a simple error. Can anyone out there advise me of whether this sort of thing is being included in the development of the various plugins for IDEs at the moment?

Thursday 11 June 2009

Tube strike? What tube strike?

My local Underground line was mostly uneffected by the RMT strike, as it turns out that not all lines are operated by the same union. This was a great relief when it came to finding my way to a job interview on the other side of town, as I didn't have to use buses or boats or anything that I would normally avoid when wearing a suit on a rainy day.

Coming back home at around 5pm did seem a bit cramped though.

Monday 8 June 2009

Want to know your audience?

For a few years in my career I was responsible for setting up web log analysis software to generate reports for various clients so that they could see what sort of value for money they were getting from their websites - at least in terms of the number of visitors, their location (approximate), which pages they were going to, what search engine queries they had used to find the site, etc., etc.

Webtrends suited us for a number of years, but then had a change in pricing model as they targeted major websites, so I evaluated a few other products and came across Weblog Expert which had enough features to be practical as a replacement and was in the right price range for some of our external clients.

Now that I am posting some content on my own little blog that is hosted somewhere that (presumably) won't allow me access to raw log files, I have had to look into other options. Google Analytics seems to do a reasonable job of producing pretty graphs and lets me know that my audience level is about the same as might hear me thinking aloud, but not quite as large as might hear me singing in the shower - neither of which I do often.

Back into interview mode - during proposed Underground strike

Today I heard that I have my first job interview lined up already.

It's in central London, which is good. Hopefully the tube strike will either be called off or will not have as much effect as I expect. The evening news mentioned 100 extra busses can run if necessary, but I can imagine that causing more problems than it would solve.

I don't know all that much about UK politics, but if the mayor of London can be criticised for not having suitable contingency plans in place for an unusual run of snow days, then I expect Boris will be catching a considerable amount of flack if the underground is shut down for 48 hours.

I'm going to assume that the overground evening trails will not be effected, as I have a friend's birthday dinner to attend across town. Based on the reviews for the restaurant the food will either be great or rubbish.

In terms of interview preparation, so far I have mainly been looking into new technologies, so between now and Wednesday afternoon I will need to look back at projects that I have designed and developed and be prepared to have a whiteboard session explaining what fitted where, and why. Fun times.

Google Wave

When I'm not cutting code, I still seem to enjoy getting online and making use of what other organisations have come up with. Not surprisingly several of the more interesting sites are developed by, or now owned by Google.

Whether I'm posting footage on Youtube, seeing where places are on Google maps, or Blogging (here), Google seems to be onto a good thing when it comes to providing useful software services online.

There has been a lot of buzz going around since Google announced Google Wave. It could be the cynic in me, but I'll be interested to see how the nuisances in today's world of email, IM, and web-based rich content are kept out.

I don't think that I will be an early adopter of this particular technology until I see for myself how it all fits together.

Friday 5 June 2009

Lag time in layered software

After hearing about the dismay of various users of IntelliJ IDEA when Grails upgraded from 1.0 to 1.1, I got to wondering what sort of cost is involved in developing with technologies that are dependent upon other software.

I see this as an area where open source technologies may have an advantage, as the beta and milestone pre-releases allow developers of related products to update their products in parallel.

I will be curious to see how the developers of Grails and the various plugins cope with the inevitable versioning issues that will arise once Spring 3 is officially released.

In theory it shouldn't be a big deal, but it's inevitable that there will be companies out there with the "If it ain't broke, don't fix it" approach to upgrading.

Ubiquity vs Standards

I noticed a post on SpringSource's blog site the other day that included some a chart of job postings based on some specified criteria. The blog post was about some recent Red Hat announcement, so one I assumed that the chart was intending to compare Spring against Red Hat.

I'll admit that I haven't done a lot of development using RedHat's technologies (unless you count Hibernate - which is practically everywhere - and RichFaces JSF), but I'm pretty sure that the charts weren't giving a fair comparison. Seam and RichFaces are not all that RedHat (JBoss) is about. I posted a comment, then noticed a day or so later that a blog post from somebody at Red Hat had used the same ad statistics system to show a quite different picture.

When companies advertise for a developer (shameless plug - I'm available at the moment), they're not always after somebody who has specific expertise with the particular application server that they are currently using - because that criterian could eliminate a lot of perfectly good candidates. These servers have to get certified as meeting certain standards before they can claim to be "Java Enterprise Edition" after all.

So, I feel that Rich's chart may be a little more reasonable than Rod's if you look at it in terms of what skills the industry has been seeking from new employees.

I'd rather be hearing about the technologies than the - possibly biased - perceived market trends.

(Spring Security 3.0.0.M1 looks like a great update).

Thursday 4 June 2009

Technologies I am currently looking into

Here's a list of tools / libraries / methodologies that I am hoping to reading about and have a play with while I some spare time on my hands.

TestNG - why use it instead of JUnit? How does its support for running tests in parallel work?

JSF 2 - what's changed since I last used RichFaces, Facelets etc.? Is it now possible to utilise the various open source widgets side-by-side?

Groovy - What cool features are there that I haven't already come across in using Grails?

easyb - will this give maximum value by allowing developers to write tests more rapidly (Groovy vs Java), and have the tests more human readable?

  • STS (Spring Tool Suite)
  • Spring 3.0
  • Spring Security 3 - initial overview makes me want to have a play with the @PreAuthorize and @PostFilter annotations, the codebase tidy up should make it easier to get setup up too.
  • Roo
  • tc Server
  • DM Server
  • OSGi support
Google Wave

Google App Engine - just got an account set up, installing exclipse plugin ...

Google Web Driver - an alternative to Selenium

Wednesday 3 June 2009

Brushing up on core Java, and reading for interest's sake

In preparation for the upcoming series of job interviews that I will no doubt have, I am trying to stay well disciplined in reminding myself about some gotchas in core Java that tend to come up as "technical tests".

I've just taken a look into Behaviour Driven Design and I can see why an old colleague has been name-dropping about it lately. My initial take on it it that it is still test-driven development, but the subtle change in naming conventions helps the developer to stay focussed on what the customer has specified. Keeping the business language in the code, allowing the tests to read more like specifications. Saying something "should" behave in a particular way instead of stating what you are testing.

Hopefully I will have a chance to try out JBehave soon.

Tuesday 2 June 2009

The perils of specialisation

I had an interesting meeting today. It involved the phrase, "we're letting you go".

I wasn't expecting to be made redundant, but evidently these things happen in Start-ups when a major release slips.

The last-in first-out decision was probably made easier for the company by the fact that I was the main Grails developer and they are considering Grails as a slightly immature and fragile environment "okay for prototyping but just too flakey on upgrades" etc. to be considered for deployment in the wild.

I've had a few job specs sent through from agencies already, so here's hoping the market is more buoyant than when I first arrived in London.

My CV now looks a bit more impressive, as I have been using Grails, Groovy, Apache Velocity, SpringMVC, subversion, c3p0 JDBC connection pooling, MySQL - including stored procedures and the event scheduler of version 5.1

Sunday 24 May 2009

Upgrading fixed updating, but may have broken creation.

It's a bit late in the evening for me to give a detailed account of what exactly I did, why, and what I found - so here's the short version.

Grails 1.1 was misbehaving whenever we tried to update an existing persisted object.

Grails version 1.1.1 was shown as fixing that particular issue.

I volunteered to be the guinea pig to be the first in the office to try upgrading.

Hurdle 1 - Original Grails 1.1.1 download included an install script with a Windows specific issue

Hurdle 2 - Maven plugin for Grails also would not play nicely (something about a dependency introduced by one of the minor new features added in Grails 1.1.1)


It works - even the updates to existing objects make it to the database without throwing exceptions etc. Hoorah!

A couple of days later we struck an issue where Groovy was unable to determine that an object was supposed to have GORM methods injected. The symptom being that it was unable to find a property (actually method) called save.

Mutter mutter mutter....

The workaround for that case has been to add some code to the BootStrap.groovy file to force Grails/GORM to be aware of that type of object - preventing us from relying on lazy initialisation.

Wednesday 20 May 2009

Grails upgrade from 1.1 to 1.1.1, attempt 2

Today I came to realise an important reason for upgrading from Grails 1.1 to 1.1.1, the staging sites running on Tomcat were not allowing updates to data via GORM due to this issue.

My most recent contributions to the project worked fine on my local system, but that was just a stand alone Jetty running courtesy of mvn grails:run-app and not much use to the QA team.

So, I figured that it was worth going back to check whether the suggestions provided on the grails user mailing list for the maven plugin version issue would work.

Updating the plugins on my local system was fairly straight-forward, so I checked my config changes into version control and fired up a build and deploy through the continuous integration / build server.

The build required a little manual intervention to clean out old versions of plugins, which I was easily able to detect having seen the same error messages on my local workstation.

Deploying the built application onto a Tomcat system that is hosting several related web apps involved a couple of areas of uncertainty, but it did take a while to discover what was preventing the application from starting up. It turned out to be due to there being multiple versions of an underlying jar in the WAR because of a an in-direct dependency pulled in by maven at build time.

I contemplated upgrading the other application to use the same version of the jar, but decided to just set up an exclusion in the Grails-based application's pom.xml for the time being. All seems to be playing together nicely now.

QA can move my task cards across to "completed" - hoorah!

Making adjustments to a moving target

Late last week I found myself in a mini spiral of database updates for a reporting system - removing an outer join in the morning, only to add another outer join in the afternoon.

Sure enough, this week I am taking a look at the bigger picture - where is the data coming from, and why is it so ssslllloooowwwww to come out. Late yesterday afternoon / early evening I looked into how the ETL tables are being populated and realised that the data in the new table that I have started joining against could easily be included in the denormalised structure that we have set up especially for reporting. I also came to appreciate why some other data had been included a couple of weeks back (before I had the bright idea of excluding it and adding a join to a later query).

I'm fairly certain that the changes I have applied today will improve performance considerably, and should also make the data access code more readable for future reporting requirements. I was unsure whether it would make sense to split out the ETL into 2 tables, as that could be considered as normalisation - but in this case it made sense, as the queries will be simpler because they will not have to filter out duplicate rows for the data that is common for a collection of entities that are related for a set period of time.

The "moving target" aspect of this exercise came in the form of some database structure updates being added by a colleague at around the same time. Essentially it has meant that I have no historic data to verify that the most complex report generated from this data is still working. In theory that should be a trivial obstacle to overcome, but it's the sort of surprise that can turn a morning's work into a couple of days' work if you're not careful.

Friday 15 May 2009

Not quite the bleeding edge of technology, but

The upgrade to Grails 1.1.1 wasn't quite as painless as I had been hoping it would be.

It was versioned as a minor release and had a fair number of Jira issues fixed - including several that had been hindering progress and could ultimately lead to Grails being pushed aside as the platform of choice in the next week or so - so I figured it was worth being an early adopter (upgrading within an hour of the release announcement).

Twitter came in handy for the first issue - a Windows batch script had an out of date version number embedded in it and other had struck the same issue.

We're now stuck on a version mismatch issue somewhere in the mirky mists of Maven and potentially out of date plugins.

My post to the mailing list has gotten some responses from other users in the same situation, so I wouldn't be surprised to find a solution in my inbox when I get to work in the morning (or am I being the optimistic software developer?).

I'm still unsure whether I want to stick with Grails, or go back to Java and Spring. Using Groovy has reminded me of my years of developing ASP pages in Perl - not much care for type safety (if you don't want it).

Wednesday 13 May 2009

JDBC Connection pooling

On a development system that I have been working on lately, the MySql database connections have been dropping out due to a lack of activity overnight (> 8 hours - a default timeout).

There doesn't seem to be a suitable corresponding timeout setting for the default connection pooling system for the corresponding Grails datasource (DBCP), so I've gone looking at what the alternative pooling systems have to offer, and how they could be configured to replace the default Grails DataSource.

On the surface of things development for dbcp and c3p0 seem quite inactive. Although dbcp shows a bunch of updates coming in their next release, it has now been over a year since their last release.

The Hibernate guys appear to have stopped supporting use for Hibernate with DBCP

C3P0 seems to use SourceForge to allow downloads, but keeps the documentation elsewhere.

Proxool looks like it could have potential - pity the user mailing lists are full of spam, so the likelihood of getting community involvement is remote.

C3P0 will do for today.

Update: I just attended a SpringSource talk on tcServer in London, during which Mark Thomas mentioned that the Tomcat developers have been working on their own connection pooling implementation, and that he is in the process of contributing some fixes to DBCP. (No mention of c3p0).

Monday 6 April 2009

Busy, working again

I've been a bit quiet on the blogging front for a while, since I've been kept busy working again - hoorah!

I'm using some pretty cool technologies such as Grails and Gigaspaces. Groovy development reminds me a lot of my Perl programming days - loose typing, brackets can be optional for method calls, the last line of a method can implicitly act as the return value, and some built-in regular expression handling.

IDE support for Grails 1.1 is quite poor, but during tonight's London Spring User Group presentation on developing plugins for Grails I came to the realisation that is hardly surprising, as I reckon the Grails plugin system is basically there to do the same job as IDEs have been doing - in terms of creating artefacts based on templates at least.

Monday 23 February 2009

It's unique, just like all the others

So, I've been going to a few job interviews lately and I can honestly say that no two have been quite the same.

For the first few weeks after arriving in London, I was applying for contracting roles. It didn't take me long to realise that Spring and Hibernate are a big deal in the local job market, so I started to do a bit of upskilling - Winning 4 days worth of training in the latest Spring technologies was a good start, but the timing of the course - December - was awkward as I had to second guess how that might be treated by a prospective employer - Good: training; - Bad - 4 days off before he's even started really working.

Reading the books and having a course coming up was never going to be enough to get me up to the standard expected when a company is paying top £s for a specialist, so I eventually agreed to look at permanent roles.

The first interview was in Devon and involved a long train journey - which half convinced me that it was not going to be suitable, given that at least some of the reason for my moving to the UK was to be close to Ireland and Europe for quick weekend getaways. After I had proven my technical prowess in a coding exercise and a general "problem solving around the whiteboard" discussion, I had the panel interview with 2 techos and an HR person. I decided not to proceed with the team lead role they were looking to fill as they were also looking to change their entire development methodology - and I could see that being painful.

The next interview was for a role as a consultant and was run in a cafeteria with a couple of techos, that would have involved me working at client sites and from home - which doesn't really fit with my preference to stick with developing as part of an agile team.

The third interview was unusual in that the company involved approached me directly after finding my CV online - as opposed to the agencies that I usually hear from. Their offices seemed nice, and the discussion went well - but I wasn't really expecting to hear back from them, given the timing and the industry sector they are in.

The fourth interview involved another out of town train ride, but ended promisingly with the guy saying HR should be in touch with me with an offer - hopefully before Christmas. So, leading up to Christmas I felt that my quest was at an end, but the holy grail of a steady income with that company ran away in corporate re-structuring and an unrelated subsequent recruitment freeze.

January proved to be quite a quiet month on the recruitment front, so I applied directly with a few companies and followed up on some companies that I had been in touch with pre-Christmas.

In two interviews I foolishly attempted to bluff my way through describing the design of a system that I had been responsible for developing. I kept tripping over how to describe a particular sub-system which a colleague had designed, but which was critical to the description of the product.

The feedback was mixed:
- one company said I was technically great, but recommended that I read a particular couple of books (one of which I had already read, the other I am in the process of reading right now);
- the other company told the agent that I gave the impression that I wasn't interested in working there

Given that I had just spoken to the agent to let them know that I would lower my salary expectations by 10%, as I felt that the company had a great approach and were really innovative in applying agile methodologies to their development, getting that kind of feedback came as a bit of a shock to me.

So, in the next interview I made a point of giving well thought out responses to the touchy feely HR questions as well as the technical ones. I thought that I was on a roll when the interviewer said that a lot of candidates hadn't known some of the aspects that I had covered... The interviewer deliberately skipped his questions about Hibernate, as I had stated on my applciation that I did not have commercial experience with it. So, I think that I was justified in feeling frustrated upon hearing their excuse for not proceeding further with me: the other candidates they spoke to had more experience with Hibernate.

So, now I've finished reading Java Persistence With Hibernate and have started posting solutions to problems other people are reporting on the Hibernate forum site. I like to learn from other people's mistakes, and get a feel for what areas are proving to be challenging. It's been re-assuring to find that the issues encountered are much the same as I found with JPA - but saying that "I don't have experience with Hibernate, but I've got a year of JPA and 3 years of JDO" doesn't mean much to the recruiters.

Now that I have some casual part time work I should be able to fill a few of those silly gaps in the CV and be set to hit the ground running - maybe I could give contracting another look?

Friday 20 February 2009


I've been keeping an eye on JavaFX since I first heard about its impending public release in December 2008. At the time I was looking into developing rich UI applications within the browser - as Flex seemed to be taking off, but I didn't want to have to fork out money or jump through hoops to get a decent development environment set up.

Today I watched a video over at the website that introduced some of the features of the JavaFX language. Some aspects of it reminded me of my first formal education in computer science - COSC 121 at the University of Canterbury in Christchurch, New Zealand way back in 1995. We were taught about Modula-2, which is a procedural programming language similar to Pascal.

I'm a little curious to know why the implementers of the language chose to have parameter type and return type information specified after the variable name and parameter list respectively - like Modula-2 and co. - as opposed to having it prior to the variable name or function name, like Java and the C family of languages.


function area(radius : Number) : Number
return PI * (radius * radius);

compared to:

double area(double radius)
return PI * (radius * radius);

I'm not going to lose sleep over it, and can't look for more info at the moment as the Sun sites are experiencing some scheduled maintenance downtime.

The way that programmers can choose the size of increments in an expression also reminded me of Modula-2:

java.lang.System.out.println ([1..10 step 2]);

seems like one of Modula-2's looping mechanisms:

FOR Index := 5 TO 25 BY 4 DO

Tuesday 17 February 2009

Reporting Potholes

I've noticed on the news here in the UK lately that potholes in roads will have expanded due to the recent cold weather conditions - the old "freeze-thaw" effect.

In some areas people are being encouraged to report potholes to their local council offices - which got me to thinking, how about setting up a website linked to Google maps to allow members of the public to locate potholes in their neighbourhood?

A quick Google search suggests that I'm not the first to think of applying technology to achieve something like this, in no particular order:
Then I came across something that looks like it already does the job:

Friday 13 February 2009

Implementation of the Singleton pattern

The last successful job interview that I had included a minor practical pen and paper exercise - "Show me how you would implement a Singleton in Java".

I'd been using the Singleton pattern for about 6 years by this stage, so I just jotted down the most basic implementation that I knew, something along the lines of:

public class Singleton
private static Singleton instance = new Singleton();

private Singleton()

public static Singleton getInstance()
return instance;

As I was doing this I thought of all the fancy examples I had seen in various projects that I have worked on and code snippets from various books that I had been reading at the time. So I did my typical self-deprecating thing of saying out loud, "I should probably have some lazy instantiation happening in the getInstance, but that could just be premature optimization".

That lead into a discussion with the interviewer who congratulated me on being one of the few candidates who had actually produced a truly threadsafe implementation. If I had delayed the instantiation until the getInstance was called, then I would have had to introduce some synchronization to prevent the possibility of the instantation happening multiple times, something along the lines of:

public final class Singleton
private static Singleton instance;

private Singleton()

* synchronized to prevent multiple instantiations of Singleton
public static synchronized Singleton getInstance()
if (instance == null)
instance = new Singleton();

return instance;

Then the overhead of the synchronization would have been introduced for each and every call to getInstance - not ideal.

Head First Design Patterns gives a good overview of a slightly better approach involving double-checked locking that I have added to my repertoire (suitable for JVMs 5 and upwards).

Tuesday 10 February 2009

Relational Databases - PostgreSQL

Is it just me, or are there a lot more free and open source database systems out there than there were 3 or 4 years ago?

I was following a thread on a Hibernate forum the other day and they mentioned a couple of options that I've used in the past - PostgreSQL and MySQL, a couple I'd heard a little about (e.g. SQLite and Firebird) as well as something called H2.

Of these options, I prefer PostgreSQL. I've been using it on and off for almost a decade, but I don't think that necessarily makes me biased as I could just as easily have been frustrated by it if it wasn't such a flexible and reliable system.

The extensive feature set really saved me some embarassment with a project that had a very tight schedule a few years back.

My team had developed some Java code based around the Geotools APIs to manipulate some GIS data in a shape file. That all appeared to be working fine - until I attempted to deploy it onto our in-house test server which was a Linux system and had the same setup as the server that we intended to host the web-based application on when it went live!

I did some digging around and established that Shapefile manipulation was just for demonstration purposes (so would not have suited our needs anyway), then found that there was also the option of accessing the data via a PostGIS data source. So, I spent my Saturday afternoon building a new PostgreSQL system from source code, with the GIS extensions and the optimal geometry packages available compiled in .... and did a bit of tweaking of the PostGIS access code in Java and had the project right back on schedule for Monday morning's, "So, what did you get up to on the weekend?" coffee break.

I subsequent got the chance to pick up the same set of technologies and have them in place from day 1 in another project, so got to spend some time looking for decent tools for visually verifying the data - uDig was the best nice free system I could find - it connects to PostGIS fairly easily and exists within an Eclipse framework.

Google Visualization

My brother was looking into setting up some charts on a website that he administers, so I offered to take a look into how involved it would be.

I was pleasantly surprised to see that what would previously have involved a lot of fiddling around with server side graphics manipulation APIs can now be achieved by using some nifty Javascript libraries.

So far I've only played around with some hard-wired fake data, so the next step is to wrap some real data in a DataTable and see whether it scales up nicely - with the usual cross browser testing.