This is just a very quick reminder that the Sydney ALT.NET group gets started tomorrow night at 6pm. More details at http://sydneyaltdotnet.blogspot.com/
Hopefully I'll see you there!
Do you live in Brisbane? Are you interested in Scrum and/or other agile methodologies? If so you might want to get along to the Brisbane Scrum Users Group. Kudos to James Brett for getting this up and running, especially outside of Sydney.
If you're in Sydney and you want a regular meet up then get in touch with Lachlan Heasman who is thinking about doing the same here.
I've been working through some architectural choices with a team starting a new project recently and the ever running debate over database access philosophies came up (as it was bound to).
Personally I'm in favour of letting an ORM manage all the CRUD operations as I see very little benefit in writing CRUD stored procedures (as do others). However I'm also pragmatic and can see some value in the sprocs argument, though the purported benefits of security and performance don't wash with me (but that's for a blog post).
Anyhoo, this particular team decided they wanted stored procedures for ALL data access. That means CRUD procedures for every table and no direct table access whatsoever. Yet at the same time they wanted to use an ORM for the .NET side of the equation as coding ADO.NET can be painful at the best of times.
Now, as part of the architecture we had decided early on to use the Repository pattern for data access and ensure we had persistence ignorance for all our data classes. This lets us isolate the data access plumbing from the rest of the application which helps with the persistence ignorance in the rest of the app, and lets us work with POCO objects everywhere except in the repositories. However it doesn't solve what to actually use in the repositories themselves.
So what to do? Some of the team wanted to use Linq To Sql, others wanted to use Microsoft's new Entity Framework and others suggested NHibernate. Too many opinions meant there was only one real way to decide. We decided to have an ORM to SProcs smackdown - may the best working code win!
The rules were simple... Take a very simple, 3-property, POCO business class and persist it to/from a database using nothing but stored procedures and an implementation of the repository pattern in the ORM of your choice. Get it done in 4 hours or less. The team would then vote for their preference.
Here's the results:
The entity framework has support for stored procedures for data access. Surely it would be simple to match things up, right? Wrong!! The developer who got the EF mission tried his best to get the tool to do what we wanted but mapping the classes was a problem.
The stored procedure support is best described as simplistic, at it's worst it's probably described using words not suitable for children to hear.
Even when the developer got something almost working, it still required creating a class that duplicated the POCO class and then manually mapping fields from one class to another. Gah! This is pretty much the same as using ADO.NET and mapping to parameters.
Also, the steps required to get the Entity Framework code "working" were so convoluted and confusing it would be a maintenance nightmare, even if it did work.
No vote was required for this. Everyone hated it.
There was some hope here - a working implementation was shown, however there are some major limitations.
First, Linq to Sql wants access to tables. Without table access you don't get any change tracking in your data context negating many of it's benefits.
Secondly, the designer support for stored procedures without is limited. Sure you can drag the procs onto the designer surface, but it only helps a little as the select stored procedures don't actually return classes you can easily use, but rather ISingleResult<T> objects. To get these into our business layer POCO's we needed to manually map them using either a linq projection or through standard property assignments in code.
Updates via stored procedures were also a problem. Not having table access means that you won't get any value from implementing the partial UpdateXXX() methods that the linq data context provides as they'll never be called. Instead you have to call the update stored procedures directly and you have to manually map your objects properties to the stored procedure's parameters.
Further, if you use an insert procedure and database generated primary key fields then you have to remember to get the new value of the key from the database and update the POCO's properties with the new value.
It's workable, but no one in the team was that impressed with it. It received zero votes.
The implementation using NHibernate was also shown to be workable.
A hibernate mapping file was created that had a class definition with specific overrides for the update, insert and delete methods so they called out to stored procedures. There was also an override for the object loader to use a select stored procedure.
The only change to the POCO was that the properties needed to be changed so that they were virtuals. We need to investigate this a little more but I'm going to assume it's something to do with NHibernate creating proxy versions of the classes for internal use.
Other than that the actual repository implementation was very simple. Pass an object to NHibernate's session and call Flush at the appropriate times.
When a vote was called for everyone went for this method. If I get some time I'll try and explain how it all works (or you could just do a Google search for it!).
So NHibernate method wins, but the scenario was simplistic. There may be some further issues that we need to deal with such as complex procs, optimistic concurrency and so forth, but even so there is more confidence that NHibernate will deliver the goods. It has a large user base, it's been around for quite some time now and we're pretty sure other people have tried this before.
On the other side of the equation, Linq to Sql and the Entity Framework are surprisingly and embarrassingly inept at this. If you don't have direct table access then you might as well forget about using these tools and stick with ADO.NET or code generation. The simple fact that data access via just stored procedures is such a common requirement in enterprise databases and that these two "modern" offerings from Microsoft just don't work is downright perplexing. Seriously, Microsoft recommends for years that people use stored procs and the release tools that are only partially functional in a pure stored procedure environment. What gives?!
Well, I think that's enough of a rant for now - time to get some back to doing some real work now!
I posted previously about the Twitter Build Publisher (aka the BuildTweeter) - a little utility that lets you publish build results and quality change events from Team Foundation Server and Team Build to Twitter.
Well I've finally managed to find enough time to put the project up on codeplex for public consumption. If you want to grab a copy of it, mosey on over and download a copy now from http://www.codeplex.com/BuildTweeter/ and feel free to contribute.
I'm happy to let you know that we now have some details for where and when the ALT.NET user group meetings will be.
We will be meeting on the last Tuesday of every month at the Thoughtworks offices in Sydney. Many thanks go to Thoughtworks for letting us make use of their facilities. Their generosity is greatly appreciated.
Level 8, 51 Pitt Street
Sydney NSW 2000 Australia
Tuesday, September 30th
6pm - 8pm
Welcome & Latest Happenings
Presentation 1 (30 mins)
Break for food & socialising
Presentation 2 (30 mins)
We'll have the 2 presentations finalised in the next few days, however if you've got anything you'd like to see discussed, please bring your ideas along.
We look forward to seeing you there and please spread the word!
Tech.Ed Australia is over for another year and for me this Tech.Ed was very different to the others I have been to. Previously I have attended as a delegate an I've simply gone to sessions and absorbed as much information and opinions from the presenters as I could.
This year I was fortunate enough to be given a breakout session on the Architecture track in which I talked about Dependency Injection, Inversion of Control and Unity (the IoC container that is part of enterprise library 4). I also had a "chalk and talk" session where I gave a 20 minute version of my agile agile presentation.
Both sessions were well attended and for everyone who came and especially to those who gave me constructive feedback, I'd like to pass on a big thank you.
Apart from my sessions I also spent time getting the TFS environment and automated build and deployment scripts set up for the Tech.Ed DevGarten project where we were redeveloping the UNICEF web site, and I spent a fair bit of one-on-one time with some of the conference delegates who wanted to talk about either unity or agile development.
As for the other breakout sessions I only made it to a handful - I went to Scott Hanselman's presentation on ASP.NET MVC so I could pick up some tips on presentation style, a session on dynamic languages (Iron Python, Iron Ruby, etc) and one on F# so I could get expand my thinking beyond C# and VB.NET, and also one on managing complex development as I've worked with one of the presenters and was curious to see what would be said.
All up I thought it was a great tech.ed and I'd like to thank Microsoft and the DPE/Marketing teams for organising such a great event. Well done to all!
The command pattern can be a powerful thing to use and yet it can be a right royal pain in the ar... let's just say it's annoying to implement it at time. Why? Because each individual command needs to be represented as an object and in a system that supports many commands you end up with a class explosion.
For those of you who need a refresher on the command pattern it is stated as follows:
The command pattern encapsulates a request as an object, thereby letting you parameterise other objects with different requests, queue or log requests, and support undoable operations.
Yeah, great - OK so in plain English that means you create classes to represent commands and those command classes simply make calls to other objects when you execute them. It's a fairly simple pattern.
So let's take as an example the command pattern implementation from the awesome Head First Design Patterns book and implement it in .NET instead of Java.
First we declare an interface for our command objects:
public interface ICommand
Then we create a command to turn a light on:
public class LightOnCommand : ICommand
private Light light;
public LightOnCommand(Light light)
this.light = light;
public void Execute()
And finally we create a simple remote control class:
public class SimpleRemoteControl
private ICommand slot;
public void SetCommand(ICommand command)
slot = command;
public void ButtonWasPressed()
So now if we wanted to use this in our code we could have a test like the following:
public void BasicExecuteTest()
SimpleRemoteControl remote = new SimpleRemoteControl();
Light light = new Light();
LightOnCommand lightOn = new LightOnCommand(light);
Easy enough. But now, let's say we want to add a command to turn our light off. We would have to implement a new LightOffCommand class. If we were using a TV remote we would then need commands for each of the channels, the volume buttons, a channel up/down button, an input selector, etc. As you can imagine once you start creating a class per command you can easily end up in with hundreds of classes all of which look pretty much the same.
This is where generics and lambda expressions in .NET 3.5 can help us.
Let's take our ICommand interface and implement it with a generic class:
public class Command<T> : ICommand
private T target;
private Action<T> command;
public Command(T target, Action<T> command)
this.target = target;
this.command = command;
public void Execute()
We now have a single class that is able to represent a whole range of commands without getting us into the class explosion situation we had previously.
The Action<T> command is a variable that will hold a lambda expression that doesn't return a value. If we wanted a value returned we would need to use Func<T, TResult> instead.
When we want to create a command we now pass the object we want to take action on, as well as a command to execute. So now we can refactor our test and change it as follows:
public void BasicExecuteTest()
SimpleRemoteControl remote = new SimpleRemoteControl();
Light light = new Light();
var lightOn = new Command<Light>(light, l => l.On());
So here we see that the LightOnCommand has been replaced with the new generic command Command<Light> command and that we are passing the object to target with our command as well as a lambda expression for the command itself.
What this means is that we can now declare commands as follows without needing to create individual classes for each command:
var lightOff = new Command<Light>(light, l => l.Off());
var strobeLight = new Command<Light>(light, l =>
So much nicer!
We now have an implementation of the command pattern that is not only powerful, but much easier to code and maintain as well.
Of course, if there's a lot of command reuse in your application this implementation of the command pattern might not be for you as it could easily lead to cut & paste development, though with judicious use of an IoC container you should be able to overcome that particular issue.