May 2008

I'm currently reading...

Applying UML and Patterns: An Introduction to Object-Oriented Analysis and Design and Iterative Development (3rd Edition)
by Craig Larman

Read more about this book...

(I'm actually reading the 2nd edition)

This is a good book. It started off really boring... really boring! I picked up this book as an introduction to object oriented programming, and it started off with a lot of talk on UML, documentation and the Rational Unified Process. But then I got to chapter 16... "GRASP: Designing Objects with Responsibilities".

Here's an excerpt that I enjoyed!

"Perhaps the most common mistake when creating a domain model is to represent something as an attribute when it should have been a concept. A rule of thumb to help prevent this mistake is:

If we do not think of some conceptual class X as a number or text in the real world, X is probably a conceptual class, not an attribute."

Here's a definition for a Domain Model...

"The Domain Model provides a visual dictionary of the domain vocabulary and concepts from which to draw inspiration for the naming of some things in the software design."

Chapter 16 is great so far, it talks about how to decompose responsibilities for objects using an acronym (I'm not a fan of acronyms) called GRASP. GRASP stands for General Responsibility Assignment Software Patterns.

Craig goes on to talk about the five different patterns of GRASP. They are:

  • Information Expert: the class that has the information necessary to fulfill the responsibility.
  • Creator: a class that has the responsibility to create an instance of another class.
  • High Cohesion: increase the measure of how strongly related and focused the responsibilities of an element are.
  • Low Coupling: decrease the amount a class is connected to, has knowledge of, or relies on other elements.
  • Controller: a class with the responsibility of receiving or handling a system event message.

A couple of days ago I posted something on an XmlEnumerable. An object that knows how to traverse an XML document in a linear form. After talking with Adam, he suggested that I simplify the implementation with a little XPath action.

 1   public class XmlElementEnumerable : IEnumerable<IXmlElement> {
 2       private XmlElement rootElement;
 3       private IMapper<XmlElement, IXmlElement> mapper;
 5       public XmlElementEnumerable(XmlElement rootElement) {
 6           this.rootElement = rootElement;
 7           mapper = new XmlElementMapper();
 8       }
10       public IEnumerator<IXmlElement> GetEnumerator() {
11           foreach (var node in rootElement.SelectNodes("//*")) {
12               yield return mapper.MapFrom(node.DownCastTo<XmlElement>());
13           }
14       }
16       IEnumerator IEnumerable.GetEnumerator() {
17           return GetEnumerator();
18       }
19   }

Diving a little deeper, I think using XPath expressions are probably a lot more efficient for traversing a document.

If you ever need to traverse each xml element in an xml document , you may want to implement your own XmlEnumerable. I've had some issues with the .NET XML API recently. The built in .NET XmlElement implements the non generic IEnumerable which means you've got to foreach through a bunch of objects.

1   foreach (object o in rootElement) {
3   }

This kind of scares me a bit because of the Xml object hierarchy. The reason being, there are several sub classes of XmlNode, and trying to understand this object hierarchy is not interesting to me.


Rather than having to check if each item is an Xml element, we just created our own abstraction that we prefer to work with, and map from the framework XmlElement to our own IXmlElement.

1   public interface IXmlElement : IEquatable<IXmlElement>, IEnumerable<IXmlElement> {
2       string Name();
3       string ToXml();
4   }

Let's say you need to traverse and Xml that looks like this:

 1   <root>
 2     <GrandParent>
 3       <Parent>
 4         <Child>
 5           <GrandChild></GrandChild>
 6         </Child>
 7       </Parent>
 8     </GrandParent>
 9     <GrandParent>
10       <Parent>
11         <Child>
12           <GrandChild></GrandChild>
13         </Child>
14       </Parent>
15     </GrandParent>
16     <Cousin></Cousin>
17   </root>

If we were to traverse this document we would expect to find 10 elements

 1   [Test]
 2   public void should_traverse_through_each_element() {
 3       CreateSUT().Count().ShouldBeEqualTo(10);
 4   }
 6   [Test]
 7   public void should_contain_one_root_element() {
 8       CreateSUT()
 9           .Where(x => x.Name().Equals("root"))
10           .Count()
11           .ShouldBeEqualTo(1);
12   }
14   [Test]
15   public void should_contain_two_grand_parents() {
16       CreateSUT()
17           .Where(x => x.Name().Equals("GrandParent"))
18           .Count()
19           .ShouldBeEqualTo(2);
20   }

We could walk this xml structure and query it, using an API that we prefer by building our own IEnumerable and extension methods for querying.

 1   public class XmlElementEnumerable : IEnumerable<IXmlElement> {
 2       private XmlElement rootElement;
 3       private IMapper<XmlElement, IXmlElement> mapper;
 5       public XmlElementEnumerable(XmlElement rootElement) {
 6           this.rootElement = rootElement;
 7           mapper = new XmlElementMapper();
 8       }
10       public IEnumerator<IXmlElement> GetEnumerator() {
11           yield return mapper.MapFrom(rootElement);
12           foreach (var element in RecursivelyWalkThrough(rootElement)) {
13               yield return mapper.MapFrom(element);
14           }
15       }
17       IEnumerator IEnumerable.GetEnumerator() {
18           return GetEnumerator();
19       }
21       private IEnumerable<XmlElement> RecursivelyWalkThrough(XmlNode element) {
22           if (element.HasChildNodes) {
23               foreach (var childNode in element.ChildNodes) {
24                   if (childNode is XmlElement) {
25                       yield return childNode.DownCastTo<XmlElement>();
26                       foreach (var xmlElement in RecursivelyWalkThrough(childNode.DownCastTo<XmlElement>())) {
27                           yield return xmlElement;
28                       }
29                   }
30               }
31           }
32       }
33   }

Now you can traverse your own xml data structures using a more strongly typed API that suits your needs. For example:

 1   public class RawXmlElement : IXmlElement {
 2       public RawXmlElement(string rawXml) {
 3           _rawXml = rawXml;
 4       }
 6       public string ToXml() {
 7           return _rawXml;
 8       }
10       public string Name() {
11           return Parse.Xml(this).ForItsName();
12       }
14       public bool Equals(IXmlElement other) {
15           return other != null && other.ToXml().Equals(_rawXml);
16       }
18       public override bool Equals(object obj) {
19           return ReferenceEquals(this, obj) || Equals(obj as IXmlElement);
20       }
22       public override int GetHashCode() {
23           return _rawXml != null ? _rawXml.GetHashCode() : 0;
24       }
26       IEnumerator IEnumerable.GetEnumerator() {
27           return GetEnumerator();
28       }
30       public IEnumerator<IXmlElement> GetEnumerator() {
31           return new XmlEnumerable(this).GetEnumerator();
32       }
34       public override string ToString() {
35           return _rawXml;
36       }
38       private readonly string _rawXml;
39   }


 1   public class SingleXmlElement<T> : IXmlElement {
 2       public SingleXmlElement(string elementName, T elementValue) {
 3           this.elementName = elementName;
 4           this.elementValue = elementValue;
 5       }
 7       public string ToXml() {
 8           return ToString();
 9       }
11       public string Name() {
12           return Parse.Xml(this).ForItsName();
13       }
15       public IEnumerator<IXmlElement> GetEnumerator() {
16           return new XmlEnumerable(this).GetEnumerator();
17       }
19       public override string ToString() {
20           return string.Format("<{0}>{1}</{0}>", elementName, elementValue);
21       }
23       public bool Equals(IXmlElement other) {
24           return other != null && ToString().Equals(other.ToXml());
25       }
27       public override bool Equals(object obj) {
28           return ReferenceEquals(this, obj) || Equals(obj as IXmlElement);
29       }
31       public override int GetHashCode() {
32           return
33               (elementName != null ? elementName.GetHashCode() : 0) +
34               29*(elementValue != null ? elementValue.GetHashCode() : 0);
35       }
37       IEnumerator IEnumerable.GetEnumerator() {
38           return GetEnumerator();
39       }
41       private readonly string elementName;
42       private readonly T elementValue;
43   }

Hopefully, this helps someone else who's drowning in xml!

Tim Ferriss writes:

"Brain activation for listening is cut in half if the person is trying to process visual input at the same time. A recent study at The British Institute of Psychiatry showed that checking your email while performing another creative task decreases your IQ in the moment 10 points."

This post is definitely worth reading!

I received a question the other day on building menu's in a win forms application. I wasn't sure of a clean way of doing it, so I thought I would put together a sample app to see if I could come up with something. I'm not sure I'm completely happy with what I've got so far, but my goal was to be able to drop in new menu items, and menu groups without a lot of ceremony and configuration.

The guts of it depends on castle windsor to glue most of the pieces together using the mass component registration api. I found it really hard to test, but was please with how easy it just kind of worked!

 1   public class WindsorContainerFactory : IWindsorContainerFactory {
 2       private static IWindsorContainer container;
 3       private IComponentExclusionSpecification criteriaToSatisfy;
 5       public WindsorContainerFactory() : this(new ComponentExclusionSpecification()) {}
 7       public WindsorContainerFactory(IComponentExclusionSpecification criteriaToSatisfy) {
 8           this.criteriaToSatisfy = criteriaToSatisfy;
 9       }
11       public IWindsorContainer Create() {
12           if (null == container) {
13               container = new WindsorContainer();
14               container.Register(
15                   AllTypes
16                       .Pick()
17                       .FromAssembly(GetType().Assembly)
18                       .WithService
19                       .FirstInterface()
20                       .Unless(criteriaToSatisfy.IsSatisfiedBy)
21                       .Configure(
22                       delegate(ComponentRegistration registration) {
23                           this.LogInformational("{1}-{0}", registration.Implementation, registration.ServiceType.Name);
24                           if (registration.Implementation.GetInterfaces().Length == 0) {
25                               registration.For(registration.Implementation);
26                           }
27                       })
28                   );
29           }
30           return container;
31       }
32   }

The other neat piece that kind of made things easy to get up and running was the concept of a default repository. (I picked up this bit of knowledge from Oren at DevTeach.)

 1   public class DefaultRepository<T> : IRepository<T> {
 2       private IDependencyRegistry registry;
 4       public DefaultRepository(IDependencyRegistry registry) {
 5           this.registry = registry;
 6       }
 8       public IEnumerable<T> All() {
 9           return registry.AllImplementationsOf<T>();
10       }
11   }

This was the only implementation of a repository in the system, and it was used for a IRepository and IRepository. I just created a new implementation of an IMenuItem or ISubMenu and it picked it up via Windsor's mass component registration.

 1   public class MainMenuPresenter : IMainMenuPresenter {
 2       private readonly IMainMenuView mainMenu;
 3       private readonly IRepository<ISubMenu> repository;
 4       private readonly ISubMenuItemComparer comparer;
 6       public MainMenuPresenter(IMainMenuView mainMenu, IRepository<ISubMenu> repository, ISubMenuItemComparer comparer) {
 7           this.mainMenu = mainMenu;
 8           this.repository = repository;
 9           this.comparer = comparer;
10       }
12       public void Initialize() {
13           foreach (var subMenuToAddToMainMenu in repository.All().SortedUsing(comparer)) {
14               mainMenu.Add(subMenuToAddToMainMenu);
15           }
16       }
17   }

I also spent a little time playing with Gallio. I had some issue with conflicts between the version of Castle.Microkernel that I was toying with and the one that comes with gallio. I wasn't able to resolve the issue, but after looking into the concept behind Gallio, I like the idea. Kind of neat stuff!

Here's what I came up... Thank you Mr. JP for the inspiration!

Source can be downloaded here!

I can't stress how many ideas in this project came from concepts learned from the Nothin' But .NET boot camp. If you're in the area, you should definitely go check out the Vancouver course coming up next month!

Last week my family and I were in Toronto, Ontario so that I could attend DevTeach. A conference put on by developers for developers, and it was a tonne of fun. Not only did my wife, daughter and I get to check out Toronto, and visit family but I got to bump in to some more of the industries greats and here them speak.

Before I continue I've got to plug this little cafe that we accidentally stumbled into one night. My daughter, wife and her cousin were out looking for the MuchMusic building when we got a little lost. We ended up walking down McCaul Street and spotted this tiny little cafe on the corner of Elm St. It looked pretty cool from the outside and just looked kind of out of place. We're so glad we stopped in... The place was called "MangiaCake Panini Shoppe" and they specialized in panini's and, you guessed it, cake!

We tried a piece of the cherry cheese cake, chocolate cake, and the carrot cake, as well as a salad, a couple of panini's and a lasagna for myself. It was absolutely awesome! The best part was the additional attention we got from the owner named Raj. He was just great and made the experience so much more...

If you're in the Toronto, Ontario area you have to check out MangiaCake Panini Shoppe located at 160 McCaul Street.

Back to the conference...

Day 1: Tuesday, May 13, 2008

8-9:15am: Keynote by Scott Hanselman

Scott talked about Data Dynamic Web Applications, Astoria, tools like Fidler Http Proxy, LinqPad, TcpTrace.

9:30-11:00am: Home-Grown Production System Monitoring: Creating a Bridge Between Development and Operations by Owen Rogers

I really enjoyed Owens talk. I thought it was informative and backed by real project experience. Some of the things I learned:

Problems with log files:

  • scattered
  • not analyzed
  • not accessible
  • size constrained
  • multiple logs (different time zones?)

You should log for immediate data, and limit the footprint of logging on client machines. Owen mentions that a great book to read is "Release It" by Mike Nygard.

11am-12:15pm: Behavior Driven Development Installed by David Laribee and Scott Bellware

This was a great session, that showcased the direction that BDD is taking and what it means. Some of the things I learned are:

  • User stories should not have UI or technical language in it.
  • We should try getting our end users to help write the stories.
  • Acceptance criteria has technical details in it.
  • Break a apart the product backlog, from a release backlog and an iteration.
  • When writing context based specifications use the active voice instead of the passive voice. Eg. "when an account has been opened" is in the passive voice. The active voice says "when opening an account".

1:30-2:45pm: How to make scrum really work by Joel Semeniuk and Turning Visual Studio Into a Software Factory by Kevin McNeish

I bounced out of the scrum talk as soon as we started getting into team foundation server, and the software factory talk wasn't exactly what I expected.

3:00-4:15pm: Achieving Persistence Ignorance with NHibernate by James Kovacs

This was a good talk that discussed alternatives to Active Record and how to implement an infrastructure ignorant domain model. It talked about different settings in NHibernate and how to create the mapping files and most importantly why you would want a infrastructure ignorant domain model.

4:30pm-5:45pm: Rapid (maintainable) web development with MonoRail by Oren Eini

This was another good talk walked through the creation of a project using MonoRail. Oren talked about the different conventions that are used by MonoRail and put it in contrast to the MS MVC framework. I'm definitely more curious about MonoRail and itchin' to slap something together using it.

Day 2: Wednesday, May 14, 2008

8-9:15am: Cross-platform Development with Mono by Geoff Norton and Planned Agility?! by David Laribee

The Mono talk was great, and actually got me pretty excited about the project. I'm surprised by just how much the Mono team has been able to accomplish and by the quick turn around on releases. I'm definitely going to have to spend some time learning more about the project.

The Mono talk ended a little early so I popped into David Laribee's talk on planned agility. This was a great talk on how to bring Agile into your projects. I guess it's still a little surprising to me how many company's are still working in a traditional methodologies, so it makes me feel pretty privileged to work where I do and with the great guys that I work with.

9:30-10:45am: Recommended Practices for Continuous Integration by Owen Rogers

This was another great talk on the concepts of Continuous Integration and how to achieve it with an automated build server. Owen talked about the inception of the CruiseControl.NET project and shared his experiences with how people were using it effectively and how people were abusing it.

11:00am-12:15pm: Busy .NET Developer's Guide to F# by Ted Neward

Mr. Ted knows his stuff. This was a great talk about F# and the functional programming paradigm. A lot of it was over my head, but I enjoyed the discussion around why this is important and what are some of the potential benefits of this style of development. Concurrency and side effect free functions were topics that kept coming up. I will definitely have to commit some time to better understand functional programming.

1:30pm-2:45pm: Blackbelt Configuration for New Projects by Jeffrey Palermo

Mr. Jeffrey gave a great talk on how to take control of your projects by offering suggestions on project structure, how to set up a single user development environment, the importance of version control, dependency management, the importance of automated deployments, application architecture.

To be continued...

A great book to read is...

Refactoring: Improving the Design of Existing Code (The Addison-Wesley Object Technology Series)
by Martin Fowler, Kent Beck, John Brant, William Opdyke, Don Roberts

Read more about this title...

"Any fool can write code that a computer can understand. Good programmers write code that humans can understand."

"The first time you do something, you just do it. The second time you something similar, you wince at the duplication, but you do the duplicate thing anyway. The third time you do something similar, you refactor."

Introduce Local Extension: A server class you are using needs several additional methods, but you can't modify the class.

Create a new class that contains these extra methods. Make this extension class a subclass or a wrapper of the original.

E.g From this...

 1     public interface IController{
 2         void Execute();
 3     }
 5     public class Controller : IController {
 6         protected void RenderView(string name, object data){
 7             //... note that this is a protected method
 8         }
10         public void Execute(){
11             //...
12         }
13     }

To this...

1     public interface IViewRenderer{
2         void Render<T>(string name, T data);
3     }
5     public class LocalExtensionController : Controller, IViewRenderer {
6         public void Render<T>(string name, T data){
7             RenderView(name, data);
8         }
9     }

Replace Conditional with Polymorphism: You have a conditional that chooses different behavior depending on the type of an object.

Move each leg of the conditional to an overriding method in a subclass. Make the original method abstract.

E.g From this...

 1     public class Bird{
 2         public Bird(BirdType type){
 3             _type = type;
 4         }
 6         public double GetSpeed(){
 7             switch(_type){
 8                 case BirdType.EUROPEAN:
 9                     return 5;
11                 case BirdType.AFRICAN:
12                     return 10;
14                 case BirdType.NORWEGIAN_BLUE:
15                     return 20;
16             }
17             throw new ArgumentException();
18         }
20         private BirdType _type;        
21     }
23     public enum BirdType{
24         EUROPEAN,
25         AFRICAN,
27     }

To this...

 1     public interface IBird{
 2         double GetSpeed();
 3     }
 5     public class EuropeanBird : IBird {
 6         public double GetSpeed(){
 7             return 5;
 8         }
 9     }
11     public class AfricanBird : IBird {
12         public double GetSpeed() {
13             return 10;
14         }
15     }
17     public class NorwegianBlueBird : IBird {
18         public double GetSpeed() {
19             return 20;
20         }
21     }        

Q: Should I be worried if my username and password are sent back and forth to a server in clear text, in a cookie, upon each request????


A couple of months ago I finished reading...

xUnit Test Patterns: Refactoring Test Code (The Addison-Wesley Signature Series)
by Gerard Meszaros

Read more about this book...


This was a thick book, that discusses unit test smells, unit test refactorings, unit test patterns... and just about anything else related to unit testing. Here's a little of what I've learned from this book...

Defect Localization

"Mistakes happen! Of course, some mistakes are much more expensive to prevent than to fix. Suppose a bug does slip through somehow and shows up in the Integration Build. If our unit test are fairly small (i.e., we test only a single behavior in each one), we should be able to pinpoint the bug quickly based on which test fails. This specificity is one of the major advantages that unit tests enjoy over customer tests. The customer tests tell us that some behavior expected by the customer isn't working; the unit tests tell us why. We call this phenomenon Defect Localization. If a customer test fails but no unit tests fail, it indicates a Missing Unit Test."

Tests as Documentation

"Without automated tests, we would need to pore over the SUT code trying to answer the question, 'What should be the result if ...?' With automated tests, we simply use the corresponding Tests as Documentation; they tell us what the result should be. If we want to know how the system does something, we can turn on the debugger, run the test, and single-step through the code to see how it works. In this sense, the automated tests act as a form of documentation for the SUT."

"When it is not important for something to be seen in the test method, it is important that it not be seen in the test method!"

Test Doubles

A test double is any object or component that we install in place of the real component for the express purpose of running a test. Depending on the reason why we are using it, a Test Double can behave in one of four ways.

  • Dummy Object: a object that is passed to the SUT as an argument but is never actually used.
  • Test Stub: an object that replaces a real component on that the SUT depends on so that different inputs can by applied against the SUT.
  • Test Spy: an object that can act as an observation point for the indirect outputs of the SUT.
  • Mock Object: an object that replaces a real component that the SUT depends on to test the and verify indirect outputs.
  • Fake Object: an object that replaces the functionality of the real SUT dependency with an alternate implementation that provides the same functionality.

Strict vs Loose

Mock objects come in two basic flavors:

  • Strict Mock: fails the test if incorrect calls are received.
  • Loose (lenient) Mock: fails if expected calls are not received, but is lenient if additional calls are received.

Need-Driven Development

"This 'outside-in' approach to writing and testing software combines the conceptual elegance of the traditional 'top-down' approach to writing code with modern TDD techniques supported by Mock Objects. It allows us to build and test the software layer by layer, starting at the outermost layer before we have implemented the lower layers."

Test Smells

"Developers Not Writing Tests may be caused by an overly aggressive development schedule, supervisors who tell developers not to 'waste time writing tests,' or developers who do not have the skills to write tests. Other potential causes might include an imposed design that is not conducive to testing or a test environment that leads to Fragile Tests. Finally, this problem could result from Lost Tests - tests that exist but are not included in the All Tests Suite used by developers during check-in or by the automated build tool."

"Another productivity-sapping smell is Frequent Debugging. Automated unit tests should obviate the need to use a debugger in all but rare cases, because the set of tests that are failing should make it obvious why the failure is occurring. Frequent Debugging is a sign that the unit tests are lacking in coverage or are trying to test to much functionality at once."

Fragile Test: "A test fails to compile or run when the SUT is changed in ways that do not affect the part the is exercising... Fragile tests increase the cost of test maintenance by forcing us to visit many more tests each time we modify the functionality of the system or the fixture."

This headache is typical if you're working with strict mock objects. I experienced this pain when working on a project using NMock. I couldn't find a clean separation between strict and loose mocks using NMock. There was only the concept of Strict Mocks and Stubs.

Slow Tests: "The tests take too long to run... They reduce the productivity of the person running the test. Instead, the developers wait until the next coffee break or another interruption before running them. Or, whenever they run the tests, they walk around and chat with other team members..."

The main disadvantages of using Fit are described here:

  • The test scenarios need to be very well understood before we can build the Fit Fixture. We then need to translate each test's logic into a tabular representation; this isn't always a good fit.
  • The tests need to employ the same SUT interaction logic in each test. To run several different styles of tests, we would probably have to build one or more different fixtures for each style of test. Building a new fixture is typically more complex than writing a few Test Methods.
  • Fit tests aren't normally integrated into developers' regression tests that are run via xUnit. Instead, these test must be run separately - which introduces the possibility that they will not be run at each check-in.


"A concept from lean manufacturing that states that things should be produced only once a real demand for them exists. In a 'pull system,' upstream assembly lines produce only enough products to replace the items withdrawn from the pool that buffers them from the downstream assembly lines. In software development, this idea can be translated as follows: 'We should only write methods that have already been called by other software and only handle those cases that the other software actually needs.' This approach avoids speculation and the writing of unnecessary software, which is one of software development's key forms of inventory (which is considered waste in lean systems)."

I'm just about finished reading...

Agile Web Development with Rails, 2nd Edition
by Dave Thomas, David Hansson, Leon Breedt, Mike Clark, James Duncan Davidson, Justin Gehtland, Andreas Schwarz

Read more about this book...

This book focuses on the Ruby on Rails framework for developing web applications. It touches very lightly on the Ruby language itself, but mostly talks about things like Model View Controller, Active Record, Action Pack and how it's implemented in RoR's. The book starts off with a light primer on what MVC is, which I enjoyed, then moves on to how to install RoR, then jumps right into building a quick application using RoR. I enjoyed the discussion Testing and the concept of Migrations.

Before jumping right in to chapter one I quickly read through the first Appendix titled "Introduction to Ruby", which helped a little bit, but I probably would have done better if I had first read a book just on the Ruby language. I think I would have found it more interesting as well. There were times in the book where it got so deep into the nitty gritty details of the RoR framework that I just completely lost interest. I choose to read this book to get some high level ideas, but I wasn't as interested in the tiny details of the framework. There's tonnes of great ideas in this book, that I recognize being adopted quite a bit in the .NET community. Migrations and MVC being a couple of my favorites.

Here's a few excerpts that I enjoyed from this book...

Convention over Configuration

"Rails gives you lots of opportunities to override this basic workflow ... As it stands, our story illustrates convention over configuration, one of the fundamental parts of the philosophy of Rails. By providing convenient defaults and by applying certain conventions, Rails applications are typically written using little or no external configuration - things just knit themselves together in a natural way."


"Over the years, developers have come up with ways of dealing with this issue. One scheme is to keep the Data Definition Language (DDL) statements that define the schema in source form under version control. Whenever you change the schema, you edit this file to reflect the changes. You then drop your development database and re-create the schema from scratch by applying your DDL. If you need to roll back a week, the application code and the DDL that you check out from the version control system are in step: when you re-create the schema from the DDL, your database will have gone back in time.

Except... because you drop the database every time you apply the DDL, you lose any data in your development database. Wouldn't it be more convenient to be able to apply only those changes that are necessary to move a database from version X to version Y? this is exactly what Rails migrations let you do."

E.g. 001_create_products.rb

 1   class CreateProducts < ActiveRecord:Migration    
 2     def self.up      
 3       create_table :products do |t|      
 4         t.column :title, :string      
 5         t.column :description :text      
 6         t.column :image_url :string      
 7       end      
 8     end      
 9     def self.down      
10       drop_table :products      
11     end     
12   end

Pragmatic Ajax-ification

"In the old days, browsers were treated as really dumb devices. When you wrote a browser-based application, you'd send stuff down to the browser and then forget about that session. At some point, the user would fill in some form fields or click a hyperlink, and your application would get woken up by an incoming request. It would render a complete page back to the user, and the whole tedious process would start afresh...

Whenever you work with AJAX, it's good to start with the non-AJAX version of the application and then gradually introduce AJAX features."

No REST For The Wicked

"REST stands for REpresentational State Transfer, which is basically meaningless. What it really means is that you use HTTP verbs (GET, POST, DELETE, and so on) to send requests and responses between applications."

Performance Testing

"Testing isn't just about whether something does what it should. We might also want to know whether it does it fast enough.

Before we get to deep into this, here's a warning. Most applications perform just fine most of the time, and when they do start to get slow, it's often in ways we would never have anticipated. For this reason, it's normally a bad idea to focus on performance early in development. Instead, we recommend using performance testing in two scenarios, both late in the development process."

Statement Modifiers

"Ruby statement modifiers are a useful shortcut if the body of an if or while statement is just a single expression. Simply write the expression, followed by if or while and the condition."

The following is valid Ruby syntax:

puts "Danger, Will Robinson" if radiation > 3000

I would love to express the following C# syntax....

1   public void AddLicense(ILicense license){
2       if(license.IsValid()){
3           licenseRepository.Add(license);
4       }
5   }

Like this...

1   public void AddLicense(ILicense license){
2       licenseRepository.Add(license).If(license.IsValid());
3   }

Thank you Mr. Aaron, today we just grabbed the latest beta version of Rhino.Mocks and out test times significantly dropped....

Our times before the update were 450ish seconds to run all the unit tests and create the report:


Our times after are 100ish seconds:


Patterns of Enterprise Application Architecture (The Addison-Wesley Signature Series)
by Martin Fowler

Read more about this book...

Defines Unit of Work as:

"Maintains a list of objects affected by a business transaction and coordinates the writing out of changes and the resolution of concurrency problems." - PoEAA

I've been playing with some different ideas on how you can implement a unit of work in a win forms application.

Here was the idea of the usage:

 1   public void SomeMethod() {
 2       using (var unitOfWork = UnitOfWork.StartFor<IPerson>())
 3       {
 4           var stacey = new Person(&quot;stacey&quot;);
 5           var veronica = new Person(&quot;veronica&quot;);
 6           var betty = new Person(&quot;betty&quot;);
 8           stacey.NewNumberIs(&quot;312-7467&quot;);
10           unitOfWork.Commit();
11       }
12   }

When the unit of work is asked to commit the new and modified instance would be committed to the person repository, in this case my imaginary black book.

 1   public class BlackBook : IRepository<IPerson> {
 2       private IList<IPerson> associates;
 4       public BlackBook() : this(new List<IPerson>()) {
 5       }
 7       public BlackBook(IList<IPerson> associates) {
 8  = associates;
 9       }
11       public void Add(IPerson newAssociate) {
12           associates.Add(newAssociate);
13       }
15       public void Update(IPerson updatedAssociate) {
16       }
17   }

Here's how it works... Person inherits from "DomainSuperType". In the layer super type the no argument constructor registers itself with the current unit of work. I really don't like this because it makes all the domain objects aware of the surrounding infrastructure, and makes it much more difficult to test.

Next all components have to be decorated with the "Serializable" attribute, so that I could manage dirty object tracking. This also sucks...

 1   [Serializable]
 2   public class DomainSuperType<T> where T : class {
 3       public DomainSuperType() {
 4           UnitOfWork.StartFor<T>().Register(this as T);
 5       }
 6   }
 7   public interface IPerson
 8   {
 9       void NewNumberIs(string newNumber);
10   }
12   [Serializable]
13   public class Person : DomainSuperType<IPerson>, IPerson {
14       private string name;
15       private string knownPhoneNumber;
17       public Person(string name) {
18  = name;
19       }
21       public void NewNumberIs(string newNumber) {
22           knownPhoneNumber = newNumber;
23       }
24   }

The unit of work delegates to a registry of units of work to retrieve the unit of work applicable to type ....

1   public static class UnitOfWork {
2       public static IUnitOfWork<T> StartFor<T>() {
3           return Resolve.DependencyFor<IUnitOfWorkRegistry>().StartUnitOfWorkFor<T>();
4       }
5   }

The unit of work registry creates a unit of work for type if one hasn't been started yet. Otherwise returns the already started unit of work. This registry is similar to an identity map using type T as the identifier.

 1   public class UnitOfWorkRegistry : IUnitOfWorkRegistry {
 2       private IDictionary<Type, object> unitsOfWork;
 3       private IUnitOfWorkFactory factory;
 5       public UnitOfWorkRegistry(IUnitOfWorkFactory factory) {
 6           this.factory = factory;
 7           unitsOfWork = new Dictionary<Type, object>();
 8       }
10       public IUnitOfWork<T> StartUnitOfWorkFor<T>() {
11           if (unitsOfWork.ContainsKey(typeof (T)))
12           {
13               return (IUnitOfWork<T>) unitsOfWork[typeof (T)];
14           }
15           var unitOfWork = factory.CreateFor<T>();
16           unitsOfWork.Add(typeof (T), unitOfWork);
17           return unitOfWork;
18       }
19   }

The unit of work factory leverages the dependency resolver to retrieve an implementation of the repository applicable to type T.

 1   public class UnitOfWorkFactory : IUnitOfWorkFactory {
 2       private IDependencyResolver resolver;
 4       public UnitOfWorkFactory(IDependencyResolver resolver) {
 5           this.resolver = resolver;
 6       }
 8       public IUnitOfWork<T> CreateFor<T>() {
 9           return new WorkSession<T>(resolver.GetMeAnImplementationOf<IRepository<T>>());
10       }
11   }

Each time the unit of work factory is asked to create a new unit of work it creates a fresh instance of a work session.

 1   public class WorkSession<T> : IUnitOfWork<T> {
 2       public WorkSession(IRepository<T> repository) : this(repository, new ObjectToRegisteredObjectMapper()) {
 3       }
 5       public WorkSession(IRepository<T> repository, IObjectToRegisteredObjectMapper mapper) {
 6           this.mapper = mapper;
 7           this.repository = repository;
 8           registeredInstances = new HashSet<IRegisteredInstanceOf<T>>();
 9       }
11       public void Register(T newInstanceToRegister) {
12           registeredInstances.Add(mapper.MapFrom(newInstanceToRegister));
13       }
15       public void Commit() {
16           foreach (var registeredInstance in registeredInstances)
17           {
18               registeredInstance.CommitTo(repository);
19           }
20       }
22       public void Dispose() {
23           registeredInstances = new HashSet<IRegisteredInstanceOf<T>>();
24       }
26       private readonly IRepository<T> repository;
27       private ICollection<IRegisteredInstanceOf<T>> registeredInstances;
28       private IObjectToRegisteredObjectMapper mapper;
29   }
 1 public class RegisteredInstance<T> : IRegisteredInstanceOf<T> {
 2     private readonly T originalInstance;
 3     private readonly T workingInstance;
 5     public RegisteredInstance(T newInstanceToRegister, ICloner cloner) {
 6         workingInstance = newInstanceToRegister;
 7         originalInstance = cloner.Clone(newInstanceToRegister);
 8     }
10     public T Original() {
11         return originalInstance;
12     }
14     public T WorkingCopy() {
15         return workingInstance;
16     }
18     public bool HasBeenModified() {
19         return !Original().Equals(WorkingCopy());
20     }
22     public void CommitTo(IRepository<T> repository) {
23         if (HasBeenModified()) {
24             repository.Update(WorkingCopy());
25         }
26         else {
27             repository.Add(WorkingCopy());
28         }
29     }
31     protected bool Equals(RegisteredInstance<T> registered) {
32         return registered != null &amp;&amp; Equals(originalInstance, registered.originalInstance);
33     }
35     public override bool Equals(object obj) {
36         return ReferenceEquals(this, obj) || Equals(obj as RegisteredInstance<T>);
37     }
39     public override int GetHashCode() {
40         return originalInstance != null ? originalInstance.GetHashCode() : 0;
41     }
42 }

Each registered instance immediately clones the original instance to keep track of changes between the original and the current working copy. For this to work properly the cloner has to perform a deep copy otherwise the dirty tracking wont work properly. To do the deep copy, i'm using serialization, hence the "serializable" attribute decorating each entity.

1   public class Cloner : ICloner
2   {
3       public T Clone< T >( T instanceToClone )
4       {
5           var serializer = new Serializer< T >( );
6           return serializer.DeserializeFrom( serializer.Serialize( instanceToClone ) );
7       }
8   }

So far this implementation is just a spike on how to implement a unit of work, it's really not a great implementation but I'm hoping to solicit some feedback on ways that have worked for others.

So last week the guys and I at work started to spike ASP.NET MVC. We're starting up a new project, and decided to take advantage of the Preview 2 version of the so far released libraries. Our experiences so far have been.... hmmm... not as expected.

Here's a few things we've learned, hopefully they help someone else out. We're nant junkies, so the first thing we did to get going was automate the compiling, testing, running, deploying, and creation of the database with nant. We found that when running our project against the aspnet_compiler.exe that it didn't recognize some of the new C# 3.0 syntax.

1   <select name="protocolName">
2       <&#37; foreach( var dto in ViewData ) {&#37;>
3           <option><&#37;= dto.ProtocolName &#37;></option>
4       <&#37; } &#37;>
5   </select>

The above code would raise an error with the aspnet_compiler.exe. Now this is valid C# 3.0, but the pre compiler didn't know what to do with the "var" keyword. Next, the precompiler didn't know where to find the "Form()" method on the Html helper class because, it's an extension method.

1   <&#37; using( Html.Form( Controllers.Order.Name, "submit", FormMethod.Post ) ) {&#37;>

It's kind of an interesting idea that so many methods on the "HtmlHelper" class are extension methods. The solution to getting the aspnet_precompiler to recognize the C#3.0 syntax was to dump this block of xml in the web.config.

1   <system.codedom>
2       <compilers>
3           <compiler language="c#;cs;csharp" extension=".cs" warningLevel="4" 
4               type="Microsoft.CSharp.CSharpCodeProvider, System, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089">
5               <providerOption name="CompilerVersion" value="v3.5"/>
6               <providerOption name="WarnAsError" value="false"/>
7           </compiler>
8       </compilers>
9   </system.codedom>

Next up... testing controllers. I think the guys and I were a little surprised at just how awkward it was to test a controller. I thought, a lot of time was spent making the controllers more testable. Our first pain point was the fact that "RenderView()" is a protected method on the Controller base class. Here's what I'm talking about...

 1   public class HomeController : Controller
 2   {
 3       public void Index( )
 4       {
 5           RenderView( "Index" );
 6       }
 8       public void About( )
 9       {
10           RenderView( "About" );
11       }
12   }

So let's think... how can we test that when the Index action is invoked it calls "RenderView" with an argument value "Index"... So some people have suggested creating a Test Double. I say... booo... I use mock object frameworks so that I don't need to groom a garden of hand rolled test stubs. Here's what we came up with... first cut remember!

 1   public class OrderController : BaseController, IOrderController
 2   {
 3       private readonly IOrderIndexCommand indexCommand;
 4       private readonly ISubmitOrderCommand submitCommand;
 6       public OrderController( IOrderIndexCommand indexCommand, ISubmitOrderCommand submitCommand )
 7       {
 8           this.indexCommand = indexCommand;
 9           this.submitCommand = submitCommand;
10       }
12       public void Index( )
13       {
14           indexCommand.InitializeWith( this );
15           indexCommand.Execute( );
16       }
18       public void Submit( )
19       {
20           submitCommand.InitializeWith( this );
21           submitCommand.Execute( );
22       }
23   }

Ok... so it's slightly more testable. Each action on the controller executes a command, after first being initialized with ... The other thing to notice is that the OrderController inherits from BaseController. BaseController is actually an adapter that implements an IViewRenderer interface.

1   public abstract class BaseController : Controller, IViewRenderer
2   {
3       public void Render< TypeToBindToView >( IView view, TypeToBindToView viewData )
4       {
5           RenderView( view.Name( ), viewData );
6       }
7   }

The OrderIndexCommand is initialized with an IViewRenderer.

 1   public class OrderIndexCommand : IOrderIndexCommand
 2   {
 3       private IViewRenderer viewEngine;
 4       private readonly IOrderTasks task;
 6       public OrderIndexCommand( IOrderTasks task )
 7       {
 8           this.task = task;
 9       }
11       public void InitializeWith( IViewRenderer engineToRenderViews )
12       {
13           viewEngine = engineToRenderViews;
14       }
16       public void Execute( )
17       {
18           viewEngine.Render( ControllerViews.Order.Index, task.RetrieveAllProtocols( ) );
19       }
20   }

If you haven't heard, JP's giving away a $70.00 book credit to Amazon. For more details check out his most recent post.

I really enjoy reading books, but if you're low on funds. Books can be quite pricey, especially tech books. This is a great offer, and anyone interested should definitely take the man up on his offer. Even if YOU don't need the books, or the credit, I'm sure you can think of someone who could. Let them know...

Why do I even care?

Because I know how hard it was to purchase books and support a family. I'm in much better shape now, and would love for someone else who needs a leg up to win an opportunity to be successful. Do you know someone that could use a little help?

Wow... I don't know what it is, but right after the ALT. NET conference I was pretty pumped up and excited, but these days I'm feeling a little low. It's amazing how many young, talented people there are out in the industry. It's more amazing to see how fast people are moving and growing.

The guys on my team, and I, try hard to stay up on what's new... and what the cool kids are doing. But these days' it's just making me dizzy... we've got the Eleutian Guys slingin' code like crazy. This PolyGlot programming thing has got me feeling like I need to go add more languages to my vocab. I'm getting sick of checking my gmail, because each time i do it looks like the ALT.NET mailing list has just puked all over my monitor.

There's new frameworks flying out like ASP.NET MVC, Moq, Prism, Silverlight, WPF... then debates about how to write tests, what's bdd, is the auto mocking container a smell. Then there's the hype around ruby and rails, and the comparisons between dynamic and statically typed languages.

It's got me a little dizzy, but now that I think about... it's kind of cool how fast the industry seems to be evolving!