Which curve do you choose?
Let's have a look at two realities (curves) that we can choose between. To many it's a law of nature that it becomes extremely costly to make changes in software. They think that software must rotten over time and it happens pretty fast.
Boehm's curve (from the book Software Engineering Economics) indicates that the cost of correcting an error will increase exponentially over time in a project. I guess it's possible to understand it so that costs will continue to increase after the first version is in production, leading to changes becoming more and more expensive. This means that all decisions must be taken very early in the life cycle so as to not cost a crazy amount.
Of course it doesnt have to be like this. Beck's curve (from the book Extreme Programming Explained) indicates another way it can be; instead of the cost growing exponentially, it will level out.
This is where how beautiful the code is, how large it is and if it is in control comes in. To create the second curve, there are very high requirements for the state of the software and how it is continuously cleaned up, how it is redesigned, and if it has good automatic tests and automatic deployment, and so on. A combination of many well executed actions will lead to the software being changeable. The value of this for the business is, of course, potentially great.
Maybe the two different beliefs are the reason behind stupid decisions you see being made. But when you have realized that there might be the differences in basic beliefs, it's suddenly not so strange anymore. That's not to say that you necessarily agree with the decisions, but when you understand where the other person is coming from, at least it's possible to understand why the choice was made.
By the way, if you think that you and your organization are careful when you develop, but you still arrive at the first curve, then unfortunately it's probably the case that the code isn't all that great after all.
To summarize: if the software is changeable without incurring dramatic costs even late in the life cycle, it will change everything in the software development thinking.
Unfortunately there's a catch. The "flat" curve is way harder to achieve. It's possible, yes, but not something to take lightly. Anyway, it starts with a choice and I will return to how it can be achieved in an upcoming chronicle.
Have you decided which curve you would like?
(First published in Swedish as a developer chronicle here.)
2012-03-23  #
  Don't cheat with the user interface
"Anyone can take care of the UI, it's not possible to do that much damage there anyway."
Part of the paradox is that the largest amount of code, the ugliest and therefore also the largest maintenance problem, has a tendency to end up in the user interface (UI).
Another part of the paradox is that the success of an application largely depends on the user experience. Sure, the better the code/design is that the UI consumes, the larger the chance that the UI can also be good. But it doesn't happen by itself that the UI code becomes good. In fact, it's often the opposite.
You might recognize some of these examples.
- The rendering of UI and logic has been mixed into a soup. When the rendering is to be swapped, it's clearly obvious that the logic is affected too.
- In the large application the logic has been split in different parts - in a good way. For example in accordance to how often they change. The exception is the user interface that is one single monolite where it's very hard to change the different parts separately.
- After a lot of pain and effort, the team is gaining a lot from automatic testing. Except for the user interface. Unfortunately it often needs refactoring and thereby needs the safety net that automatic tests provide.
- Good design is used in the domain model, for instance, so that with little code and right abstractions it successfully describes the solution to the problem. But the user interface is done in a "brute force" way, almost totally without abstractions.
If you do as the vendor proposes, you can often expect verbosity in the code and large, hard to maintain codebases. Most often, the vendor hasn't eaten its own dog food. It's not until the tool is used in real projects that the experience and knowledge of what is good and bad emerges - and how the good is used in an effective way.
An indication of the problem is that to most developers design means quite different things, such as in the domain model and the user interface. Of course good software development design in the UI as well is extremely useful (and not only "look-and-feel"-design)!
The user interface has always been important, but has often been treated as the poor relation. The largest maintenance bottleneck is often found in the user interface. Now and in the future, that area requires and deserves to receive a lot of focus.
An interesting approach is framework oriented user interface development. It's anything but trivial to do well. Yet the potential is enormous!
(First published in Swedish as a developer chronicle here.)
2012-02-19  #
  The next big thing is extremely small
It might not be the silver bullet, but I bet a nickel on the next big thing being small, extremely small. We are at the beginning of the era of tiny.
We've had three different eras:
1. The era of following one big vendor and nothing else.
2. The era of using many large open source frameworks.
3. The era of tiny.
If we go 10-20 years back in time, we find ourselves in the time I call "follow one big vendor and nothing else". You found all you needed at one single place. It seemed to be simple and it seemed to be safe. You were taken care of and didn't have to think for yourself all that much.
Often the tools helped you quickly create loads of ugly code that you then had to maintain. There was a lot of focus on "drag till you drop" instead of writing code.
Large database oriented programming models and large solutions to create the illusion that it wasn't the web when you were developing for the web were often used. When the vendors were demonstrating their solutions they often said: "And all this without me writing a single line of code." When you looked at the result behind the scenes it made you cry, but not tears of joy.
After a while some people lost their tempers and there was something of a revolution. About ten years ago and up until now there has been more and more focus on open source and strong communities. Large, generic and important frameworks grew up and were used a lot. Lets call this era the "use many large open source frameworks".
The focus was on agile development, DDD, elegant code, TDD, craftmanship, MVC, O/RM and REST. A lot was good and sound, but a common problem was that whoever used the most number of design patterns in his solution won. It was also now many of us learned the hard way that large, generic frameworks shouldn't be invented. The chance of success increases radically if you harvest them instead.
The third stage which is currently emerging is more evolution than revolution. I call it "the era of tiny".
A lot is still around from before, such as test-driven, real DDD and focus on superb code. However, there is a little less focus on using communities and open source projects in order to find all you need in the form of large generic frameworks.
Instead the focus is on turning inwards and solving tasks with tiny, specific framework solutions without all the unnecessary extras. An alternative is to have dependencies to extremely small code snippets that solve one single specific task in a good way. Often those are used as code instead of as binary.
A typical example to make the cooperation between developer and client even better is to use domain-specific languages so that the documentation is also executable.
If you receive the following comment...
    Have you solved the problem with that tiny solution? That's not very "enterprisey".
...then you have done something good.
(First published in Swedish as a developer chronicle here.)
2012-01-18  #
  Developer Chronicle: Too many chefs spoil the code
(First published in Swedish here.)
2011-05-10  #
  Developer chronicles
2011-04-18  #
  Developer Chronicle: Lasagna is not a good role model
- "It can be good to have many layers. You never know when a layer might turn out to be useful."
- "Person X / Book Y says it should be like that."
- "Everything should be similar and follow the same structure."
- When something is changed, there is a tendency that the change affects more than one place in the code due to all the layers.
- You get tons of completely uninteresting code which means that there is more to read and it takes longer to locate what you are looking for.
2011-04-18  #
  More than 10 000 km apart...
2010-03-23  #
  It works both ways
This works both ways. If a business person wants their software development projects to be more successful, it pays off immediately to show an interest in the projects, to get involved with them and learn more about software development and all the changed and rapidly changing rules. For instance, in order to be able to adjust a software project for the better, it helps if you understand what is going on so you know what you'd like to change.
2010-02-18  #
  Drivers for NOSQL
The term NoSQL was created last year and has morphed into NOSQL lately (Not Only SQL). This trend is rapidly gaining strength. Below I have written down some of the drivers behind the NOSQL movement. Which ones are missing?
- TDD and DDD For many, this changes the picture completely. The focus moves away from the storage technology.
- Application databases instead of integration database Then the storage technology that is used is a private concern of the service/application itself and the decision becomes less dramatic.
- Internet scale The need for scaling out is changing the scene for certain scenarios and eventual consistency is most often good enough.
- We don't need many of the common capabilities for certain applications/services And we just don't want to pay a high cost for impedance mismatch if we don't get other large benefits.
- Querying and analysis finally moves away from the production databases Just an example of one reason why we have less of a need for functionality in specific situations.
- Certain types of applications don't fit the relational model very well A classic example is "bill of materials". Nowadays social networks is an often mentioned example.
- Event sourcing is growing in popularity This might mean that you only need a persistent log file for that part, a pretty basic need.
- Need for rapid schema evolution Or very many schemas, or no schemas.
2010-02-18  #
  New TDD course
Oh, and the DDD course is taking place next time in Stockholm in April.
2010-02-18  #
  A new and *good* "VB"
What do you think? Has it become significantly easier to build software since 1991? Have we as a group become significantly more productive? (If so, would you say "yes" even if we take maintenance into account?)
VB had its merits and its flaws, but I think it represented a big shift in productivity for a big crowd. That said, it's not exactly a new VB I envision. It's something kind of in its spirit, but something that doesnt put lots of obstacles in the way of creating *good* software. It should help in creating software that can be maintained and taken further, maybe by more skilled developers if needed after the first successful application is built and used for some time. Probably also in making it easy to involve business analysts, domain experts, testers...
As I wrote here, there is no lack of new RAD tools ("drag till you drop" etc), but they are not focusing on creating expressive and beautiful code, maintainable software. Is it a law of nature that approachable tools must create bad results below the surface? I dont see that. (Please note that I *dont* expect a tool to make design and software development easy, thats definitely not what Im talking about. I just dont think it has to add lots of accidental complexity either.)
Why so little (no?) interest from the big vendors? It seems quite easy just to harvest some obvious and proven good ideas and concepts to put together in a nice and approachable package to take a big leap, and I think it would represent a huge difference for loads of non-alpha geeks. Sure, its a moving target, but this might not be moving at all as fast as some of the current RAD attempts...
I put together my own toolbox as do many others, of course. We would probably do that anyway, but that's not the case for everybody. As I understand it, lots of developers go with the tools they get in the single package from the single vendor of their choice.
OK, there are attempts here and there for what I in the header called a new and *good* "VB". Those attempts are as far as I know from small vendors (and open source of course) and I guess this will spread.
Or maybe the "situation" will have been solved at SAW in a few days?
:-)
2010-01-20  #
  The DDD course in Malm
2010-01-20  #
  The big picture of software development
The model is based on the fundamental questions of Why, What and How.
Why do we exist?
Thats a pretty philosophical question. However, in the context of why we as developers exist, why we are needed, why companies pay us to write code, I believe the reason is that we are expected to create business value. That answer would seem to be pretty reasonable for most people in most situations.
What is it that we do to create business value?
I think the answer to that question is code, great code! It's not only the code itself, but its surroundings too, like tests, docs, build scripts, and such. Yet the very code itself is of crucial importance.
If we are in control of the code, then the business people can come with a request for a new feature that will create a new business opportunity and we can achieve it in a matter of, say, a few weeks. A couple of changes in the codebase might translate into millions of dollars. That codebase leads to high business value and the cost for achieving it is low.
The opposite is a codebase that has rotten to the extent that nobody dares touch it at all. The developers walk around it, they even leave the company just so as not to have to work with it. That codebase might need years before it can achieve the afore-mentioned business opportunity. Time to market, risk and cost of change mean that the business value of this codebase is very low.
(There are of course ways of transforming a bad codebase into a great one, but that's another story.)
How do we create great code?
How you achieve great code varies. Some techies would spontaneously suggest Test-Driven Development (TDD), Domain-Driven Design (DDD), architecture-focus, refactoring, and so on. You just add here what works for your team, the stuff that let your codebase add business value.
What I find over and over again is that a good many companies have focused all their attention on implementing a new process. Which process doesn't seem to matter, it varies over time and lately it has been Scrum. It has been seen as a panacea. At first it seems to go very well, but after just a few sprints, the velocity drops. The reason is that the codebase and the architecture haven't received any attention at all... Sure, a suitable process can often help a good team to become even more productive, but it wont compensate for lack of skills. So, I added "Iterative process" as an example of something that is part of the picture for achieving good code. My point is just that its most definitely not the only thing, of course it isnt. (Also read my factor10 colleague Niclas' blog post called "The holistic view".)
If I had to pick the one answer to how, I believe that the single most important way of achieving great code is to have the best people that you could possibly find. To have only a few, as few as possible, but not fewer. On top of achieving great code that would actually also let you achieve it cheaply. Of course you would have to pay more for the best, but they will be extremely productive (for real, not a fake) and since you only have so few, that also makes the financial side very appealing.
I find that quite a lot of people strongly believe that products will help a lot. After 20+ years in this business, I believe that products are more often the problem than the solution so care must be taken.
In the same way there is the comment about companies believing they need expert help from the vendor of the product they are having trouble with. To me, that's a smell regarding that product.
"We can't solve problems by using the same kind of thinking we used when we created them."
--Albert Einstein
Another silver bullet that doesn't work on the contrary is trying to solve the problem by increasing the head count. That's just a good way of increasing the problem and the cost. Quality over quantity, every day!
To summarize what I said above, here's a sketch:
Some more comments
Techies immediately seem to agree with this model. But, and this might come as a surprise, business people seem to like it even more. They might say something like: "So that's why the developers are complaining that I dont value quality. I used to think it was just whining, but now I realize it's about sustainable business value."
Quite often I find that developers ask questions such as: What feature shall I do first? Where shall I focus my refactoring efforts? What automatic tests are the most important? According to this model, there's a clear and simple answer to those questions that points us in the right direction. Choose whatever gives the most business value.
A colleague of mine goes as far as using the code quality of the important codebases for determining if a company will have a chance for success or if it will go out of business. If we aren't fully in control of the codebase, that's a setup for failure. (Obviously, these companies are not automatically doomed, not if they grasp the situation and act with power and intelligence.)
So, the codebase is probably quite important, perhaps the most important asset of the company. It's not something you allow just anybody to work with and make changes to. It's like if you need brain surgery; would you allow someone who had just done a crash course in brain surgery to operate? Of course not, you would probably try to find the best and most experienced person there is. Thats quite the opposite approach to how codebases are dealt with in many companies.
2009-12-16  #
  Fast Track to DDD in Sweden
2009-10-23  #
  It should be easy to build *good* software.
That particular tool doesn't matter, but thinking more about it, unfortunately it actually feels like its been the same thing for quite a few years and for quite a few development tool vendors. It's obvious we need a much more interesting goal, isn't it?
It should be easy to build *good* software!
2009-08-20  #
  DDD in the cloud
2009-08-20  #
  Fast Track to DDD in South Africa
2009-08-16  #
  The CCC-article translated to Chinese
2009-04-14  #
  Article: Chunk Cloud Computing (CCC)
2009-03-29  #
  Some observations over the last few years
2009-03-27  #
  NWorkspace == Nilsson Workspace?
:D
And when I'm at it, that's not the case for NUnit, NHibernate, NPersist, NDepend or NCover either...
2009-01-06  #
  Strategy workshop about Data Access
Here are a few extracts from the description in English:
"There's a lot going on right now in .NET land when it comes to data programmability.
...
There are many choices available now and each choice has an impact on code quality, maintainability and, overall, bottom line costs.
...
At the same time lots of .NET developers are starting to embrace Domain-Driven Design (DDD) and Test-Driven Development (TDD), which change the scene much more than most people at first realize.
...
Suddenly, the choice of data programming options is a strategic decision. What data programming option must I choose so that my teams can spend more time on solving the problem and less time on how to get and save the data?
..."
2009-01-06  #
  Oredev 2008
2008-11-16  #
  DDD anti-motivation
New (to most developers)
Even though DDD in itself isn't at all new, it's new to those that haven't tried it before, of course. And if you work in a big team with no prior experience of DDD, then it's very risky using it for a new and time constrained project. There are ways of reducing the risk, but it's still there and it's real.As a matter of fact, DDD is new to most tool vendors as well. Unfortunately that means that quite often the tools don't help us that much. Sometimes they are even more in the way than they are helpful.
2008-10-24  #
  Swedish surnames
Another very common surname in Sweden is Nilsson, the second most common actually. Even so, more or less every week I hear somebody who thinks that one of my colleagues in factor10, Niclas Nilsson, is my brother. Thats not the case. I just wanted to work with him because he is brilliant and very nice.
:-)
Something I'm not that used to though is what happened the other day when Niclas met an old student of mine. My old student asked Niclas if I'm Niclas father! Hey, I'm not even three years older than Niclas...
2008-10-24  #
  Strategic Design (DDD Motivation 7)
Eric Evans has said that if he were to rewrite the book, he would put the chapters on Strategic Design at the front of the book and not at the end because nobody reads that far but that part of the book is perhaps the most important.
I know, this is no reason for you to try out an alternative to database-driven design because strategic design isn't the opposite at all, it's not at the same level. No matter if you go for db-driven design or domain models, I think strategic design is very important and therefore it felt important to mention it as a reason to have a closer look at DDD.
2008-10-16  #
  A new course on implementing DDD and LINQ
2008-10-16  #
  Have you heard about LINQ?
2008-09-22  #
  DP Advisory Council at Microsoft
2008-06-19&n