spacer   Which curve do you choose?

Important code should be tiny, beautiful, and in control. Anything else is a waste of money. To me it's so obvious and internalized that I forget that not everybody sees it like this. It's hard to have a quality discussion with someone if your opinions differ greatly as you will both be puzzled and then nothing. It recently occurred to me that many people automatically agree that bad software quality is expensive, but don't really believe it deep down.

Let's have a look at two realities (curves) that we can choose between. To many it's a law of nature that it becomes extremely costly to make changes in software. They think that software must rotten over time and it happens pretty fast.

Boehm's curve (from the book Software Engineering Economics) indicates that the cost of correcting an error will increase exponentially over time in a project. I guess it's possible to understand it so that costs will continue to increase after the first version is in production, leading to changes becoming more and more expensive. This means that all decisions must be taken very early in the life cycle so as to not cost a crazy amount.

Of course it doesnt have to be like this. Beck's curve (from the book Extreme Programming Explained) indicates another way it can be; instead of the cost growing exponentially, it will level out.

This is where how beautiful the code is, how large it is and if it is in control comes in. To create the second curve, there are very high requirements for the state of the software and how it is continuously cleaned up, how it is redesigned, and if it has good automatic tests and automatic deployment, and so on. A combination of many well executed actions will lead to the software being changeable. The value of this for the business is, of course, potentially great.

Maybe the two different beliefs are the reason behind stupid decisions you see being made. But when you have realized that there might be the differences in basic beliefs, it's suddenly not so strange anymore. That's not to say that you necessarily agree with the decisions, but when you understand where the other person is coming from, at least it's possible to understand why the choice was made.

By the way, if you think that you and your organization are careful when you develop, but you still arrive at the first curve, then unfortunately it's probably the case that the code isn't all that great after all.

To summarize: if the software is changeable without incurring dramatic costs even late in the life cycle, it will change everything in the software development thinking.

Unfortunately there's a catch. The "flat" curve is way harder to achieve. It's possible, yes, but not something to take lightly. Anyway, it starts with a choice and I will return to how it can be achieved in an upcoming chronicle.

Have you decided which curve you would like?

(First published in Swedish as a developer chronicle here.)

2012-03-23  #

spacer   Don't cheat with the user interface

It's a paradox that many developers who have a large interest in design and maintainable applications often say something like:
"Anyone can take care of the UI, it's not possible to do that much damage there anyway."

Part of the paradox is that the largest amount of code, the ugliest and therefore also the largest maintenance problem, has a tendency to end up in the user interface (UI).

Another part of the paradox is that the success of an application largely depends on the user experience. Sure, the better the code/design is that the UI consumes, the larger the chance that the UI can also be good. But it doesn't happen by itself that the UI code becomes good. In fact, it's often the opposite.

You might recognize some of these examples.
  • The rendering of UI and logic has been mixed into a soup. When the rendering is to be swapped, it's clearly obvious that the logic is affected too.
  • In the large application the logic has been split in different parts - in a good way. For example in accordance to how often they change. The exception is the user interface that is one single monolite where it's very hard to change the different parts separately.
  • After a lot of pain and effort, the team is gaining a lot from automatic testing. Except for the user interface. Unfortunately it often needs refactoring and thereby needs the safety net that automatic tests provide.
  • Good design is used in the domain model, for instance, so that with little code and right abstractions it successfully describes the solution to the problem. But the user interface is done in a "brute force" way, almost totally without abstractions.
Most developers are resilient sorts, but I don't think we should accept what we get from the tool vendors as-is. Tools should often rather be seen as platforms that can be used to develop specific frameworks on top.

If you do as the vendor proposes, you can often expect verbosity in the code and large, hard to maintain codebases. Most often, the vendor hasn't eaten its own dog food. It's not until the tool is used in real projects that the experience and knowledge of what is good and bad emerges - and how the good is used in an effective way.

An indication of the problem is that to most developers design means quite different things, such as in the domain model and the user interface. Of course good software development design in the UI as well is extremely useful (and not only "look-and-feel"-design)!

The user interface has always been important, but has often been treated as the poor relation. The largest maintenance bottleneck is often found in the user interface. Now and in the future, that area requires and deserves to receive a lot of focus.

An interesting approach is framework oriented user interface development. It's anything but trivial to do well. Yet the potential is enormous!

(First published in Swedish as a developer chronicle here.)

2012-02-19  #

spacer   The next big thing is extremely small

In software development we are constantly looking for the Next Big Thing. We haven't given up hope of the silver bullet... one day it must show itself.

It might not be the silver bullet, but I bet a nickel on the next big thing being small, extremely small. We are at the beginning of the era of tiny.

We've had three different eras:
1. The era of following one big vendor and nothing else.
2. The era of using many large open source frameworks.
3. The era of tiny.

If we go 10-20 years back in time, we find ourselves in the time I call "follow one big vendor and nothing else". You found all you needed at one single place. It seemed to be simple and it seemed to be safe. You were taken care of and didn't have to think for yourself all that much.

Often the tools helped you quickly create loads of ugly code that you then had to maintain. There was a lot of focus on "drag till you drop" instead of writing code.

Large database oriented programming models and large solutions to create the illusion that it wasn't the web when you were developing for the web were often used. When the vendors were demonstrating their solutions they often said: "And all this without me writing a single line of code." When you looked at the result behind the scenes it made you cry, but not tears of joy.

After a while some people lost their tempers and there was something of a revolution. About ten years ago and up until now there has been more and more focus on open source and strong communities. Large, generic and important frameworks grew up and were used a lot. Lets call this era the "use many large open source frameworks".

The focus was on agile development, DDD, elegant code, TDD, craftmanship, MVC, O/RM and REST. A lot was good and sound, but a common problem was that whoever used the most number of design patterns in his solution won. It was also now many of us learned the hard way that large, generic frameworks shouldn't be invented. The chance of success increases radically if you harvest them instead.

The third stage which is currently emerging is more evolution than revolution. I call it "the era of tiny".

A lot is still around from before, such as test-driven, real DDD and focus on superb code. However, there is a little less focus on using communities and open source projects in order to find all you need in the form of large generic frameworks.

Instead the focus is on turning inwards and solving tasks with tiny, specific framework solutions without all the unnecessary extras. An alternative is to have dependencies to extremely small code snippets that solve one single specific task in a good way. Often those are used as code instead of as binary.

A typical example to make the cooperation between developer and client even better is to use domain-specific languages so that the documentation is also executable.

If you receive the following comment...

    Have you solved the problem with that tiny solution? That's not very "enterprisey".

...then you have done something good.

(First published in Swedish as a developer chronicle here.)

2012-01-18  #

spacer   Developer Chronicle: Too many chefs spoil the code

Let's start with a fairly common question:

"Ouch, we're so behind with the project, what on earth shall we do?"

A response that is just as common is:

"Add more resources! Increase head count as much, and as fast, as possible!"

Honestly, how many people have responded so spontaneously?

Over 35 years ago Frederick Brooks wrote the book "The Mythical Man-Month", where he shows a formula for how to calculate how much the delay increases in a delayed project if you add one more person. How is it, then, that so many think it'll help?

What is even worse, apart from the delay being likely to increase, is that projects with too many people involved have a tendency to lead to a large code base. It is to be expected as a law of nature. It can of course be too large a code base for many other reasons too, but this is one important factor.

Each developer would naturally like to be productive and, just like the others, add X number of lines a day. So the growth of the code base size is directly related to the number of developers.

Even if I am allergic to projects that become overcrowded over time, there's something I dislike even more. That is projects that start to be overpopulated on day one. On top of the problem with too large code base there is an additional problem with X amount of people that are waiting for something to do, which costs a lot to no avail. Pretty soon everyone starts to create their own architecture. Similarly, requirements are created wildly on speculation. The project is simply in a very bad position from the outset.

Why does this happen? I think one reason could be that many people see software development as a keyboard intensive task. Sure, it never hurts to be fast on the keyboard, but it is mostly because the keying should not distract the thinking and the mental models.

If you are twice as fast at the keyboard as your colleague is, are you twice as good as your colleague?

Perhaps, but if you believe a quote from Bill Gates where he should have said that a super programmer is worth 10 000 times higher salary than a mediocre one, your double speed keyboard skills should be a very small advantage. Probably other stuff is way more important.

When I think about it, we have a saying for this phenomenon; the adage is probably much older than Brooks's book and it goes: "too many chefs spoil the broth." So, avoid adding loads more head count on the problem at all costs.

(First published in Swedish here.)

2011-05-10  #

spacer   Developer chronicles

I've been writing the developer chronicle for Computer Sweden (in Swedish) for a couple of months. I'm going to republish the articles as blog posts. The first one is called "Lasagna is not a good role model".

2011-04-18  #

spacer   Developer Chronicle: Lasagna is not a good role model

How about a spot of banter about an architectural phenomenon that I often see in various projects? Since I've made the same mistake in many projects, I guess it's okay to joke about it.

The phenomenon I'm referring to is that there seems to be an architecture principle in many projects which says "the more layers the better." The principle is very common in .net and java projects, especially in the projects known as enterprise projects.

Why does this happen? There are many explanations, but the following stand out as the most common:
  • "It can be good to have many layers. You never know when a layer might turn out to be useful."
  • "Person X / Book Y says it should be like that."
  • "Everything should be similar and follow the same structure."
I think all the reasons above are really bad. The first is an excellent example of accidental complexity. Speculation in advance increases complexity. It is best to resist speculating and make things as simple as possible instead so that they are changeable when, or if, needed.

The second reason is lacking "context is king". No solution is always the best; it obviously depends on the situation.

But perhaps worst of all is the third reason. The code should not be similar because there are different problems and different situations or if it can be similar, it sounds like a serious violation of the DRY (Don't Repeat Yourself) principle. Probably it should not be hand written at all.

Someone wrote a while back that "Architecture is like lasagna. The more layers, the better. "I totally disagree, both in terms of architecture and the lasagna. The taste is not in the number of layers. In fact, that quote makes me long for spaghetti.

What are the drawbacks when having too many layers?
  • When something is changed, there is a tendency that the change affects more than one place in the code due to all the layers.
  • You get tons of completely uninteresting code which means that there is more to read and it takes longer to locate what you are looking for.
In short, it becomes more expensive to maintain.

There seems to be a common tendency for developers to follow a cyclical pattern. At school an early project starts with all the code in a single layer.

Then you move on to an extreme amount, maybe eight to twelve layers, finally becoming situational and allowing the solution to the problem decide the right number.

The approach I recommend is that you do not focus on layers at the start of new projects, but focus on understanding the problem and describing the solution as simply as possible.

As the need for more layers proves itself you add them, but only after having critically evaluated the value relative to the cost.

You won't win by using more layers than anyone else.

(First published in Swedish here.)

2011-04-18  #

spacer   More than 10 000 km apart...

...but the problems are the same.

I was in a phone meeting some time ago with a big Asian company to discuss their problems. They were eight participants, and only one spoke and understood English. So when I said something, the English speaking guy spoke their language and I didn't understand at all. And the same when they discussed things before telling me. The situation and meeting was very tricky and made me nervous and uneasy. Large distances in language, culture and geography... Until they told me about their problems. All of a sudden, the meeting was transformed into well-known territory.

2010-03-23  #

spacer   It works both ways

I've spent a good part of my career trying hard to better understand the businesses I'm developing software for, since that helps *a lot* in delivering more value. This is not anything I'm done with. On the contrary, it's something that I find increasingly important. You can't program what you don't understand. Period.

This works both ways. If a business person wants their software development projects to be more successful, it pays off immediately to show an interest in the projects, to get involved with them and learn more about software development and all the changed and rapidly changing rules. For instance, in order to be able to adjust a software project for the better, it helps if you understand what is going on so you know what you'd like to change.

2010-02-18  #

spacer   Drivers for NOSQL

About two years ago I gave a few presentations about how relational databases are facing more and more competition and that several factors indicate that we will be choosing other ways of storing data in the not too distant future.

The term NoSQL was created last year and has morphed into NOSQL lately (Not Only SQL). This trend is rapidly gaining strength. Below I have written down some of the drivers behind the NOSQL movement. Which ones are missing?
  • TDD and DDD
  • For many, this changes the picture completely. The focus moves away from the storage technology.
  • Application databases instead of integration database
  • Then the storage technology that is used is a private concern of the service/application itself and the decision becomes less dramatic.
  • Internet scale
  • The need for scaling out is changing the scene for certain scenarios and eventual consistency is most often good enough.
  • We don't need many of the common capabilities for certain applications/services
  • And we just don't want to pay a high cost for impedance mismatch if we don't get other large benefits.
  • Querying and analysis finally moves away from the production databases
  • Just an example of one reason why we have less of a need for functionality in specific situations.
  • Certain types of applications don't fit the relational model very well
  • A classic example is "bill of materials". Nowadays social networks is an often mentioned example.
  • Event sourcing is growing in popularity
  • This might mean that you only need a persistent log file for that part, a pretty basic need.
  • Need for rapid schema evolution
  • Or very many schemas, or no schemas.

2010-02-18  #

spacer   New TDD course

factor10's course "Test First with TDD" (or "TDD p riktigt" in Swedish) is given as an open course for the first time in Malm in Sweden in April. You can find more information and book a seat here.

Oh, and the DDD course is taking place next time in Stockholm in April.

2010-02-18  #

spacer   A new and *good* "VB"

As I write this, the yearly SAW is under way. One of the subjects I put on the table at last year's workshop was that I thought (and still think) it's about time for a new and *good* "VB".

What do you think? Has it become significantly easier to build software since 1991? Have we as a group become significantly more productive? (If so, would you say "yes" even if we take maintenance into account?)

VB had its merits and its flaws, but I think it represented a big shift in productivity for a big crowd. That said, it's not exactly a new VB I envision. It's something kind of in its spirit, but something that doesnt put lots of obstacles in the way of creating *good* software. It should help in creating software that can be maintained and taken further, maybe by more skilled developers if needed after the first successful application is built and used for some time. Probably also in making it easy to involve business analysts, domain experts, testers...

As I wrote here, there is no lack of new RAD tools ("drag till you drop" etc), but they are not focusing on creating expressive and beautiful code, maintainable software. Is it a law of nature that approachable tools must create bad results below the surface? I dont see that. (Please note that I *dont* expect a tool to make design and software development easy, thats definitely not what Im talking about. I just dont think it has to add lots of accidental complexity either.)

Why so little (no?) interest from the big vendors? It seems quite easy just to harvest some obvious and proven good ideas and concepts to put together in a nice and approachable package to take a big leap, and I think it would represent a huge difference for loads of non-alpha geeks. Sure, its a moving target, but this might not be moving at all as fast as some of the current RAD attempts...

I put together my own toolbox as do many others, of course. We would probably do that anyway, but that's not the case for everybody. As I understand it, lots of developers go with the tools they get in the single package from the single vendor of their choice.

OK, there are attempts here and there for what I in the header called a new and *good* "VB". Those attempts are as far as I know from small vendors (and open source of course) and I guess this will spread.

Or maybe the "situation" will have been solved at SAW in a few days?
:-)

2010-01-20  #

spacer   The DDD course in Malm

It's time for factor10's course "Fast track to Domain-Driven Design (DDD)" again (in Swedish). This time in Malm in a few weeks.

2010-01-20  #

spacer   The big picture of software development

Very often when I have tried to understand certain situations that have cropped up over the years and tried to explain where I believe the problem lies, I find myself using a certain simple model again and again. I should have written it up a long time ago, but better late than never, eh?

The model is based on the fundamental questions of Why, What and How.

Why do we exist?
Thats a pretty philosophical question. However, in the context of why we as developers exist, why we are needed, why companies pay us to write code, I believe the reason is that we are expected to create business value. That answer would seem to be pretty reasonable for most people in most situations.

What is it that we do to create business value?
I think the answer to that question is code, great code! It's not only the code itself, but its surroundings too, like tests, docs, build scripts, and such. Yet the very code itself is of crucial importance.

If we are in control of the code, then the business people can come with a request for a new feature that will create a new business opportunity and we can achieve it in a matter of, say, a few weeks. A couple of changes in the codebase might translate into millions of dollars. That codebase leads to high business value and the cost for achieving it is low.

The opposite is a codebase that has rotten to the extent that nobody dares touch it at all. The developers walk around it, they even leave the company just so as not to have to work with it. That codebase might need years before it can achieve the afore-mentioned business opportunity. Time to market, risk and cost of change mean that the business value of this codebase is very low.

(There are of course ways of transforming a bad codebase into a great one, but that's another story.)

How do we create great code?
How you achieve great code varies. Some techies would spontaneously suggest Test-Driven Development (TDD), Domain-Driven Design (DDD), architecture-focus, refactoring, and so on. You just add here what works for your team, the stuff that let your codebase add business value.

What I find over and over again is that a good many companies have focused all their attention on implementing a new process. Which process doesn't seem to matter, it varies over time and lately it has been Scrum. It has been seen as a panacea. At first it seems to go very well, but after just a few sprints, the velocity drops. The reason is that the codebase and the architecture haven't received any attention at all... Sure, a suitable process can often help a good team to become even more productive, but it wont compensate for lack of skills. So, I added "Iterative process" as an example of something that is part of the picture for achieving good code. My point is just that its most definitely not the only thing, of course it isnt. (Also read my factor10 colleague Niclas' blog post called "The holistic view".)

If I had to pick the one answer to how, I believe that the single most important way of achieving great code is to have the best people that you could possibly find. To have only a few, as few as possible, but not fewer. On top of achieving great code that would actually also let you achieve it cheaply. Of course you would have to pay more for the best, but they will be extremely productive (for real, not a fake) and since you only have so few, that also makes the financial side very appealing.

I find that quite a lot of people strongly believe that products will help a lot. After 20+ years in this business, I believe that products are more often the problem than the solution so care must be taken.

In the same way there is the comment about companies believing they need expert help from the vendor of the product they are having trouble with. To me, that's a smell regarding that product.

"We can't solve problems by using the same kind of thinking we used when we created them."
--Albert Einstein

Another silver bullet that doesn't work on the contrary is trying to solve the problem by increasing the head count. That's just a good way of increasing the problem and the cost. Quality over quantity, every day!

To summarize what I said above, here's a sketch:

spacer

Some more comments
Techies immediately seem to agree with this model. But, and this might come as a surprise, business people seem to like it even more. They might say something like: "So that's why the developers are complaining that I dont value quality. I used to think it was just whining, but now I realize it's about sustainable business value."

Quite often I find that developers ask questions such as: What feature shall I do first? Where shall I focus my refactoring efforts? What automatic tests are the most important? According to this model, there's a clear and simple answer to those questions that points us in the right direction. Choose whatever gives the most business value.

A colleague of mine goes as far as using the code quality of the important codebases for determining if a company will have a chance for success or if it will go out of business. If we aren't fully in control of the codebase, that's a setup for failure. (Obviously, these companies are not automatically doomed, not if they grasp the situation and act with power and intelligence.)

So, the codebase is probably quite important, perhaps the most important asset of the company. It's not something you allow just anybody to work with and make changes to. It's like if you need brain surgery; would you allow someone who had just done a crash course in brain surgery to operate? Of course not, you would probably try to find the best and most experienced person there is. Thats quite the opposite approach to how codebases are dealt with in many companies.

2009-12-16  #

spacer   Fast Track to DDD in Sweden

factor10's course "Fast Track to Domain-Driven Design" goes to the Northern hemisphere too, in Stockholm in November by little me. More information (in Swedish) here.

2009-10-23  #

spacer   It should be easy to build *good* software.

Earlier this year I was asked by a large vendor to review an early draft of a new tool. The idea of the tool was to help developers become more productive, that is, to make it easy to build software. But as soon as I started looking, I got the feeling that it wouldnt help the productivity when creating *good* software (and the more I looked, the more apparent it became).

That particular tool doesn't matter, but thinking more about it, unfortunately it actually feels like its been the same thing for quite a few years and for quite a few development tool vendors. It's obvious we need a much more interesting goal, isn't it?

It should be easy to build *good* software!

2009-08-20  #

spacer   DDD in the cloud

I'm going to give a workshop about DDD in the cloud at Progressive.NET in Stockholm next week. See you!

2009-08-20  #

spacer   Fast Track to DDD in South Africa

I think Aslam Khan has written some of the most interesting pieces related to DDD lately. Now he will teach the two day course "Fast Track to Domain-Driven Design" in Cape Town in early September. You find more information here.

2009-08-16  #

spacer   The CCC-article translated to Chinese

Xiaogang Guo has translated my CCC-article into Chinese. The article will be published in Programmer Magazine in an upcoming issue. You find the translation here.

2009-04-14  #

spacer   Article: Chunk Cloud Computing (CCC)

As a follow up to this blog post I have written an article called "Chunk Cloud Computing (CCC)".

2009-03-29  #

spacer   Some observations over the last few years

Before: Focus on layering
Now: More focus on partitioning

Before: Focus on loose coupling
Now: A balance (include high cohesion)

Before: Huge teams
Now: Extremely small teams

Before: Integration database
Now: Application databases

To be continued...

2009-03-27  #

spacer   NWorkspace == Nilsson Workspace?

I heard a rumor about someone being upset with me because I had named a little framework after my own name. He thought the N in NWorkspace stood for Nilsson. Well... No.
:D
And when I'm at it, that's not the case for NUnit, NHibernate, NPersist, NDepend or NCover either...

2009-01-06  #

spacer   Strategy workshop about Data Access

I've developed a workshop called "Strategy workshop: .NET data access" and Programutvikling in Oslo, Norway, has invited me to run the workshop as an open workshop in a few weeks. You find more information here (in Norwegian).

Here are a few extracts from the description in English:

"There's a lot going on right now in .NET land when it comes to data programmability.
...
There are many choices available now and each choice has an impact on code quality, maintainability and, overall, bottom line costs.
...
At the same time lots of .NET developers are starting to embrace Domain-Driven Design (DDD) and Test-Driven Development (TDD), which change the scene much more than most people at first realize.
...
Suddenly, the choice of data programming options is a strategic decision. What data programming option must I choose so that my teams can spend more time on solving the problem and less time on how to get and save the data?
..."

2009-01-06  #

spacer   Oredev 2008

There is going to be a DDD-track at Oredev this year - the first time this has happened at a conference. I'm going to be giving a presentation in that track, which I have also been given the honor of hosting. The other speakers are Randy Stafford, Dan Bergh Johnsson, Einar Landre and Eric Evans.

Of course there are more DDD people around, such as my factor10 colleague Aslam Khan from South Africa. I'm pretty sure he will be DDD-centric in his talks as well as in his workshop. Other DDD-centric presentations will be given by Patrik Lwendahl, Rickard berg and Claudio Perrone.

2008-11-16  #

spacer   DDD anti-motivation

OK, I've published a long list of very positive blog posts about DDD. Now I think its time to balance that a little bit with some problems. Heres the first one:

New (to most developers)

Even though DDD in itself isn't at all new, it's new to those that haven't tried it before, of course. And if you work in a big team with no prior experience of DDD, then it's very risky using it for a new and time constrained project. There are ways of reducing the risk, but it's still there and it's real.

As a matter of fact, DDD is new to most tool vendors as well. Unfortunately that means that quite often the tools don't help us that much. Sometimes they are even more in the way than they are helpful.

2008-10-24  #

spacer   Swedish surnames

A friend of mine here in Sweden told me a story about when he worked for a large telco here in Sweden and they sent him to one of their offices very far away to work for a few weeks. When he arrived, he was treated as a king and he couldnt understand why. But after a while he understood that they thought he was the son-son-son of the founder of the company Ericsson since his last name was just that. Here we wouldnt think like that since Ericsson is a VERY common surname here.

Another very common surname in Sweden is Nilsson, the second most common actually. Even so, more or less every week I hear somebody who thinks that one of my colleagues in factor10, Niclas Nilsson, is my brother. Thats not the case. I just wanted to work with him because he is brilliant and very nice.
:-)

Something I'm not that used to though is what happened the other day when Niclas met an old student of mine. My old student asked Niclas if I'm Niclas father! Hey, I'm not even three years older than Niclas...

2008-10-24  #

spacer   Strategic Design (DDD Motivation 7)

A single domain model and a single application typically aren't isolated things, without anything around them. You need to think about the big picture as well. Strategic design is about that.

Eric Evans has said that if he were to rewrite the book, he would put the chapters on Strategic Design at the front of the book and not at the end because nobody reads that far but that part of the book is perhaps the most important.

I know, this is no reason for you to try out an alternative to database-driven design because strategic design isn't the opposite at all, it's not at the same level. No matter if you go for db-driven design or domain models, I think strategic design is very important and therefore it felt important to mention it as a reason to have a closer look at DDD.

2008-10-16  #

spacer   A new course on implementing DDD and LINQ

My factor10-colleague Anders Janmyr and I have developed a new course on implementing DDD and LINQ. Anders will teach the course for Cornerstone in Gothenburg and Stockholm next week as well as in December. You can find more information about it here (in Swedish).

2008-10-16  #

spacer   Have you heard about LINQ?

I'm pretty sure you have, of course. The father of LINQ, Erik Meijer (Microsoft), will be giving a mini-event together with factor10 in Malm on October 3rd. It's free of charge. Read more about it and register here (some of the information is in Swedish). First come, first served...
:-)

2008-09-22  #

spacer   DP Advisory Council at Microsoft

I've been invited by Microsoft's Data Programmability team to be a member of an Advisory Council for Data Programmability and Domain-Driven Design (DDD). I'm really looking forward to it, it's going to be interesting and a lot of fun. More information here.

2008-06-19&n

gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.