VS2010/.NET4 Coming March 22nd

October 19, 2009 drub0y Leave a comment

In case you haven’t heard, VS 2010 and .NET 4.0 are officially scheduled for release on March 22nd. If you’re into playing with Betas (I know I am) and have an MSDN subscription you can go download it now. Those of you without subscriptions will need to wait about a week or so.

What I like most about this version of VS2010 is that someone at Microsoft finally wised up and realized that selling a gazillion different flavors wasn’t working and just confused/angered everyone. So now they’ve got just three:

  • Microsoft Visual Studio 2010 Ultimate with MSDN. The comprehensive suite of application life-cycle management tools for software teams to help ensure quality results from design to deployment
  • Microsoft Visual Studio 2010 Premium with MSDN. A complete toolset to help developers deliver scalable, high-quality applications
  • Microsoft Visual Studio 2010 Professional with MSDN. The essential tool for basic development tasks to assist developers in implementing their ideas easily
Categories: .NET, Team System

Microsoft Announces Shared CDN for Common AJAX Scripts

September 16, 2009 drub0y Leave a comment

ScottGu broke the news last night that Microsoft is making a shared CDN available for the purposes of hosting the AJAX scripts. The full details of the scripts that are supported right now are available here, but basically it’s ASP.NET AJAX 4.0 Preview 5 (which just came out) and jQuery 1.3.2.

If you’re using ASP.NET AJAX on the server side, you can just tell the ScriptManager to use the CDN by setting the EnableCdn property to true. I find this implementation a little misleading because it only applies to the scripts within System.Web and System.Web.Extensions. If you want support for other scripts you would need to put them on your own CDN and still hook the ScriptManager::ResolveScriptReference event and set the ScriptRefrenceEventArgs::Script property accordingly.

Categories: ASP.NET, Web Development

ASP.NET AJAX 4.0 Preview 5 Released, Includes “Disposable Objects” Performance Fix!

September 10, 2009 drub0y Leave a comment

Good news, ASP.NET AJAX 4.0 Preview 5 is here! Better yet, Microsoft has overhauled the implementation of tracking disposable objects to include the performance enhancement that was discussed in my “ASP.NET AJAX ‘Disposable Objects’ Performance Heads Up” posts (Part I & II).

So how’d they do it? They tag each disposable object with an integer value which represents the objects position in an internal array using a property named “__msdisposeindex” when it’s registered. When it’s unregistered they delete that property from the object and from the array. Small performance heads up that there is a small hidden cost of reallocating the array once 1000 object have been removed. This keeps the array from growing into infinity should you keep an application open in the browser for a long time and create/destroy lots of disposable objects.

So, it’s not exactly as discussed previously since they use an array/index approach instead of a global, incremented counter with a hash, but… problem solved!

Categories: ASP.NET, Web Development

Switching to Disqus comment system

August 31, 2009 drub0y Leave a comment

I’m working on switching this blog over to the Disqus commenting system. I have it in place for commenting now, but still need to find a way to export all the old comments and import them into Disqus so I can keep the legacy discussions alive.

Categories: Personal

Full Expression Tree Support Coming in .NET/C# 4.0

August 24, 2009 drub0y 3 comments

Yesssssssss! I’ve been waiting and hoping that this was coming in 4.0 and now it’s official: Full Expression Tree Support. This seems like a HUGE step forward to me. With the advent of this feature, one can finally implement support for converting an expression written in C# to something that runs on the GPU. For example, this would enable a scenario in the future where we can actually write WPF shader code using C#/VB and they can be considered safe because a “GLINQ” interpreter could inspect the statements before it converts them to shader code directly on the fly at runtime.

I haven’t blogged in forever (something I want to get back into), but this news warranted a post immediately. Really excited, hope I can find some time to tinker with starting a expression tree based GPU library.

UPDATE/CORRECTION:

Ah, darn it. Looks like I read the post wrong. While full support is coming to the expression library, it looks lilke C# 4.0 will still not implement compiling of full method bodies into expression trees. Do they really think that having the ability to hand write expression trees by hand with the expression library is of any use? ‘Cause I don’t really see it happening… gotta have the language support for it to take off. :\ 

Guess it’ll be another four years of not being able to implement something like rich GPU support inside of .NET languages. spacer

Categories: .NET

ASP.NET AJAX “Disposable Objects” Performance Heads Up – Part II

April 14, 2009 drub0y Leave a comment

Ok, I had to put together a Part II to this topic because I was totally wrong in Part I about objects being able to be used as keys because… well, I’m an idiot and didn’t do all my fact checking to make sure my implementation was 100% sound. spacer Thanks to Dave Reed who commented on the original post pointing out my flawed thinking.

Mea culpa

Basically Dave points out that JavaScript objects are really specialized hashtables called associative arrays where the keys absolute MUST be a string OR a type which can be converted to a string. Now because we’re using Object subtypes here, they would have to override the toString() method to provide a meaningful string if we expected them to be used as keys. Well, the instances we were stuffing in the _disposableObjects weren’t providing any such toString implementation, nor do we want to impose one. So, is all lost? No!

Redemption!

Ok, so I totally failed with my first approach, but I shall now redeem myself! As soon as my other idea was beaten down, set it on fire and stomped it out with golf cleats (actually Dave was rather nice, it just FELT like that’s what happened), I quickly came up with another solution. Here’s the nitty gritty:

  1. Sys._Application keeps an internal counter called _dispoableObjectsNextId which is started off at the minimum value of a 32-bit integer number:-2147483648. I choose this for ease and because it will provide us with billions of identifiers which, unless your app runs for a really long time and/or instantiates and disposes of billions of objects should have us covered.
  2. Sys._Application has a constant hanging off of it called _DisposableObjectIdKey which I’ve decided to make “!!disposableObjectId”. I don’t see that colliding with many other chosen JavaScript key names, but because it’s a constant we could change it to be something long and totally ASP.NET AJAX specific to avoid the possibility of collision.
  3. Each time Sys._Application::registerDisposableObject is called we “tag” the incoming object with the next identifier using the constant _DisposableObjectIdKey. Next we use that identifier as the key on the _disposableObjects hashtable with the object as the value.
  4. When Sys._Application::unregisterDisposableObject is called we look up the value of the id on the incoming object using the _DispsosableObjectIdKey we then delete that key from the _disposableObjects hashtable.
  5. For the Sys._Application.dispose implementation we simply for..in the keys on the _disposableObjects hashtable and call dispose item that is registered.

The new performance picture

We’re doing a little more than before here, so naturally we’re gonna take a hit someplace. How bad is it? Do we still outperform the array?

IE 6.0.2900.5512.xpsp.080413-2111
# of Objects Array (ms) Hashtable (ms) Gain
500 160 1(+0) 160x
1000 711 10(+0) 71.1x
5000 *33298 90(+9) 369.9x
10000 *138279 180(+19) 768.2x
IE 8.0.6001.18702
# of Objects Array (ms) Hashtable (ms) Gain
500 91 5(+1) 18.2x
1000 429 11(+2) 39.0x
5000 *27168 57(+10) 476.6x
10000 *110025 114(+10) 965.1x
FireFox 3.0.7
# of Objects Array (ms) Hashtable (ms) Gain
500 21 1(+0) 21.0x
1000 79 2(+0) 39.5x
5000 1891 11(+0) 171.9x
10000 7608 22(-1) 345.8x
Safari 4 Public Beta (5528.16)
# of Objec
ts
Array (ms) Hashtable (ms) Gain
500 63 1(+0) 63.0x
1000 263 2(+0) 131.5x
5000 6805 5(-4) 756.1x
10000 *28315 12(-6) 1573.1x

*Indicates that I had to escape the “Slow running script” dialog to even be able to continue execution on these.

Well, we’re still handily out performing the Array implementation. We took hits in both IEs, but the hit was worse in IE8 which is absolutely baffling. FireFox didn’t really change, in fact the 10000 object test gained a millisecond. Finally, Safari 4 gained in the 5000 and 10000 object tests!

Is this a hack?

So, some will look at this and say: hey man, that’s really hacky that you’re just slapping a random key/value (aka “expando” property) on an arbitrary JavaScript object like that. Wellllll, this is JavaScript and from where I’m standing that’s one of the powerful features of this language. It’s kinda like DependencyObject if you’re familiar with WPF. In most cases that I can think of this is perfectly harmless because, if people don’t know about it or purposefully go messing with it, it can’t hurt anyone. The only case where this could potentially hurt is if someone’s implementation uses a for..in on the object to enumerate all of its keys. That would now turn up our _DisposableObjectIdKey and that could be bad. There is only one aspect of the framework that really does that and that’s when working JSON [de]serialization. In the case of JSON objects though, you’re talking about “pure” data objects and those are not going to be registered as “disposable objects” anyway. So, the real question for me is: Is this “hack” worth the performance gains as long as it comes with a small documentation note that explains how this extra field could affects callers? And for me, the answer is “hell yes”.

Conclusion

Ok, so I screwed up my first approach, but hopefully this second one helps me save some face. We had to do a little extra work inside the framework code base and gave up a wee bit of performance in IE, but we’re still posting huge gains over the existing implementation. I am providing my updated version of the performance test and framework scripts below and will go update the CodePlex issue with this latest version as well. Now, I just have to hope I got it right so Dave doesn’t come back and teach me another lesson. spacer

Links

  • Download the updated performance test and framework scripts.
  • Still want to see this fixed? Go vote on the issue over on CodePlex.
Categories: ASP.NET, Web Development

ASP.NET AJAX “Disposable Objects” Performance Heads Up

April 11, 2009 drub0y 4 comments

Update: Make sure you read Part II as there was ultimately a fundamental flaw in this implementation which prevents it from working as I originally thought.

One of the important features of the ASP.NET AJAX client side framework is the concept of disposing of components/controls so that they unregister event handlers and release DOM element references so that you don’t end up with circular references and, thus, memory leaks. The ASP.NET AJAX client framework takes the same approach as .NET does where there is a Sys.IDisposable interface which you can implement to indicate that your class requires disposal semantics. By implementing this interface certain aspects of the framework, as well as other consumers of your code, will recognize that they need to call your dispose implementation.

How the framework tracks “disposable objects”

The performance problem I want to discuss lies in the way the framework itself tracks known instances of “disposable” objects. First off, anything that derives from Sys.Component is automatically in this boat. Sys.Component is important because it is the base class of other important base classes like Sys.UI.Control and Sys.UI.Behavior. Sys.Component implements Sys.IDisposable, but also has some tight coupling with Sys.Application. Every time you create an instance of a Sys.Component a call is made during its initialization to Sys.Application.registerDisposableObject to which it passes itself in. This method takes whatever instance it is handed and calls Array.add to add the object to an array it maintains internally called _disposableObjects. Conversely, when Sys.Component’s dispose method is called it makes a call to Sys.Application.unregisterDisposableObject at which point the method calls Array.remove to remove the instance from the _disposableObjects array. The astute performance geeks are probably already starting to see where this is going, but before I get to the specifics let’s discuss why this register/unregister dance is even happening in the first place.

Why does it work this way?

So, why does Sys.Application need to even track these objects? Isn’t the person who created them supposed to dispose of them? Well, for the most part yes. However, there’s also the pattern of just creating controls via global functions, such as pageLoad, and then just forgetting about them. In either case, when the application is shutting down either from an explicit call to Application.dispose (which is rare) or a navigation away from the page it needs to be able tell all those objects that it’s time to clean up.

So what’s the problem?

Ok, so, what exactly is the problem? The problem is an array is used to store this list of disposable objects and, as mentioned earlier, when a component asks to be unregistered Array.remove is used. Array.remove uses Array.indexOf to find the position of the item in the list. Array.indexOf is implemented as a for loop indexing into the array and doing equality checks on each item until the item looking to be removed is found. So, the more disposable objects in the array the worse your performance gets. In Big-O Notation, we’re talking O(n) here. Worse yet is that, if you consider the typical pattern where the most recently created objects are the ones most likely to be disposed of, you’re constantly having to scan through to the end of the array. And that’s not all! Once Array.remove has located the index of the item in the array it still has to perform an Array.split to actually remove it from the array which incurs even more overhead.

Seriously, is this gonna even affect me?

Right about now, some of you might be skeptical and wonder why this is such a big deal. I mean, who creates all that many components anyway? Well, I can tell you I’ve already run into this problem in a rich client ASP.NET AJAX application in production. You see, the power (and joy IMHO) of ASP.NET AJAX is that you’re encouraged to create more complex interactions by composing controls and behaviors much the same as you would with WPF/Silverlight. You just have to keep in mind that each control and behavior you attach to an HTML element adds to the _disposableObjects array we’ve been talking about here. Worse still is that the power of templates which make it sooo easy to repeat a bunch of controls/behaviors per each item being bound. You always need to be aware of how many controls/behaviors each instance of a template instantiates of course, but you also need to consider that even binding some text to a <span> element comes with the same cost because Sys.Binding is a Sys.Component subclass.

A proposed solution

So, how can we remedy this problem? Sure sounds like a problem for a HashSet<T> to me in .NET land. Hmm… too bad there isn’t some kind of a hashtable implementation in JavaScript, right? Well, actually, there is! A lot of people don’t realize it, but every JavaScript Object is really just a glorified hashtable of key/value pairs. All we need to do is use the power of the ability to dynamically add key/values to any JavaScript Object using the [key]=value notation. Keys don’t have to be strings or numeric types, any type can! So, with that in mind, if the the internal _disposableObjects field on Sys.Application was just an Object and registerDisposableObject just added the instance being passed in as a key with null as a value and then unregisterDisposableObject just deleted that key from _disposableObjects, we could rely on the power of the JavaScript runtime implementation to find that entry in the list instead of having to scan the entire list looking for it ourselves. Now, naturally it depends on how the JavaScript runtime is implemented, but most implementations today are actually using “real” hashtables/hashcodes behind the scenes so the performance is wayyyyy better than having to index into an array and do equality checks ourselves.

Working code

As not to be all talk and no action, I’ve actually updated the latest version of the ASP.NET AJAX Preview (version 4 as of this writing) with these changes and am providing my updated copy at the bottom of this post. I’ve also relayed this information to Bertrand Leroy who is forwarding it to the ASP.NET AJAX Team who I hope will consider making the fix in the next drop since it’s completely internal to Sys.Application and very easily tested for compatibility. Just to make sure, I also entered an issue over on CodePlex which, if you’re interested in seeing this get fixed, you can go vote on.

Numbers don’t lie

Here’s a quick set of results from a performance test I slapped together where I instantiate some number of disposable objects and then dispose of them in reverse order to simulate the fact that, most often, the youngest of objects die off first. The machine where I ran these tests is a Core 2 Duo 2.66Ghz, 4GB RAM, running Vista 32-bit SP2. The IE6 test was done in the IE6 test VM supplied by Microsoft which is running XP SP3. The Safari test was done on a 13” MacBook with Core 2 Duo 2Ghz, 2GB RAM running OS X 10.5.6. All tests were done with the release mode version of the MicrosoftAjax script and the numbers shown are the median of three consecutive runs. I also executed the test several times before recording the numbers to give the JavaScript engines a chance to employ any kind of code optimization they might use.

IE 6.0.2900.5512.xpsp.080413-2111
# of Objects Array (ms) Hashtable (ms) Gain
500 160 1 160x
1000 711 10 71.1x
5000 *33298 81 411.1x
10000 *138279 161 858.9x
IE 8.0.6001.18702
# of Objects Array (ms) Hashtable (ms) Gain
500 91 4 22.8x
1000 429 9 47.6x
5000 *27168 47 578.0x
10000 *110025 94 1170.5x
FireFox 3.0.7
# of Objects Array (ms) Hashtable (ms) Gain
500 21 1 21.0x
1000 79 2 39.5x
5000 1891 11 171.9x
10000 7608 23 330.8x
Safari 4 Public Beta (5528.16)
# of Objects Array (ms) Hashtable (ms) Gain
500 63 1 63.0x
1000 263 2 131.5x
5000 6805 9 756.1x
10000 *28315 18 1573.1x

*Indicates that I had to escape the “Slow running script” dialog to even be able to continue execution on these.

No surprise that FireFox and Safari are crushing IE in both scenarios. It’s also no surprise that IE6 lags everyone else in both scenarios. Safari appears to have the best hashtable implementation of the three, though FireFox seems to have the best overall execution performance since it beats the others handily in the Array approach. One thing’s for certain, all the browsers show massive gains when moving to the hashtable approach.

Final thoughts

Assuming the ASP.NET AJAX team applies this simple change to the next version of the framework, there’s really not much to worry about going forward because even if you had an application with 10000 registered disposable objects and, at any given time, you dispose of a more realistic number of components, like say 200-300, from a template at once, the overhead of the unregisterDisposableObject implementation will now be so miniscule that all you have to worry about is the actual cost of the dispose implementations themselves.

Links

  • Download the optimized source and test page here.
  • Vote on the issue to make sure it gets addressed here on CodePlex.
  • Make sure to read Part II for details on how this original implementation was flawed.
Categories: ASP.NET, Web Development

Create your own ObjectContext Factory To Avoid performance penalties

April 6, 2009 drub0y 1 comment

Here’s a quick performance tip for those of you using ADO.NET Entity Framework. If you’re like me, you’re using EF in web services as your data access layer. Also like me, I’m sure you want your web services to perform their absolute best. Well, if you use the default constructor when instantiating EF ObjectContext classes, you’re incurring a penalty of metadata lookup every single time which includes the cost of three internally handled exceptions. As we all know, throwing exceptions, handled or not, is one of the most costly things there is runtime in .NET.

For starters, here’s the pattern you’ll see a lot in samples and documentation when, if used in production code, will cost you:

using(MyEntities entities = new MyEntities())
{
    … perform LINQ queries and whatnot here …
}

In this sample MyEntities is the EDMX code-generated ObjectContext subclass that is specific to our model. The code-generated constructor will look like this:

public MyEntities() : base(“name=MyEntities”, “MyEntities”)
{
    this.OnContextCreated();
}

As you can see, the base class constructor that is invoked is the one that uses the connectionString+defaultContainerName overload. Now this causes the following chain of events to occur:

  1. The ObjectContext needs to internally resolve an EntityConnection instance which it will use to do its communication with the DB. So it instantitates the EntityConnection by passing in the name=”MyEntities”.
  2. The EntityConnection checks to see if the connection string is just a named connection string or the full blown real deal. If it’s just a name, now the EntityConnection needs to take a detour through configuration land to resolve the full connection string from ConfigurationManager.ConnectionStrings.
  3. Now that the ObjectContext has an EntityConnection to work with, it needs a MetadataWorkspace to be able to know how to map between the conceptual model and the storage model. So, it’s going to go ask the EntityConnection for it by calling its GetMetadataWorkspace method.
  4. Well, with the path we took, the EntityConnection was created for us without a MetadataWorkspace instance, so it’s gonna have to go and resolve it.
  5. Assuming you’re using assembly embedded EDMXs (the default) this involves going through the assembly manifest to resolve the metadata.
Categories: .NET, Entity Framework

Getting a distinct list of changed files from TFS using PowerShell

April 1, 2009 drub0y 6 comments

If you’re like me and need to do code-reviews of other people’s stuff or maybe you just want to see everything that’s changed during a certain period of a project, then here’s a nice PowerShell tip for you.

First, make sure you’ve downloaded the latest version of the Team Foundation Powertools. Starting with the October ‘08 release, the tools now include a PowerShell snap-in that add several commands which enable rich interaction with your Team Foundation Server. Of particular interest to us for this exercise though is the Get-TfsItemHistory command.

Now, let’s assume a scenario where we have a Team Project named $/Foo and we’ve been working on some patches in a v1.1 branch within that project. Now it’s time to review the work we’ve done in the current iteration which started on March 1st ‘09 and ended on March 31st ‘09. Here’s how we might start gathering those changes:

Get-TfsItemHistory “$/Foo/v1.1” -Version “D3/1/09~D3/31/09” -Recurse

Now, what this is gonna do is bring us back all the changesets that are related to any file underneath the v1.1 branch. While listing out the changesets is nice, it’s not going to tell exactly which source files I need to review. So the next thing we need to do is make sure we bring back all the changes in the changesets by adding the -IncludeItems parameter to the Get-TfsItemHistory call. We’ll also want to do is flatten out the list because, again, we’re interested in the indvidual changes, not the changesets themselves. So we use Select-Object’s -Expand parameter to flatten out the list:

Get-TfsItemHistory “$/Foo/v1.1” -Version “D3/1/09~D3/31/09” -Recurse -IncludeItems | Select-Object -Expand “Changes”

Great, so now we have a list of all the changes that were made to each file in this release, but this is still a little noisy. For starters, change types such as deletes, branches and merges are shown here. Well, if the file was deleted there’s not much to look at now, so… I don’t want those in my list. Also if a file was simply branched into the project from someplace else we don’t really care because wherever it came from already underwent a review. Merges are also questionable as, hopefully, the merged was reviewed for any conflicts at the time the merge occurred. Plus if there was a conflict that means they must have changed the file which will show up as a standalone edit anyway which means it will still end up on our list. So, how do we filter that noise out? Like so:

Get-TfsItemHistory “$/Foo/v1.1” -Version “D3/1/09~D3/31/09” -Recurse -IncludeItems | Select-Object -Expand “Changes” | Where-Object { ($_.ChangeType -band ([Microsoft.TeamFoundation.VersionControl.Client.ChangeType]::Delete -bor [Microsoft.TeamFoundation.VersionControl.Client.ChangeType]::Merge -bor [Microsoft.TeamFoundation.VersionControl.Client.ChangeType]::Branch)) -eq 0 }

Alright, almost there! Now the only problem is that if the same file is changed multiple times it’s going to be listed multilple times and really we just want the distinct names. This is a little tricky because the Change object contains the Item as a Note property, luckily there’s a Select-TfsItem command to help read what we’re interested in out. After that we just do a little grouping and sorting and we have a list we can work with:

Get-TfsItemHistory “$/Foo/v1.1” -Version “D3/1/09~D3/31/09” -Recurse -IncludeItems | Select-Object -Expand “Changes” | Where-Object { ($_.ChangeType -band ([Microsoft.TeamFoundation.VersionControl.Client.ChangeType]::Delete -bor [Microsoft.TeamFoundation.VersionControl.Client.ChangeType]::Merge -bor [Microsoft.TeamFoundation.VersionControl.Client.ChangeType]::Branch)) -eq 0 } | Select-TfsItem | Group-Object Path | Select-Object Name | Sort-Object

Finally, in true PowerShell fashion, this will write the paths out to the host (console or ISE), but naturally you could also Export-*, Out-*, etc as well.

Categories: PowerShell, Team System
gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.