Ok, so I was reading /.’s coverage of Mark Lucovsky jumping ship from MS to Google. Naturally I expected the typical anti-MS spin and of course it was there, but in the form of link to an entry on Mark’s blog about Microsoft not being very good at shipping software. In this entry Mark analyzes the way Microsoft ships their software (specifically he mentions the .NET framework) vs. the way Amazon ships a fix for their software (their web site). Maybe if Mark expanded a little bit on what he believes the right answer to the problem is I could understand his angle a little better, but how you can make this comparison of shipping software to a controlled environment vs. millions of user’s desktops is beyond me. That said, I did want to just talk about the .NET Framework part for a minute.
For starters, the .NET Framework is available on Windows Update. It’s just an optional component to install because you don’t need it to use windows. You could technically ship your application with an installer that detects that the framework is not installed and tell them to get it from Windows update, but why make the user go through all the trouble. Just slap the resdistributable merge module into your installer and ship it with your application. This is exactly the way DirectX works for example. Sure you can grab the latest from Windows update, but almost all video driver installs and/or game installs ship with the latest version so that when the user installs their new card or the latest game they can just start playing right out of the box.
So now let’s consider how updates are handled for Microsoft’s biggest application: Office. Office Update is there, people can check for and grab the latest updates whenever they want. The updates aren’t forced, but why should they be? If people are perfectly happy with Office the way it is why would you badger your users to download megs of updates just because you refactored your code and improved speed of a certain operation by 10%?
Ultimately I believe Microsoft is getting better and better at breaking their products into smaller more manageable pieces. This is eventually going to lead to smaller, faster updates for the OS as well as applications. They are also working very hard on delivering the Smart Client technology with .NET 2.0. Using ClickOnce the install procedure can literally be the equivalent of clicking on a link and approving/denying some rights the product may require to run. For updates, each time you launch the application, or based on a schedule, it can check if there are updates and ask if the user wants to download them. Longhorn is going to ship with with all of this functionality baked right in. As a result, we should see more and more programs, both by Microsoft and third parties, start to use this functionality to perform installation and updating of products.
I, like you, can’t really see the comparison with a MS developer fixing a bug in the .NET framework to an Amazon developer fixing a web service that has a bug somewher in it’s internals. Those are two different animals and one reason we love web services so much. Now fixing a bug that requires a new web service interfacce or method declaration is a bit closer to fixing a .NET bug.
But on the other hand I can see the argument on the other side. I hate finding a bug in .NET or SqlServer or Word, check MSDN to see if it’s been fixed and then find out that I have to go through MS channels to get the unsupported fix. I know why they do it — stuff has to be properly QAed, lots of other things that have to be tested with the fix, yada, yada…but it seems like MS takes it a bit too extreme. Take VS2000 sp that came out on MSDN, um what about all the bugs that VS2003 has? Can we get a new SP for that?
When you put a button on your product that says check for updates and it is totally useless for 18months, then you’ve just wasted screen space and a MS developer’s time because it certainly looks like nothing will ever come through that VS2003 update button.
How about giving me the option to install the latest VS2003 not-quite-ready for prime time fixes without having to prove I have the exact problem described in a fix bulletin.
OK I’m done.
One last thing. As for this “FIX” posted by MS on the support web site:
FIX: Web service clients regenerate serialization assemblies every time that the application runs (KB872800).
Any resolution that requires me to copy code from the KB article, and then build a command line tool from that code is not a FIX. That’s more like something you post on CodeProject.
Ok now I’m really done.
Microsoft’s smart client technology in Whidbey is very minimal.
I want a platform that has both client and server functionality to cache data. I want to be able to use the smart client offline with that cached data. I want to synchronize that data when I’m back online. I want the platform to do this for me. I want an occasionally connection client platform.
Microsoft Update is a poor piece of technology. Why doesn’t it send byte-level deltas down the wire when it needs to update a DLL? Instead it sends down a monolithic mshtml.dll. The process would be:
1) Client requests updates.
2) Server notifies client of files which require updating.
3) Client responds with MD5/SHA1 hashes of those files.
4) Server checks dependancies and sends down deltas.
RE: Offline caching of data
You can do this even today without anything from Whidbey. The easiest way is to simply use a DataSet to be your client side. When connected, you can download the latest image of data into the DataSet. Then you can disconnet and make all the changes you want to the local DataSet. You can even close the app out and just persist the DataSet as XML or even using the BinaryFormatter. All this time that you’re making changes, the DataSet is tracking them. Now you reconnect and in your app you say “merge my changes” well now you connect back up to the server with a DataAdapter and the dataset will issue all the UPDATE/DELETE/INSERT commands necessary for whatever work you did while you were offline. The trick then is dealing with optimistic concurrency, but I need not go into all that detail here.
Now that approach pretty much assumes you’re working against a DataBase directly. If you were dealing with a web service you’d obviously have to do a little more work yourself to determine the changes to the data and translate them into the correct calls to the web service in the correct XML format, but that too is possible. For example if you had a web method to create new entries that took a certain XML schema you could start by calling Data[Set|Table]::GetChanges specifying that you wanted only data that was recently added (DataRowState.Added). Next you could serialize the result of that call to XML document using WriteXml. Finally you could pass that XmlDocument as a parameter to the web method that performed inserts. Obviously I’m leaving out a bunch of synchronization steps, but you get the idea.
In the end, Microsoft providing the DataSet/DataAdapter approach for you is probably about as good as they can do. There’s just no way you can expect them write something generic enough to communicate with all the other random forms of APIs out there because there’s just too many.
Unsurprisingly, Lucovsky is preaching the gospel according to Adam Bosworth….
Shipping the .NET redistributable is NOT an option for many kinds of applications. Let me give you one real world example: I used to work on Microsoft Office Live Meeting, which ships both a Win32 rich client and a Java-based almost-rich client. The Java client is 1.1 MB compressed, the Win32 client is 1.5MB. We did a lot of work to get the size down on both of these, because our customer base wouldn’t bother to do a huge download and install. We would have liked to do a .NET client, but most enterprise end users didn’t have .NET installed (as of 9 months ago, anyhow), and many of them didn’t even have admin privileges. Bottom line: rich clients only work if the runtime is preinstalled on the majority of your customers’ machines. The magic word is ubiquity. Flash player has it, Java has it, Acrobat Reader has it. .NET doesn’t. I wish that wasn’t true, but it is.
Longhorn will fix this eventually, but it will take a while for it to become the majority OS in the corporate world.
… immediate scrutiny from advocates for the Keystone XL project. Michael Whatley, a vice president at the industry-backed Consumer Energy Alliance (CEA), rapped the report for making an “apples to oranges” comparison of spill rates in the two nations.
I know this is Apples vs Oranges, but I essentially have this choice. … The odds you’ll be picking up return shipping have just made …
And I look at the production rate, which I believe was an all-time record for your company in the first quarter, and then allowing for — there’s some apples-to-oranges things with the Moab harvest, I guess. But it does seem like if you can continue at …
New Macs are shipping without restore media of any kind. … online instructions (either here or elsewhere) to create a functional install disc of your own.
I didn’t look up shipping cost, but let’s ignore it for the moment. Apple’s typical profit margin on the iPhone is 60%, well in excess of other iOS devices.