Disruptive technology – xRM (aka MS CRM 5.0) & Windows Azure

•December 23, 2009 • 1 Comment

Keeping an eye out for disruptive technology is one of the key driving forces behind folks involved in software development. Most people refer to this as the next “big thing” and its like a holy grail. Take the time to watch this video from MS about where they see the world in 2019 and you ‘ll get the picture. It hopefully has cranked up your brain and so ideas will start poping into your head.

I’ll not go quite as far as 2019 but starting in 2010 and for a few years I see the combination of xRM and Windows Azure making a significant impact on how line of business applications are developed & deployed. There are lots of reasons why this will happen but I want to just distill it down to their respective killer propositions.

So why do I think Windows Azure is a strong contender in the cloud space. Well assumming its just on a feature par with  its competitors its trump card is  integration, integration, integration do this and you lower the barrier to entry do this and you lower TCO. Simple fact, Azure will be huge.

Why do I think xRM is going to be disruptive two words “declarative” & “compatibility”.  You can get a hell of a lot done without coding or coding in small blocks. The data model & workflow components are the best two examples and just look at the PDC 09 annoncement for SQL Server modelling. Software is shifting in this general direction and using xRM today will start training your mind to architect solutions in this manner.

Now xRM & Windows Azure are platforms in their own right with a shared ancestry and this combination (with a sprinkling of the SharePoint platform) will give us our first truly super platform for building applications.

A bright future, StreamInsight & Complex Event Processing (CEP)

•July 23, 2009 • 1 Comment

[Update: Microsoft announced the offical name StreamInsight and a CTP available in August sometime for download. See here for details.]

CEP has been creeping into the blogsphere over the last few years but has not caught on in mainstream computing but I reckon this is about to change. See this link for a quick introduction,

http://www.thecepblog.com/what-is-complex-event-processing/

CEP’s main  proponents are finance, energy,risk,fraud detection, etc. but I reckon the really big break through is going to be consumer driven. I see here an explosion of data especially from consumer devices just think of all that real time GPS data on an iPhone coupled with either of,

-Mint web site which stores a significant proportion of ones financial health

Amazon online shopping history

A who’s who of vendors include,

StreamBase

Progress Apama

Truviso

IBM

Aleri/Coral8

Finally why blog about this now, well back in February Microsoft announced a product coded named “Orinoco”

Introducing Microsoft’s Platform for Complex Event Processing

Involved In Software Development – You Need This !!

•February 17, 2009 • Leave a Comment

A website worth browsing around. I really liked its page below as its definitely a skill you need to master in development,

How to detect bullshit

Workflow as the wildcard? Absolutely…

•January 7, 2009 • Leave a Comment

This is a response to the blog post,

http://mdavey.wordpress.com/2008/12/29/further-azure-thoughts/

So the question was,

“Or go for a “wild card” WF implementation?”

I believe this has a reasonably chance of being the future of a lot of development. We just need to answer the when but with WF 4.0 being a complete rewrite and items like Dublin, Velocity and Parallels library in the works then yes its a stretch but their unification could actually work and bears investigating.

Just imagine a team building an app. DDD style and exposing public domain services that then get stitched together by a domain expert via a Workflow GUI like K2’s so getting them involved in the application building process, this has to be very compelling for any DDD guy.

In terms of workflow editors I’d recommend folks look at the one in MS CRM 4.0 which is very interesting and goes another step in the right direction for MS. Just look at the image that MS has for the Azure platform and guess what CRM services (don’t get hung up on the name CRM, MS will end up renaming it to something like xRM and it will compete with the likes of Salesforces force.com offering) sits in as one of the 5 services.

So workflow is going to be huge but it will take lots of time to get into the high end (if ever) of financial trading systems but one place to start I reckon is low volume high value transactions…

Death Knoll for Relational Databases ??

•December 31, 2008 • 3 Comments

Well the subject line maybe a bit premature but I was investigating the new Windows Azure platform and when reading about the SQL Data Services the first thing that jumped into my head was could this be a small but major step in the decline of relational databases. For years OO databases have been touted as the grim reaper for relational databases but none have really broken into the mainstream, enter SQL Data Services.

As its a cloud service its got great scalability & capacity, its got communication (HTTP) covered with both REST & SOAP and finally its just part of the greater MS ecosystem for development support so you can use all the favorites such as Visual Studio, C#, etc. This all helps to lower MS’s entry into cloud computing as developers use familiar tools to develop applications.

So why a replacement for relational databases well just a few points,

  • It follows a more abstract model with Authorities (~Server), Containers (~Database) & Entities (~Rows). Entities just define one instance of data and are stored in a generic name\value property bag.
  • There really is no fixed schema in the containers or the Entitites just put in what you like.
  • Bound to see ORMs being developed to query\update the entities making it easy to work with for devs and should see more maturity in projects such as Prevayler.
  • Has internet scale which easily goes horizontal which can be done but is difficult with regular SQL Server.
  • No items such as stored procs, identity columns & referential integrity. This forces people into thinking about their domains, entities, repositories, factories, services, etc. and this whole concept has really matured with the advent of  Domain Driven Design.
  • Distributed cache will really help to make this whole thing work and you can see Microsoft are working on their own one called,  Velocity
  • Its not just Microsoft look at Amazons SimpleDB service
  • Folks should still worry about performance but  focus more on the business as massive performance gains in the  future with the likes of Intels  Dunnington Six cores on one chip.

I guess the really question is can the individual companies and the industry as a whole glue all these things together to bring around the demise of the relational database? Well I reckon there’s a better than 50/50 chance that within the decade it will become evident that they are sliding and look to no further proof of that but all Oracle acquisitions over the past few years.

Grid Computing = Silverlight + Parallel Extensions, nearly….

•March 3, 2008 • Leave a Comment

Using my trusty Google Reader I came across a couple of interesting projects which I hoped could be combined to solve very real world problems.

The Projects

Legion is a Grid Computing framework that uses the Silverlight CLR to execute user definable tasks.

Parallel Extensions

Is a library that allows developers to take advantage of multi-core machines while abstracting away the hard problems of multi-threading programming. It includes,

-Parallel Language Integrated Query (PLINQ)

This is declarative in nature and provides a SQL like syntax for your parallel work.

-Task Parallel Library (TPL)

This is imperative in nature and just makes it easier to write managed code that can take advantage of multiple processors.

The Problem

There are certain types of problems in the real world that are CPU intensive a classic one is determining risk. The Monte Carlo method is one way of tackling this problem through repeated random sampling to work out the risk.

It is used in industries such as finance to price complex financial derivatives and within portfolios to work out the Value at Risk (VAR) . In manufacturing it can be used to determine how many product units should be produced. Take a look at this Microsoft Excel Monte Carlo simulation example for an introductory primer as to whats it all about.

Monte Carlo methods are computationally intensive but naturally parallel and so using grid computing to solve the problem is a good fit.

A Potential Solution

There seems to be a good fit between the Legion project and Parallel Extensions to solve these class of problems but unfortunately the releases of Silverlight 2.0 & Parallel Extensions are just to far apart for now (see here). I hope these will be supported together in the future so opening up a whole new area for the average developer. For now I’d recommend keeping an eye on the Silverlight 2.0 release at MIX 08 this week so to see what threading support will be provided.

Multi-Tenancy & the Singleton Design Pattern

•January 27, 2008 • Leave a Comment

Since before xmas we have been working on a upgrading one of our core products which is built on the MS CRM platform (yes platform..working on a post about that topic).

Its the first time I have worked on a multi-tenant system and one of the first things you run into is not being able to use the standard singleton design pattern in a lot of common places. We use is for configuration information about the system but now that you can have multiple organizations (tenants) effectively sharing the same DLL loaded into memory means you have to think differently.

We took a very simple approach and just changed the existing singleton to have a new private member storing a generic list of configuration objects that can be accessed by passing the organization (tenant) as a parameter.

Anyone know of any better ways to accomplish this ?