Ive been working on a graphical workflow designer for several years, and recently the requirements got alot larger. Ive always been interested in adopting a mature, abstract workflow system, but Im still not quite sure how workflow tools are meant to work.
Im especially bothered by the term "workflow engine" - seems like a glorified word for "execution environment to me"...
For me, workflow can mean any procedural logic that is implemented programmatically. In that sense, isnt any programming language a workflow engine ?
I really want to learn how to automate workflows using something like JBPM and by inheriting a FOSS java swing gui like Enhydra, but havent been able to understand what it is that JBpm/Enhydra/etc like tools does that good old object oriented code doesnt also do. For example, the JBPM website sais :
-Pluggable architecture -Extensible and customizable on every level -Easy programming model
This reminds me of what youre instructor sais when you first take a class in Java programming and OOP.
Then after they cite these reasons, they mention a whole bunch of other acronyms which are confusing, and if you google those acronyms, you the exact same list of pros (i.e. extensibility, pluggability, modularity, etc etc). It seems like its an endless circle.
My Final confusion (If youve gotten this far, thanks...) :
When working on large projects, I always look to the relational data model as the "end point" i.e., the place which gives immediate relevance to any software component in the system, but it seems like these WSDL BPM XPDL people only reference databases as secondary services in their frameworks, making it difficult to understand what the non abstract benefits of their workflow based technologies are.
Will somebody please give us a real world example of how JBpm or Enhydra or some other XPDL compliant application speeds deployment, development, etc. ?
Joined: Aug 19, 2005
Originally posted by jay vas: Will somebody please give us a real world example of how ... or some other XPDL compliant application speeds deployment, development, etc. ?
As stated previously, BPEL currently seems to have market momentum. However that doesn't mean that the concept of plug and play business processes works as na�ve assumptions might suggest. In practice, the following aspects may hinder the realization of that vision: * To compose and/or orchestrate services, you need services. That means that the whole vision of Plug and Play business processes requires a stage of SOA expansion where you have enough existing services available to compose new ones. If you don't have the right services, you will have to implement new basic services to realize a solution.
In other words to see the "speed up" you need to have already invested heavily into development of services that are compliant with your specific BPEL/XPDL engine - otherwise the development will be "slowed" down by the development of the missing services. That would mean that you have already determined that the total (and considerable) effort required to enable the use of BPEL/XPDL in your business is worth it and you have proceeded with implementation. The problem is that it is not always clear whether that "significant effort" is in fact required for any one business as it makes no sense to accept complexity for flexibilty that you are never going to use or leverage.
Also many of the current vendor tools ultimately lead to hub-and-spoke topology. The existance of a hub suggests some type of centralization which is exactly what distributed processing is trying to avoid as centralization undermines scalabilty. That explains why Event-Driven Architectures (EDA) have been getting more hype lately. It remains to be seen how this will impact the concept of work flow engines in the long term.