Graeme posted a couple of questions regarding my recent post on Web services integration. Specifically he was asking how best to get access to events in third party applications that do not support any native mechanism for hooking events that occur within the application.
This is a good question as it is a common problem to overcome. The fact of the matter is that when we are trying to leverage third party applications as part of our SOA, we are constrained by the limitations of those applications. If an application does not provide a native way of getting at events, we have no choice but to extract the events from the application database.
As Graeme points out, this is less than desirable as the database schema of an application is subject to change. The application vendor will likely give you no guarantees about the stability of the database schema, nor warn you when it is about to be updated.
Although this is indeed true, I don't think it is sufficient a problem for us to avoid publish-subscribe. Anytime we upgrade or replace an application with which we integrate as part of a service, we may have to do work to cater for changes in how we integrate with that application. This problem is not limited only to hooking events.
So what techniques can we use to get at events from the database? Well whatever works really. One way we can go about it is to apply some database triggers to write into a log table whenever something changes in the database. We can then poll that log table and publish events based on entries found there.
Another approach is to add a nullable timestamp column which is updated by a trigger every time a row is inserted or updated. We then poll the table for records with a timestamp after the last time we polled.
Some applications make getting at events from the database quite easy as they keep some kind of historical data. So for instance, a CRM application may keep a record of changes to the customer for audit purposes. In this case, we would just poll the audit table for any new records.
The effort required to hook these events in my opinion is certainly worth it. Events in SOA are highly reusable. So the fruits of your labour can be reused across a number of services. It also enables a decentralised data architecture, which considerably reduces coupling between services.
Thursday, April 24, 2008
Publish-Subscribe with Legacy Applications
Subscribe to:
Post Comments (Atom)
2 comments:
Another good post Bill I am enjoying reading your blog!
One question though. How does this approach that you recommend fit into the De-centralised data model approach?
Thanks,
Dirk
Hi Dirk,
With a decentralised data architecture, each service holds locally all the data it needs in order to be able to perform its particular function.
Publish-subscribe messaging is key in enabling a decentralised data architecture as each service keeps its data up to date by subscribing to relevant events from other services.
A service publishes an event, and all subscribed services receive the notification and update the information in their local databases.
So when we have legacy third party applications that do not natively provide a means of hooking events (as a means of supporting publish-subscribe), we must resort to the strategies described in this post.
Otherwise we cannot have a decentralised data architecture.
Post a Comment