Katrina and the Waves

That was the name of a 80’s one-hit wonder band that released “I’m Walking on Sunshine,” yet the waves that Katrina brought to New Orleans earlier this week and the aftermath of her deadly passage across the Gulf Coast bring nothing but darkness into the hearts of many.

I’m troubled by the rapid disintegration of a civil society with such a rapid plunge into anarchy and violence. It’s got a strong J.G. Ballard feel and, while he’s certainly had plenty of opportunity to glimpse violence and evil, it’s hard for me to fathom how quickly a supposedly sophisticated way of life can collapse into “The Lord of the Flies.” Katrina was a catalyst and New Orleans now burns. Even if we able to stop the flames and destruction this time, will there be some time in the near future where the fire will ravage our whole society burning with such intensity that we will be unable to stop it until it has consumed us all?

I pray this not be the case and wish I could find comfort in these prayers.

Peace,

David

From Conjunction Junction to Infastructure Juncture

I’m currently working with the infrastructure group to improve the scalability of our system. Although built on sound Java technology with sound architecture that clearly puts 99.9% of comparable systems to shame, clients will always find ways to push the boundaries.

Some clients will say, “Since our end of day risk analysis scheduled tasks only take 20 minutes to run thanks to the wonderful performance of your system, can’t you guys just give us real-time portfolio pricing?” Others might simply think that it’s perfectly natural that one should never have to go to the database: “Yes, we’d really like to have 250,000 trades in memory at all time, including all associated advice documents, settlements, documents generated and those received from our counterparties. Why? Is that a problem?”

Frankly, I think it’s totally legit… you’re always going to want to push the envelope. So I’ve recently moved to the infrastructure group as they need my help to migrate our back-end and make it adaptable to high-performance distributed caching tools such as GemFire and Coherence. In theory it actually doesn’t look so complex since we already have a fairly clean cache API (with an internal implementation) so my idea is to upgrade this and adapt the framework a little bit in order for us to be jCache compliant. Once that’s done, we should be able to theoretically plug in any 3rd party Cache and voila… or not Voila, since the French can’t seem to agree to anything these days other than holding some sort of weekly strike… But anyway, the point is that it shouldn’t be too hard… though I am a bit worried about cache consistency and database transactions. The API doesn’t provide a way to delegate this to the cache implementation, so I’m guessing it’s up to us to deal with this. Anyway, that’s what i’m up to right now…. Still doing some infrastructure level enhancements on the report framework though, but most of our Back Office development is migrating back to Europe, which makes sense since settlement typically occurs closer to GMT so most of the knowledgeable Back Office staff is typically found in Paris and London.

When I’m doing serious refactoring and design work where I don’t interact much with others, I tend to immerse myself in code with the help of music and I’ve been listening to Launchcast a lot lately. I was thinking today: They know my favorite artists, and they know (or should be able to know) new release dates. So how come they don’t give fans “sneak peeks” at upcoming album releases, potentially even making deals with major labels so that I can buy the album 2 weeks before it goes out to the general public? I’d love to start hearing a rotation of Morcheeba songs before the album hits the streets…

Ciao,

David

Ideas emerging from necessity

It’s funny how software comes to life…

I designed a reporting framework for Calypso. It is used to plug easily and extract data from the system. Among the many features, it’s possible to customize the way the data is displayed, whether to aggregate and on what fields, keeping subtotals, averages, and various other aggregation functions, and easily allows the data to be exported to HTML, Excel, and PDF. Simply put, you write 3 classes, one that defines your search criteria, the main class that takes this search criteria and “queries” the system via the API, and a class that extracts discrete column values from an object. You compile these classes, plug the name of the new report into a database table, and you’re good to go: You will get a GUI for this report and it will allow you to sort, aggregate, basically slice and dice your data every which way you like. You can export the data to all the aforementioned file formats…. It’s probably very similar to Crystal Reports and is definitely similar to Excel in that we’ve implemented some functionality based on what we found in there. The nice thing, though, is that because it’s built and tailor-made to plug into the API, it provides a nice level of abstraction. I can typically crank out a brand new vanilla report in half a day. No kidding.

The report framework is only about a year and a half old and I pretty much worked on it solo for the first 9 months. Now we have probably 5 to 6 developers using it, making enhancements, refactoring the code, what not. It’s been integrated into a variety of GUI windows and we have over 60 different reports built on top of this framework. What’s really interesting though is its reason for existing. Did we brainstorm and decide we needed to offer this functionality, both for use internally and for our clients? (Several clients are loving it, by the way, and are already building upon this framework extensively.)

Nope.

The reporting framework was born out of sheer desperation as I was assigned 15 back office reports to implement from scratch with a delivery 2 weeks later. Well, actually the delivery was 4 weeks later, but this enhancement request for a very large New York-based bank came into my mailbox exactly 11 days before I was to head off to Tuscany for a wedding.

“Let me get this straight,” I thought, “They want 15 brand new reports, with GUI and the ability to run on a scheduled basis?” I did the quick math… “That’s a minimum of 75 new classes that I need to write and debug if I am to deliver all these reports and follow the existing design guidelines to implement reports.” Not completing the reports was not an option. It had been negotiated from above, and I certainly sympathize with a client that’s paid a lot of money for a very good SLA. But we’re talking Tuscany here… Sienna, Firenze, and Brunello. Not going to Italy was not an option!

So I figured my best bet was to build upon the existing interface-less report methodology, refactor the code extensively, and make these new reports “pluggable” into a generic GUI window. Next thing you know, the reporting framework was born.

I’m just thinking back on this as I am currently drafting a developer’s guide on the subject. Since that beautiful September day when I began coding it, the reporting framework has grown at mindblowing speeds. It can be updated real-time by plugging into our event server. Search criteria can be exported to XML and be shared across multiple users. Clients can extend the framework and build their own custom aggregation functions. We provide sums, averages, minimums, and maximum functions out of the box. All kinds of tweaks can be made to the look of the output as views can be customized with row and column color settings based on whether such condition is met.

It’s a trip to see the organic growth of this framework even with just a handful of developers having access to the source code… Have a great weekend.

Peace.

Writing a language interpreter

Today was a kickass day… I love days like this. I woke up this morning with feedback from one of our consultants in Tokyo. Sumitomo Trust went live on FX yesterday and it turns out that they processed 240 incoming MT202 SWIFT messages yesterday and the STP process is working well. The messages come in via MQ and the trades are automatically matched with appropriate payments generated. I felt quite gratified because I seriously suffered trying to implement this stuff.

I spent the rest of the day implementing iteration in our advice document generation language. Calypso generates different formats for advice documents (SWIFT, HTML, text) to be sent from the system via various gateways (SWIFT, E-mail, printer, fax) and the HTML documents are generated based on template files using a rudimentary in-house processing language and keyword substitution. I implemented conditionals a while back but it finally came time to add in iteration. Talk about a blast to code! I’ve got a command queue, contexts and variables being pushed on and off the stack (since we can have nested conditionals and, now, nested iterations). It had been a while since I’d revisited that code and I took advantage of this enhancement to do a lot of refactoring. From the onset (and for historical reasons) I’ve been using JavaCup to generate a parse tree. With the added complexity, however, I’ve moved all of the processing outside of the cryptic cup file, especially now that I’ve got a command queue (since we can only actually execute the commands within the iteration block after we’ve exited it.) Granted, there’s still plenty of performance optimization to do, and I’ve got to investigate whether or not I can use a cache. One limitation of the current setup is that the conditional statements within the iteration block can’t actually reference the iterator. That bugs me, but only a little since all that is really being done is keyword substitution in generated HTML. I’m also pushing the limits of JavaCup, I think… Thankfully, I’ve got the companion book at home so it looks like that’ll be coming off the shelf tonight and I’ll be burning the midnight oil.

Cover-your-ass clauses and gourmet FX traders

Certain long term maturing products are negotiated with conditions for early trade termination at predefined dates that either party can terminate the trade. This is common practice among interbank players to limit credit risk exposure. This is more or less financial mumbo-jumbo to allow traders to add special clauses to cover their asses. “We wanna do this deal but… just in case… let’s add a couple of get-out-of-jail-free cards in here.”

So that’s what I’ve been working on lately… Trying to ensure that our system allow Early Termination and Cash Settlement of Interest Rate and Credit Derivatives. I’d actually implemented a lot of that a while back and just needed to do some enhancements. Typically when I approach such a problem, I try to think how to encapsulate the new functionality by creating a new interface. It seems pretty evident. If you want to do more or less the same thing across a variety of dissimilar objects, figure out how to abstract out the common functionality and determine what information you need from those objects. Bam! You got yourself an interface.

Hmmm… I’m just blown away by how few people grasp this concept. Funny thing, too, is that this approach is usually harder at the onset but allows you to kick back later on by making fixes or enhancements in one place instead of 50. I just cringe when I see the same code cut-and-pasted across so many classes…. Oh, and people… If you tend to have a bunch of if clauses checking whether an object is an instance of something or other… you’re doing it wrong!

I’ve also been wrapping up Fund FX. That’s been brutal… These Japanese bankers are serious FX gourmets. They requested bulk matching of incoming MT202 messages against payments in the system. And all of that had to be recursive since it might require netting of payments, which then needed to be split to match against the incoming message amounts and, whatever residual payment could later be bulk matched with other payments. The thing drove me absolutely bonkers, mainly because I didn’t write the payment netting functionality of the system. The guy that did is brilliant with lots of business knowledge and good technical foundation, but his code… errrr… shall we say, inherits from structural programming as opposed to OOP. Ever heard of 5000 line methods? Calypso’s Back Office got ‘em! Oh the humanity…

Anyway, I’m off to San Diego for a long weekend and I’m looking forward to hanging out by the beach. The San Francisco fog, combined with the constant bombardment of CRT rays has me feeling white and pasty.

Peace,

David

Fund FX and CLS

Here’s a picture: You buy a nice bottle of wine from me for $50 (What can I say, you like wine and I sell some serious gourmet Brunello.) I’m just a broker so I’ll have the wine shipped to you and you’ll send the payment directly to my supplier… The catch is, all I know about this supplier is that the guy’s name is “Giovanni’s Vinos”, in Milan Italy, and that he has a Bank of America account somewhere in New York City. So now, I expect you to transfer $50 into Giovanni’s account at BofA based solely on this information. Sorry, dude.

In the world of Foreign Exchange (FX) Trading, it’s called Fund FX, and that’s exactly what I’ve been trying to implement for one of our Japanese clients for the last several months. Yeah, fun stuff.

On the interface, I need to extract settlement information from an imcoming MT202 SWIFT message. This message contains antiquated pseudo-cryptic character sequences clearly conceived by twisted alien creatures with pasty white faces to keep track of where we keep our pieces of green paper. A SWIFT MT202 Message is used to order the movement of funds (Your $50) to the Bank of America account of ‘Giovanni’s Vinos.’ It would look something like this:

{1:F01YOUUSNYAXXX0000000000}{2:I202DFO1SFNYAXXXN2020}{3:{108:MT202}}{4:
:20:2021
:32A:040723USD50,
:57D:BANK OF AMERICANEW YORK
:58D:GIOVANNI'S VINOSMILANO
-}{5:}

I need to interpret these instructions as soon as they come in the system and match them against transfers and issue settlement instructions to BofA in New York. So now I’ve got to figure out a way to identify Giovanni’s account at BofA… Granted, in real-life the bank name would allow me to uniquely identify it, and all I really need to do is locate its SWIFT code and, with the recent introduction of Continuous Linked Settlement (CLS), all this has gotten a lot easier. The thing gets to be a real nightmare, though, if you’re trying to implement an interface to both Fund FX and CLS at the same time. What came first, the chicken or the egg?

What really irks me, though, is that both CLS and Fund FX are brand-spanking new services. Don’t you think they could at least have opted for a less arcane protocol than SWIFT? I fully understand the ubiquitous presence of SWIFT in the world of foreign exchange settlements. Still, If I’m architecting a new system, I’d much rather use XML and SOAP to communicate and let people generate their own SWIFT, if they need to. Apparently though, that’s just me…

Bloomberg, SWIFT, and other dinosaurs

Why are some of the most widely used protocols and data streams on Wall Street so antiquated? I’ve been working recently to interface my employer’s product with Bloomberg security data. FTP? Flat-file format? Errrr… I’m surprised they didn’t ask me to use punch cards to send my request via US Mail.

Now my first thought was that our client (on whose behalf I’m implementing this interface) is too cheap to get an upgrade on its Bloomberg License. Still, that doesn’t really make sense when upgrading to real-time quotes and reference data updates would have a significant upside for its traders.

No… I think this is all Bloomberg’s doing. They’ve got antiquated software that’s clearly (as far as I could discern from the API documentation) running batch jobs to query their data feeds and… well, you know… concurrency is, like, really hard. 😉 Then again, it could simply be that, like Microsoft and other pseudo-monopolies out there, Bloomberg knows he’s got the best data on the street and his clients will do just about anything to get to it. Clearly, if indices, quotes and credit events were drugs, Bloomberg would be the Pablo Escobar of the digital age.