Thursday, March 06, 2014

Understanding ISIM Reconciliations

Most ITIM/ISIM gurus will understand what goes on during Service reconciliation. Let's be honest, most gurus have had to write countless adapters and perform countless bouts of debugging problems when they arise.

What happens, though, if you have one ISIM environment that can reconcile a service successfully but a second ISIM environment which cannot reconcile the exact same service. And let us, for a moment, assume that both ISIM environments are configured identically.

Of course, if something works in one environment and doesn't work in another, there must be a difference somewhere. But the ISIM version is the same, the service profile is the same, the service definition is the same, it's talking to the same TDI instance which performs the grunt work of retrieving all the data from the target service. On the face of it - there is no reason for one environment to behave any differently than the other.

Yet it can happen! In fact, I recently saw an ISIM environment get itself into a hung state trying to reconcile a service yet all the reconcilable objects had made their way into ISIM.

When ISIM reconciles a service, the result is that supporting data objects are stored in the LDAP under the service container (erglobalid=123,ou=services,erglobalid=00000000000000000000,ou=org,dc=com) and ultimately accounts are stored under the ou=accounts and ou=orphans containers. I say ultimately for a reason. The accounts are actually temporarily stored under the service container too before being moved to a more appropriate container.

And therein lay the difference in my two environments. The working environment had no accounts stored under the service container prior to executing the reconciliation. Somehow, the non-working environment did have some accounts left hanging about under the service container.

A ruthless LDAPDELETE on all objects sub-ordinate to the service was all that was required for the reconciliation to complete successfully.

Next time you have a misbehaving reconciliation process, why not check to see how previous reconciliations have been left by having a look under your service container. You might be surprised at what you will find.

Thursday, September 26, 2013

ITIM Best Practices For Workflow Design

It sounds rather arrogant to call this blog entry "Best Practices" when it is merely my meandering thoughts that I'm about to spew forth. But a number of experiences of other people's work recently has almost moved me to tears.

So here is my list of "best practices".

Workflow Start Nodes

The Start Node (and the End Node for that matter) is not somewhere that code should be placed. Code can be placed in these nodes, of course. But it shouldn't ever be placed in here. For example, how could I know that the following Workflow has been customised:



If you have code that needs to be run as soon as the workflow starts, place a Script Node immediately after the Start Node and name it appropriately. This is MUCH better:



Script Node Size

If I open a Script Node and find that there are 1,000 lines of code in there, I will cry. If you have 1,000 lines of code you are probably doing something wrong. I would much rather see multiple Script Nodes with a sensible name describing their function laid out in an organised manner that gives me a good visual representation of the process rather than having to wade through hundreds or thousands of lines of code.

Workflow Extensions

If you have written some Javascript that looks cool and funky and re-usable, then make it cool and funky and re-usable by creating a Workflow Extension. Also, feel free to give extensions back to the community! (I shall publish some of my favourites soon!)

Hardcoding!

If I see a hardcoded erglobalid reference in a Script Node, I will, in all possibility, hunt the developer down and do very un-Tivolian things to him or her. Their assumption that their code will work when promoted through various environments is very flawed and they are being lazy. Do Not Hardcode!

Commenting & Logging
You might think your code is readable, but the chances are that it could still do with some comments here and there. Even if the code is understandable, the reasoning behind performing the function may not be so clear and requirements documents and design documents have a habit of disappearing!

When it comes to logging, log appropriately and update the status of the workflow throughout the workflow process. The following statements are great to see in Script Nodes because we can successfully handle any problems and ensure that we always navigate our way to the End Node.

activity.setResult(activity.SUCCESS, "Attributes Successfully Verified");
activity.setResult(activity.FAILED, "Attribute Verification Failed due to " + errMsg);

And Finally

The chances are that someone will come along after you have done all your hard work and they will look at your workflow. Do you want them to cry? Or do you want them to be delighted in what they have seen? Make sure you delight people!

Friday, June 07, 2013

Careful Where You Point That Finger

Over the last 9 months, I have managed to shed almost 50lbs in weight - that in excess of 20Kg for those living in the metric world. As part of the process of losing weight, I took to tracking my walking and cycling habits using Endomondo on my phone - and what an excellent piece of software that is.

I did have a problem, though!

When I started the application and enabled GPS, I would wait quite a considerably amount of time before my location could be determined. Indeed, I had completed the bulk of some of my journeys by the time I got the "GPS Excellent" notification.

Initially, my thoughts went along these lines:

"Endomondo have coded their freebie application to not pick up my location in the hope that I will purchase the Pro version of their app"

I did purchase the Pro version - mostly because I thought the app was terrific with only a slight nod towards the hope that my GPS issues would be resolved. My GPS issues weren't resolved.

So I then thought:

"I must have dropped my phone at some point and 'disturbed' its ability to perform it's GPS magic"

This seemed like a reasonable thought as all my friends and colleagues and perfectly acceptable Endomondo experiences on their phones. Indeed, my Samsung Galaxy II seemed to be really struggling when compared to a colleague's Nexus - and they were both Android phones.

I figured it must be time for a new phone. And proceeded to spend the next couple of weeks evaluating my options. And then... I woke up one morning to discover that my phone wanted to apply an update to my Android operating system.

An hour or so later, I had a shiny new OS. Admittedly, performance was awful initially as all my apps needed updating too. And, all my icons had been resorted alphabetically (for which I could quite happily exact some kind of tortuous revenge on the developer of the upgrade process). Apart from the initial dodgy performance and the tears that followed the pain of re-organising my icons, I noticed two things:
  • Playing 7x7 was super-fast - 7x7 is one of those simple yet addictive games that can keep me going for hours
  • GPS performance was perfect!

Wonderful! When starting Endomondo Pro now, I have to wait no longer than 2 or 3 seconds before I have my "GPS Excellent" notification. And to think my problems were with the app or the phone.

So. What's the point of this story I hear you ask! Well, it is far too easy to make a judgement about an app, or a product or almost anything you might encounter during the course of your life. My problems had nothing to do with Endomondo's developers' coding ability; nor had it anything to do with how Samsung hardware technicians had put my phone together. Yet, even for a techie like me, my thought processes led me to think bad thoughts about both!

I have seen senior managers in enterprises make some very strange decisions in the past. They may decide that they no longer need "Huge Software Developer's Amazing IT Solution" because it costs too much and doesn't perform the way they expected and instead by "Enormous Software Developer's Stupendous IT Solution" in the anticipation that it will cost less and perform perfectly.

And frequently, I don't see the cost benefit and the performance problems still exist.

Sometimes the problems we are faced with in the IT world are not being caused by the applications we are using. Sometimes we need to dig a little deeper to find out what the real cause of a problem is - whether it be the hardware platform we are using; the operating system in use; the networking setup; the interfaces; and of course, the users and their expectations.

And finally... maybe some of the big boys can learn from the developers of the apps we now use on a daily basis on our smartphones. Endomondo does exactly what I need in a manner which makes sense for me. Can we say the same thing for enterprise software?

Tuesday, April 16, 2013

Tempus Fugit

I remember being a follower of a blogger who wrote about IBM Tivoli security solutions and becoming quite concerned for his well being when he "stopped" blogging for a while. When he had gone fully six months without blogging, I had myself convinced that something terrible had happened to him personally.

And now I find that I have done the same thing. My periodical briefings have stopped. But fear not - it's not because of any ill health. It's not because of a change in career. It's not even to do with boredom. It has everything to do with being far too busy and that's the best reason of all for the temporary blip in my output.

But, time flies... and too much time has passed since I last committed my thoughts to writing.

So what has happened since in the last 6 months? Well, the IBM Tivoli re-branding exercise is in full swing with IBM Security Identity Manager and IBM Security Access Manager products having been released.

And what has changed in those products?

ISAM has a brand new deployment process which is greatly simplified. However, for TAMeb users who have deployed their software on Windows, beware? The upgrade process might not behave as you expect? Why? The move to a 64-bit architecture, that's why? Think seriously about any attempt to perform an in-situ upgrade!

You might like to also check compression settings on your WebSEALs - a respected colleague of mine has already encountered some fun and games with those!

And ISIM? Is it just a pure re-branding exercise? Not at all. Some functional additions are definitely welcome like the controls added to the services offering retry operations on failed resources. Account ownership and role attributes look interesting (despite how they have been implemented). Privileged Identity Management is a great addition as is the inclusion of supported web services API.

But core processing that ITIM administrators will know and love is still there!

And what of my work recently? Well, it's a matter of spending a lot of time concentrating on federated security, and environmental upgrades. Working on pre-sales; sprucing up designs; sizing projects; and helping those around me get best use out of their IBM Tivoli/Security solutions.

So much has happened in recent months, though, that I hardly know where to start in documenting it all. There has been fun with SAML v2. Flirtations with FESI extension re-writes. Dalliances with web services and APIs. Encounters with business process re-engineering.

My next article, however, will likely be an IBM Tivoli Directory Integrator article centred on best practice for collaborative development. That sounds like a tricky one!

Wednesday, October 10, 2012

Internet Explorer Is A Very Naughty Boy

It has been three months since I last felt the urge to post anything on my blog. There isn't any real particular reason why there should've been a hiatus with the possible exception that the sun was shining (sometimes) and I was out and about rather than being tied to my desk.

But the days are shorter than the nights now. There is definitely a nip in the air. Being "out and about" isn't quite as enjoyable as it was just a few weeks ago.

And so here I am... ready and willing to commit some more of my findings to this blog.

So what shall I tell you? Well, what about the fact that Internet Explorer is a very naughty boy! Hardly a startling revelation. Those of us working in the web world already understand how often we come across situations whereby Firefox, Chrome, Safari and Opera can all render a page correctly, but Internet Explorer seems to fail to do so! It gets rather tedious after while, right?

This week, I had the joys of diagnosing why a page protected by WebSEAL wouldn't render in Internet Explorer. Capturing the HTTP Headers whizzing back and forth in Internet Explorer and Firefox provided the answer quite quickly: Internet Explorer would sometimes not bother to send the session cookie back to WebSEAL.

Why would it "sometimes" just not bother to do this? Well, there is some well documented evidence that Internet Explorer (up to version 8) treats cookies in a rather unexpected fashion. Internet Explorer can start dropping in-memory cookies as it has a finite limit on the number of in-memory cookies it can handle!

Those clever people in the development labs of IBM, however, have come across this before and the problem can be alleviated by setting the resend-webseal-cookies parameter to yes in the WebSEAL configuration file. This ensures that the cookie gets set with every request!

For many of you, you will have come across this quirk before. Many times, potentially. For those just getting started out with your WebSEAL deployment, though, make sure you have the ability to take a grab of the HTTP Headers from within your browser. It's amazing what you can see inside them!

Useful Header Inspection Tools

I promise to blog more... now that that winter is almost upon us!

Friday, June 29, 2012

TDI and MQTT to RSMB

That's far too many acronyms, really. What do they mean? Well, readers of this blog will understand that TDI has got nothing to do with diesel engines but is, in fact, Tivoli Directory Integrator.



MQTT? MQ Telemetry Transport - "a machine-to-machine (M2M)/"Internet of Things" connectivity protocol".

RSMB? Really Small Message Broker - "a very small messaging server that uses the lightweight MQTT publish/subscribe protocol to distribute messages between applications".

So what do I want to do with this stuff? Well, you will now know that I got myself a Raspberry Pi and I was scratching around thinking of things I'd like my Pi to do. I came across an excellent video showing how Andy Stanford-Clark is utilising his Pi to monitor and control devices around his home - it is definitely worth a look

I have no intention (yet) of trying to copy Andy's achievements as I'm quite sure I don't have the spare hours in the day! However, I was intrigued to see if I could use my favourite tool (TDI) to ping messages to RSMB using MQTT.

Step 1 - Download RSMB
https://www14.software.ibm.com/webapp/iwm/web/preLogin.do?source=AW-0U9

Step 2 - Startup RSMB



Step 3 - Fire up a Subscriber listening to the TDI topic
 


Step 4 - Write an Assembly Line to use the MQTT Publisher Connector
 


Step 5 - Populate the output map of my connector and run the Assembly Line.
The result will be a message published to RSMB which I can see in my subscriber utility:

I can also see the RSMB log shows the connections to the server:

Of course, TDI doesn't have an MQTT Publisher Connector - I had to write one. The good news is that this was possibly the simplest connector of all-time to write. That said, it is extra-ordinarily basic and is missing a myriad of features. For example - it does not support authentication to RSMB. It's error handling is what I can only describe as flaky. It is a publisher only - I haven't provided subscriber functions within the connector. But it shows how TDI could be used to ping very simple lightweight messages to a message broker using MQTT.

So what? Sounds like an intellectual exercise, right? Well, maybe. But MQTT is a great way of pushing information to mobile devices (as demonstrated by Dale Lane) so what I have is a means of publishing information from my running assembly lines to multiple mobile devices in real-time - potentially.

At this point, though, it is worth pointing out that the development of a connector is complete overkill for this exercise (though it does look pretty).

Dropping the wmqtt.jar file that can be found in the IA92 package into {TDI_HOME}/jars/3rdparty will allow you to publish to RSMB using the following few lines of TDI scripting:

// Let's create a MQTT client instance
var mqttpersistence = null;
var mqttclient = Packages.com.ibm.mqtt.MqttClient.createMqttClient("tcp://rsmb-server:1883", mqttpersistence);

// Let's connect to the RSMB server and provide a ClientID
var mqttclientid = "tdi-server";
var mqttcleanstart = true;
var mqttkeepalive = 0;
mqttclient.connect(mqttclientid, mqttcleanstart, mqttkeepalive);

// Let's publish
var mqtttopic = "TDI";
var mqttmessage = "This is a sample message!";
var mqttqos = 0;
var mqttretained = false;
mqttclient.publish(mqtttopic, mqttmessage.getBytes("ISO-8859-1"), mqttqos, mqttretained);

Not very complicated. In fact, very simple indeed! (The connector I developed doesn't get very much more complicated than this!)

Wednesday, June 13, 2012

Raspberry Pi, XBMC and Android

Well this is completely off-topic for me... Pi, XBMC and Android really doesn't fit with the myriad of previous posts on Identity & Access Management and IBM Tivoli security software. So for those of you who are only interested in the latter, you might as well stop reading now.

For those of you interested in how you can build your own media centre for just a few pennies, read on.

The Raspberry PI is a cred-card sized computer that plugs into your television. Having grown up with a ZX Spectrum in my younger days, I was keen to recreate the feelings I had as a teenager fiddling with Sir Clive's version of BASIC.

The Pi has an ethernet port, an HDMI slot, two USB ports, an SD card reader, a power slot and additional audio/video outputs as well as a GPIO for adding your own hardware attachments.

For my project, I just needed an ethernet cable, an HDMI cable, some form of power and an SD card to host my Operating System - Debian6.

Following the instructions provided at raspberrypi.org, I was able to flash my debian6-19-04-2012 image to my 8GB SD card. I then resized the partitions using a GParted LiveCD as attempting to deploy anything on to the image is an exercise in futility as the image provided is TINY!

I slotted the SD card into my Pi, attached the HDMI and ethernet cables and then attached my Amazon Kindle power cable and watched the lights come on and the boot sequence start.


Unfortunately, the image provided doesn't have SSH enabled by default which meant hooking up an old keyboard to the Pi and sorting that out - as such:

sudo mv /boot/boot_enable_ssh.rc /boot/boot.rc
sudo reboot


The keyboard was disconnected and it was time for some PuTTY action back on my home desktop.

Next up, I decided to install XBMC and thankfully this is a path well trodden so far. In fact, detailed instructions can be found online and it is worth following this thread on the Raspberry Pi Forum.


The following instructions are slightly more detailed and detail the approach I took. NOTE: I have placed the actual files I downloaded on to my own website so I can recreate this procedure should the images "disappear" from the original sources.


First step, install XBMC:

cd /home/pi
wget http://www.stephen-swann.co.uk/downloads/xbmc-bcm.tar.gz
gunzip xbmc-bcm.tar.gz
tar -xvf xbmc-bcm.tar
sudo mv xbmc-bcm /opt
rm xbmc-bcm.tar

Next up, update the Raspberry Pi firmware:

cd /home/pi
wget {host}/raspberrypi-firmware-0c3566c.zip
unzip raspberrypi-firmware-0c3566c.zip
rm raspberrypi-firmware-0c3566c.zip

cd raspberrypi-firmware-0c3566c
sudo cp boot/* /boot
sudo cp -R opt/vc/* /opt/vc/
sudo cp -R lib/modules/3.1.9+/* /modules/3.1.9+/
sudo reboot


So far, so good. Now to install some dependencies which takes a not inconsiderable amount of time:

sudo apt-get -y install autoconf libboost-dev libass-dev libmpeg2-4-dev libmad0-dev libjpeg-dev libsamplerate0-dev libogg-dev libvorbis-dev libmodplug-dev libcurl4-gnutls-dev libflac-dev libmysqlclient-dev libbz2-dev libtiff4-dev libssl-dev libssh-dev libsmbclient-dev libyajl-dev libfribidi-dev libsqlite3-dev libpng12-dev libpcre3-dev libpcrecpp0 libcdio-dev libiso9660-dev libfreetype6-dev libjasper-dev libmicrohttpd-dev python-dev python-sqlite libplist-dev libavahi-client-dev

Next? Install the Raspberry Tools:

cd /home/pi
wget {host}/raspberrypi-tools-772201f.zip
unzip raspberrypi-tools-772201f.zip
rm raspberrypi-tools-772201f.zip

cd raspberrypi-tools-772201f
sudo cp -R arm-bcm2708/linux-x86/arm-bcm2708-linux-gnueabi/sys-root/lib/libstdc++.so.6.0.14 /usr/lib
sudo ldconfig


At this stage, we are almost finished. I wanted to make sure that everything was bang up-to-date though and threw in some reboots, just to be sure:

sudo reboot
sudo apt-get update
sudo apt-get upgrade
sudo reboot

And finally, it was time to run XBMC:

sudo LD_LIBRARY_PATH=/opt/xbmc-bcm/xbmc-bin/lib /opt/xbmc-bcm/xbmc-bin/lib/xbmc/xbmc.bin

Configuring the XBMC settings through the XBMC interface required my old keyboard to be hooked up once again, unfortunately, but this was a "one-off". I enabled the XBMC webserver and added my ReadyNAS Duo as a UPNP source for video, music and pictures.


I then downloaded the XBMC Remote software for Android to my phone, punched in the webserver details for my new XBMC installation, and hey presto - I was able to control the XBMC from my phone and enjoy watching a movie on a television that is already DLNA capable :-)

Now, I just need to move the Pi to the bedroom where there is an old television that couldn't even spell DLNA if it had the power of speech.

Cool? I think so...