Archive for April, 2010
I’ve been reading Nicholas Carr’s latest book The Big Switch. I enjoy his penchant for business history and find that he makes some very interesting arguments about where technology is heading. So far The Big Switch is an expansion on his previous work IT Doesn’t Matter, which made him a notorious killjoy in the IT community. His central thesis is that, over time, information technology becomes commoditized and the competitive advantage of technology among competing firms will eventually be eroded to insignificance. He cites as an example the history of electricity. Companies that first harnessed electricity had to produce it themselves – if you had it, it was a significant advantage over your competitors. Eventually electricity was mass produced, economies of scale saw its price drop and it became cheap and readily available to all. He argues that the internet is doing the same thing to computing power now. At first only those who could build computer hardware, develop computer software, or maintain computer networks themselves had a major advantage, and with the internet this is no longer the case.
Carr’s analogy has a point, and the proliferation of cloud computing is an example of his idea in action. But the existence of readily available computer power, like cheap electricity, may facilitate business innovation but by no means ensures it. A business is more than the sum of its parts. Just because I have cheap and plentiful electricity doesn’t mean I can build a Toyota factory any more than having cheap and plentiful computer power means I can build an Amazon website. You still need suppliers, distributors, financiers, workers and the like, and you need to know how they all fit together.
Thomas Redman, in his book Data Driven, describes how data is the one company asset that no one else can replicate (short of actually stealing it). Your competitors can buy the same computer systems or services, hire away your employees, buy from your suppliers, sell to your customers, but no where can they fully replicate your distinct business intelligence. He describes meta-data as data about how your various business assets fit or work together and argues this is what defines your enterprise. In this sense, data and information technology is more than raw computing horsepower. It is about the whole business and as long as this is so, it will always be part art and part science.
There are politics in every organization, but one thing I love about consulting is that you are always one step removed from them. As an expert outsider, you are often above the fray. When you give advice it is usually seen as neutral and professional – after all, that is what they are paying you for. But some clients are more difficult than others, and avoiding politics with them can be very difficult, if not impossible.
Over my many years of consulting, I have found that most consultant/client relationships fall into three broad categories. Whenever possible, it is best to position yourself in the top category:
You are seen as a recognized expert and treated well. You are allowed to work autonomously. Your client appreciates and respects you, and good work is lauded. Personally I thrive in this kind of environment. I want to deliver success to these clients. I will go the extra mile for clients that respect me. It is a win-win relationship.
Consultant as JAE (Just Another Employee)
You are seen as one of the crowd. Depending on the managerial staff, you may be micromanaged. Other employees may resent you. Any “favouritism” shown to you will be a grievance to the others (larger workspace, better equipment, a window etc.). The only way out of this trap is to prove your worth by excelling beyond their expectations. If you can, you could graduate to Guru status.
Consultant as the Enemy
For any number of possible reasons, your client is not happy and is directing its collective wrath at you. Marketing may have over-promised and your client’s expectations may be sky-high. Negative past experience with other consultants may make them prejudice against you. Maybe the project is being shoved down their throats by upper management. Maybe they are unhappy that the software package they just purchased also requires tens of thousands of dollars in development time. In any case, they are quietly or openly hostile. You are resented as a “huge” expense. Sometimes your client even wants the project to fail for political reasons. Sometimes they want someone to blame. Pulling this one out of the fire will be very hard, if not impossible. This is the worst case scenario. These will be the hardest clients. They will demand the most and thank you the least. Avoid these situations whenever possible.
You may find that any one client may be a mix of these, depending on the circumstances. You may even find individuals within an organization scatter all over this scale. To position yourself well, you need to manage expectations as early as you can and strive to exceed them. Prove your worth as you are able. But while you do, bear in mind the adage “there is no pleasing some people.” Sometimes bad clients are not worth the trouble.
Have you ever had trouble setting current period in a Powerplay Transformer model even though everything looked like it was set up properly? In a multi-query data model, make sure you check all your data sources to be sure that only one of them is marked to “set current period” (right-click your data source and click on the General Tab – a checkbox will show if this data source sets the current period or not). If more than one data source is marked to set current period, your model may not be able to resolve the current period and you could get unexpected results.
A quick way to check your current period availability is to rick-click on the top level of your time dimension, and then click on the Time tab. Your current period should show at the bottom of that screen. If you current period is unresolved, the current period box will show blank. Please note: You will have to generate categories or build the cube at least once for current period to be known in your model.
We all face the question of when to upgrade our computer systems. Is it a savvy business investment or a frivolous waste of money? This is a very real issue currently facing the government of Canada. The Auditor General of Canada released a report yesterday indicating that the Canadian government needs to spend a sizable amount of money to upgrade its aging information technology infrastructure, perhaps in the billions of dollars, simply to continue delivering key government programs. Treasury Board president Stockwell Day said that the government would find the money needed for such investment. Sounding a bit like a customer forced into an expensive and unnecessary upgrade, he added “As you know, with technology, there are always people who are saying you should have newer and better.”
So what is wrong with aging computer systems anyway? Auditor General Sheila Fraser makes her case with the following points, with my additional commentary:
- Aging systems become increasingly expensive to operate – Anyone who owns an aging automobile understands that sometimes it is just cheaper and more reliable to buy a new car.
- Vendor support may no longer exist – You may be annoyed that Microsoft dropped support for Windows XP, but count yourself lucky you are not Immigration Canada and running a system on a DMSII database, developed by the Burroughs Corporation in 1972.
- Skilled employees in aging technology may be more and more difficult to find – COBOL programmers are few in number, on the retirement track, and no longer trained.
- Aging computer systems may have difficulty meeting current regulation requirements – Laws are changing all the time; is your aging computer system flexible enough to keep up?
- Data access may be difficult – Historic systems are notoriously bad at reporting; this is where the entire business intelligence movement came from.
- Meeting client expectations may be difficult – In a web-enabled world, clients have exceedingly high expectations from your computer systems. Can your systems do what they want?
- Security issues – The arms race between security software and hackers never ends. If your technology is not keeping up, your data security may be at risk.
- Disaster recovery issues – Can you recover a system that has any or all of the above problems? Would you want to?
I know that as a computer consultant, I would be one of those people Mr. Day is speaking of who are arguing in favour of the newer and the better. But I think the Auditor General lays out a compelling case to keep pace with the technological times.
I came away from Ottawa Code Camp with an interesting tidbit – the XML data storage capabilities of SQL Server. Although this feature has existed since SQL Server 2005, this was the first time I had actually seen it demonstrated. As virtually the entire Cognos 8 world is XML driven, this has some interesting possibilities. The XML data type can force incoming data to fit its defined XML schema in much the same way a table structure does in a relational database, but it also will store any XML data when no schema is defined. Data can be extracted from the XML data type as pure data or raw XML through the use of XML Query (known as XQuery) and it can be manipulated with XML Data Manipulation Language (XML-DML).
Is anyone out there using this feature or is this a little known extra hiding away under SQL Server’s many other features?
Recent business history has shown that any corporation accused of committing fraud invariably invokes the “we knew nothing” response from senior management. Pleading incompetence in place of malicious intent is hardly comforting in my opinion. We have yet to see how Goldman Sachs will respond to the allegations levered by the SEC but I will not be surprised if it follows this well-worn script.
I’ve already written that I believe bad management is at the heart of deliberately fraudulent data. But what can an organization do to prevent rogue employees or poor quality suppliers from ruining their data quality? Put simply, due diligence. Without due diligence, you have corporations that don’t know what’s going on. Limited or poor due diligence may give senior executives probable deniability, but it also comes across as bad management.
A number of years ago, there was a major fraud in Canada’s mining sector known as the Bre-X scandal. On the face of it, Bre-X appeared to have discovered a mammoth gold find in Indonesia. Their stock went up exponentially and then collapsed when it was a proven fraud. Although it passed the stock market listing requirements in Canada and the United States, there were early indications of problems that were ignored, including:
- No independent testing or drilling for gold was conducted for more than 2 years. Drilling from Freeport McMoran eventually proved the claim false.
- Standard industry procedures were ignored. Core samples were crushed entirely instead of being split for independent analysis by partners.
- An Australian company that had previously held the claim and had tested the site was always sceptical of the Bre-X’s find.
- The gold flakes extracted from the core fell easily from the sample and were more like river-extracted gold than mined gold.
- A fire at Bre-X’s field office destroyed many sampling records.
Bre-X management always denied knowledge of the fraud, but they ought to have been in tune with these issues. That they were not suggests either complicity or gross incompetence. I suspect that the legal odyssey now unfolding with Goldman Sachs will be another lesson in due diligence gone wrong.
A fish rots from the head down. - Greek Proverb
Data quality is an issue in every organization. But there is a point in time, in my opinion, that data quality stops being the problem. This is when management implicitly or explicitly allows, facilitates or encourages bad data, employing wilful blindness or even conspiracy. This is what brought down Enron. This continues to play itself out to some extent in the world financial crisis.
Sometimes those in trouble attempt to fix data in their favour. This can be at a small scale when a line manager applies hand-picked statistics to back up a previously made decision. This can also be at a corporate or national scale where excessive debts are off-loaded into secret accounts to make balance sheets more palatable. But do not mistake these actions for poor data quality management or even bad BI. This is bad management, pure and simple.
The real danger here is that management might start believing their own lies, or worse yet, get everyone else to. Like a rogue financial advisor spinning a Ponsi scheme, this might work for awhile but a day of reckoning will come. Once lost, hard-earned trust can be gone for good.
Did you know that approximately 15% of all Cognos telephone service calls are resolved in the first 10 minutes because they are known compatibility issues? If you want to save yourself some time on the phone, check your Cognos 8 BI compatibility by following these steps:
1. Check your Component List: You can check to see exactly what version and build you are running of all your Cognos tools by checking your Component List. This can be found in your Cognos install directory at \Cognos\c8\cmplst.txt.
2. Check your Cognos 8 compatibility: You can check the compatibility of your Cognos version against any other software you might be using here.
For more assistance with Cognos compatibility issues or other troubleshooting, you can also download a diagnostics tool from IBM Cognos here.
Hope this helps!
Remember all those promises of household appliances that would work in sync and communicate with each other? The refrigerator that can manage its own inventory. The house that can manage its power consumption. Although still in the realm of science fiction, I have read two articles recently that indicate we may be moving in this direction. The first of these is an article in Wired Magazine which speaks of Mark Hamblin of Touch Revolution. He is creating a platform where home appliances can communicate with one another which should be available later this year. The second article speaks of development work at IBM which is focused on the “internet of things”. Specifically this involves devices that communicate with each other to gather and share information, with the ultimate objective of increased efficiency in such areas as energy consumption or traffic patterns.
What does this have to do with Business Intelligence? A lot, really. Consider a typical manufacturing facility with disparate machines of various technological capabilities. Unless machines are able to communicate with one another (or at minimum create machine readable logs of their processes), then you are dependent on machine operators to gather your manufacturing business intelligence. These human operators are fallible. They may forget to take readings or mark down information at the right time. They may even skew their stats to make their performance look better than it is by not recording spoilage or downtime.
But if machines can manage their own data collection, these particular problems should disappear. The more automated these processes can become, the cleaner the statistics should be coming out of them. So if we are moving towards greater integration of devices, whether in the home or otherwise, this should be good news for both business intelligence and data quality.
I have been reading Thomas Redman’s quite excellent book Data Driven. Dr. Redman is one of the foremost leading experts on data quality. He has a Ph.D. in statistics and created and led the Data Quality Lab at AT&T Bell Laboratories from 1987 to 1995. He is widely known as the “Data Doc”.
To put it simply, Dr. Redman defines data quality as the “right” data at the “right” time, with a distinction between getting the data “right” and getting the “right” data. On the face of it this may appear obvious and almost trite, much like the Monty Python skit in which an actor explains it’s not just a matter of memorizing words but also about getting them in the right order. But it belies an important distinction. To get data right, one must debug, analysis, look for errors and such. But to get the right data, one must communicate effectively with the one asking the questions. In fact, Dr. Redman explains that often technological issues are confused with communication issues with respect to data quality. Definitions of business terms and assumptions must be made explicitly among business intelligence practitioners. Communication is thus a key component to effective data quality.