Cognos Crosstab Double Trouble

Have you ever added a header to a crosstab table, only to find it started to double your totals?  This scenario may transpire if you have relational data that you are reporting on with hierarchy structures, as explained in my example.  As long as you do not have a header, your totals should show up correctly.  But if you add headers, your totals can easily be messed up.

Let’s say you have three levels of GL account codes you want to report on, with totals for each.  Your report looks like this:

initial_report1

At this point, your report is very basic and it works as expected.  But let’s say you want to show your hierarchies stacked in order to save space in the row headers (a common technique I use).  You add headers and remove the descriptions on the first two columns so your report looks like this:

stacked_report1

Now your totals don’t work anymore.  Why?

In short, Set Aggregation, which is a property of your crosstab.

Set Aggregation is automatically set to “No” for relational data sources.  So the crosstab is adding up your total (for your header), and then adding up your total again (for your footer), giving you the double effect.  Change Set Aggregation to “Yes” on your crosstab and your totals will work properly again.

Set Aggregation defaults to “Yes” for dimensional data sources, so this problem only occurs for relational data sources.

,

No Comments

Why I lament the end of TechNet (and why Microsoft should too)

LamentationYears ago, when I decided to pursue a career in IT, I did something silly.  I bought a full licence version of Visual Studio.  It cost me a lot of money, I barely used it, and it was obsolete very quickly.  Then I learned about TechNet subscriptions.  This was a much better way to stay up to speed on Microsoft technology.

Alas, the era of TechNet subscriptions is coming to an end.  While I understand that this change is at least in part due to piracy of Microsoft software, I believe that the end of TechNet subscriptions will hurt me and Microsoft equally.

For me, it will hamstring my efforts to learn new Microsoft products.  There may very well be trialware available from Microsoft, but this will not suffice for most of my needs.  I am in and out of technology quickly as I move from one project to another.  I have always found trialware wholly inadequate and do not use it if I can help it.  A fair amount of work goes into setting up training and prototypes, and I like to go back to refer to what I have done.  Trials don’t allow for that.

For Microsoft, it will be lost opportunity.  I am in a position to recommend product solutions to my clients.  Without hands on experience with products or ability to prototype, I will be in far less a position to recommend Microsoft.

Perhaps Microsoft will take the Oracle model, and make some of its products available free to registered users (sometimes in watered down form).  This would help somewhat.  But Microsoft has so many products; surely it will have to restrict what is available.  And sometimes the features of the enterprise edition versions are what corporate clients really need.  Without the ability to play with them, I will not know.

Hopefully Microsoft does not take the partner approach that IBM takes.  Getting IBM software for training or prototyping is painful, and software available is limited.  And this is coming from someone who gets IBM software through the appropriate partner channels!  Without access to an IBM partner, you are really out of luck.

MSDN subscriptions are another option, although they are limited in scope and cost substantially more.  For an independent like me, these cost more than they are worth.

TechNet was the gold standard.  It allowed me as a Cognos consultant to keep a firm footing in the Microsoft world.  Keeping up with Microsoft will now be harder, and for some products maybe not worth it at all.  Just when Microsoft needs independent IT consultants like me more than ever to drive new business.

photo by: jglsongs

,

No Comments

Performance Management with Mike Baggott

BI StrategyHave you ever struggled to align your BI strategy with your strategic business plan?  Are you seeking best of breed knowledge from across the performance management industry?  Do you need an assessment of your performance management process?  In short, are you looking for better BI strategy across your enterprise?

My friend and colleague Mike Baggott has a new series of seminars on strategic performance management that may be able to help.  He is a 30 year veteran of business intelligence and the former directory of product management at Cognos.  His seminars intend to merge business strategy with meaningful business intelligence, resulting in business-driven BI.  Mike has drawn on a deep well of BI knowledge in preparing this, not only from his own career but also drawing on leading practitioners from across the information industry like Roland Mosimann, Stephen Few, Wayne Eckerson, Bill Inmon and many others.

These seminars focus on the elements of a successful BI approach and the alignment of strategic business goals with BI strategy.  His material is technology neutral and does not pertain only to Cognos solutions.

His seminar outline consists of the following modules:

Business Driven BI planning

Management Overview (1 1/2 hours)

Management – Analytic Assets Inventory and Gap Analysis (1 hour)

Business Driven BI Requirements Analysis and Design

Business and IT – Analytic Content (1 day)

Business and IT – Advanced Analytic Content (1 day)

Business and IT – Agile Requirements Gathering & Design (1/2 day)

Business and IT – Governance and Compliance Access Control (1/2 day)

If you have any interest in these seminars you can contact Mike directly through his LinkedIn profile.

photo by: FutUndBeidl

No Comments

Software Licenses for Prototyping or Training

lio2I recently received a question about how to acquire a software license for self-training purposes.  Here is a list of the resources I use for training and prototyping:

Microsoft Technet:  By far the easiest to use and most comprehensive, this give you access to virtually all Microsoft operating systems, SQL Server versions and Office software, along with most other Microsoft programs.  For an annual subscription fee, you can have it all, granted you adhere to the license agreements.  This is especially useful if you need to get your hands on older versions of software, as you can go back pretty far.  If Microsoft makes it or made it in the past, you can probably find it here.  There are certain restrictions on how many licenses of each item you are allowed to use, but I find this more than meets my needs.  It is an invaluable resource, and well worth the annual fee.

Oracle:  You can download certain Oracle software if you register and adhere to the license agreement.  This is free of change, but more limited in scope than Microsoft Technet, mostly due to the more limited nature of Oracle’s business.

IBM:  To get licenses of IBM software, you pretty much need to be in touch with an IBM partner. If you have an account with a partner, you can download certain software packages and images.  This is limited and not particularly easy to navigate.  With persistence, you can usually find what you need.

For prototyping and training, it is also very handy to use VMware to run your software on virtual machines.  This allows you to set up multiple environments and try things out, and gives you an easy out if things go wrong.

photo by: zugaldia

, , ,

No Comments

Introducing Microsoft Data Quality Services

MicrosoftLast evening I attended a presentation on Data Quality Services (DQS) by Microsoft BI specialist Stéphane Fréchette. This new feature is available in SQL Server 2012, and is part of the the Enterprise Infomation Management stack of MS BI which includes DQS, Master Data Services (MDS) and the familiar Integration Services (SSIS).

First and foremost, DQS is intended as a business tool. DQS is designed for a data steward (typically a business user) who wants to define rules for cleansing data. It allows users to create a knowledge base of these rules, and it allows users to cleanse their data and to match duplicated data. This is a two step process that always starts with cleansing and then, if necessary, moves on to matching. DQS never overwrites data but instead writes data out to target destinations.

DQS’s strength is in the cleansing and matching, its two core functions. It does not appear to offer much in terms of data profiling, although it does allow you to review data as it is being cleansed and matched. SSIS is able to call DQS cleansing operations, but not matching operations. SSIS is able to deduplicate data through its own fuzzy matching feature or through the Master Data Services (MDS) operations.

DQS does have a few limitations. This first release is only able to read and write to Microsoft Excel and SQL Server 2012. Already noted, SSIS cannot make use of the matching operations. While it is an interesting foray into the realm of applied data governance, its heavy focus on cleansing makes it feel more tactical and less strategic. It will be an interesting addition to the overall SQL Server universe.

photo by: Nils Geylen

No Comments

An Overview of Esri Maps For Cognos, Part 3 – Shapes

(Special “The Colour and The Shape” edition)

By Peter Beck, CBIP

In Part 1 of this series we introduced the Esri Maps for Cognos (EM4C) product, which enables us to tie together BI-type reporting with the rich capabilities of Esri’s mapping software. In Part 2 we demonstrated how easy it is to use this software to connect point-type

Wonder how we're doing in the dragon district?

Wonder how we’re doing in the dragon district?

data in a Cognos report to a map. In essence, we can take points of data identified by lat/long values and connect them to a map, and then colour-code the points to represent different categories or types of data. In our example, we looked at crime data for San Francisco. The result enabled the user to make inferences from the geographic distribution and type of crime reports that would be difficult to make if the data were simply listed by address, or even grouped into neighbourhood categories.

In this installment, we will look at a slightly different way of displaying data within the context of geography – instead of displaying discrete points (which require lat/long values) we will categorize larger geographic areas, defined by shapes on the map.

 

As before, we need a Report Studio report, with a query:EM4C_3_1_query

Note that in this example we don’t have any “lat/long” type data here – instead, we have Retailer Province-State, which contains values representing the name of each state:

EM4C_3_2_results

This time, instead of adding a Cognos X/Y Layer to our Esri map in the report, we will add a Cognos Shape Layer:

EM4C_3_3_Shape_Layer

A Cognos Shape Layer acts similar to a XY layer, except that it binds data based on common descriptions between the report data to a  map containing a “shape”, instead of lat/long points. In this case we set up the map associated with the “shape” layer to one containing shapes of States in the US. In the wizard provided we can match the shape names in the map we have selected (STATE_NAME) to the appropriate column (Retailer Province-State) in our query:

 

 

EM4C_3_4_Join

We select the measures we are interested in…

EM4C_3_5_Fields

… and then configure the “shape join”, assigning colour-values to relative levels of each measure (in this case, Revenue):

EM4C_3_6_Style

We now have a map that lets us see, by quantile, how revenue compares by state:

EM4C_3_7_Map_1

Easy!

However, because we have selected several measures, we can also use the map legend to select the other measures and see how they compare as well:EM4C_3_8_Map_2

For example, here is the map showing Gross Profit:

EM4C_3_9_Map_3

Note that the legend shows the quantile breakdowns for each colour. As well, hovering over each state brings up information on the state:

EM4C_3_10_Map_4

Users are not limited to a single shape layer – multiple layers can be combined on a single map, and then the layers activated/deactivated by the user to how different data by different “shape”.

Shapes are not limited to conventional maps, of course. Floor plans provide an ideal source of shapes. Retailers can use shapes to identify revenue by area of a store, or property managers can look at building usage, perhaps over time. All that is needed is a Esri map with shapes that correspond to the physical areas the user is interested in, and have an attribute that can be joined to a column in the report that contains values that match the values of the attribute.

 

 

, , , , ,

No Comments

An Overview of Esri Maps For Cognos, Part 2 – Points

(Special “If You’re Going To San Francisco” edition)

By Peter Beck, CBIP

In part 1 of this series, we looked at how Esri Maps For Cognos – EM4C – allows us to embed a map from an Esri map server inside a Reoprt Studio report. But the map is pretty useless if it doesn’t allow us to connect to our data and perform some kind of analysis that can’t be done with a regular list report, or with some kind of graph.

From a mapping perspective there are a couple of concepts that we need to keep in mind if we are going to bind business data to a map: one is the idea of a point, the other the idea of a shape.

Creating map-points (old school)

Creating map-points (old school)

We’ll start with a point. A point is a lat/long value on a map: it is (strictly speaking) an entity with no area. It could be a point that represents a store location, a home address, whatever you like. The important thing to keep in mind is that even if a store (or your house) occupies area, from a mapping/point perspective it is simply a point on the map.

So what kind of data can we plot using points? Crime data is one example – a police call is typically to a particular address. If we can plot these locations on a map, by type, we might gain insights into what kinds of crimes are being reported not just by location, but by location relatively to each other – what kinds of crimes cluster together, geographically.

Crime data for San Francisco for March, 2012 is available on the web, and this data set comes with both category of crime and lat/long of the police report. This makes the data set ideal for plotting on a map.

First, I set up a quick Framework Manager model that retrieves the data from my database. Then, we need a query in Report Studio that retrieves the data:

Creating a simple query

Creating a simple query

Note that we have a Category, Description, and X and Y values representing Longitude and Latitude respectively.

I add a map placeholder (as we did in Part 1) and then save the report. (I could, of course, add any additional report items, queries etc to the report that I wish.) I then open the map placeholder in Esri Maps Designer, add a base map, and then add a new layer: the special Cognos X Y Layer. I rename it Crime_Locations:

Adding the X Y Layer

Adding the X Y Layer

A wizard enables me to select the query associated with the Crime_Locations layer, which will display points:

Selecting Data

Selecting Data

Note the inclusion of a Unique Field – this is the IncidentNum from the original data.

Further configuration allows me to then assign the Lat/Long from the data set, and identify each point by the Category of crime.

Categorization

Categorization and Symbolization

I now have a set of symbols – coloured squares – that correspond with the categories of my data. When I view my report, I can see the location of each crime, by colour-coded type, at each location it was reported at:

Woah thats a lot of crime...

Woah thats a lot of crime…

Even at this zoom level I can draw some conclusions about what areas have more crime – the north-east seems to have more reports that the south-east, for example. But by selection of specific crimes, and zooming in, interesting patterns begin to emerge.

What patterns are emerging?

What patterns are emerging?

The orange squares represent drug-related charges. The green and purple squares are assault and robbery charges respectively. The drug-related charges are more concentrated in one relatively small area, while the assault and robbery charges seem more spread out – but with a concentration of them in the area the drug charges are also being laid.

If we zoom in even closer, we can see that certain streets and corners have more calls than others in close proximity – that the crimes seem to cluster together:

Stay away from these areas...

What’s special about these areas?

But zooming out again, we see an interesting outlier – a rash of drug charges along one street, with what appears to be relatively few assaults or robberies:

Something looks out of place...

Something looks out of place…

Zooming in we see that this activity is almost completely confined to a 7-block stretch of Haight St., with virtually no activity in the surrounding area, and few robberies or assaults:

What is special about this street?

What is it that makes this street so special?

This kind of spatial relationship is extremely hard to discern from a list or chart, even a chart that breaks events like police calls down by geographic category of some kind. But using mapping, with a simple zoom we can go from an overall view of patterns of activity to a much higher degree of detail that begins to tell some kind of story, or at least warrant further investigation.

But wait, there’s more…

By hovering over an individual square, I can get additional category information from my underlying data, assuming I have included it in my query. In this case there is a sub-category of the call:

The reader is left to draw his or her own conclusions...

The reader is left to draw his or her own conclusions…

By adjusting the query I can re-categorize my data to yield results by, for example, day of the week, or sub-category.  For example, here we can contrast Possession of Marijuana (green) with Possession of Base/Rock Cocaine (pink):

Patterns of behaviour...

Patterns of behaviour…

Marijuana possession seems more diffuse, although concentrated in a few areas. The cocaine charges are much more concentrated.

In our next entry in this series, we’ll take a look at allocating data to shapes, to colour-code areas to represent different levels of activity.

 

 

 

 

 

 

, , , , ,

1 Comment

An Overview of Esri Maps For Cognos, Part 1 – Intro

By Peter Beck, CBIP

Cognos report writers have long been frustrated by the poor built-in support for GIS-type displays in Cognos reporting tools. True, there is a basic map tool included as part of Report Studio, but it is quite limited in functionality. It can be used to colour geographic areas, but lacks layering, zooming, sophisticated selection tools, and the kind of detail we’ve all become used to with the advent of Google Maps and the like.

Wonder how the sales are doing in Mordor

Wonder where the ring sales are these days?

There are a few map-related add-ons for Cognos reporting available. Recently I had the opportunity to take Esri’s offering in this space for a test drive with a 2-day training session at Esri Canada’s Ottawa office. I came away impressed with the power and ease-of-use offered by this product.

EM4C – Esri Maps For Cognos – came out of development by SpotOn Systems, formerly of Ottawa, Canada. SpotOn was acquired by Esri in 2011. The current version of the product is 4.3.2. The product acts as a kind of plug-in to the Cognos portal environment, enabling Report Studio developers to embed Esri maps, served up by an Esri server, in conventional Report Studio reports. From a report developer perspective EM4C extends Report Studio, and does so from within the Cognos environment. This is important: EM4C users don’t have to use additional tools outside the Cognos portal. From an architectural perspective things are a little more complex: the Cognos environment must be augmented with EM4C server, gateway and dispatcher components that exist alongside the existing Cognos components.

Then, of course, there are the maps themselves. Since this is a tool to enable the use of Esri maps, an Esri GIS server must be available to serve the maps up to the report developer and ultimately the user. For shops that are already Esri GIS enabled this is not a challenge, and indeed I can see many users of this product wanting to buy it because they have a requirement to extend already available mapping technolgy into their BI shops. However, if you don’t have an Esri map server, don’t despair – the product comes with out-of-the-box access to a cloud-based map server provided as part of the licence for the product. This is a limited solution that won’t satisfy users who have, for example, their own shape files for their own custom maps, but on the other hand if you have such a requirement you probably already have a map-server in-house. If you are new to the world of GIS this solution is more than enough to get started.

So where do we start with EM4C? First, you need a report that contains data that has some geographic aspect to it. This can be as sophisticated as lat/long encoded data, or as simple as something like state names.

When we open our report, we notice we have a new tool: the Esri Map tool:

Selecting the Esri Map tool

Selecting the Esri Map tool

As mentioned, the EM4C experience is designed to enable the report writer to do everything from within Cognos. Using this tool we can embed a new map within out report:

Map Placeholder

Map Placeholder

 

So now what? We have a map place-holder, but no map. So the next step is to configure our map.

This step is done using Esri Maps Designer. This tool is installed in the Cognos environment as part of the EM4C install, and enables us to configure our map – or maps, as we can have multiple maps within a single report.

Selecting Esri Map Designer

Selecting Esri Map Designer

Esri Maps Designer is where we select the map layers we wish to display in our report. When we open it we can navigate to any Report Studio reports in which we have embedded and Esri map :

 

Selecting a map to configure

Selecting a map to configure

In this case VANTAGE_ESRI_1 is the name of the map in my report; the red X indicates it has not been configured yet. Clicking Configure brings up our configuration. This is where we select a Base Map, and then link our Cognos data to a layer to overlay on the map.

As mentioned, out-of-the-box the EM4C product enables the user to use maps served from the Esri cloud. We will select one of these maps from Esri Cloud Services as the Base Map of our report:

Maps available from Esri's cloud services

Maps available from Esri’s cloud services

When the base map is embedded, it becomes a zoom-able, high-detail object within the report:

An embedded map

An embedded map

Unfortunately, while the map looks great it bears no relationship to the report data. So now what?

In part 2 of this overview we will look at how to connect the report data points to the report map. It is the combination of the ease-of-use of BI tools (and the data they can typically access) with mapping that makes a tool like EM4C so powerful. We will symbolize data to created colour-coded map-points to reveal the geographic location and spatial relation data, potentially allowing users to draw conclusions they otherwise would not have been able to with list-type data.

 

 

, , , , ,

2 Comments

Toggle Between Data Sources for Different Cognos Packages

ConnectionsIf you have one Cognos Framework Manager model with multiple packages, what can you do to publish some packages using one data source and some using another? You want to avoid duplicating your current Framework Manager model because you want only one model to maintain. How can you set something up that will allow you to toggle between data sources?

  1. Add all the data sources you want to toggle between to your Framework Manager model. Obviously these data sources will have to have the same underlying database structure or your model will not work. The best way to add them is through the Run Metadata Wizard, which will force you to add at least one table reference. You can remove that table reference when you have the data source added. You may need to set the Query Processing setting to “Limited Local” for your new data source to work properly.

  2. Create a session parameter. This will become the global variable of which data source your model is currently pointing at. The value of this session parameter should be the value of the data source you want to reference for your current publish.

  3. Change all of your database queries to use the session parameter reference instead of the data source reference. So instead of this:

SELECT * FROM [My Data Source].MY_DB_TABLE

You should have:

SELECT * FROM #sb($MySessionParameter)#.MY_DB_TABLE

You should include the square bracket function in your session parameter call. You can alternatively add the square brackets to your session parameter value. Do not use square brackets outside of the function call – it will not work, at least not as I tested it.

  1. Set your session parameter to the value you want for the Cognos package you wish to publish.

  2. Publish your Cognos package.

Repeat steps 4 and 5 as many times as you need to reset your data source and publish your packages. You can now easily toggle between your data sources and publish Cognos packages as needed.

photo by: juhansonin

,

No Comments

The Dangers of Data on the Cloud

Sky symphonyAcross the internet, many data services exist on the Cloud – those faraway servers managed by someone else, often for free. Not long ago I had a minor bad example of what can happen when you rely on such data services. Until recently, LinkedIn had a handy feature that I really liked – the Amazon Reading List.  Here I could keep track of books I’d read, share these books with my LinkedIn community, and find other books to read from other people’s lists. Certainly not mission critical data, but data that I appreciated. In an effort to have greater security on LinkedIn (I later learned) LinkedIn dropped support for external apps sometime in November. This came rather suddenly and without warning to LinkedIn users like me, and perhaps to Amazon as well (they continue to develop a new platform for replacing the reading list). For the moment, my reading list is gone (Amazon has promised to replace it sometime).

Now imagine if this list was instead a list of clients, or a list of contacts, or a list of project work items. A company providing such a service can radically change its terms of service (as in the case with the Instagram debacle) or its pricing , can suffer a major security breach or perhaps even go out of business. And where would we be then? We are becoming increasing dependent on internet services that can be revoked or changed on a moment’s notice. Granted, some of these are frivolous but some of these are important too.

What are the contingencies you have to protect your data on the cloud?

photo by: kevin dooley

,

No Comments