https://slotsdad.com/ - casino online slots

Day 6 and Still Off the Nails: “Forced March” or “Business Savvy” Move to DB2 10?

by Administrator on July 12, 2012

Well, I have refrained from posting here for a few days — out of consideration for the poor folks who might be victimized by my writing, which is likely to be colored by all of the nicotine withdrawal.  Today, however, I am feeling quite a bit better about things and wanted to give some kudos where they are deserved.

This will be a two part post.  In part 1, I want to give some attention to some improvements that IBM has made in DB2 for the mainframe, and more recently, that CA Technologies has made in its support tools for DB2.  Both have been under-reported in the technology trade press, which has a tendency to under report just about anything in the MF space anyway.

DB2 10 for zOS was introduced this past April with all of the standard ruffles and flourishes that Big Blue visits on its flagship products.  There is a pretty good site at IBM with lots of links to Redbooks, customer case studies, TCO and business value case whitepapers, etc., if you want to read and watch.  Basically, their argument for the product distills down into a single value prop:  lots of new and improved functionality with 5-10% less CPU demand.  That’s bumper sticker quality.

To my way of thinking, the focus of the announcement really seemed to be on Big Data woo, rather than product efficiencies.  IBM introduced a couple of interesting ideas around “unique temporal capabilities” in the product (i.e., data warehousing) that were supposed to support the needs for businesses to keep all data forever without slowing down DB queries and other operations.  I was rather dissatisfied as I read their whitepaper on this, since there seems to be a disconnect in their messaging around (1) Big Data Analytics and (2) infrastructure efficiencies gleaned from an improved DB2 code base.  Let me explain.

As I understand Jeff Jonas, Big Data isn’t about creating data warehouses, it is about surfacing all data in real time in order to subject it to analytics that will reveal important info.  The whole reviewing old data in a data warehouse is so day before yesterday to hear Jonas and others describe it, so data mining in the 1990s sense of the word.  So, the emphasis on ease of data warehousing and data mining as a new “temporal capability” of DB2 and its fit with evolving Big Data Analytics strikes me as a kind of half-baked effort by marketing folks to join two concepts together while understanding neither.  It’s a stretch.

The same may be said of the claims around infrastructure efficiencies.  Yes, DB2 10 does give you an overall gain in CPU cycles by reducing those cycles required by the DB engine.  But CPU isn’t the only resource that costs money and may be in short supply.  What about disk storage resources?  If IBM says that everyone is holding on to DB2 output forever and “using”  data subsets (this usually means replicating) to create “temporal” warehouses, aren’t we eating up more storage capacity than we were before?  Again, I sense that we aren’t getting a holistic sense of what the impact would be of a migration from DB2 9 to DB2 10 in terms of overall resource utilization efficiency.

These are just a couple of nagging questions I have for IBM.  If they wish to send me a response with some pointers to materials that backup their arguments, I would be delighted to print them here.  I am not anti-DB2, whatever the release, I just need to feel warm and fuzzy that IBM has a sense of how its visions in multiple areas actually tie together to benefit the customer.

However you cut it, there seems to be some pressure out there to do the migration to DB2 10.  I am picking this up in trade press accounts (what few there are) and in this week’s announcement from CA Technologies that they have gone General Availability with their excellent DB2 tools, updated to support version 10.  In case you missed the announcement, my editors at ESJ.com elected to reprint it here.  From this release announcement, it sounds like Big Blue is force marching its installed base of DB2 9 and earlier to the new release by end-of-life-ing the older wares.  Personally, I hate it when a vendor does this, though I can understand what a buzz hassle it is to support multiple iterations of a product.

Anyway, CA has updated its DB2 tools to faciliate the migration/transition of older DB2 databases to the latest release of DB2, promising to increase the stability of rollouts and rebindings, to pre-plan the transition and to identify in advance changes and modifications that will be required to applications, etc. before you commit.  As a benefit, all of these tools can be surfaced and operated through what we have called the Mainframe Product of the Year from CA Technologies:  CA Mainframe Chorus.

The CA Technologies DB2 tool kit and CA Mainframe Chorus are definitely worth a look if you have decided to make the version upgrade in DB2.  But, before you go making any changes, why not ask Big Blue to tell you what all of your efforts will REALLY buy you in terms of any gains toward Big Data Analytics nirvana or infrastructure efficiency improvement goals you may be targeting in your strategic plans.

Nuff said.

(And because I am still struggling with nicotine withdrawal, one more edgy comment to IBM:  It would be a heck of lot easier for customers to migrate to DB2 10 if it were installable using  CA Technologies’ free Mainframe Software Manager.  Are you guys any closer to working together with CA on a common installshield manager for mainframes?)

 

Previous post:

Next post: