planetDB2 logo

  planetDB2 is an aggregator of blogs about the IBM DB2 database server. We combine and republish posts by bloggers around the world. Email us to have your blog included.

April 16, 2014

DB2Night Show News

18 APR 10a CDT: Update - New Topic on DB2 LUW Indexes with Ember Crooks

Attend Episode #133 of The DB2Night Show™ to learn "Why Low Cardinality Indexes Negatively Impact Performance". Ember Crooks, IBM DB2 GOLD Consultant and Sr. Director at Rosetta, replaces Gopi...


Willie Favero

Survey: Understanding the business applications running on System z (Mainframe)

(Posted Friday, April 16, 2014) Help us (IBM) to better understand the business applications you are (or have been) running on System z, your IBM Mainframe, by completing a short 5 minute survey! This survey has been designed to help IBM better understand how your IBM System z inv...

(Read more)

DB2 Guys

Fraud detection? Not so elementary, my dear.


Radha Gowda, Product Marketing Manager, DB2 and related offerings

Did you know that fraud and financial crime has been estimated at over $3.5 trillion annually1?  Identity theft alone cost Americans over $24 billion i.e. $10 billion more than all other property crimes2?  And, 70% of all companies have experienced some type of fraud3?

While monetary loss due to fraud is significant, the loss of reputation and trust can be even more devastating.  In fact, according to a 2011 study by Ponemon Institute, organizations lose an average of $332 million in brand value in the year following a data breach. Unfortunately, fraud continues to accelerate due to advances in technology, organizational silos, lower risks of getting caught, weak penalties, and economic conditions.  In this era of big data, fraud detection needs to go beyond traditional data sources i.e. not just transaction and application data, but also machine, social, and geospatial data for greater correlation and actionable insights. The only way you can sift through vast amount of structured and unstructured data and keep up with the evolving complexity of fraud is through smarter application of analytics to identify patterns,construct fraud models, and conduct real-time detection of fraudulent activity.

 IBM Watson Foundation portfolio for end-to-end big data and analytics needs


While IBM has an impressive array of offerings addressing all your big data and analytical needs, our focus here is on how DB2 solutions can help you develop and test fraud models, score customers for fraud risk, and conduct rapid, near-real-time analytics to detect potential fraud.  You have the flexibility to choose the type of solution that best fits your needs – select software solutions to take advantage of your existing infrastructure or choose expert-integrated appliance-based solutions for simplified experience and fast time to value.

Highly available and scalable operational systems for reliable transaction data

DB2 for Linux, UNIX and Windows software is optimized to deliver industry-leading performance across multiple workloads – transactional, analytic and operational analytic – while lowering administration, storage, development, and server costs.  DB2 pureScale, with its cluster based, shared disk architecture, provides application transparent scalability beyond 100 nodes, helps achieve failover between two nodes in seconds, and offers business continuity with built-in disaster recovery over distances of a thousand kilometers.

IBM PureData System for Transactions, powered by DB2, is an expert integrated server, storage, network, and tools selected and tuned specifically for the demands of high-availability , high-throughput transactional processing—so you do not have to research, purchase, install, configure and tune the different pieces to work together. With its pre-configured topology and database patterns, you can set up high availability cluster instances and database nodes to meet your specific needs and deploy the same day rather than spend weeks or months. As your business grows, you can add new databases in minutes and manage the whole system using its intuitive system management console.

Analytics for fraud detection

 DB2 Warehouse Analytics  DB2 advanced editions offer capabilities such as online analytical processing (OLAP), continuous data ingest, data mining, and text analytics that are well-suited for real-time enterprise analytics and can help you extract structured information out of previously untapped business text.  Its business value in enabling fraud detection is immense.

IBM PureData System for Operational Analytics, powered by DB2, helps you deliver near-real-time insights with continuous data ingest and immediate data analysis.  It is reliable, scalable, and optimized to handle 1,000s of concurrent operational queries with outstanding performance. You can apply fraud models to identify suspicious transactions while they are in progress, not hours later. This can apply across any industry segment, including financial services, health care, insurance, retail, manufacturing, and government services.  PureData System for Operational Analytics helps with not just real-time fraud detection, but also cross-sell or up-sell offers/services identifying customer preferences, anticipating their behavior, and predicting the optimum offer/server in real-time.

DB2 with BLU Acceleration, available in advanced DB2 editions, uses advanced in-memory columnar technologies to help you analyze data and generate new insights in seconds instead of days.  It can provide performance improvements ranging from 10x to 25x and beyond, with some queries achieving 1,000 times improvement4,  for analytical queries with minimal tuning.  DB2 with BLU Acceleration is extremely simple to deploy and provides good out-of-the-box performance for analytic workloads. From a DBA’s perspective, you simply create table, load and go. There are no secondary objects, such as indexes or MQTs that need to be created to improve query performance.

DB2 with BLU Acceleration can handle terabytes of data to help you conduct customer scoring across your entire customer data set, develop and test fraud models that explore a full range of variables based on all available data.  Sometimes creating a fraud model may involve looking at 100s of terabytes of data, where IBM® PureData™ System for Analytics would fare better.  Once a fraud model is created, you can use DB2 with BLU Acceleration to apply fraud model to every transaction that comes in for speed of thought insight.

IBM Cognos® BI  DB2 advanced editions come with 5 user licenses for Cognos BI, which enable users to access and analyze the information consumers need to make the decisions that lead to better business outcomes.  Cognos BI with Dynamic Cubes, in-memory accelerator for dimensional analysis,enables high-speed interactive analysis and reporting over terabytes of data.  DB2 with BLU acceleration integrated with Cognos BI with Dynamic Cubes offers you a fast-on-fast performance for all your BI needs.

With the array of critical challenges facing financial institutions today, smarter are the ones that successfully protect their core asset – data. IBM data management solutions help you integrate information, generate new insights to detect and mitigate fraud. We invite you to explore and experience DB2 and the rest of Watson foundation offerings made with IBM.

Stay tuned for the second part of this blog that will explore the product features in detail.

1 ACFE 2012 report to the nations
2 BJS 2013 report on identity theft
3Kroll 2013/2014 global fraud report

4 Based on internal IBM tests of analytic workloads comparing queries accessing row-based tables on DB2 10.1 vs. columnar tables on DB2 10.5. Results not typical. Individual results will vary depending on individual workloads, configurations and conditions, including size and content of the table, and number of elements being queried from a given table.

Follow Radha on Twitter @rgowda



April 15, 2014

Ember Crooks

DB2 Basics: Capitalization

When does case matter in DB2? Well, it doesn’t unless it does. Nice and clear, huh? When Text Must be in the Correct Case Text must be in the correct case whenever it is part of a literal...


Chris Eaton

How Data Skipping works in BLU Acceleration - Part 2

In my first part on Data Skipping I gave you the reasons for, and a short example on the benefits of data skipping. In this blog posting I will describe how the synopsis table works and how it is used by DB2.


Data skipping in BLU is made possible by storing "metadata" about the various values in a given columns so that at runtime we can skip over portions of the

DB2Night Show News

18 APR 10a CDT: Make DB2 LUW Tuning Less Taxing

Attend Episode #133 of The DB2Night Show™ to learn about DB2 V10.1 performance enhancements with guest Gopi Attaluri from the IBM Silicon Valley Lab. Mr. Attaluri will focus on Index Jump...


April 14, 2014

Craig Mullins

Aggregating Aggregates Using Nested Table Expressions

Sometimes when you are writing your SQL to access data you come across the need to work with aggregates. Fortunately, SQL offers many simple ways of aggregating data. But what happens when you uncover then need to perform aggregations of aggregates? What does that mean? Well, consider an example. Let's assume that you  want to compute the average of a sum. This is a reasonably common...

(Read more)

Frank Fillmore

IBM DB2 Analytics Accelerator (#IDAA) Workshop a Success!

Last week in Baltimore The Fillmore Group conducted a deep-dive technical workshop on IBM DB2 Analytics Accelerator.  We covered Basic and Advanced hands-on labs with the support of the IBM...

(Read more)

Ember Crooks

Journey of a DB2′s Got Talent Winner

I’ve been encouraged by a few to tell my story. How I was encouraged into DB2′s Got Talent 2014, what it was like, decisions I made on the fly, any advice, and what I learned from my...


April 11, 2014

Data and Technology

The Ethical DBA?

Today’s posting is a re-blog of a post I wrote several years ago for another blog (that has since been discontinued). But I think the subject matter is important enough that it warrants...

(Read more)

DB2Night Replays

The DB2Night Show #132: All About IDUG & DB2's GOT TALENT Winners!

Hover your mouse over finalist photos for more information and links to LinkedIn profiles! All About IDUGPhoenix, AZ - May 12-16, 2014 DB2's GOT TALENT Winners Announced! 100% of our audience learned something! IDUG volunteers Bob Vargo and Terry Johnson shared with us important news, updates, and tips for the upcoming IDUG Conference. There's a new process this year to sign up for the Thursday Dine-Arounds with your favorite speakers! ...

(Read more)

Henrik Loeser

DB2 Quiz: Find the website for this screenshot

Today's DB2 quiz is not that technical, but it requires that you are up-to-date on IBM's offerings for DB2. What is the context for this screenshot? On which website did I take it? Probably easy...

(Read more)

April 10, 2014

Willie Favero

APAR Friday: LOB insert performance issue resolved

(Posted Friday, April 10, 2014) This post is going to be pretty short this afternoon.   I?ve been talking with a lot more people lately that are messing around with LOBs.   So when I saw this APAR show up that takes care of of the reuse of free space, I thought ...

(Read more)

DB2 Guys

Improve IT Productivity with IBM PureData System for Transactions


Kelly Schlamb , DB2 pureScale and PureData Systems Specialist, IBM
I’m a command line kind of guy, always have been. When I’m loading a presentation or a spreadsheet on my laptop, I don’t open the application or the file explorer and work my way through it to find the file in question and double click the icon to open it. Instead, I open a command line window (one of the few icons on my desktop), navigate to the directory I know where the file is (or will do a command line file search to find it) and I’ll execute/open the file directly from there. When up in front of a crowd, I can see the occasional look of wonder at that, and while I’d like to think it’s them thinking “wow, he’s really going deep there… very impressive skills”, in reality it’s probably more like “what is this caveman thinking… doesn’t he know there are easier, more intuitive ways of accomplishing that?!?”

The same goes for managing and monitoring the systems I’ve been responsible for in the past. Where possible, I’ve used command line interfaces, I’ve written scripts, and I’ve visually pored through raw data to investigate problems. But inevitably I’d end up doing something wrong, like miss a step, do something out of order, or miss some important output - leaving things not working or not performing as expected. Over the years, I’ve considered that part of the fun and challenge of the job. How do I fix this problem? But nowadays, I don’t find it so fun. In fact, I find it extremely frustrating.Things have gotten more complex and there are more demands on my time. I have much more important things to do than figure out why the latest piece of software isn’t interacting with the hardware or other software on my system in a way it is supposed to. When I try to do things on my own now, any problem is immediately met with an “argh!” followed by a google search hoping to find others who are trying to do what I’m doing and have a solution for it.

When I look at enterprise-class systems today, there’s just no way that some of the old techniques of implementation, configuration, tuning, and maintenance are going to be effective. Systems are getting larger and more complex. Can anybody tell me that they enjoy installing fix packs from a command line or ensuring that all of the software levels are at exactly the right level before proceeding with an installation of some modern piece of software (or multiple pieces that all need to work together, which is fairly typical today)? Or feel extremely confident in getting it all right? And you’ve all heard about the demands placed on IT today by “Big Data”. Most DBAs, system administrators, and other IT staff are just struggling to keep the current systems functioning, not able to give much thought to implementing new projects to handle the onslaught of all this new information. The thought of bringing a new application and database up, especially one that requires high availability and/or scalability, is pretty daunting. As is the work to grow out such a system when more demands are placed on it.

It’s for these reasons and others that IBM introduced PureSystems. Specifically, I’d like to talk here about IBM PureData System for Transactions. It’s an Expert Integrated System that is designed to ensure that the database environment is highly available, scalable, and flexible to meet today’s and tomorrow’s online transaction processing demands. These systems are a complete package and they include the hardware, storage, networking, operating system, database management software, cluster management software, and the tools. It is all pre-integrated, pre-configured, and pre-tested. If you’ve ever tried to manually stand up a new system, including all of the networking stuff that goes into a clustered database environment, you’ll greatly appreciate the simplicity that this brings.

The system is also optimized for transaction processing workloads, having been built to capture and automate what experts do when deploying, managing, monitoring, and maintaining these types of systems. System administration and maintenance is all done through an integrated systems console, which simplifies a lot of the operational work that system administrators and database administrators need to do on a day-to-day basis. What? Didn’t I just say above that I don’t like GUIs? No, I didn’t quite say that. Yeah, I still like those opportunities for hands-on, low-level interactions with a system, but it’s hard not to appreciate something that is going to streamline everything I need to do to manage a system and at the same time keep my “argh” moments down to a minimum. The fact that I can deploy a DB2 pureScale cluster within the system in about an hour and deploy a database in minutes (which, by the way, also automatically sets it up for performance monitoring) with just a few clicks is enough to make me love my mouse.

IBM has recently released some white papers and solution briefs around this system and a couple of them talk to these same points that I mentioned above. To see how the system can improve your productivity and efficiency, allowing your organization to focus on the more important matters at hand, I suggest you give them a read:

Improve IT productivity with IBM PureData System for Transactions solution brief
Four strategies to improve IT staff productivity white paper

The four strategies as described in these papers, that talk to the capabilities of PureData System for Transactions, are:

  • Simplify and accelerate deployment of high availability clusters and databases
  • Streamline systems management
  • Reduce maintenance time and risk
  • Scale capacity without incurring downtime

I suspect that I won’t be changing my command line and management/maintenance habits on my laptop and PCs any time soon, but when it comes to this system, I’m very happy to come out of my cave.

Ember Crooks

Last Chance to Vote in DB2′s Got Talent!

Have you enjoyed posts from this blog? Have they helped you? Now is your chance to pay it forward. Go vote in DB2′s Got Talent and help YOUR favorite competitor win. I’m voting for...


Willie Favero

April 2014 (RSU1403) service package has been tested and is now available

(Posted Friday, April 10, 2014) Testing for RSU service package RSU1403 is now complete. This RSU closes out 2013 and starts with January 2014. This April 2014 1st Quarter 2014 "Quarterly Report" (118 KB PDF file) contains ALL service through the end of December 2013 not already marked RSU. This...

(Read more)

April 08, 2014

Scott Hayes

FREE IDUG Phoenix AZ Exhibit Hall Pass

Will you be in the Phoenix AZ area during May 13-15? Are you a DB2 professional? DBI Software invites you to join us, free of charge, in the IDUG Exhibit Hall, booth #105, with this FREE PASS! Check out the latest DB2 solutions and enjoy magic performed by professional magician Frank Velasco at DBI's Booth #105!

(Read more)

Susan Visser

The Rise of the Information Producer


As part of a continuing series, Claudia Imhoff has created a paper that describes the impact the emerging Information Producer role can have on warehouse environments and IT organizations. 

In a recent blog, Rachel Bland of IBM sees that the Information Producer role in organizations has the potential to cause angst for IT.  The Information Producer’s need for self-service must to be balanced against other needs such as governance, system stability and performance, and the need to sustain services for other end users.

To address these competing needs such as these, IT can look to innovative technologies that speed deployment and query performance, simplify maintenance and offer affordability. 

One IBM advantage is having a broad perspective and a portfolio of products that includes The IBM Business Intelligence Pattern with BLU Acceleration. Aligning with these objectives, this pattern offering can help to reduce the burden on IT and provide every user who interacts with the system with a “speed of thought” response.

Read more of Rachel’s blog –


Thanks to Cindy for sharing!



DB2Night Show News

Fri 11 APR 10am CDT: ALL ABOUT IDUG and DB2's GOT TALENT 2014 Winners!

Incredible! Once again, we are reminded that EVERY VOTE COUNTS! As of 1pm CDT 8 April 2014, over 1,054 votes have been cast and there is ONLY ONE vote difference between 1st place and 2nd place! ...


Ember Crooks

IBM DB2 Certification – A Comprehensive Guide as of Today

Nearly every DB2 conference I go to has me thinking about DB2 certification. That is probably because most conferences include free or reduced cost certification testing. I have covered this topic...



DB2 Subsystem Tuning Tips

Many, many years ago, I was a DB2 systems programmer. As part of my training, I was sent to an IBM DB2 systems tuning course....

Willie Favero

The IBM The IBM System/360: it started it all 50 years ago today... and the world has never been the same since

(Posted on Monday, April 7, 2014) Today is the 50th Anniversary of the IBM System/360, the machine that started the whole mainframe thing. Tuesday, April 7, 1964 (we did all of announcements on Tuesday even back then) marked the beginning of a whole new way of doing computing, a way never tried...

(Read more)

April 07, 2014

Henrik Loeser

50 Years of IBM Mainframe: The Art of Selling

IBM (and the world) are celebrating 50 years of mainframe, "Make the Extraordinary possible". To honor the mainframe, I want to point you to a series of IBM-produced videos from few years back. The...

(Read more)

Triton Consulting

Happy 50th Birthday Mainframe!

To celebrate Mainframe’s 50th birthday some of our Triton Consultants share their favourite mainframe stories: My first contact with the mainframe was as a lowly graduate Trainee Programmer at a large chocolate manufacturer. All programs were stored on punch cards; partly … Continue...

(Read more)

Craig Mullins

DB2 Buffer Pool Monitoring

After setting up your buffer pools, you will want to regularly monitor your configuration for performance. The most rudimentary way of doing this is using the -DISPLAY BUFFERPOOL command. There are many options of the DISPLAY command that can be used to show different characteristics of your buffer pool environment; the simplest is the summary report, requested as follows: -DISPLAY...

(Read more)

April 05, 2014

Dave Beulke

Five More SQL Performance Tips for your Big Data

I have talked about many DB2 SQL performance tips before (10 Performance Guidelines and Five Big Data SQL Performance Tips). Dealing recently with tables with tens of billions of rows and crazy generated SQL from GUI interfaces has resulted in these five more SQL performance tips for your big data...

(Read more)


planetDB2 is an aggregator of blogs about the IBM DB2 database server. We combine and republish posts by bloggers around the world. Email us to have your blog included.