IBM hosted the inaugural TechXchange Conference in Las Vegas last week (September 11 – 15, 2023). Here’s what I liked:

  • The Developer community was the primary focus of TechXchange. This is a significant change from the pre-pandemic IBM technical conferences – Think and World of Watson (sited variously in Las Vegas and San Francisco). Two of the TechXchange keynote presenters pointed out that there were no “suits” at the conference (in reference to both people and apparel).
  • Artificial Intelligence (AI) was (not surprisingly) given a shout-out in almost every keynote, but I thought the most prescient comment was that in three years Quantum Computing will be where AI is today.


  • Since I’m a career database administrator (DBA) I am really jazzed about IBM’s Lakehouse approach to data. For decades IBM was primarily focused on IMS (hierarchical) for large-scale online transaction processing (OLTP) and Db2 (relational) for analytical workloads (and, eventually, OLTP too).
  • IBM made a brief foray into Data Lake (vast amounts of unstructured or semi-structured data with varying standards of curation) with its BigInsights Hadoop distribution.
  • But Lakehouse is the combination of traditional shared-nothing MPP IBM Db2 Warehouse (relatively structured with higher levels of curation, consistency, and veracity) and an open source stack of data repositories: Lake + House.

Key characteristics of Lakehouse:

  1. Let the data decide where it should reside. Based on data attributes such as volume, structure, quality, timeliness, etc. choose the data repository that best addresses those attributes. You no longer have to force data into a repository where it’s not best-suited.
  2. Seamlessly access data regardless of how and where it’s stored.
    1. Data Scientists can easily combine customer account data, transaction history, and sentiment analysis.
    2. An Auto Repair Technician can pull up vehicle history by Make/Model/Trim/Year, geography, mileage, and so forth on a tablet in a garage bay while the car is on the lift.
  3. Ever increasing data storage costs have your CFO in a frenzy? Performant Object Storage will likely meet your response-time targets for most applications – at about 3% of the cost of Block Storage. Automated multi-temperature storage enables you to transition data to the storage platform most suited to its age, likelihood of being retrieved, and other factors.
  4. On-prem, in the cloud, or SaaS? You decide where – and by whom – the infrastructure underpinning your OLTP and analytics workloads are provisioned.
  5. Containerization enables scaling that infrastructure as required by both cyclical and black-swan events.

A much more detailed dive into Db2 Warehouse Gen 3 and will be in an upcoming blog post.