Monday, May 08, 2006

Just how much of a liability is DB2's GUI?

InfoWorld logoInfoWorld Database Underground blog: What is the Future of DB2?


This InfoWorld blog post does have some undisputed facts: Sean McCown has written about MS SQL Server in the past, and his spell checker is broken. The rest is a bit more speculative and debatable.


So, DB2 has impressive features, but its marketing strategy and clunky GUI are holding it back. Is it as simple as that? Is the release of Express-C of any help in increasing DB2's market share? Does Gartner's recent forecast of strong growth for DB2 take its allegedly meager GUI tools into account?


Although vague on details, I still consider the opinions in the InfoWorld blog post to be food for thought. Many DB2 DBAs I have talked to proudly claim they shunned most if not all of the GUI tools long ago. Would this stance be less common if DB2 had better GUI tools? If so, perhaps it's time more people participated in IBM's user feedback opportunities, either at DB2 conferences or right at an IBM lab.

1 Comments:

At 5:30 PM, Blogger dave74737 said...

Generally, the capability of the "common" GUI tools such as the control center and development center is good. To be fair, IBM have made reasoable efforts to ensure that they are up to date with server side improvements and the GUI tools even some additional value that is not easily accessible from the command line.(e.g. sproc debugger)

The problem is that the GUI tools IBM provides just don't scale into a large environment - wheither you define large as a single large system or a large number of medium sized instances. This is something which other dataserver vendors (particularly MS) are excellent at. Some examples:

i) The control and development center consume roughly 500Kb of your PC's memory per table in the database you are connected to. If you have some time, create a developer project that references a schema with 5,000 tables and watch what happens...

ii) In firms with many dataserver instances / databases (in our case, over 1000), its common to leverage LDAP to house the node and db catalogs in a central place so that you don't have the nightmare of updating the local catalog once an hour on all of the client hosts (again, many 1000's).
Unfortunately, the control center and development center don't appear to scale their querying of LDAP very well. I believe there were some improvements in FP10, but even so, just starting the control center takes around 10 minutes on my powerful PC - all of it spent searching through the catalog in LDAP.

iii) Most of the IBM Gui tools are java based. This is not a performance problem - but a compatibility one. More that once, JVM security patches on our windows machines have completely broken all of the IBM gui tools.

iv) No scripting or mass deployment capability. It is very rare to make bespoke changes to an individual database. In a large environment, everything should be scripted to replay against a configurable number of systems (and the change recored in CVS or some other system to roll-back / audit in the future).

v) Following on from the above, the ability to track, examine and compare objects, properties and statistics across systems is missing. For example, other vendors allow us to create custom views in their GUI tools that allow us to track buffer pool hit ratios across a configurable number of systems then view this on a single page.

vi) (Personal rant :-) ) Why oh why does the control center insist on a DAS server running server side? These days I would argue that its completely unnecessary. When coupled with the fact that we can no longer run DAS servers because they don't integrate with our Unix sercurity model, then this rules out the control center completely.

Rant over. :-)

 

Post a Comment

Links to this post:

Create a Link

<< Home