Converting Paradox and dBASE Applications to InterBase

By: Conference Speaker

Abstract: This session shows how to convert an application built with Paradox tables to InterBase. How to convert browsing Locate, FindKey, SetRange, RecordCount, RecNo, and datasets are covered using an actual application.

This is the technical paper from a talk given at the 13th Annual Borland Developer's Conference
By Bill Todd – The Database Group, Inc.

Bill Todd is president of The Database Group, Inc., an InterBase and Delphi training and consulting firm based near Phoenix. He is co-author of four database programming books and more than 80 articles, and is a member of Team Borland, providing technical support on the Borland Internet newsgroups. He is a frequent speaker at Borland Conferences in both the U.S. and Europe. Bill is also a nationally known technical trainer and has taught Delphi and InterBase programming classes across the country and overseas. Bill can be reached at bill@dbginc.com.

Note: No portion of this paper may be distributed or reproduced by any means, or in any form, without the author's and/or Borland's prior written permission.

Can You Really Just Change the Alias and Go?

One of the promises of the Borland Database Engine (BDE) is that if you want to change databases you can just change your BDE alias to point to your new database and you are done. Is it really that easy? The answer is both yes and no. You may be able to do this with a typical application written to use Paradox or dBase tables but you are not likely be happy with the results.

Problems To Watch Out For

Moving the Data

Paradox allows spaces and other characters in table and field names that are not allowed in InterBase. InterBase object names are limited to alphanumeric characters and the underscore. If you have tables or fields with with names that are unacceptable to InterBase you have two choices. You can change the names in the database and throughout your code, or, if you are using a dialect 3 InterBase database, and you certainly should for any new database you create, you can use quoted identifiers. Quoted identifiers make the names of tables, fields and other database objects case sensitive and allow the names to include characters that would not otherwise be allowed, however, you must always enclose these object names in double quotes in your SQL statements.

Using the DataPump may create the wrong field types in your InterBase database. For example, currency values in Paradox tables are stored as double precision floating point values. The DataPump will create double precision fields in InterBase but this is not the best choice. Floating point notation cannot store most decimal fractions exactly. This can lead to cummulative errors in calculations that can result in errors, such as the result of a calculation being off by a penny. In InterBase you should store values that include decimal fractions and that require precision in NUMERIC fields. In dialect 3 databases the NUMERIC data type stores decimal fractions as scaled integers so they are always stored exactly.

Security

You can password protect Paradox tables. While there are tools that will break Paradox table encryption it is good enough to keep casual snoopers out of your data. InterBase has no protection at all. InterBase was designed as a database server and designed to use the operating system's security features to protect its files. If you want to secure the data on an InterBase server:

  1. Do not allow anyone but the DBA login rights to the server machine.
  2. Do not share any drives or folders.
  3. Make sure the machine is physically secure.

If you are using InterBase on a stand alone machine such as a notebook:

  1. Use Windows 2000 or later and use the operating system's file encryption features to secure the folders that contain your InterBase files.
  2. Do not allow the user access to the folders that contain the InterBase files.

Don't Use Table Components

Using TTable components to access data on a SQL database server produces poor performance. SQL database servers understand SQL so all TTable actions must be translated by the BDE to SQL statements. This requires fetching metadata information form the server and frequently requires sending multiple commands to the server. You can see this with SQL Monitor. This means more network traffic and more commands that must be executed by the database server which equals poor performance. TTable views the entire table. Fetching large amounts of data from the database server leads to higher server workload and increased network traffic which also leads to poor performance.

The dbExpress SQLTable component and the InterBase Express IBTable component are not as bad as TTable but they are still not the best choice. For good performance use components that execute SQL statements.

Get Rid of the BDE

The BDE has a number of disadvantages.

  1. It is large which increases the size of your application's deployment file.
  2. It is complex to deploy and configure.
  3. It is 40% slower than InterBase Express or dbExpress.
  4. It is no longer being developed by Borland.

If you want your converted application to provide the best performance with minimum distribution, configuration and support problems convert to InterBase Express or dbExpress. Since you need to replace all of your TTable components anyway this involves little extra work compared to the benefits you will receive.

Change Your Application Architecture to be Set Oriented

Application's written for desktop databases like Paradox and dBase typically open tables and let the user find records by browsing or by entering search criteria and using the TTable FindKey, FindNearest, Locate or SetRange methods or by setting the TTable Filter property.

Client applications written to work with SQL database servers normally start by presenting the user with a blank screen and requiring the user to enter some selection criteria. The values the user enters are used in the WHERE clause of a SQL SELECT statement to retrieve a small set of records. The user works with this subset of records then enters new selection criteria and selects a new set of records to work with. While there is no magic number that defines the maximum number of rows that you should let the user select, my rule of thumb is that the user should not be able to select more than about 200 rows and should normally be selecting a much smaller number. Working with small sets of records reduces both database server and network load and improves performance.

Understand Transactions

Everything you do in InterBase takes place within a transaction. You cannot view, insert, update or delete a record without first starting a transaction. Although all of the components you can use to access InterBase databases will start and commit transactions automatically it is much better to explicitly start transactions in code and explicitly commit transactions in code. If you are working with ClientDataSets you do not need to start or commit transactions. Calling the ClientDataSet's ApplyUpdates method starts a transaction before applying the updates and either commits or rolls back the transaction after the updates have been applied.

InterBase transactions include two unusual options called commit retaining and rollback retaining. Do not use them, at least not until you understand how to use them properly. Improper use of commit retaining and rollback retaining can disable garbage collection on the transaction inventory pages which, in turn, will cause increasing resource usage on the server and decreasing performance.

If You Use InterBase Express

There are a few additional things you will want to do if you use the InterBase Express components and ClientDataSets instead of dbExpress.

  1. Set the IBTransaction component's AutoStopAction property to saCommit. This ensures that the transaction will be committed when the dataset is closed by the Provider.
  2. Set the Unidirectional property of your IBQuery or IBDataSet components to True. This prevents wasting system resources and time by disabling record buffering. Record buffering is not necessary because ClientDataSet buffers its own records.
  3. Leave the ClientDataSet's PacketRecords property set to the default of -1. This ensures that the IBDataSet or IBQuery will be closed and its transaction committed as soon as all records have been read.

Sample Application Features

The Paradox application that we are going to move to InterBase is a simple one using some of the tables in the DBDemos alias. It lets the user work with Customer, Order and Item records. Figure 1 shows the data module. The data module contains a Database component, seven Table components and three DataSource components. The first four Table components provide data from the Customer, Orders, Item and Parts table. The tblPart comonent is used to provide data to a lookup field in tblItem that displays the Description that corresponds to the PartNo. The tblCustomer, tblOrder and tblItem components are linked via their MasterSource and MasterFields properties in a one-to-many-to-many relationship.

{short description of image}

Figure 1 - the Paradox application's data module

The data tables do not use AutoInc fields for their primary keys. Instead they use long integers and store the last number used in three single field single row tables. The bottom three Table components connect to these tables. The Utils.pas unit contains a function that is used to get the next id number safely in a multiuser environment.

Figure 2 shows the application's main form. The Customer tab shows the current Customer record and the grid shows the Order records for the customer.The Customer List tab shows all customers in a grid.

{short description of image}

Figure 2 - the main form

The Search menu provides a Find option that displays the dialog shown in figure 3. Users can enter a customer number or a partial company name and find all matching customer records. Since customer number is the primary key only one record will be found. If a user supplies a string in the Company edit box all records that start with that string will be found. The searching is done by applying a filter to the tblCustomer component.

{short description of image}

Figure 3 - the Find Customer dialog

By right clicking the Orders grid or choosing Orders from the File menu users can display the Orders form shown in figure 4. This form shows the orders for the current customer in the top grid and the items for the current order in the lower grid. Note that the Items grid includes the Description field from the Part table. This is a lookup field added in the Fields Editor.

{short description of image}

Figure 4 - the Orders form

Creating the InterBase Database

I created the InterBase database interactively in IBConsole. I then wrote a SQL script to create the tables and another to create the generators and ran them from IBConsole. The scripts are shown below. This script creates the tables and adds the primary and foreign key constraints.

CREATE TABLE CUSTOMER
(
  CUSTNO	INTEGER NOT NULL,
  COMPANY	VARCHAR(30),
  ADDR1	        VARCHAR(30),
  ADDR2	        VARCHAR(30),
  CITY	        VARCHAR(15),
  STATE	        VARCHAR(20),
  ZIP	        VARCHAR(10),
  COUNTRY	VARCHAR(20),
  PHONE	        VARCHAR(15),
  FAX	        VARCHAR(15),
  TAXRATE	DOUBLE PRECISION,
  CONTACT	VARCHAR(20),
  LASTINVOICEDATE	TIMESTAMP
);

ALTER TABLE CUSTOMER ADD CONSTRAINT PK_CUSTOMER PRIMARY KEY (CUSTNO);

CREATE TABLE ORDERS
(
  ORDERNO	INTEGER NOT NULL,
  CUSTNO	INTEGER NOT NULL,
  SALEDATE	TIMESTAMP,
  SHIPDATE	TIMESTAMP,
  EMPNO	        INTEGER NOT NULL,
  SHIPTOCONTACT	VARCHAR(20),
  SHIPTOADDR1	VARCHAR(30),
  SHIPTOADDR2	VARCHAR(30),
  SHIPTOCITY	VARCHAR(15),
  SHIPTOSTATE	VARCHAR(20),
  SHIPTOZIP	VARCHAR(10),
  SHIPTOCOUNTRY	VARCHAR(20),
  SHIPTOPHONE	VARCHAR(15),
  SHIPVIA	VARCHAR(7),
  PO	        VARCHAR(15),
  TERMS	        VARCHAR(6),
  PAYMENTMETHOD	VARCHAR(7),
  ITEMSTOTAL	NUMERIC(18,2),
  TAXRATE	DOUBLE PRECISION,
  FREIGHT	NUMERIC(18,2),
  AMOUNTPAID	NUMERIC(18,2)
);

ALTER TABLE ORDERS ADD CONSTRAINT PK_ORDERS PRIMARY KEY (ORDERNO);
ALTER TABLE ORDERS ADD CONSTRAINT FK_ORDERS_CUSTNO FOREIGN KEY (CUSTNO) 
  REFERENCES CUSTOMER (CUSTNO)
  ON DELETE CASCADE ON UPDATE CASCADE;

CREATE TABLE VENDOR
(
  VENDORNO	INTEGER NOT NULL,
  VENDORNAME	VARCHAR(30),
  ADDRESS1	VARCHAR(30),
  ADDRESS2	VARCHAR(30),
  CITY	        VARCHAR(20),
  STATE	        VARCHAR(20),
  ZIP	        VARCHAR(10),
  COUNTRY	VARCHAR(15),
  PHONE	        VARCHAR(15),
  FAX	        VARCHAR(15),
  PREFERRED	CHAR(1)
);

ALTER TABLE VENDOR ADD CONSTRAINT PK_VENDOR PRIMARY KEY (VENDORNO);

CREATE TABLE PART
(
  PARTNO	INTEGER NOT NULL,
  VENDORNO	INTEGER,
  DESCRIPTION	VARCHAR(30),
  ONHAND	INTEGER,
  ONORDER	INTEGER,
  COST	        NUMERIC(18,2),
  LISTPRICE	NUMERIC(18,2)
);

ALTER TABLE PART ADD CONSTRAINT PK_PART PRIMARY KEY (PARTNO);
ALTER TABLE PART ADD CONSTRAINT FK_PART_VENDORNO FOREIGN KEY (VENDORNO) 
  REFERENCES VENDOR (VENDORNO)
  ON DELETE CASCADE ON UPDATE CASCADE;

CREATE TABLE ITEM
(
  ORDERNO	INTEGER NOT NULL,
  ITEMNO	INTEGER NOT NULL,
  PARTNO	INTEGER NOT NULL,
  QTY	        INTEGER,
  DISCOUNT	DOUBLE PRECISION
);

ALTER TABLE ITEM ADD CONSTRAINT PK_ITEM PRIMARY KEY (ORDERNO, ITEMNO, PARTNO);
ALTER TABLE ITEM ADD CONSTRAINT FK_ITEM_ORDERNO FOREIGN KEY (ORDERNO) 
  REFERENCES ORDERS (ORDERNO)
  ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE ITEM ADD CONSTRAINT FK_ITEM_PARTNO FOREIGN KEY (PARTNO) 
  REFERENCES PART (PARTNO)
  ON DELETE CASCADE ON UPDATE CASCADE;

This script, CREATEGEN.SQL, creates the generators. The only reason for using two separate scripts is to allow me to easily drop and recreate the tables during testing. Since dropping the tables does not drop the generators creating the generators in the same script that creates the tables would cause errors if the generators already exist.

CREATE GENERATOR GEN_CUSTNO;
CREATE GENERATOR GEN_ORDERNO;
CREATE GENERATOR GEN_VENDORNO;
CREATE GENERATOR GEN_PARTNO;
CREATE GENERATOR GEN_ITEMNO;

For testing, two other scripts, DROPTABLES.SQL and EMPTYTABLES.SQL, both shown below, are provided.

DROP TABLE ITEM;
DROP TABLE PART;
DROP TABLE VENDOR;
DROP TABLE ORDERS;
DROP TABLE CUSTOMER;

DELETE FROM ITEM;
DELETE FROM PART;
DELETE FROM VENDOR;
DELETE FROM ORDERS;
DELETE FROM CUSTOMER;

Another script, shown below, adds the triggers to the tables. Each table gets a before insert trigger that checks the value of the primary key and generates the next value if the primary key field is null. Although the converted client app calls the appropriate generator and adds the primary key to each new record I always add the triggers to generate the primary key in case someone adds a new row by some means other than the client application.

SET TERM ^ ;

CREATE TRIGGER CREATE_CUSTNO FOR CUSTOMER
ACTIVE BEFORE INSERT POSITION 10
AS
BEGIN
  IF (NEW.CUSTNO IS NULL) THEN
    NEW.CUSTNO = GEN_ID(GEN_CUSTNO, 1);
END^

CREATE TRIGGER CREATE_ORDERNO FOR ORDERS
ACTIVE BEFORE INSERT POSITION 10
AS
BEGIN
  IF (NEW.ORDERNO IS NULL) THEN
    NEW.ORDERNO = GEN_ID(GEN_ORDERNO, 1);
END^

CREATE TRIGGER CREATE_ITEMNO FOR ITEM
ACTIVE BEFORE INSERT POSITION 10
AS
BEGIN
  IF (NEW.ITEMNO IS NULL) THEN
    NEW.ITEMNO = GEN_ID(GEN_ITEMNO, 1);
END^

CREATE TRIGGER CREATE_PARTNO FOR PART
ACTIVE BEFORE INSERT POSITION 10
AS
BEGIN
  IF (NEW.PARTNO IS NULL) THEN
    NEW.PARTNO = GEN_ID(GEN_PARTNO, 1);
END^

CREATE TRIGGER CREATE_VENDORNO FOR VENDOR
ACTIVE BEFORE INSERT POSITION 10
AS
BEGIN
  IF (NEW.VENDORNO IS NULL) THEN
    NEW.VENDORNO = GEN_ID(GEN_VENDORNO, 1);
END^

SET TERM ; ^

These scripts and the InterBase database, CUSTDEMO.GDB, are in the IB subdirectory of the sample code for this paper. This directory also contains a SQL script that will delete all of the tables in the database. This is handy for testing.

Warning

In this database I used declarative referential integrity to link all of the tables. Be careful of this. If you have a lookup table that is used to validate the values entered into one of your main data tables and the lookup table contains a small number of values you should not use declarative referential integrity. When you use declarative referential integrity an index is automatically created on the foreign key. This index will have very low selectivity (a small number of unique values) and can actually slow a query if the optimizer does not recognize this and ignore the index. Even if the optimizer ignores the index you will incur a performance penalty from maintaining it.

A good example would be an order table with a million rows and a sales region table with three rows. Each unique index value would select 300,000 rows. Clearly less disk I/O would be required to scan the table sequentally. In such a case it is much better to enforce referential integrity with triggers.

Converting the Data

The first step in converting the application is to move the data to an InterBase database. I am not going to spend much time on this phase. If you have the Enterprise version of Delphi you can attempt to do this with the DataPump. I chose not to in this case because both the primary key fields and the fields that contain dollar amounts are Number fields in the Paradox tables. The primary key fields in the InterBase database should be NUMERIC(18,0) to hold the values supplied by generators. Generators are the InterBase mechanism for providing unique autoincrementing keys and they generate a 64 bit integer number. The fields that hold currency values should be NUMERIC(18,2) in the InterBase database to avoid floating point imprecision. I could not get the DataPump to handle these conversions. When your pump data to an InterBase dialect 3 database the DataPump uses quoted identifiers for all of the table and column names. This makes these names case sensitive in InterBase. While this does not hurt anything I find it to be an unnecessary annoyance when working with an InterBase database.

Another option would be to export the Paradox data to fixed length ASCII files and import them into InterBase using external tables. This solution is limited to tables that do not contain blobs and exporting the data is time consuming.

There are third party tools for migrating data to InterBase. One that I have heard good things about is InterBase DataPump. This is a freeware tool which you can find at i) http://www.clevercomponents.com/products/ibdatapamp.asp. Clever Components, the makers of InterBase DataPump will even convert your data for you for a fee. I have not tried InterBase DataPump but it is worth considering if you have many tables to convert.

The final option is to write your own data conversion program as I did. My program is called CopyData and is located in the Convert subdirectory of the code that accompanies this paper. I used the InterBase Express components for InterBase access and the BDE components for Paradox table access in the conversion program. It is not difficult to write a generic conversion program that you can easily reuse of the source and destination tables have the same structure. There are just two tasks to perform. The first is to loop through a list of tables and copy their data. The second is to loop through a list of tables whose primary keys will be set from generators in the InterBase database, find the highest value in the key field of each table and then set the corresponding generator to that value.

The following figure shows the data module from the conversion program. The top row of components contains an IBDatabase, an IBTransaction and an IBSQL component used to access the destination InterBase database. The program dynamically creates the INSERT statement used to add the data to each table and assigns it to the IBSQL component. The INSERT statement for the Customer table is shown below. Note that the field names and parameter names are identical.

insert into CUSTOMER (
CUSTNO, COMPANY, ADDR1, ADDR2, CITY, STATE, ZIP, COUNTRY, PHONE, FAX, TAXRATE, CONTACT, LASTINVOICEDATE)
values (:CUSTNO, :COMPANY, :ADDR1, :ADDR2, :CITY, :STATE, :ZIP, :COUNTRY, :PHONE, :FAX, 
:TAXRATE, :CONTACT, :LASTINVOICEDATE)

{short description of image}

Figure 5 - the CopyData program's data module

The second row of components contains a Database component and a BDE Table component. Two text files are used to control the program. The first, TABLELIST.TXT, shown below, contains the name of the source and destination tables in pairs in the form SourceTableName=DestinationTableName. You can create this file with any text editor.

Customer=Customer
Orders=Orders
Vendors=Vendor
Parts=Part
Items=Item

The second control file, GENLIST.TXT, contains information about the generators that must be set. Each line is in the form TableName,ColumnName,GeneratorName and the data is shown below.

CUSTOMER,CUSTNO,GEN_CUSTNO
ORDERS,ORDERNO,GEN_ORDERNO
VENDOR,VENDORNO,GEN_VENDORNO
PART,PARTNO,GEN_PARTNO
ITEM,ITEMNO,GEN_ITEMNO

The main form's Copy Data button calls the data modules CopyAllTables method shown below. This method calls the LoadListOfTables method, described next, then loops through the StringList that contains the names of the source and destination tables. For each non-blank entry in the TableList StringList the CopyData method is called passing the source and destination table names as parameters. Since the source and destination table names are in the form SourceTableName=DestinationTableName the two names can be easily accessed using the Names and Values propertis of the StringList.

procedure TdmMain.CopyAllTables;
var
  I:         Integer;
begin
  {Load source and destination table names from the file TABLELIST.TXT into
   the TableList StringList. Entries in the text file are in the form
   SourceTableName=DestTableName.}
  LoadListOfTables;
  {Loop through all of the table names in TableList and copy the data. Skip
   any entries where the source or destination table name is blank.}
  for I := 0 to TableList.Count - 1 do
    if (TableList.Names[I] <> '') and (TableList.Values[TableList.Names[I]] <> '') then
      CopyData(TableList.Names[I], TableList.Values[TableList.Names[I]]);
end;

The list of tables to be converted is loaded into a StringList by the LoadListOfTables method shown below. The TableList StringList is a member of the data module. It is created in the the data modules OnCreate event handler and is freed in the OnDestroy event handler. The file is loaded by calling the StringList's LoadFromFile method.

procedure TdmMain.LoadListOfTables;
const
  TABLE_LIST_FILE = 'TableList.txt';
var
  FileName: String;
begin
  FileName := ExtractFilePath(Application.ExeName) + TABLE_LIST_FILE;
  {If the file exists load it otherwise raise an exception.}
  if FileExists(FileName) then
  begin
    TableList.Clear;
    TableList.LoadFromFile(FileName);
  end else
    raise Exception.Create('There is no ' + TABLE_LIST_FILE + ' file.');
end;

The work of the conversion program is done by the data module's CopyData method. The CopyData method begins by calling the CreateInsertStatement below passing the source and destinations table names as parameters. CreateInsertStatement loops through the FieldDefs property of the source BDE Table component and adds the name of each data field to both the list of fields in the INSERT statement and the list of parameters in the INSERT statement. When the INSERT statement has been constructed it is added to the SQL property of the ibsqlDest component. This method also adds the name of the field in the source table and the name of the field in the destination table to the SourceFieldList and DestFieldList StringLists. These lists are used later by the CopyData method. Note that since Paradox tables allow spaces in field names and InterBase does not the source field name is passed to the CleanFieldName method, shown below, to convert any spaces to underscores. It is because the source and destination field names may be different that they are recorded in the SourceFieldList and DestFieldList StringLists for later reference.

procedure TdmMain.CreateInsertStatement(SourceName, DestName: String);
{Create an INSERT statement that will insert all of the data fields into
 the destination table. Assign the INSERT statement to the ibsqlDest
 component.}
var
  Insert:        String;
  Fields:        String;
  Params:        String;
  FldName:       String;
  I:             Integer;
begin
  //Open the source table.
  tblSource.Close;
  tblSource.TableName := SourceName;
  TblSource.Open;
  //Clear the StringLists that will hold the field names.
  SourceFieldList.Clear;
  DestFieldList.Clear;
  //Initialize the INSERT statement.
  Insert := 'INSERT INTO ' + DestName + ' (';
  //Initialize the line that contains the VALUES clause.
  Params := 'VALUES (';
  //Loop through all of the fields in the source table.
  for I := 0 to tblSource.Fields.Count - 1 do
  begin
    {Only process data fields. You do not want lookup or calculated fields.}
    if tblSource.Fields[I].FieldKind = fkData then
    begin
      //If this is not the first field add a comma and space.
      if I > 0 then
      begin
        Fields := Fields + ', ';
        Params := Params + ', ';
      end; //if
      //Get the field name with spaces converted to underscores.
      FldName := CleanFieldName(tblSource.Fields[I].FieldName);
      //Add the field names to the StringLists.
      SourceFieldList.Add(tblSource.Fields[I].FieldName);
      DestFieldList.Add(FldName);
      //Add the field name to the list of fields and the list of parameters
      //in the INSERT statement.
      Fields := Fields + FldName;
      Params := Params + ':' + FldName;
    end; //if
  end; //for
  //Add the closing paren to the lists of fields and pamrameters.
  Fields := Fields + ')';
  Params := Params + ')';
  //Add the INSERT statment to the destination IBSQL component.
  ibsqlDest.SQL.Clear;
  ibsqlDest.SQL.Add(Insert);
  ibsqlDest.SQL.Add(Fields);
  ibsqlDest.SQL.Add(Params);
end;

function TdmMain.CleanFieldName(FieldName: String): String;
{Replaces all spaces in the field name with underscores.}
var
  I:        Integer;
begin
  Result := FieldName;
  for I := 1 to Length(Result) do
    if Result[I] = ' ' then Result[I] := '_';
end;

The CopyData method is called once for each table to be converted. The source and destination table names are passed as parameters. The CopyData method callse CreateInsertStatement then starts a transaction. Next it builds a list of the IBSQL component's parameter names in a StringList called ParamList. This StringList is also a member of the data module object and is created in the data module's OnCreate event handler and destroyed in its OnDestroy handler. Next it opens the Table component and prepares the query. The while loop iterates through all of the rows in the Table component. The for loop iterates through the StringList of field names and for each source field name assigns the value from the field with that name to the parameter in the INSERT statement with the corresponding name in the DestFieldList StringList. After the parameter values have been assigned the code executes the query and moves to the next row in the source Table component. After all of the rows have been processed the method closes the Table and IBSQL componenents, commits the transaction and frees the StringList.

procedure TdmMain.CopyData(SourceName, DestName: String);
var
  I:             Integer;
begin
  CreateInsertStatement(SourceName, DestName);
  ibtranDest.StartTransaction;
  try
    {Open the source TTable.}
    tblSource.Open;
    {Prepare the query.}
    if not ibsqlDest.Prepared then ibsqlDest.Prepare;
    {Loop through all records in the source table.}
    while not tblSource.Eof do
    begin
      {Loop through the parameter names and assign the value from the
       source field with the same name as the parameter to the parameter
       in the INSERT statement.}
      for I := 0 to SourceFieldList.Count - 1 do
        ibsqlDest.Params.ByName(DestFieldList[I]).Value :=
          tblSource.FieldByName(SourceFieldList[I]).Value;
      {Insert the row into the InterBase table.}
      ibsqlDest.ExecQuery;
      {Move to the next record in the source table.}
      tblSource.Next;
    end; //while
    tblSource.Close;
    {If all of the tables are copied successfully commit the transaction.}
    ibtranDest.Commit;
  except
    {If anything goes wrong rollback the transaction.}
    ibtranDest.Rollback;
    raise;
  end; //try
end;

Because of its architecture you should be able to use this program to convert Paradox or dBase tables to InterBase by just creating the TABLELIST.TXT and GENLIST.TXT files. There are just two limitations. First, the tables must contain the same fields and second the only difference in field names must be that spaces in source field names are converted to underscores in the corresponding InterBase table.

Converting the Data Module

The data module is a good place to start the conversion because once you change your dataset components the compiler will lead you to any other areas of your program that require changes. Figure 6 shows the converted data module. In this example I am converting the application to use the dbExpress components. The only differences in converting to the InterBase Express components were described earlier. Begin by adding the following components to the data module.

  • SQLConnection.
  • 3 SQLDataSets for the Customer, Orders and Item tables.
  • 3 SQL DataSets for the next id numbers for the Customer, Orders and Items tables.
  • One DataSource to link sdsCustomer and sdsOrder.
  • 2 Providers for Customer and Item.
  • 3 ClientDataSets for Customer, Orders and Item.

{short description of image}

Figure 6 - the converted data module

Set the properties of all of these components to link them in the usual way. Remember to set the DataSource property of sdsOrder to the dsCustomerLink DataSource component so the Order table will appear as a nested dataset in the Customer data. In this case I am not including the Item table as a nested dataset since it is displayed on another form and there is no need to incur the overhead of getting the item records unless the user opens the Order form. Change the DataSet propeties of the existing DataSource components to the Customer, Order and Item ClientDataSets.

You can now delete the Database component and all of the BDE Table components. Make sure that you also remove the DBTABLES unit from the uses clause so the application will not require the BDE to be installed in order to run.

Next, add the SQL statements to the SQLDataSet components. For sdsCustomer use:

select * from Customer where CustNo = -1

This statement will have to be changed as the application runs because the user can select customers by their customer number or by company name. To allow the SQL statement to be changed via the ClientDataSet's CommandText property set the provCustomer Provider component's poAllowCommandText Option to true. The purpose of this statement is to return no records when the application starts. For sdsOrder use the following statement:

select * from ORDERS where CustNo = :CustNo

This will select the Order records that belong to the customer records returned by sdsCustomer. Note that the parameter, :CustNo, must have the same name as the CustNo field in the Customer table. The SQL statement for sdsItem is:

select I.*, P.Description
from ITEM I join PART P
on I.PARTNO = P.PARTNO
where I.ORDERNO = :OrderNo

It is much more efficient to let the server lookup the Description field value in the Part table as part of the query instead of using a lookup field but it does require some extra work in the client application. First, open the Fields Editor for sdsItem and add all field objects. Next, select the Description field and set all of its ProviderFlags to False so that this field will not be included in the WHERE clause or the UPDATE fields. Next, select the Provider, provItem, and create an OnGetTableName event handler as shown below. The provider cannot determine which table to update in a join so you need the OnGetTableName event handler to tell it.

procedure TdmMain.provItemGetTableName(Sender: TObject; DataSet: TDataSet;
  var TableName: String);
begin
  TableName := 'ITEM';
end;

Moving to the ClientDataSet components, start by adding the field objects to the cdsCustomer component. Next, set the DataSetField property of cdsOrder to sdsOrder then add field objects to cdsOrder. Finally add field objects to cdsItem and set the ReadOnly property of the Description field object to True so users cannot edit the lookup field.

The last properties that you need to set are the SQL CommandText properties of the SQLDataSets used to supply the next primary key value for each of the tables. Since the SQL statements are identical for all three excpet for the name of the generator I will just show the statement for sdsNextCustNo below. Executing this query will return the next value from the GEN_CUSTNO generator.

select GEN_ID(GEN_CUSTNO, 1) as NextCustNo from rdb$database

Changing the Code

The following sections describe the code changes required to complete the conversion.

Data Module

Start by removing the SetCustomerFilter method. This method is no longer required since record selection will be done with SQL statments. Next change the FindCustomerByname and FindCustomerByNumber methods as shown below. The old code has been left in the methods as comments to make the changes clear. This is true of all of the methods that will be changed.

procedure TdmMain.FindCustomerByName(CompanyName: String);
{Function:  Sets the Customer table index to ByCompany.
            Sets a filter on the Company field.
 Called By: frmFindCust OK button OnClick.}
begin
  //frmMain.SetCustIndexToCompany;
  //SetCustomerFilter('Company = ' + QuotedStr(CompanyName + '*'));
  cdsCustomer.Close;
  cdsCustomer.CommandText :=
    'SELECT * FROM CUSTOMER WHERE COMPANY STARTING WITH ' + QuotedStr(CompanyName);
  cdsCustomer.Open;
end;

procedure TdmMain.FindCustomerByNumber(CustomerNumber: String);
{Function:  Sets the Customer table index to primary index.
            Sets a filter on the CustNo field.
 Called By: frmFindCust OK button OnClick.}
begin
  //frmMain.SetCustIndexToCustNo;
  //SetCustomerFilter('CustNo = ' + CustomerNumber);
  cdsCustomer.Close;
  cdsCustomer.CommandText :=
    'SELECT * FROM CUSTOMER WHERE CUSTNO = ' + CustomerNumber;
  cdsCustomer.Open;
end;

Both of these methods have been changed to close the Customer ClientDataSet, change the SQL statement in ClientDataSet's CommandText property then reopen the ClientDataSet. This causes the ClientDataSet to send the new SQL statement to the Provider which, in turn, sends it to its SQLDataSet component. The Provider then opens the SQLDataSet, retrieves the new set of records, returns them to the ClientDataSet and closes the SQLDataSet. This lets the database server do the record selection and reduces network traffic to just the rows returned by the query.

You also need to remove the ShowAllCustomers method that was used to remove the filter in the Paradox application. This method is not needed since the InterBase application will never show all of the records in the Customer table. After you delete the ShowAllRecords method delete the ShowAllRecords action from the ActionList component on the main form and remove the Show All Records item from the Search menu on the main form.

The next step is to change the three OnNewRecord event handlers as shown below. The new code executes the query to return the next value from the appropriate generator, assigns the new value to the key field, then closes the query. I also renamed these event handlers to correspond to the ClientDataSets they will be attached to. Once you have changed the code, the cdsCustomer, cdsOrder and cdsItem ClientDataSets and assign their OnNewRecord event handlers to the modified methods using the Object Inspector. Finally, remove the Utils unit from the project and from the uses clause since the function it contains for generating unique values is no longer required.

procedure TdmMain.sdsCustomerNewRecord(DataSet: TDataSet);
begin
  //tblCustomer.FieldByName('CustNo').AsInteger :=
  //  dgGetNextUniqueNumber(tblNextCust);
  sdsNextCustNo.ExecSQL;
  cdsCustomerCustNo.AsInteger :=
    sdsNextCustNo.FieldByName('NextCustNo').AsInteger;
  sdsNextCustNo.Close;
end;

procedure TdmMain.sdsOrderNewRecord(DataSet: TDataSet);
begin
  //tblOrder.FieldByName('OrderNo').AsInteger :=
  //  dgGetNextUniqueNumber(tblNextOrder);
  sdsNextOrderNo.ExecSQL;
  cdsOrderOrderNo.AsInteger :=
    sdsNextOrderNo.FieldByName('NextOrderNo').AsInteger;
  sdsNextOrderNo.Close;
end;

procedure TdmMain.sdsItemNewRecord(DataSet: TDataSet);
begin
  //tblItem.FieldByName('ItemNo').AsInteger :=
  //  dgGetNextUniqueNumber(tblNextItem);
  sdsNextItemNo.ExecSQL;
  cdsItemItemNo.AsInteger :=
    sdsNextItemNo.FieldByName('NextItemNo').AsInteger;
end;

In order to reduce server load and network traffic we are not going to open the Item ClientDataSet unless the user opens the Order form. This requires changing the data module's OnCreate event handler as follows so that only the Customer and Order ClientDataSets are opened.

procedure TdmMain.DataModuleCreate(Sender: TObject);
var
  I:         Integer;
begin
  {Open all TTables.}
  {
  for I := 0 to ComponentCount - 1 do
    if Components[I] is TDataSet then
      TDataSet(Components[I]).Open;
  }
  cdsCustomer.Open;
  cdsOrder.Open;
end;

The Main Form

The only other change that needs to be made to the main form is to change the code called by the radio buttons that control the sort order of the Customer records. Begin by removing the SetCustIndexToCompany and SetCustIndexToCustNo methods. Next, change the OnClick event handlers of the two radio buttons as shown below.

procedure TfrmMain.rbByCustNoClick(Sender: TObject);
begin
  //SetCustIndexToCustNo;
  rbByName.Checked := False;
  dmMain.cdsCustomer.IndexFieldNames := 'CUSTNO';
end;

procedure TfrmMain.rbByNameClick(Sender: TObject);
begin
   //SetCustIndexToCompany;
   rbByCustNo.Checked := False;
  dmMain.cdsCustomer.IndexFieldNames := 'COMPANY';
end;

This code controls the sort order of the records in the Customer ClientDataSet by setting its IndexFieldNames property to the name of the field to sort the records by. This requires one more change in the data module. Set the IndexFieldNames property of cdsCustomer to CustNo so the default sort order will correspond to the intial state of the radio buttons.

The last change to the main form is to add the Save Changes button to the toolbar. The code for the button's OnClick event handler is shown below. This code checks the ClientDataSet's ChangeCount property and, if there are changes, calls the ApplyUpdates method to send them to the database. Finally, the method calls Refresh to reread the records from the server so updates made by other users will be seen.

procedure TfrmMain.SpeedButton1Click(Sender: TObject);
begin
  if dmMain.cdsCustomer.ChangeCount > 0 then
  begin
    dmMain.cdsCustomer.ApplyUpdates(0);
    dmMain.cdsCustomer.Refresh;
  end; //if
end;

The Order Form

The changes to the Order form are minor.

  1. Add an OnCreate event handler that opens the cdsItem ClientDataSet.
  2. Call cdsItem.Close in the form's OnClose evernt handler.
  3. Add a Save Changes button similar to the one added to the Customer form.

The OnClick event handler for the Save Changes button is shown below.

procedure TfrmOrders.SpeedButton1Click(Sender: TObject);
begin
  if dmMain.cdsOrder.ChangeCount > 0 then
  begin
    dmMain.cdsCustomer.ApplyUpdates(0);  //Order is nested in Customer.
    dmMain.cdsCustomer.Refresh;
  end; //if
  if dmMain.cdsItem.ChangeCount > 0 then
  begin
    dmMain.cdsItem.ApplyUpdates(0);
    dmMain.cdsItem.Refresh;
  end; //if
end;

Server Response from: ETNASC04