 |
This is the technical paper from a talk given at the 12th Annual Borland Developer's Conference
By Bill Todd The Database Group, Inc. |
Bill Todd is President of The Database Group, Inc., a database consulting and development firm based near Phoenix. He is co-author of four database
programming books and over 80 articles, and is a member of Team Borland, providing technical support on the Borland Internet newsgroups. He is a frequent speaker at Borland Developer Conferences in the U.S. and Europe. Bill is also a nationally known trainer and has taught programming classes across the country and overseas.
Note: No portion of this paper may be distributed or reproduced by any means, or in any form, without the author's and/or Borland's prior written permission.
Introduction
Propblems With Interbase
Express
Using ClientDataSet
Building the Data Module
Building the User Interface
Editing Data
Handling Update Errors
Sorting Data On-the-Fly
Showing Record Status
Now that the ClientDataSet component is available in the Professional as
well as the Enterprise versions of Delphi and C++ Builder it is time to look at
the advantages of using ClientDataSets in combination with the Interbase
Express components to build your Interbase applications. As you will see later
on in this paper there are many advantages to using ClientDataSets in your
applications. This is the reason that the ClientDataSet component is the
foundation of Borland's new dbExpress architecture.
There are also many reasons for using Interbase Express (IBX) to build your
Interbase applications. First, IBX is much smaller and easier to deploy than
the BDE. Second, IBX provides faster access to Interbase databases than the
BDE. Finally, Interbase Express provides access to features of Interbase that
are not available through the BDE such as multiple simultaneous transactions, a
wider variety of transaction isolation levels and access to CommitRetaining and
RollbackRetaining.
The single biggest problem that developers encounter when working with the
IBX components is that all access to data in an Interbase database must take
place in the context of a transaction. This means that you cannot even read and
display the contents of a record without starting a transaction. The problem is
that when you commit or rollback the transaction all of the datasets attached
to the IBTransaction component are closed and the user is suddenly looking at a
blank screen with no data displayed.
You can see this effect in the IBXDemo program that accompanies this paper.
The following screen shows the main form after the Open button has been
clicked. The next screen shows the main form after clicking the Commit button.


Of course there are ways to deal with this and the IBXDEMO application shows
one of them. If you check the Reopen checkbox before you click the commit
button the program code saves the primary key values of the employee and salary
history tables, commits the transaction, reopens both IBDataSets and locates
the records that were current at the time of the commit. The same thing happens
if you click the Rollback button. This works but it is a lot of extra work.
Another solution is to follow the normal client/server design philosophy of
starting the user off with a blank form and requiring him/her to enter some
selection criteria to allow you to fetch a modest number of records for the
user to work with. 'The user can then make whatever changes are required to the
set of records and commit the changes. When the changes are committed the user
is back to a blank form and must select another set of records. This approach
requires less code but can lead to unacceptably long transactions. For example,
suppose a user needs to make changes to several hundred customer records in the
state of Arizona. If the user has the ability to select all of the customers in
Arizona at once he/she may spend several hours working with those records
before finishing. This causes two potential problems. First, if the user's
workstation crashes all of the changes will be rolled back and all of the
user's work will be lost. Second, the transaction may exist for several hours
with several hundred records changed and locked so that they cannot be updated
by other users.
Another solution that many users discover is to call CommitRetaining or
RollbackRetaining instead of Commit or Rollback. This seems like an ideal
solution at first because it does not close the datasets and allows the user to
continue to view and modify the selected set of records. However,
CommitRetaining and RollbackRetaining do not end your transaction. That means
that if you are using snapshot transaction isolation you cannot see any changes
made by other users until you finally commit or rollback. It also means that
you have disabled Interbase's garbage collection by forcing it to maintain all
transaction and record version information since your transaction started. This
can cause severe performance problems in a multi-user system with many users
making many changes. This is a clearly a case where the cure is worse than the
disease.
The ClientDataSet component is one of the Midas components used to implement
the provide/resolve strategy of database editing that was first introduced as
part of Midas in Delphi 3. Before we get into the details of building an
application that uses the DataSetProvider, ClientDataSet and IBX components it
is worthwhile to examine the basics of the provide/resolve strategy.
The Provide/Resolve Strategy
The provide/resolve mechanism uses a DataSetProvider component to provide
data supplied by a dataset component to a ClientDataSet. The ClientDataSet
component holds the records received from the provider in memory. Users can
insert, delete or update records and the changes to the data are also held in
memory by the ClientDataSet. When the user is finished working with the current
set of records a call to the ClientDataSet's ApplyUpdates method sends the
changes back to the provider. The provider then generates the SQL statements
necessary to apply the changes to the database, starts a transaction, sends the
SQL statements to the database server and, if there are no errors, commits the
transaction.
From this very basic explanation you can see the advantages of this
architecture. First, transactions are very short lived. When you open a
ClientDataSet that is connected to a DataSetProvider the provider in tern opens
the dataset component it is connected to. When opened the dataset component
executes its SQL SELECT statement and returns the records to the provider which
then sends the records to the ClientDataSet and closes the dataset and commits
the read transaction. When you call the ClientDataSet's ApplyUpdates method the
changes are sent to the provider which generates the necessary SQL statements,
starts a transaction, sends the SQL statements to the database server and
commits the transaction. This makes both the read and update transactions very
short which means that other users are not blocked by record locks held by long
running update transactions. Short transactions also means that the number of
transactions open at any one time will be smaller and, therefore, the load on
the database server will be less.
The second big advantage of this strategy is that it gives you control of
how much data you work with. since you control the SQL SELECT statement that is
executed you control how many records are fetched from the database server at
any one time which gives you control of how much network traffic you generate
and how much memory you use on the client PC for buffering records. This can be
particularly valuable if you are developing an application which will be used
over a WAN or other low speed link.
The following figure shows the data module from the sample CDSIBX
application that accompanies this paper.

This data module is designed to support a form that shows the one-to-many
relationship between the Employee table and the Salary_History table. The
employees to view are selected by department. In addition, the Employee dataset
includes a lookup field on the Dept_No field which displays the name of the
department. The IBDatabase component in the upper left corner is configured
exactly the same as it would be in an application that uses only Interbase
Express components.
Next comes three IBTransaction components, one for each table the
application uses. When using ClientDataSets you must provide a separate
transaction component for each table since different datasets could be
interacting with the database simultaneously. Next comes an IBQuery component
for each table. Note that the Unidirectional property of all of the IBQuery
components is set to True to reduce memory consumption and improve performance.
Since these query components are only use to supply the result set of the query
to the DataSetProvider components there is never any need to scroll backward in
the result set. Although the IBQuery component provides a read-only result set
that is all that is necessary since updates to the database are handled
automatically by the DataSetProvider component. The SQL property of the EmpQry
component is set to
select * from EMPLOYEE
where Dept_No = :Dept_No
order by Last_Name, First_Name
so the records for a single department can be selected. The SQL statement
for the SalHistQry component is:
select * from SALARY_HISTORY
where Emp_No = :Emp_No
order by Change_Date desc
In this case the parameter, :Emp_No, has the same name as the primary key
field in the Employee table and the DataSource property of the SalHistQry
component is set to EmpSrc, the DataSource component that is connected to the
EmpQry component. This causes the SalHistQry component to get its parameter
value from the current record of EmpQry. This fetches just the salary history
records for the employees that were selected. The SQL property of the DeptQry
component is
select DEPT_NO, DEPARTMENT from DEPARTMENT
order by Dept_No
which selects all of the department numbers and names.
A single DataSetProvider component is provided for the EmpQry component. The
DataSet property for the provider is set to EmpQry. No provider is requird for
the SalHistQry because its records will be embedded in a dataset field in the
records supplied by the EmpQry's DataSetProvider. This happens automatically
for any child dataset component that is linked to its master dataset component
through the child dataset component's DataSource property. A second
DataSetProvider component is linked to the DeptQry component to provide the
department records.
Finally, three ClientDataSet components are used to supply data to the
application's user interface. The first one, EmpCds, is connected to the
EmpProv provider by setting the ClientDataSet's ProviderName property to
EmpProv. If you have used ClientDataSet's in multitier applications note that
the RemoteServer property is left blank since the provider is in the same
application as the ClientDataSet. After connecting the EmpCds ClientDataSet to
its provider the next step is to open the Fields Editor and add all of the
field objects as shown below.

Notice the last field object, SalHistQry. This is a field object whose type
is TDataSetField and it contains the salary history records returned by the
SalHistQry dataset. You must intantiate the field objects in the Fields
Editor so this field object will be available or you will not be able to access
the salary history records. Now you can add the SalHistCds component and set
its DataSetField property to SalHistQry using the drop-down lost. This connects
the SalHistCds ClientDataSet to the SalHistQry dataset field in the employee
dataset. The third ClientDataSet, DeptCds, is connected to the DeptProv
DataSetProvider component via its ProviderName property. Finally, two
DataSource components are connected to the EmpCds and SalHistCds
ClientDataSets.
Since we are interested in the data access components the user interface in
the sample CDSIBX application is nothing fancy. The the main form, as it stands
at this point is shown below.

The form contains two grids and two DBNavigators to display the employee and
salary history records respectively. To make this more like a normal
client/server application where the user only works with a subset of records at
one time there is a combo box that displays the departments. Selecting a
department from the combo box automatically selects the employees for that
department using code in the combo boxes OnChange event handler shown below.
procedure TMainForm.DeptComboChange(Sender: TObject);
begin
with MainDm do
begin
EmpCds.Close;
EmpCds.Params.ParamByName('Dept_No').AsString :=
Copy(DeptCombo.Items[DeptCombo.ItemIndex], 1, 3);
EmpCds.Open;
SalHistCds.Open;
end; //with
end;
This code closes the EmpCds ClientDataSet, assigns the first three
characters of the selected item in the combo box to EmpCds' Dept_No parameter
and reopens EmpCds. The Params property of the ClientDataSet is a list of
TParam objects and each TParam object represents one of the parameters in the
SQL statement of the dataset component the ClientDataSet's DataSetProvider is
linked to, in this case EmpQry. This means that you can easily change the
value(s) of the underlying SQL statement's parameter(s) and fetch a new set of
records at any time.
If you need more flexibilty you can use the ClientDataSet's CommandText
property to change the entire SQL statement. Suppose you want to be able to
select employees by either department or grade. The sample application has a
Select option on the main menu that offers these two choices. The form actually
contains two panels positioned on top of each other. One contains the combo box
for choosing a department and the other contains an edit box to allow the user
to enter a grade. When the application starts the department panel is visible
and the grade panel is not. Here is the code for the OnExecute event handler of
the EmpByGrade action item that is attached t the Employee By Grade menu item.
procedure TMainForm.EmpByGradeExecute(Sender: TObject);
begin
DeptPanel.Visible := False;
GradePanel.Visible := True;
with MainDm.EmpCds do
begin
Close;
with MainDm.EmpQry.SQL do
begin
Clear;
Add('SELECT * FROM EMPLOYEE WHERE JOB_GRADE = :JOB_GRADE ORDER BY LAST_NAME, FIRST_NAME');
end; //with
Params.Clear;
FetchParams;
GradeEdit.SetFocus;
end; //with
end;
This event handler starts by making the department panel invisible and the
grade panel visible. Next, it closes the EmpCds ClientDataSet, clears the
IBQuery component's SQL property and adds a new SQL statement to the SQL
property. Finally the code clears the Params property to remove the Dept_No
Param object and calls the ClientDataSet's FetchParams method to create the
parameter objects for the new SQL statement. To select employees by grade just
type a grade number in the edit box and press Enter. The following code from
the TEdit's OnKeyDown event handler closes the ClientDataSet, assigns the grade
to the Job_Grade parameter and reopens the ClientDataSet.
procedure TMainForm.GradeEditKeyDown(Sender: TObject; var Key: Word;
Shift: TShiftState);
var
Grade: Integer;
begin
if Key = VK_RETURN then
begin
with MainDm.EmpCds do
begin
try
Grade := StrToInt(GradeEdit.Text);
except
Grade := 0;
end; //try
Close;
Params.ParamByName('Job_Grade').AsInteger := Grade;
Open;
MainDm.SalHistCds.Open;
end; //with
end; //if
end;
As you insert, delete and modify records the changes you make are stored in
the ClientDataSet's Delta property. To send these changes to the database you
must call the ClientDataSet's ApplyUpdates method. Until you call apply updates
you can undo the changes in reverse order, undo all changes to the current
record or undo all changes to all records. The Edit menu in the sample
application offers all of these choices and the code from the OnExecute event
handlers of the action items associated with each menu choice are shown below.
procedure TMainForm.UndoCurrentRecordExecute(Sender: TObject);
{Undo all changes for the current record.}
begin
MainDm.EmpCds.RevertRecord;
end;
procedure TMainForm.UndoLastChangeExecute(Sender: TObject);
{Undo the last change and move to the effected record.}
begin
MainDm.EmpCds.UndoLastChange(True);
end;
procedure TMainForm.CancelChangesExecute(Sender: TObject);
{Cancel all updates.}
begin
MainDm.EmpCds.CancelUpdates;
end;
procedure TMainForm.ApplyUpdatesExecute(Sender: TObject);
{If there are unapplied updates in the Delta property apply them.}
begin
with MainDm.EmpCds do
begin
if ChangeCount > 0 then
begin
ApplyUpdates(-1);
Refresh;
end; //if
end; //with
end;
The UndoLastChange method, shown in the third event handler above, takes a
single Boolean parameter. If the parameter is true the last change will be
undone and that record will become the current record unless it was a newly
inserted record. If the parameter is false the last change is undone but the
current record does not change unless undoing the last change causes the
current record to disappear because it is a newly inserted record and it is the
last change.
The last event handler above calls the ApplyUpdates method. Before calling
ApplyUpdates the code checks to see if the ChangeCount property is greater than
zero. ChangeCount returns the number of unapplied changes in the Delta proprty.
If ChangeCount is zero there are no updates to apply so there is no reason to
call ApplyUpdates.
ApplyUpdates takes a single parameter which is the number of errors to allow
before aborting the update process. Suppose there are ten records in Delta
waiting to be applied and you call ApplyUpdates with the MaxErrors parameter
set to one. If the second record causes an error the update process will stop
and no attempt will be made to apply the other updates to the database. If the
MaxErrors parameter is set to minus one there is no limit to the number of
errors that will be allowed. Every update will be attempted even if all of them
cause an error.
ApplyUpdates prevents updating records that have been changed by another
user since you read the records by including a WHERE clause in all of the
INSERT, DELETE and UPDATE statements generated by the DataSetProvider. The
UpdateMode property of the DataSetProvider controls which fields are included
in the WHERE clause. UpdateMode can be set to upWhereAll (the default),
upWhereChanged or upWhereKeyOnly. If UpdateMode is set to upWhereAll all of the
non-BLOB fields in the record are included in the WHERE clause. This means you
will get an error when you apply updates if another user has changed any field
in the record even if the field the other user changed is not a field that you
changed. Using upWhereChanged includes only the fields you have changed in the
WHERE clause so you will get an error only if another user has changed a field
that you also have changed. This is the setting you will probably want to use
in most cases. The upWhereKeyOnly option includes only the priamry key fields
in the WHERE clause. This means that if another user changes a field and you
change the same field your changes will overwrite the other user's changes.
There is, however, one limitation. The WHERE clause never includes BLOB fields
so if another user changes a record since you have read it and the only change
is to a BLOB you will not be warned when you update the record.
Updating Datasets That Cannot Be Updated
Working with records supplied by a stored procedure on the server has one
major problem. You cannot update the records. However, you can edit datasets
from a stored procedure when you use a ClientDataSet and DataSetProvider. This
is easy because the DataSetProvider already generates SQL statements to perform
the updates. The only problem is that when records come from a stored procedure
the provider has no way to find the name of the table it should update.
Fortunately, the DataSetProvider has an OnGetTableName event that fires when it
is about to generate the SQL statements to apply updates. All you have to do is
write a one line event handler that assigns the name of the table to update to
the events TableName parameter and you can now apply updates to records
supplied by a stored procedure.
This technique is also useful for SQL SELECT statements that would not
normally be updateable. Suppose you have a multi-table join query that is
necessary so the user can see information from two or more tables but the user
only needs to make changes to fields from one table. All you have to do is
instantiate field objects at design time for the ClientDataSet and set the
ReadOnly property for all of the fields the user cannot change. Then create an
OnGetTableName event handler for the DataSetProvider and supply the name of the
table to which the updates should be applied.
Life is not as easy if you have a multi-table join and users need to change
records in two or more tables. In this case you will need to write a
BeforeUpdateRecord event handler for the DataSetProvider component. The Delta
property of the ClientDataSet is passed to this event handler in its DeltaDs
property. The type of the DeltaDs property is TClientDataSet so you have access
to all of the properties and methods of the ClientDataSet component in your
event handler. In this event handler you will need to scan the records in the
DataSet property and generate your own SQL statements to apply the updates. You
can use the UpdateKind method to find out if the current record should be
inserted, deleted or modified. The field objects for the DeltaDs parameter have
both an OldValue and a NewValue property. This lets you use
DeltaDs.FieldByName('SomeField').OldValue
to get the old value of the field and
DeltaDs.FieldByName('SomeField').NewValue
to get the new value of the field. Obviously records to be deleted have only
old values and records to be inserted have only new values. Records that are
modified are handled in a way that is not intuitive but makes sense once you
understand it. For each record that is modified there are two records adjacent
to each other in the DataSet. The first record has an update status of
usUnmodified and is the record as it was before any changes were made. The
second record's update status is usModifield and it contains values for the
fields that were changed only. The OldValue and NewValue properties of the
field objects are variants. When you examine the NewValue for a field in the
usModified record it will be Unassigned if the value of that field was not
changed, Null if the value of the field was changed from something to null and
will contain the new value if the field was changed from null or any value to
some other value. You can use the VarIsEmpty and VarIsNull functions to see if
a field has been changed or has been changed to null. Using this information
you can generate your own SQL INSERT, DELETE and UPDATE statements for each
table you have to update, assign the SQL statement to the SQL property of an
IBSQL component and execute it. As you process each record set the Applied
parameter to True so the provider will know not to perform the update.
When you call ApplyUpdates the Delta property is sent to the DataSetProvider
which generates SQL statements to apply each change to the database and
executes them. If an error occurs the record is saved. When all of the updates
have been attempted or when the maximum allowable error count is reached,
whichever comes first, any errors are returned to the ClientDataSet which fires
an OnReconcileError event. Therefore, you must provide an OnReconcileError
event handler to deal with any errors that occur. Fortunately Borland's
engineers have made this easy. If you select File | New from the Delphi menu
and go to the Dialogs page of the repository you will find a form named
Reconcile Error Dialog. All you need to do is add this form to your
application, add its unit to the uses clause in each unit that contains a
ClientDataSet and add the code shown below to each ClientDataSet's
OnReconcileError event handler.
procedure TMainDm.EmpCdsReconcileError(DataSet: TClientDataSet;
E: EReconcileError; UpdateKind: TUpdateKind;
var Action: TReconcileAction);
begin
Action := HandleReconcileError(DataSet, UpdateKind, E);
end;
When an error occurs while applying updates this code causes a dialog box to
display which shows the record, the changes made to the record, the type of
update attempted (insert, delete or modify) and the error message. The user can
edit the record and choose one of the following six actions. The selected
action is assigned to the event handler's Action parameter.
- raSkip - Do not apply the update but leave the changes in the Delta
property.
- raAbort - Abort the entire update operation. No changes will be applied.
- raMerge - Update the record even though it has been changed by another
user.
- raCorrect - Replace the changes in Delta with the changes entered by the
user.
- raCancel - Undo all changes for this record.
- raRefresh - Undo all changes for this record and reread the record from the
database.
You can modify the appearance or features of the reconcile error dialog from
the repository or write your own OnReconcileError event handler from scratch to
meet your needs.
The fact that the ClientDataSet holds its records in memory in its Data
property is a tremendous advantage. One example of that is that you can sort
the records by any field or combination of fields by simply assigning the field
name to the ClientDataSet's IndexFieldNames property. To sort by more than one
field separate the field names with semicolons. The following code is the
OnTitleClick event handler for the DBGrid that displays the employee records.
procedure TMainForm.EmpGridTitleClick(Column: TColumn);
begin
MainDm.EmpCds.IndexFieldNames := Column.FieldName;
end;
This single line of code lets the user click on the title of any column to
sort the employee records by that column. This technique has one restriction
and two limitations. The restriction is that you cannot use IndexFieldNames on
an embedded detail dataset such as the salary history records in the sample
application. In fact, there is no way to change the sort order of an embedded
detail dataset except changing the ORDER BY clause of the underlying query. The
first limitation is that the sort is always ascending. There is no way to
specify a descending sort using the IndexFieldNames property. Second, if you
have a very large number of records in the ClientDataSet, for example 50,000,
the sort may take several seconds.
Both of the limitations can be easily overcome using indexes. You can create
indexes on a ClientDataSet at design time using the IndexDefs property or at
run time using the AddIndex method. Both methods let you create indexes that
sort ascending by some fields and descending by others as well as indexes that
are case insensitive on one or more fields. Because all the records are in
memory creating an index is very fast. For example, creating an index on 50,000
records takes just a few seconds. Creating an index on 1,000 records is
essentially instantaneous. The sample application encludes an index created at
design time which sorts the employee records descending on salary and ascending
on name. The following figure shows the IndexDef for the index in the Object
Inspector and the code following the picture of the Object Inspector shows how
you could create the same index using the ClientDataSet's AddIndex method.

MainDm.EmpCds.AddIndex('DescBySalary', 'Salary;Last_Name;First_Name',
[ixDescending], 'Salary');
In this call to AddIndex the first parameter is the name of the index, the
second is the list of field names to be indexed, the third is the index options
parameter and the fourth is the names of the fields to be sorted in descending
order. The ClientDataSet AddIndex method can take two additional parameters
that are not used in this example. The first is the list of fields that will be
case insensitive and the second is the grouping level. Grouping level is used
with maintained aggregates to accumulate subtotals. Maintained aggregates are
described later in this paper.
The following code is the OnExecute event handler of the DescendingBySalary
action that is called by the Descending By Salary menu idem.
procedure TMainForm.DescendingBySalaryExecute(Sender: TObject);
begin
if DescendingBySalary.Checked then
begin
DescendingBySalary.Checked := False;
MainDm.EmpCds.IndexFieldNames := gByNameFieldNames;
end else begin
DescendingBySalary.Checked := True;
MainDm.EmpCds.IndexName := gDescBySalaryIndex;
end; //if
end;
This action toggles back and forth between the DescBySalary index and
sorting by name using the IndexFieldNames property. The Sort1 menu item's
OnClick event handler, shown below, checks the current value of the
EmpCds.IndexName property and sets the Checked property of the action item in
case the sort order was changed by clicking on the grid's title bar.
procedure TMainForm.Sort1Click(Sender: TObject);
begin
if MainDm.EmpCds.IndexName = 'DescBySalary' then
DescendingBySalary.Checked := True
else
DescendingBySalary.Checked := False;
You can easily display the status of a record, that is, whether it has been
modified, inserted, or not modified, using a calculated field. The Status field
of the employee ClientDataSet is a calculated field and the following code is
the OnCalcFields event handler.
procedure TMainDm.EmpCdsCalcFields(DataSet: TDataSet);
begin
case EmpCds.UpdateStatus of
usModified: EmpCdsStatus.AsString := 'Modified';
usUnModified: EmpCdsStatus.AsString := 'Unmodified';
usInserted: EmpCdsStatus.AsString := 'Inserted';
usDeleted: EmpCdsStatus.AsString := 'Deleted';
end;
end;
Calling the ClientDataSet's UpdateStatus method returns one of the four
constants, usModified, usUnmodified, usInserted or usDeleted. Of course you
will never see a record whose status is usDeleted.
Filtering Data
One way that ClientDataSets help you reduce network traffic and server load
is by filtering the records in the ClientDataSet instead of executing another
query. For example, suppose a user selects all customer records for a state and
then needs to see records for just one sales territory in that state. You can
display just the records for one territory in one of three ways. First, if you
have an index on the Territory field and it is the active index you can use the
ClientDataSet's SetRange method as shown below.
CustomerCds.SetRange([23], [23]);
This method call will restrict your view of the data to just those records
in territory 23. To remove the range and see all records call the CancelRange
method. The second technique is to apply a filter using the ClientDataSet's
Filter property. ClientDataSet filters use SQL WHERE syntax and are, therefore,
more powerful than the BDE filters you may have used in the past. The following
code would display the records for territory 23 using a filter.
with CustomerCds do
begin
Filter := 'Territory = 23';
Filtered := True;
end;
To remvoe the filter set the Filtered property to False. The third way to
filter records is to write an OnFilterRecord event handler.
Using Local Lookup Tables
Another technique for reducing network traffic and server load is to use
ClientDataSets for all lookup tables that you use for data validation, combo
boxes and to support lookup fields in ClientDataSets. For table with a few
thousand records or less you can select all records from the server. From then
on, as long as you do not call the ClientDataSet's Refresh method, your
application will use the records held in memory in the ClientDataSet's Data
property and never reread these records from the server.
Maintained Aggregates
Maintained aggregates allow you to define summary values that will
automatically be maintained by the ClientDataSet. To create a maintained
aggregate use the Fields Editor to add a new field, whose type is aggregate, to
the ClientDataSet as shown below.

Maintained aggregates are shown in a separate section of the Fields Editor
below the other fields as shown below.

When selected, the maintained aggregate, Total_Salary, appears as shown
below in the Object Inspector.
To use the maintained aggregate field you must set its Active property to
True and enter an expression that describes the value you want in the Epression
property. You must also set the ClientDataSet's AggregatesActive property to
True. In this example the expression is a simple one, Sum(Salary), to calculate
the total salary for all of the employee records in the ClientDataSet. You can
also use the Min, Max, Avg and Count functions. You can also build more complex
expressions. For example, the SalHistCds ClientDataSet has a maintained
aggregate named Total_Increases whose expression is
sum(New_Salary) - sum(Old_Salary)
to calculate the total amount of salary changes for the currently selected
employee. The same result could have been obtained with the expression
Sum(New_Salary - Old_Salary). The two major restrictions on maintained
aggregate expressions are that you cannot next summary functions, so
Avg(Sum(New_Salary - Old_Salary)) is not legal, and you cannot use
non-aggregated field values in the expression because there is no way to
determine which record the field value should come from.
You can also calculate aggregates for groups of records. For example, you
can have subtotals or averages by group. The following screen from the sample
application shows the form displayed when you choose View | Employee Statistics
from the menu. Notice the rightmost three columns in the grid which show the
total salary by department, the average salary by department and the number of
employees in the department.

The first step in creating a maintained aggregate that uses groups is t
create an index that groups the records the way you want them. In the sample
application an index on the Dept_No, Last_Name and First_Name fields is defined
in the IndexDefs property of the EmpStatsCds ClientDataset. The ClientDataSet's
IndexName property is set to this index which is named AscByDept. In additions
the GroupingLevel property of the index is set to one to indicate that we only
want to group by the first field in the index. It could be set to a higher
value up to the number of fields in the index. Creating a maintained aggregate
that uses grouping is the same as creating one that does not use grouping
except that you must set two additional properties. The first is the
GroupingLevel property of the aggregate field which must be set to the number
of fields in the index that you want to group by. In this example we want to
group by the first field in the index, Dept_No, so GroupingLevel is set to one.
The second property is IndexName. In this example the IndexName property of the
aggregate field is set to AscByDept. The aggregate will only be calculated when
this index is the active index of the ClientDataSet.
Cloning Datasets
Another very handy feature of ClientDataSets is the ability to clone
cursors. Suppose you need to look at two different records in the same dataset
at the same time. If you are not working with ClientDataSets you must either
find the record and save it to an array or use a second query component to
fetch a second copy of the records which creates additional network traffic and
additional load on the server. If you are using ClientDataSets and need an
addtional view of your data all you have to do is add another ClientDataSet to
your application and call its CloneCursor method as shown in the following code
from the OnCreate event handler of the EmpCloneForm form in the sample
application.
procedure TEmpCloneForm.FormCreate(Sender: TObject);
begin
with EmpCloneCds do
begin
CloneCursor(MainDm.EmpCds, False, False);
Open;
end; //with
end;
CloneCursor takes three parameters. The first is the source ClientDataSet.
The second and third are called Reset and KeepSettings and both are boolean.
When Reset and KeepSettings are False the RemoteServer, ProviderName,
IndexName, Filter, Filtered, MasterSource, MasterFields and ReadOnly properties
are copied from the source ClientDataSet to the clone. When Reset is True the
properties of the clone are set to their default values. When KeepSettings is
True the property values of the clone are not changed.
Saving Data Locally
The ClientDataSet component has a SaveToFile and a LoadFromFile method.
SaveToFile saves the contents of the ClientDataSet's Data property and any
indexes that were defined at design time in the IndexDefs property to a local
file in a proprietary format. If the file extension is XML only the data will
be saved and it will be saved in XML format. LoadFromFile will load the saved
data back into the ClientDataSet.
Aside from being a very easy way to convert data to XML this ability has
many other uses. One is to create briefcase model applications that can
download a subset of data from the database server; save the data locally;
disconnect from the network; add, delete and change records; reconnect to the
network and apply the changes to the database. You can also use this capability
to reduce network traffic for applications that run over a low speed WAN and
which use tables that are relatively static and contain moderate numbers of
records. Just read the data for the static tables into the ClientDataSet and
save it locally. From then on load the data from the local files. All you need
is a control table in the database updated by a trigger that shows the date and
time each table was last updated. When your application starts check the date
and time last updated for each table and only download the data from the server
if the table has been changed since your application last retrieved it. You can
also use a ClientDataSet as a single user database without using any external
database or server.
Using ClientDataSets as In-Memory Tables
You can also use a ClientDataSet without connecting it to a provider. At
design time add fields to the FieldDefs property using the property editor.
Then all you have to do is right click on the ClientDataSet and choose Create
Dataset from the context menu. You now have an in-memory table with all of the
features described above. You can also add the FieldDefs at runtime using the
FieldDefs.Add method and call the ClientDataSet's CreateDataSet method to
create in-memory tables on-the-fly. Use these temporary tables for data
manipulation, reporting or any other situation where you can improve
performance and reduce network traffic by storing data in memory.
Deploying ClientDataSet
If you use ClientDataSets in your application you must deploy MIDAS.DLL with
your application. MIDAS.DLL contains the support code for ClientDataSet. You
must place MIDAS.DLL in the directory that contains your EXE, the Windows
System directory, the Windows directory or any directory on the search path.
Summary
Using ClientDataSets for your Interbase applications offers a host of
benefits. ClientDataSets can reduce both server load and network traffic. Using
ClientDataSets improves concurrency by minimizing the length of time
transactions are open. In addition they let you sort and filter your data
quickly and easily, allow you develop briefcase model applications and make
queries and stored procedures updateable. Finally, you can use a ClientDataSet
as a very powerful in-memory table.
Connect with Us