Wednesday, February 19, 2014

Thoughts about application architeture with Lazarus

The Delphi books of my days (or why i'm not guilt of my application's poor design)

As most of Lazarus developers, i started to code in Delphi (in fact i learned computer programming with turbo pascal) and to get most of the tool i read some books, i bought three or four and read part of others in bookstores. This is supposed to be a good practice when learning a new technology.

The problem, noticed by me only years later, is the lack of teaching of good application design like separation of concerns (view, business, persistence layers) and how to achieve them with Delphi. Most of the books focused in the visual aspect (how to create a good looking form, reports etc) and how to setup datasets and the db aware controls. The closer to a good practice advice was putting datasets and datasources in data modules instead of forms.

We can't even blame the book authors. The Delphi's greatest selling point was (is?) the Rapid Application Development (RAD) features.

Recipes for a bulky spaghetti

In early days, when developing my applications, i was a diligent student: i put database logic in data modules and designed the forms as specified in the books. But, as all developers that created applications with more than three forms knows, things started to get hard to evolve and maintain.

Keeping the database components in data modules did not help much. You end with shared dataset states, and all problems that comes with it, across different parts of applications.

Below is a data module's snapshot of my first big application (still in production, by the way).

It could be even worse if i had not started to use a TDataset factory in the middle of development 
In the end, the project has code like:

  // a form to select a profile
  DataCenter.PrescriptionProfilesDataset.Open;
  with TLoadPrescriptionProfileForm.Create(AOwner) do
  try
    Result := ShowModal;
    if Result = mrYes then
      DataCenter.LoadPrescriptionProfile;
  finally
    DataCenter.PrescriptionProfilesDataset.Close;
    Destroy;
  end;
  
  //snippet of DataCenter.LoadPrescriptionProfile (copy the selected profile to PrescriptionItemsDataset)
  with PrescriptionItemsDataset do
  begin
    DisableControls;
    try
      FilterPrescriptionProfileItems(PrescriptionProfilesDataset.FieldByName('Id').AsInteger);
      while not PrescriptionProfileItemsDataset.Eof do
      begin
        NewMedication := PrescriptionProfileItemsDataset.FieldByName('Medication').AsString;
        if Lookup('Medication', NewMedication, 'Id') = Null then
        begin
          Append;
          FieldByName('PatientId').AsInteger := PatientsDatasetId.AsInteger;
          FieldByName('Medication').AsString := NewMedication;
          FieldByName('Dosage').AsString := PrescriptionProfileItemsDataset.FieldByName('Dosage').AsString;
          [..]
          Post;
        end;
        PrescriptionProfileItemsDataset.Next;
      end;
      ApplyUpdates;
      PrescriptionProfileItemsDataset.Close;
    finally
      EnableControls;
    end;
  end;

Its not necessary to be a software architect guru to know that this is unmanageable

Eating the pasta with business objects and inversion of control

In the projects that succeeded the first one, most of the data related code is encapsulated in business objects. The data module does not contain TDataset instances anymore, it's responsible only to act as a TDataset factory and to implement some specific data action. To work with dataset it's necessary just reference one from a key which leads to code like the below:

  FWeightHistoryDataset := DataModule.GetQuery(Self, 'weighthistory');
  FWeightHistoryDataset.ParamByName('prontuaryid').AsInteger := FId;
  FWeightHistoryDataset.Open;

This fixes the shared state issue since each dataset has a clear, limited scope. But does not solve the  business objects dependency of a global instance (DataModule), which  makes testing harder.

In the project that i'm starting, i solved the dependency to the global instance by using the service locator pattern through the IoC Container i cited in a previous post. I defined a resource factory service that is resolved as soon as the business object is created, opening the doors to setup testing environments in a clear manner.

All done?

Not yet. The business logic is contained in specific classes, there's no shared state across application and no hardcoded global dependency but the view layer (forms) is still (dis)organized  in the classic way with each TForm calling and being called by other ones directly. This problem, and the solutions i'm working, will be the subject to a future post.

Saturday, February 15, 2014

Number of units, optimization and executable size

I tend to split my code in small units instead of writing a big unit with lot of classes or functions. The drawback is the large number of files but, in my opinion, the benefits outweight it.


One thing that always bothered me was if this practice has any effect in file size.


Seems not. I wrote two versions of the same program. In the first, all classes (one descendant of another) are defined in the same unit while in the second each class lives in a separated unit. When compiled with the debugging info the separated program is a little bigger. This difference in size does not exist when compiled without debugging info.


As a side note, i noticed that compiling with -O2 flag leads to smaller executables compared with -O1, the Lazarus default. It's just a few kilobytes but worth the note.  





Sunday, February 02, 2014

Using TComponent with automatic reference count

For some time, i know the concepts of Inversion of Control (IoC) and Dependency Injection (DI) as well the benefits they bring but never used them in my code.  Now that i'm a starting a new project from scratch, and the deadline is not so tight, i decided to raise the bar for my code design.

I'll implement an IoC container in the line of VSoft's one. While adding the possibility of doing DI through constructor injection would be great, i won't implement it. It's not a hard requirement of mine and fpc currently does not support the features (basically Delphi's new RTTI) needed to implement it without hacks.

Automatic reference counting


Most of Delphi IoC implementations use COM interfaces and rely on the automatic reference count to manage object instance life cycle. So do i. This approach's drawback is that the class to be instantiated must handle the reference count. When designing new classes or when class hierarchy can be modified, is sufficient to inherit from TInterfaced* classes. The problem rises when is necessary to use a class that has a defined hierarchy and does not handle reference counting, like LCL ones.

Since i plan to decouple TForm descendants, i need a way to use them with the IoC container. Below is the (rough) design, in pseudo code:

//Define interface
  IPersonView = interface
  ['{9B5BBA42-E82B-4CA0-A43D-66A22DCC10DE}']
    procedure DoIt;
  end;

  //Implement an IPersonView
  TPersonViewForm = class(TForm, IPersonView)   
    procedure DoIt;
  end;

  //Register implementation   
  Container.Register(IPersonView, TPersonViewForm); 

  //Instantiate the view
  Container.Resolve(IPersonView)

At first look, it should work seamlessly. And in fact does: a TPersonViewForm is instantiated and returned as IPersonView. The only issue is that the object instance will never be freed even when the interface reference goes out of scope. This occurs because _AddRef and _Release methods of TComponent does not handle reference count by default.

VCLComObject to the rescue


Examining the code, we observe that TComponent _AddRef and _Release forwards to VCLComObject property. There's not good documentation or examples of using this property. So i wrote an example to see if it would solve my problem.

Basically i wrote TComponentReference, a descendant of TInterfacedObject with a dummy implementation of IVCLComObject that gets a TComponent reference in the constructor and free it in BeforeDestruction.

constructor TComponentReference.Create(Component: TComponent);
begin
  FComponent := Component;
end;
procedure TComponentReference.BeforeDestruction;
begin
  inherited BeforeDestruction;
  FComponent.Free;
end;


And this is how i tested:

function GetMyIntf: IMyIntf;
var
  C: TMyComponent;
  R: IVCLComObject;
begin
  C := TMyComponent.Create(nil);
  R := TComponentReference.Create(C);
  C.VCLComObject := R;
  Result := C as IMyIntf;
end;
var
  MyIntf: IMyIntf;
begin
  MyIntf := GetMyIntf;
  MyIntf.DoIt;
end.   

It worked! I get a IMyIntf reference and no memory leaks. Easier than i initially think.

The code can be downloaded here.
















Sunday, July 08, 2012

The cost to supress a warning (and how not pay for it)

In the previous post, i pointed that passing a managed type (dynamic array) as a var parameter is more efficient than returning the value as a function result. However this technique have a known side effect: the compiler outputs a message  (Warning: Local variable "XXX" does not seem to be initialized) each time a call to the procedure is compiled.

The direct way to suppress the warning is change the parameter from var to out. Pretty simple but out does more than inhibit the compiler message. It implicitly initialize managed types parameters to nil or add a call FPC_INITIALIZE if the parameter is a record that has at least a field of a managed type. It does not add implicit code to simple types like Integer or class instances (TObject etc).

Although the performance impact is mostly negligible, is extra code anyway. In my case i initialize the parameter explicitly so out would add redundant code. There's an alternative to suppress the message: add the directive {%H-} in front of the variable that is being passed to the procedure. In the example of the previous post would be:

BuildRecArray({%H-}Result);

It can be annoying if the function is called often or the routine is part of a public API, otherwise is fine. At least for me.

Update: out does not generate initialization code for records that contains only fields which type is not automatically managed by the compiler, e.g., Integer.

Saturday, July 07, 2012

Does it matter how dynamic arrays are passed/returned to/from a routine?

I was implementing a routine that should return a dynamic array and wondered if the produced code of a function and a procedure with a var parameter are different. So, i setup a simple test:

type
  TMyRec = record
    O: TObject;
    S: String;
  end;

  TMyRecArray = array of TMyRec;

function BuildRecArray: TMyRecArray;
begin
  SetLength(Result, 1);
  Result[0].O := nil;
  Result[0].S := 'x';
end;

procedure BuildRecArray(var Result: TMyRecArray);
begin
  SetLength(Result, 1);
  Result[0].O := nil;
  Result[0].S := 'x';
end;

var
  Result: TMyRecArray;

begin
  BuildRecArray(Result); //or Result := BuildRecArray
end.


Looking at the generated assembly revealed that the function version (returns the array in the result) leads to bigger code when compared with the procedure version (pass the array as a var parameter). More: the code difference is due to an implicit exception frame which is known to impact performance.

And what about the caller code? Again the function version generates more code (creates a temporary variable and calls FPC_DYNARRAY_DECR_REF).

In short: yes, it matters.

Thursday, June 14, 2012

The cost of using generics

Since a few versions, fpc provides support for generics. It allows the developer to save some typing and also improves type safety at the time that prevents unsafe typecasts.

Unfortunately it's benefits is not for free.  Every time a generic is specialized, the whole implementation code is copied into the unit/program.

To be more clear, i created two examples that implements a list of a custom class (TMyObj): one uses a TFPList, the other specializes a TFPGList. The difference in usage is that the former needs a typecast.

I compiled both with fpc 2.6.0 under windows. The result is a difference in executable size of 2Kb, the generic version being bigger. Than i looked the generated asm: the code to use the list classes are the same, the difference comes from the copy of implementation of  TFPGList.

Many will say that code size is not a issue anymore given the availability of big hard drives, but i still think that is a good practice seeking smaller code. Regarding generics, it should be used, IMHO, when benefits are clear like classes that are instantiated many times in user (programmer) code and avoided in internal structures of e.g. RTL or third party libraries.

Sunday, July 10, 2011

Generic cross data report with lazreport

In the lazreport repository there's a demo app showing how to create a cross data report. It uses two instances of TfrUserDataset: one for the master (row) data and one for the cross (column) data. At first glance the component does not provide another way to build such reports. A deeper look shows the contrary. Here are the steps to build a cross data report with arbitrary number of rows and columns.

WARNING: to follow this guide is necessary basic lazreport knowledge.

Prepare the report

Nothing special here

  • Create an empty report

  • Add a Master Data band

  • Add a Cross Data band

  • Add a Text Object inside the Cross Data

  • In the Text Object put a variable named value: [value]



Add handler to retrieve the value

Those familiar with lazreport will have no problems:

procedure TForm1.frReport1GetValue(const ParName: String; var ParValue: Variant);
begin
if ParName = 'value' then
ParValue := IntToStr(FRow) + ' - ' + IntToStr(FCol);
end;

Just the column and row indexes for demonstration purpose. Using together with matrix like data structures the retrieve of actual data is straightforward.

Set the number of rows and columns

The most attentive developers will notice that no dataset (even the virtual dataset) was linked to each band. In fact running the report at this stage will lead to a blank page.
If the number of columns and rows are previously know just set the Virtual Dataset option for each band. This is not an optimum solution since not always we have that info. Here's how to set the Virtual Dataset record count at runtime:

procedure TForm1.frReport1BeginDoc;
var
BandView: TfrBandView;
begin
BandView := frReport1.FindObject('MasterData1') as TfrBandView;
BandView.DataSet := '9';
BandView := frReport1.FindObject('CrossData1') as TfrBandView;
BandView.DataSet := '2';
end;

This will create a report with nine rows and two columns. Yep, you read right: lazreport store the number of the records of band's Virtual Dataset in an string field, the same field that store the name of an associated TfrDataset.
WARNING: don't look at lazreport source. It may scare the faint hearted ;-)

Track the row and column positions

The tricky part. The first thing to do is add two integer fields (FRow and FCol) to the Form/Data Module containing the TfrReport instance.

To get the column add an event to OnPrintColumn, and store the ColNo parameter:

procedure TForm1.frReport1PrintColumn(ColNo: Integer; var ColWidth: Integer);
begin
FCol := ColNo;
end;

Notice that the Lazarus IDE will create an event declaration with the second parameter named Width. This will not compile with {$mode ObjFpc}. Renaming it to ColWidth will make the compiler happy.

There's not an event that pass the current row position. The first try is to hook into the OnBeginBand

procedure TForm1.frReport1BeginBand(Band: TfrBand);
begin
Inc(FRow);
end;

Running the report with this will show wrong row indexes because it will increment in all bands not only the Master/Row band. The fix is easy:

procedure TForm1.frReport1BeginBand(Band: TfrBand);
begin
if Band.Typ = btMasterData then
Inc(FRow);
end;

It's done, add Data Header and Cross Header bands, glue with actual data and the generic cross data report is done. The sample project.