Sunday, March 25, 2007

How the automatic persistence magic is woven - Part 2

I did not declare any static methods on my persistent classes (see previous posting), so someone else must have done it, so they appear in the compiled assembly. Look here, this is my persistent class as seen through Lutz Roeder´s Reflector:

    1 [Persistent]

    2 internal class Person : PersistenceCapable

    3 {

    4     // Fields

    5     public DateTime dob;

    6     public string firstname;

    7     [Depend]

    8     public Address homeAddress;

    9     public string lastname;

   10     private static readonly sbyte[] OpenAccessEnhancedFieldFlags;

   11     private static readonly string[] OpenAccessEnhancedFieldNames;

   12     private static readonly Type[] OpenAccessEnhancedFieldTypes;

   13     [NonSerialized]

   14     protected sbyte OpenAccessEnhancedFlags;

   15     private static readonly int OpenAccessEnhancedInheritedFieldCount;

   16     private static readonly Type OpenAccessEnhancedPersistenceCapableSuperclass;

   17     private static int OpenAccessEnhancedSlotCount;

   18     [NonSerialized]

   19     protected StateManager OpenAccessEnhancedStateManager;

   20     [Transient]

   21     public OnProgressDelegate progressing;

   22 

   23     // Methods

   24     static Person();

   25     private Person();

   26     public Person(string firstname, string lastname, DateTime dob, Address homeAddress);

   27     public override void OpenAccessEnhancedCopyField(int);

   28     public sealed override void OpenAccessEnhancedCopyFields(object, int[]);

   29     protected override void OpenAccessEnhancedCopyKeyFieldsFromObjectId(object);

   30     public override void OpenAccessEnhancedCopyKeyFieldsFromObjectId(PersistenceCapable.ObjectIdFieldConsumer, object);

   31     public override void OpenAccessEnhancedCopyKeyFieldsToObjectId(object);

   32     public override void OpenAccessEnhancedCopyKeyFieldsToObjectId(PersistenceCapable.ObjectIdFieldSupplier, object);

   33     public static DateTime OpenAccessEnhancedGetdob(Person);

   34     public static string OpenAccessEnhancedGetfirstname(Person);

   35     public static Address OpenAccessEnhancedGethomeAddress(Person);

   36     public static string OpenAccessEnhancedGetlastname(Person);

   37     public static int OpenAccessEnhancedGetManagedFieldCount();

   38     public sealed override object OpenAccessEnhancedGetObjectId();

   39     public sealed override PersistenceManager OpenAccessEnhancedGetPersistenceManager();

   40     public sealed override object OpenAccessEnhancedGetTransactionalObjectId();

   41     public sealed override bool OpenAccessEnhancedIsDeleted();

   42     public sealed override bool OpenAccessEnhancedIsDirty();

   43     public sealed override bool OpenAccessEnhancedIsNew();

   44     public sealed override bool OpenAccessEnhancedIsPersistent();

   45     public sealed override bool OpenAccessEnhancedIsTransactional();

   46     public sealed override void OpenAccessEnhancedMakeDirty(string);

   47     protected override object OpenAccessEnhancedMemberwiseClone();

   48     public override PersistenceCapable OpenAccessEnhancedNewInstance(StateManager);

   49     public override PersistenceCapable OpenAccessEnhancedNewInstance(StateManager, object);

   50     public override object OpenAccessEnhancedNewObjectIdInstance();

   51     public override object OpenAccessEnhancedNewObjectIdInstance(string);

   52     public sealed override void OpenAccessEnhancedPreSerialize();

   53     public override void OpenAccessEnhancedProvideField(int);

   54     public sealed override void OpenAccessEnhancedProvideFields(int[]);

   55     public override void OpenAccessEnhancedReplaceField(int);

   56     public sealed override void OpenAccessEnhancedReplaceFields(int[]);

   57     public override void OpenAccessEnhancedReplaceFlags();

   58     public override void OpenAccessEnhancedReplaceStateManager(StateManager);

   59     public static void OpenAccessEnhancedSetdob(Person, DateTime);

   60     public static void OpenAccessEnhancedSetfirstname(Person, string);

   61     public static void OpenAccessEnhancedSethomeAddress(Person, Address);

   62     public static void OpenAccessEnhancedSetlastname(Person, string);

   63 }

It´s considerably larger than my original definition which contained just a couple of public fields (lines 5..10, 21) and two ctors (25, 26). So who added the interface PersistenceCapable and the static methods and why?

 

The culprit is the so called enhancer of OpenAccess. It´s a tool called during the build process in VS2005. When you "enable a project" for OpenAccess (see the OpenAccess|Enable Project menu item in VS2005), the VOA wizard adds the following section to the project file:

    1 <ProjectExtensions>

    2     <VisualStudio>

    3       <UserProperties OpenAccess_EnhancementOutputLevel="1"

    4                       OpenAccess_UpdateDatabase="True"

    5                       OpenAccess_Enhancing="True"

    6                       OpenAccess_ConnectionId="DatabaseConnection1"

    7                       OpenAccess_ConfigFile="App.config" />

    8     </VisualStudio>

    9 </ProjectExtensions>

These properties are interpreted by OpenAccess after any post-build events the project might contain and if the OpenAccess_Enhancing property is set to true, then the VOA VS2005 integration calls its VEnhance.exe application located in the sdk/dotnet20 folder of the VOA installation directory.

For my sample project the command line for the enhancer would be

venhance.exe -assembly:simplecrud.dll

This would take the assembly created by the C# compiler and add code to make object persistence as transparent as demonstrated. For that, the enhancer needs to add code at two different locations: in any class marked as OpenAccess.Persistent and whereever fields of instances of those classes are accessed.

Persistent classes are enhanced by adding a number of static fields and methods as well as implementing the OpenAccess interface PersistenceCapable. The overall purpose of this is to make change tracking and lazy loading possible without falling back on reflection which would be slow.

Detecting changes on a persistent object´s data is as necessary for an O/R Mapper as it is for you when using ADO.NET. Without knowing if an object is new (has not been persisted yet) or just changed and on top of that, which fields have been changed since it was loaded, is essential for generating the right SQL command. An O/R Mapper like a DataAdapter needs to either issue an INSERT for a new object or an UPDATE for a modified object (or a DELETE for any deleted objects). Plus an UPDATE statement should not always overwrite all column values, but just the ones which have been modified.

In order to determine what to do, a DataAdapter checks the row state of each DataRow. Each DataRow does its own change tracking. But how to do this for regular objects? They usually don´t carry and meta information on their data. Basically there are two approaches:

  • either the O/R Mapper manages a copy of each persistent object´s data, or....
  • each object keeps two copies of its data, one for its current state and one for its state when it was loaded.

And there are two approaches to comparing the original state of an object with its current state:

  • The O/R Mapper can access an object´s state in a somewhat brute force way using reflection, or...
  • the O/R Mapper interacts with an object in a predefined way, e.g. through the methods of a common base class or an interface.

Using reflection seems to be the most convenient way to do change tracking - from an application developer´s point of view. O/R Mappers working like this usually don´t require you to do anything special in order to make the objects or a class persistent. Ideally you don´t even need to mark them with an attribute. Just take any object and throw it at the O/R Mapper and it tries to persist it.

This sounds great, but comes has a major disadvantages: It´s slow. It´s slow, because reflection is slow compared to direct method calls. And it´s slow because changes to objects can only be recognized by comparing their complete state to an internal state copy kept by the O/R Mapper. The O/R Mapper cannot ask such a persistent object for which parts of its state have changed.

O/R Mappers who value performance over flexibility thus do not rely on reflection, but require extra code on persistent classes. The basic form of this code needs to be known to them for early binding and strongly typed object access, thus a persistent class either needs to be derived from a base class provided by the O/R Mapper or implement an interface known to or provided by the O/R Mapper. In any case, additional code beyond whatever business functionality a persistent class is supposed to implement needs to be written.

Now, this code could be fully generated, e.g. from a database schema. For example LLBLGen Pro is offering this way to object persistence. The Wilson O/R Mapper on the other hand added an interface to be implemented on top of its changing tracking via reflection.  You can then implement it yourself or can again use some kind of code generation.

NDO and OpenAccess, though, also do not use reflection to access object state. They don´t require you to add special persistence code, either. They use code generation - but not source code generation. They generate binary code and insert it transparently into the assembly compiled from the original source code thus enhancing it. You (usually) never (need to) see the code necessary to make O/R Mapping perform well.

Bottom line: If you want high performance O/R Mapping, additional code is necessary to couple persistent objects to the O/R Mapper. This code can either be written by hand or be generated. It can be generated as source code or as binary code. OpenAccess does the latter. Is that good or bad? Well, it depends. It´s completely transparent to you and it´s safe. But it´s as it is; you cannot (and are not supposed to) interfere with this code. So if you really want to get your hands dirty tweaking persistence code, OpenAccess is not for you. There are no templates to adapt to any special needs of yours. Of course you can optimize OpenAccess´ workings with regard to object load behaviour optimization, but not with regard to the code it generates.

The question now is, where does this transparent persistence enabling code need to be added? Firstly to the class (see listing above), since it´s supposed to avoid the need for reflection to check for object modifications. But secondly at all locations where persistent objects´ fields are accessed (see the last listing of part 1). That´s necessary to track changes and see if an object needs to be loaded.

The necessity of this should be obvious. But what are the implications? I´d say the most important implication of code enhancement is that you want to isolate it. You should see to that as few assemblies as possible require it. That means, you want to put as many persistent classes into as few VS2005 projects as possible. Persistent objects mostly belong to the domain model of an application. That alone should be sufficient to encapsulate them in a component/assembly of their own. However in addition you need to hide all (persistent) field access behind methods, lest the enhancer needs not wrap field access in other assemblies as well.

Here´s how I split up my previous sample solution:

The persistent classes as well as the database are put into a VS2005 project of their own. Only this one needs to be enhanced, since all field access is limited to code in the persistent classes defined therein. I just made the public fields private and made them accessible through property methods. That´s it. The database I put together with the persistent classes, since its schema is defined by them.

However, also the referencing project - SimpleCRUD.Application in the above image - needs to reference the OpenAccess library, since it still contains code that uses it. It´s there that I still open the IObjectScope to interact with the database. In a later article I´ll tackle the question of how to isolate this functionality in a dedicated data access component.

Please note: Of course the App.Config created by the OpenAccess wizard during enabling of the persistent classes  project also needs to be included in the project where the IObjectScope is set up. The enhancer uses the App.Config of the project to enhance for finding the database whose schema to check against the persistent class definitions. That´s at compile time. But during runtime OpenAccess also needs to know where the database is located. So there needs to be another App.Config.

This of course means, the database managed by the enhancer needs to be made available during runtime. For global databases this is not a problem. Both App.Config files can reference the same database. But for local databases as above you need to think about how to move it from the compiletime location to the execution directory. I did this by setting the output directory of the SimpleCRUD.PersistentObjects project to the same folder as the referencing project´s output directory.

Wrap-up: OpenAccess is able to transparently load/store persistent objects. This is possible without the need for you to write/generate any code by automatically adding code to the persistent classes and any access to their fields in a post-build step using an enhancer. To limit the intrusion of the enhancer into your code, a specific VOA best practice thus is to bundle up your persistent classes in as few assemblies as possible. Also this should be motivation enough for you to follow the general best practice of hiding you fields and funnelling all access to them through methods/properties.

Please note: If you happen to access your assemblies in a VS2005 post-build event they are still not enhanced! Call the enhancer manually if, for example, you want to copy them to some repository. In that case switch off the enhancer flag in the MSBuild file using the OpenAccess wizard.

Saturday, March 24, 2007

How the automatic persistence magic is woven - Part 1

One of the benefits of O/R Mapping is supposed to be transparency of loading and storing data. You should not need to care exactly when and how data is stored. Just focus on creating and manipulating and selcting your objects. Except for some general interaction with the O/R Mapping infrastructure, you just do, whatever you would do with your objects if they were not persistent:

   19 Person p;

   20 p = new Person("Eric", "Doe", new DateTime(1972, 2, 17),

   21                new Address("123 Wisconsin Ave NW", "Washington, DC", "22099", "USA"));

   22 os.Add(p);

   23 

   24 p.firstname = "Denis";

   25 p.homeAddress.city = "San Francisco, CA";

Line 19 creates two objects, line 24 changes an object. Just line 22 is a tribute to the O/R Mapper: you register the root object of a persistent object tree with the O/R Mapper. But this is all straightforward OOP; you deal with your objects/class and not some generic stuff like an ADO.NET DataSet.

What you don´t see, though, is, how your objects are persisted. You don´t call a Save() method on individual objects - at least not with OpenAccess; other O/R Mappers might require you to do so -, especially you don´t need to care for making each and every individual object persistent. Dependent objects like the Address in line 21 are automatically persisted with their parent objects - at least that´s how OpenAccess works, it´s called persistence by reachability.

However, at some point you have to tell the O/R Mapper to persist the changes to the overall set of persistent objects you´ve loaded and modified, deleted, or created. With OpenAccess this is done by closing a transaction which means: before you change anything on persistent objects you need to open such a transaction either through the OpenAccess API:

   15 using (OpenAccess.IObjectScope os = OpenAccess.Database.Get("DatabaseConnection1").GetObjectScope())

   16 {

   17     os.Transaction.Begin();

   18     ...

   26     os.Transaction.Commit();

   27 }

... or use the .NET System.Transactions namespace:

   45 OpenAccess.Database db = OpenAccess.Database.Get("DatabaseConnection1");

   46 db.Properties.TransactionProvider = OpenAccess.TransactionProvider.TransactionScope;

   47 using (OpenAccess.IObjectScope os = db.GetObjectScope())

   48 {

   49     using (System.Transactions.TransactionScope tx = new System.Transactions.TransactionScope())

   50     {

   51         ...

   57     }

   58 }

Warning: Using TransactionScope like above and what´s called implicit transactions by OpenAccess only works with SQL Server and Oracle 9i (or higher), but not with SQL Express! Also you should not mix OpenAccess transaction usage with System.Transactions, since you would need to switch between the two modes by setting the database TransactionProvider property (line 46 above) all the time. The TransactionProvider needs to be set before an IObjectScope for the database is opened.

So the overall signal to the O/R Mapper to store any changes made to persistent objects is a transaction´s commit command. Only during this commit SQL INSERT, UPDATE, DELETE statements will get issued to transfer the current state of the objects to the database. That also means, a real database transaction will only be necessary now.

 
Click on image to enlarge 

Now, the magic of O/R Mapping is happening during lazy loading and during commit. How does the O/R Mapper know, when to load more objects, e.g. the Address for a Person? And how does the O/R Mapper determine which objects to store and which field values have changed?

This magic obviously has to be woven by code behind the scenes. The class definitions are void of any infrastructure code to that purpose. Remember: You just need to adorn a class with the OpenAccess.Persistent attribute to make it a persistent class.

Here´s the solution to the magic trick. Let´s look into the magician´s top hat:

  225 using (IObjectScope os = Database.Get("DatabaseConnection1").GetObjectScope())

  226 {

  227    os.Transaction.Begin();

  228    Person p = new Person("Eric", "Doe", new DateTime(0x7b4, 2, 0x11), 
                               new Address("123 Wisconsin Ave NW", "Washington, DC", "22099", "USA"));

  229    os.Add(p);

  230    Person.OpenAccessEnhancedSetfirstname(p, "Denis");

  231    Address.OpenAccessEnhancedSetcity(Person.OpenAccessEnhancedGethomeAddress(p), "San Francisco, CA");

  232    os.Transaction.Commit();

  233 }

Thanks to Lutz Roeder´s Reflector this is the code I was able to extract from the assembly the above code (lines 19 to 27) was compiled into. But compare lines 230 and 231 in this listing with lines 24 and 25 in the first listing above. There is an astonishing difference: simple field assignments got replaced by calls to static methods. But I did not declare any static methods on the Person and Address classes. Where do they come from? For an answer see my next article...

Saturday, March 10, 2007

Updating VOA to version 4.3.2

Today I downloaded a new version of VOA from Vanatec´s site. Vanatec´s support had told me, they had fixed some stuff I complained about in this blog. Well, let me see...

 1. The "installation experience" was better than last time. They indeed put all install files in a directory tagged with the version number. This makes for less clutter on my hard disk, I´d say.

2. Installation was smooth. The setup process detected I already had a VOA version installed and even a VS2005 instance running which used it. I closed VS and OpenAccess was patched instead of reinstalled.

3. After re-opening my VOA sample solution in VS2005, though, I was unable to recompile it. The OpenAccess.20 assembly referenced was out of sync with the newly installed version. Since I like to keep all my libs local to a project [download samples], I now finally copied the new VOA files to my libs folder: OpenAccess.20.dll, OpenAccess.20.xml, and OpenAccess.20.kernel.netmodule. In the future I will reference the assembly from there so I need to replace it just in one location after installing a new version.

4. The ADO.NET |DataDictionary| placeholder now works for VOA. So I set the Copy Local switch to "Copy Always" for my local databases. Now the VOA enhancer first applies all necessary changes to the database schema to the mdf residing in the project directory, then VS2005 copies this mdf to the output directory. That way my sample programs always start with a fresh database copy.

5. There was indeed a problem with the AttachDbFilename option I talked about earlier. At first I did not realize it, but then it became obvious: VOA registered the local database with the global SQL Server Express instance. That way the local database stayed attached even though the application process finished. VOA did not use the User Instance for SQL Server connection strings. But now it does! So AttachDbFilename works as expected now. My App.Config for the CRUD sample now looks like this:

    8 <connections>

    9   <connection id="DatabaseConnection1">

   10     <databasename>VOASimpleCRUDDB</databasename>

   11     <servername>.\SQLEXPRESS</servername>

   12     <integratedSecurity>True</integratedSecurity>

   13     <backendconfigurationname>mssqlConfiguration</backendconfigurationname>

   14     <connectionParams>AttachDbFilename=|DataDirectory|\VOASimpleCRUDDB.mdf</connectionParams>

   15   </connection>

   16 </connections>

 
I´m impressed by how fast Vanatec´s support reacted. Great! Thank you, guys!

The basic CRUD operations - Part 2/2

In Part 1 of "The basic CRUD operations" I showed you how to add to and retrieve objects from a database. Now here´s how to change and delete them:

Updating objects

Updating objects is the easiest part of all.  I can´t even show what to do, because you don´t need to do anything special :-) You just load the objects you want to modify (lines 81ff below) , modify them (lines 85, 88f) - and that´s it. However, you need to do this within the scope of a VOA transaction (lines 79 and 91). But that should be obvious.

   77 using (OpenAccess.IObjectScope os = OpenAccess.Database.Get("DatabaseConnection1").GetObjectScope())

   78 {

   79     os.Transaction.Begin();

   80 

   81     OpenAccess.IQuery q;

   82     q = os.GetOqlQuery("select p from PersonExtent p where p.firstname='Sammy'");

   83     OpenAccess.IQueryResult qr = q.Execute();

   84 

   85     Person p = (Person)qr[0];

   86     Console.WriteLine("{0} {1}, {2}", p.firstname, p.lastname, p.homeAddress.city);

   87 

   88     p.lastname += "x";

   89     p.homeAddress.city += "y";

   90 

   91     os.Transaction.Commit();

   92 }


How does VOA know during Transaction.Commit() which objects to persist? Because each persistent object does its own change tracking. So during Commit() VOA just checks the objects it knows about for a change flag and stores just the ones where the flag is true. More about this in a later post. For now take this as a part of VOA´s magic ;-)

As long as you just want to read from the database and don´t want to change the objects loaded, you don´t need a transaction. But as soon as you start manipulating persistent fields, a VOA transaction needs to be active! Lines 99 until 107 just open a "connection" to the database and read an object. But then just before changing the value of a field of this object, a transaction is opened (line 110) and closed once all changes have been applied (line 115).

   99 using (OpenAccess.IObjectScope os = OpenAccess.Database.Get("DatabaseConnection1").GetObjectScope())

  100 {

  101     // reading

  102     OpenAccess.IQuery q;

  103     q = os.GetOqlQuery("select p from PersonExtent p where p.firstname='Sammy'");

  104     OpenAccess.IQueryResult qr = q.Execute();

  105 

  106     Person p = (Person)qr[0];

  107     Console.WriteLine("{0} {1}, {2}", p.firstname, p.lastname, p.homeAddress.city);

  108 

  109     // manipulating

  110     os.Transaction.Begin();

  111 

  112     p.lastname += "X";

  113     p.homeAddress.city += "Y";

  114 

  115     os.Transaction.Commit();

  116 }

 
What happens, if you forget to open a transaction and try to change a field? An exception is thrown telling you, there´s no open transaction.

Essentially this is not different from what you should to when you work with ADO.NET. Any manipulation of data should be done within a transaction. For single DML statements this is ensured by the RDBMS automatically. But if you want to store the changed content of a DataSet you always should explicitly wrap the calls to Update() of all the DataAdapters involved in a transaction, since more than one DML statement gets issued.

Deleting objects

O/R Mapping differs from ADO.NET in that you first need to load data in order to delete it. With ADO.NET you can just issue a SQL DELETE statement to delete thousands of rows in many tables. But O/R Mappers traditionally do not provide means for such kind of bulk data manipulations. O/R Mapping like ODBMS are about working with single objects. SQL on the other hand is for batch operations.

To delete an object from the database call Remove() of the "connection" with it:

  124 OpenAccess.IObjectScope os;

  125 os = OpenAccess.Database.Get("DatabaseConnection1").GetObjectScope();

  126 os.Transaction.Begin();

  127 

  128 OpenAccess.IQuery q;

  129 q = os.GetOqlQuery("select p from PersonExtent p");

  130 OpenAccess.IQueryResult qr = q.Execute();

  131 foreach (Person p in qr)

  132 {

  133     Console.WriteLine("Deleting: {0}", p.firstname);

  134     os.Remove(p);

  135 }

  136 

  137 os.Transaction.Commit();

  138 os.Dispose();

 
Of course this has to happen within a transaction like any other data manipulation.

What if this object references other objects? How can referential integrity be enforced? As shown in  Part 1  you can mark such dependencies with the attribute OpenAccess.Depend. However, the dependent objects still have to be loaded before they will be deleted. This will be done by the VOA engine, nevertheless this causes traffic on the database. So before deleting tons of objects like this living in deep hierarchies, think about it. Maybe you want to fall back on ADO.NET for that and issue a couple of DELETE statements in a transaction. I call this hybrid data access: Whereever O/R Mapping provides a benefit, use it. But if something is too hard or too costly doing it with O/R Mapping, then go back to ADO.NET.

However, if you delete data using SQL thereby bypassing the O/R Mapper, you need to be careful to also discard any objects that might refer to that data!

The basic CRUD operations - Part 1/2

Before I drill down into single features of O/R Mapping, I´d like to give you an overview of how to do the most common operations on RDBMS data with OpenAccess. How to implement CRUD functionality using an O/R Mapper?

Persistent classes

First things first. Here´s the class model I want to use for today´s experiment. It´s a little more elaborate than so far: 

    7 [OpenAccess.Persistent]

    8 class Person

    9 {

   10     public string firstname;

   11     public string lastname;

   12     public DateTime dob;

   13 

   14     [OpenAccess.Depend]

   15     public Address homeAddress;

   16 

   17     private Person() { }

   18 

   19     public Person(string firstname, string lastname, DateTime dob, Address homeAddress)

   20     {

   21         this.firstname = firstname;

   22         this.lastname = lastname;

   23         this.dob = dob;

   24         this.homeAddress = homeAddress;

   25     }

   26 }

 

    7 [OpenAccess.Persistent]

    8 class Address

    9 {

   10     public string streetAddress;

   11     public string city;

   12     public string zip;

   13     public string country;

   14 

   15     private Address() { }

   16 

   17     public Address(string streetAddress, string city, string zip, string country)

   18     {

   19         this.streetAddress = streetAddress;

   20         this.city = city;

   21         this.zip = zip;

   22         this.country = country;

   23     }

   24 }

 

I want to persist objects of two different kinds: persons and addresses. Each Person can have a single home address assigned. The existence of an Address object should depend on the Person it´s assigned to. So I´m defining an aggregation here. That´s why the homeAddress field is adorned with the OpenAccess.Depend attribute. It tells the O/R Mapper to delete an Address with its Person. To not assign the same Address to more than one Person thus is a matter of the business logic.

Also noteworthy is the empty constructor. It´s necessary for the O/R Mapper to instanciate empty objects which then get filled from the database. This ctor without parameters can be private, though, if you don´t want to publish it.

In my previous sample I did not need an explicit ctor because I did not specify one myself. Here, though, I´m using a ctor with parameters so I need to make clear, whether or not a parameterless ctor is also ok.

Creating objects

Creating objects and persisting them is as easy as with non-persistent objects.

   15 using (OpenAccess.IObjectScope os = OpenAccess.Database.Get("DatabaseConnection1").GetObjectScope())

   16 {

   17     os.Transaction.Begin();

   18 

   19     Person p;

   20     p = new Person("John", "Doe", new DateTime(1972, 2, 17),

   21                    new Address("123 Wisconsin Ave NW", "Washington, DC", "22099", "USA"));

   22     os.Add(p);


However, to make an object and its children persistent, you only have to add the root object to the OpenAccess "connection", the ObjectScope. VOA will automatically persist all objects "hanging" below this root, like the Address in the above example. That´s called persistence by reachability. The O/R Mapper will traverse the tree of objects and persist all objects whose classes are marked as OpenAccess.Persistent. If it hits a object with a field type which it cannot persist, though, it throws an exception.

So in case you want to exclude a field from being persisted, e.g. a callback delegate, mark it with the OpenAccess.Transient attribute:

    7 public delegate void OnProgressDelegate(double pct);

    8 

    9 [OpenAccess.Persistent]

   10 class Person

   11 {

   12     ...

   19     [OpenAccess.Transient]

   20     public OnProgressDelegate onProgress=null;

Retrieving objects

To load data from the database into objects work almost like with ADO.NET: you describe the data you want using a query language and then execute the query. The query language offered by VOA is OQL (Object Query Language). It´s a language defined by the ODMG. It purposedly resembles SQL. A query to select all Person objects would look like this:

select * from PersonExtent

Putting "Extent"  after the class name is necessary to make clear you mean "all object of this class". At least this is how OpenAccess wants it; I don´t see this convention in the official OQL specs.

What such a query returns, though, is (usually) not a projection of a table, i.e. a list of fields arranged in rows, but a list of objects of the type you specified. The above query would return Person objects - and just Person objects. There is no such thing as a join in O/R Mapping. When querying you´re always asking for a single kind of objects to be returned.

To apply a filter to the objects "in an extent" (of a class) you can specify a where clause as in SQL. For example the following piece of code is asking for all Person objects with "Sammy" as their firstname.

   50 OpenAccess.IQuery q;

   51 q = os.GetOqlQuery("select p from PersonExtent p where p.firstname='Sammy'");

   52 OpenAccess.IQueryResult qr = q.Execute();

   53 foreach (Person p in qr)

   54     Console.WriteLine("{0} {1}, {2}; {3}, {4}", p.firstname, p.lastname, p.dob, p.homeAddress.city, p.homeAddress.country);

 
See how the usual dot-notation is used in the query string? The p is defined as the placeholder for all Person objects to check. It´s like the running element variable of a foreach statement. You could translate the above query into something like this:

List<Person> results = new List<Person>();
foreach(Person p in PersonExtent)
    if (p.firstname=="Sammy") results.Add(p);

Of course this also works with object references in persistent objects. You´re not limited to filtering objects just according to their own fields. You can descend the object tree hanging down from it. Here´s a query to retrieve all persons living in a particular city:

select p from PersonExtent p where p.homeAddress.city='Hamburg'

The dot-notation is not even limited to singe instances like the single address of a person. If a person had more than one addresses, e.g. stored in an array, the same syntax would work to select all persons with at least one address in the specified city.

To execute a query with OpenAccess, you need ask the "connection" to create an IQuery object for you. Using the ObjectScope as a factory for query objects guarantees them to be bound to a "connection". Then you Execute() the query and get a IQueryResult resultset which you can iterate through or can access directly, e.g. qr[0].

The objects returned are of the class you specified in your query. They are not (!) some kind of proxy objects like with NHibernate which just look like a persistent class. So if I had defined a small class hierarchy like

Person <- Student

and would issue the query

select * from PersonExtent

then I could do the following upcast if necessary:

Student s = (Student)qr[0];

And since the objects returned are of the specified class, of course you can use them as if they had been in memory all along. When accessing their fields their values are either already there or will get loaded transparently by the O/R Mapper. The line 54 in the example above: Although the code asked for Person objects it´s no problem to access the Address object pointed to and not explicitly loaded. In fact, this kind of transparent loading - or lazy loading how it´s called - is at the core of O/R Mapping. To not go back and explicitly load lower levels of an object tree is what makes O/R Mappers so convenient to use compared with ADO.NET - at least as long as such kind of transparent loading works, which is dependent on the application´s architecture.

See here for Part 2 on CRUD operations...

Monday, March 05, 2007

Using a local SQL Express database

I like to keep my software artifacts together during development. Preferrably I organize them under a single folder for a whole project. That´s also true for databases. Fortunately SQL Server 2005 lets you accomplish this for its databases by using the AttachDbFilename parameter in a connection string.

Using this feature I now want to switch from a global sample database to a local database. But that´s easier said than done. Where should I enter this connection string option? There is no place in the VOA wizard and I don´t see a connection string in the App.Config generated by the wizard.

Fortunately the VOA support was able to help me. Here´s the relevant part of my App.Config:

    8 <connections>

    9   <connection id="DatabaseConnection1">

   10     <databasename>VOASamples01SimpleWithLocalDB</databasename>

   11     <servername>.\SQLEXPRESS</servername>

   12     <integratedSecurity>True</integratedSecurity>

   13     <backendconfigurationname>mssqlConfiguration</backendconfigurationname>

   14     <connectionParams>AttachDbFileName=C:\Samples\VOASamples01\SimpleWithLocalDB\VOASamples01SimpleWithLocalDB.mdf</connectionParams>

   15   </connection>

   16 </connections>


I created a new project called SimpleWithLocalDB which contains right the same code as my first one. I also used the VOA wizard in the same way, and just put in a different database name.

Then I added a SQL Database through the Add Item menu item which was put into the project directory. I adjusted the App.Config like above so it pointed to the newly created local database and compiled the project. All went well.

However, when I then ran my test I realized, the database which was automatically copied to the execution directory, was not changed. Copying a local database for execution is the default for VS2005. Rather the database in the project directory was updated with the object´s data the code stored. Sure this should have been obvious since the connection string pointed to the database in the project folder... but from using ADO.NET I kinda was used to accessing a temporary copy in the execution directory. But of course, then I used |DataDirectory| as a placeholder for the actual path to the database.

Unfortunately, |DataDirectory| does not work with VOA. VOA needs a written out full path to the database with AttachDbFilename. It currently can only deal with a single database for automatic schema adaption as well as persisting objects. So I left the path to the project folder database in the App.Config and switched the Copy to Output Directory property of the database to "Do not copy" (see figure above).

Now my simple sample works just fine with a local database. However, I suggest, Vanatec improves its connection string handling a bit and understand |DataDictionary|. That would make rolling back test data easier.

Is the current limitation a major drawback, though? No, I don´t think so. Most projects use central/global databases anyway like I did in my initial sample.

The only small real drawback I see is, you need to adjust such paths to databases in my samples you can download. Putting a placeholder of my own into the connection string would not help, since VOA needs the connection string not only during runtime but also at compiletime.

But anyway... Let´s move on with actually persisting objects.

 

[see "Updating VOA to version 4.3.2" for an update on some of the issues raised in this post]