program ideas:-

1) the data carries the process [henceforth DCP programming]

the data has a link to each processes by which it has been
manipulated. Time related problems can be obviated
as for example a change in tax at a specific point
only requires data to use a different [new] module from
that tax point as applied to the RELEVANT DATA.
Allying the process to the data item is in direct contrast
to current program-bound processing !
In practice when I programmed this in my trial system
I allowed the user to choose a process or use a default
the use of a default being similar to say a current method

2) the program suite should have a minimal file footprint

contrary to the current thinking on the dissembling of
data into smaller structures related to activity this is now
no longer necessary
1) above makes agglomeration into time sequential processing possible
so for example i reduced the number of files required by my test suite
to 7 [ header + detail = 14 ]
this potentially allowed for complete restructuring of files
and views of the files according to schema changes although in trial
I limited this 'on the fly' to the latter.

3) the program suite should have a minimal program footprint

this is effected by allowing the user to change the structure
of the file they are viewing
this morphing goes hand in hand with security issues
it has the convenient aspect that it does not inhibit users
who would not conventionally be regarded as part of an organisation
from potentially viewing data in an organised way and according to security

4) there should not be any inhibition of the siting of data

2) above should not limit where the data is situated merely
reduce the no of files per cluster

5) the suite should interact with the user as rapidly as possible

this is incorporated within 2) & 3) however as in all things there may
need to be a measure of compromise between the minimisation aspects
and the flexibility incurred. This is more particularly relevant to threading