Any more comments on this topic?
I've decided to be able to archive what I call "outputs" (reports etc.
) as well as jobs records. This routine also compiles data into a .zip file, for storage and (or) recalling as needs be.
And at the moment (that is, these last few weeks) I'm looking into "data migration" - a topic that may warrant its own
thread. In a phrase:- how to preserve legacy
In an attempt to keep things straight in my own mind, I use the following rough definitions:-
1) Archiving: preserving "historical" (or closed) data*
2) Importing: bringing in data from "outside"
3) Exporting: sending out data to various "outside" formats (.csv**, .xls etc.
4) Sharing: passing updated data between the same database system used at (different computers at) different locations
5) Migration: passing data between various - but different - database systems (whilst preserving as much valid data as possible) and re-coding if necessary (and possible)***
My methods are:-
1) Selection of records then compilation to .zip file(s)
2) Importing from .csv to native datafiles (.dbf)
3) Exporting from native datafiles (.dbf) to selected format (.csv, .txt, .xls etc.
4) Selection of records then compilation to .zip file(s) for unpacking at the receiving computer
5) Getting legacy
data into .xls, saving as .csv then importing into native datafile (.dbf) for cleaning up and further processing
It's that last bit - "cleaning up and further processing" that has resulted in various bottle necks (log-jams?) and I'm wondering how others - and other systems - have dealt with it (if at all)!
* With an option to clear out (remove, delete) old records once they have been archived.
** Traditional Comma-Separated Value data files.
*** Some database systems use native "codes" - eg, job codes etc. - within their routines and it's nice to be able to re-assign these when data is being migrated in, if possible (and if their structure and rationale is published, or otherwise understood).