[Openmcl-devel] Moving large amount of data with OpenMCL
Andrew P. Lentvorski, Jr.
bsder at mail.allcaps.org
Sat Jan 15 14:14:18 PST 2005
I'm about to start the process of moving an old program of mine from
Python to OpenMCL.
The program processes large (>1Gigabyte) datasets stored as XML.
However, pulling this data in from disk is slow; sorting it is even
slower.
So the question is: how should I store this and how should should I
pull it into memory for efficiency?
There are probably two different questions here.
First, how do I store and process the stuff meant for interchange (ie.
it goes to disk and possibly to another program)?
When you strip away the XML stuff, the records are effectively:
((xmin xmax) (ymin ymax) objectid (objectid linkage) (polygon
description))
Normally, I would write some form of loop with parsing callbacks in any
other language. However, that strikes me as very "un-Lispy". This
feels like it should be macro magic, but I'm just not experienced
enough at Lisp to know what I should be doing.
Second, is there a way that I can "freeze" the data image of a running
openmcl instance and push it to disk for a later reload. It would be
nice to not have to reload, reprocess, and re-sort this data every
time. Being able to just let openmcl suck in a memory image or
something would be a lot better.
Thanks,
-a
More information about the Openmcl-devel
mailing list