foxpro cursor size
This seems like such an easy problem, but I can't seem to find a solution anywhere. My co-worker and I are working on an application that makes use of the Foxpro xml dump facilities. It works great, but we're wishing to split the table into multiple files based on some siz开发者_开发技巧e constraints.
This seems like it should be the easy part: How do you find the size of a cursor in Foxpro?
RECSIZE() will return the length of an individual row in bytes -- that times the RECCOUNT() will give you the size. All the elements already discussed are accurate.
With respect to memo fields, if you need to know how large THEY are, you might want to add a new integer column to your table structure for "MemoLength". Then
replace all memoLength with len( alltrim( YourMemoField ))
Then, you can take the MemoLength to help determine your breakdown groups by taking this column size into consideration with the rest of you RECSIZE() * rows you want to extract out.
Additionally, you might want to run a query based on the primary key column of the table you can use as a link and do something like...
select YourPrimaryKey, len( alltrim( YourMemoField )) as MemoLength from YourTable into cursor SomeHoldingCursor readwrite
.. OR, select
into table MemSizeTable
Build an index on the MemSizeTable and you can use a join on that to get more info. This way, it won't distort your original record size, nor disrupt your original table structure, yet with a relation, you can still extract the elements you need.
If you mean file size, you can find the file a cursor relates to by calling the DBF() function with the cursor as an alias, checking that the return value extension is .dbf, and then use file functions to read the file size. The cursor may be in-memory though (the reported 'filename' will have a .tmp extension, if I remember right) so an alternative would be to use RECCOUNT() (to get the number of rows) combined with AFIELDS() (to get the size of each row) to approximate the file size. (In-memory cursors can sometimes be forced to disk by including a NOFILTER clause in the generating query)
Here's a fully functional based on a sample cursor and a bogus record... The critical function is the DumpXML() routine and needs the alias of the file to be dumped, the size per file you want to cap it at (in "k" size), and the file name prefix you want the XMLs to be dumped as. It will auto-generate a sequencing ex: MyXMLOutput1.xml, MyXMLOutput2.xml, MyXMLOutput3.xml, etc for however many instances it needs to go. Took me about 15 minutes.
CREATE CURSOR SomeTest ;
( SomeField1 c(10),;
AnotherField i,;
SomeNumber N(8,2),;
MemoFld m,;
SomeDateTime t;
)
INSERT INTO SomeTest VALUES ( "testchar10", 9403, 12345.78, "some memo value string", DATETIME() )
DumpXML( ALIAS(), 300, "MyXML" )
FUNCTION dumpXML
LPARAMETERS cAliasName, nSizeLimit, cNameOfXMLOutput
IF NOT USED( cAliasName )
RETURN ""
ENDIF
*/ Assume size limit in "k"
nSizeLimit = nSizeLimit * 1024
SELECT ( cAliasName )
*/ Get a copy of the structure without disrupting original
USE IN SELECT( "MySample" ) && pre-close in case left open from prior cycle
SELECT * ;
FROM ( cAliasName ) ;
WHERE RECNO() = 1;
INTO CURSOR MySample READWRITE
SELECT MySample
*/ Populate each field with maximum capacities... typically
*/ critical for your char based fields
AFIELDS( aActualStru )
cMemoFields = ""
lHasMemoFields = .f.
FOR I = 1 TO FCOUNT()
cFieldName = FIELD(I)
DO CASE
CASE aActualStru[i,2] = "C"
replace &cFieldName WITH REPLICATE( "X", aActualStru[i,3] )
CASE aActualStru[i,2] = "L"
replace &cFieldName WITH .T.
CASE aActualStru[i,2] = "D"
replace &cFieldName WITH DATE()
CASE aActualStru[i,2] = "T"
replace &cFieldName WITH DATETIME()
CASE aActualStru[i,2] = "M"
*/ Default memo as a single character to ensure
*/ closing field name </endoffield> included in XML
replace &cFieldName WITH "X"
*/ if a MEMO field, add this element to a string
*/ to be macro'd to detect its size... Each record
*/ can contain MORE than one memo field...
*/ Ex: + LEN( ALLTRIM( MemoFld ))
lHasMemoFields = .T.
cMemoFields = cMemoFields + " + len( ALLTRIM( " + cFieldName + " ))"
CASE aActualStru[i,2] = "I"
*/ Integer, force to just 10 1's
replace &cFieldName WITH 1111111111
CASE aActualStru[i,2] = "N"
*/ Actual numeric and not an integer, double or float
*/ Allow for full length plus decimal positions
NumValue = VAL( REPLICATE( "9", aActualStru[i,3] - aActualStru[i,4] - 1 );
+ "." + REPLICATE( "9", aActualStru[i,4] ))
replace &cFieldName WITH NumValue
ENDCASE
ENDFOR
*/ Strip leading " + " from the string in case multiple fields
IF lHasMemoFields
cMemoFields = SUBSTR( cMemoFields, 3 )
ENDIF
cXML = ""
LOCAL oXML as XMLAdapter
oXML = CREATEOBJECT( "XMLAdapter" )
oXML.AddTableSchema( "MySample" )
oXML.ToXML( "cXML", "", .f. )
*/ Now, determine the size of the per record at its full length -- less memo
nSizeOfPerRecord = LEN( STREXTRACT( cXML, "<MySample>", "</MySample>", 1, 4 ))
*/ and the rest of the header per XML dump
nSizeOfSchema = LEN( cXML ) - nSizeOfPerRecord
*/ Now, back to the production alias to be split
SELECT( cAliasName )
nNewSize = 0
nXMLCycle = 0
*/ if we just started, or finished writing another block
*/ and need to generate a new group of XML dump reset size
nNewSize = nSizeOfSchema
*/ Always blank out the temp cursor for each batch...
SELECT MySample
ZAP
SELECT ( cAliasName )
SCAN
IF lHasMemoFields
nAllMemoSizes = &cMemoFields
ELSE
nAllMemoSizes = 0
ENDIF
IF nNewSize + nSizeOfPerRecord + nAllMemoSizes > nSizeLimit
*/ The upcoming record will have exceeded capacity, finish XML
*/ with all records up to this point
nXMLCycle = nXMLCycle + 1
cNewFile = FULLPATH( cNameOfXMLOutput + ALLTRIM( STR( nXMLCycle )) + ".XML" )
oXML = CREATEOBJECT( "XMLAdapter" )
oXML.AddTableSchema( "MySample" )
*/ Generate the XML cycle of these qualified records...
oXML.ToXML( cNewFile, "", .t. )
*/ restart for next pass of data
nNewSize = nSizeOfSchema
*/ Always blank out the temp cursor for each batch...
SELECT MySample
ZAP
ENDIF
*/ Add record to total size...
nNewSize = nNewSize + nSizeOfPerRecord + nAllMemoSizes
*/ we have a record to be included in segment dump...
*/ scatter from the original table and gather into the temp
SCATTER MEMO NAME oFromOriginal
SELECT MySample
APPEND BLANK
GATHER MEMO NAME oFromOriginal
*/ back to original table driving the XML Dump process
SELECT ( cAliasName )
ENDSCAN
*/ if the "MyTable" has records not yet flushed from limit, write that too
IF RECCOUNT( "MySample" ) > 0
*/ The upcoming record will have exceeded capacity, finish XML
*/ with all records up to this point
nXMLCycle = nXMLCycle + 1
cNewFile = FULLPATH( cNameOfXMLOutput + ALLTRIM( STR( nXMLCycle )) + ".XML" )
oXML = CREATEOBJECT( "XMLAdapter" )
oXML.AddTableSchema( "MySample" )
*/ Generate the XML cycle of these qualified records...
oXML.ToXML( cNewFile, "", .t. )
ENDIF
*/ Done with the "MySample" for cursor to XML analysis...
USE IN SELECT( "MySample" )
ENDFUNC
精彩评论