Form has my table locked down tight even after docmd.close
Sorry for the wall of text guys but this one requires expliaining, way too much code to post...
I'm importing fixed width files into access in methods that require data entry. I import the file using transferText into two specs (ones global, the other is special circumstance).
I have a function that uses DAO to cycle through all Field objects in TableDefs to build a duplicate table including an AutoIncrement PK so I have the ability to edit these records. I push the data into that table with INSERT INTO.
Works great. Errors are found, user goes into data entry to manually correct them which beats sifting through 400 character lines and reorganizing everything the way its supposed to be. Works great!
T开发者_JAVA百科he problem: When the data entry changes are made a commit button is pressed which calls a function inside a module outside of the form. It closes the data entry form and pushes the information back to the original table minus the autoincremented PK, and is SUPPOSED to DROP the replicated table with ID's, and generate a new one searching once again for errors...
It pushes back to the original just fine, but it will not DROP the ID table. Always returns to me with a message indicating this table is locked. Ive noticed the table is indefiniatly locked down until all functions/subs exit. At any time stepping through the code I cannot delete it manually, once the execution has finished I am able to remove it.
I am assuming that since I called this through a command in the form, that the lock will not be released until all code finishes and the form terminate can be called and do its thing. Any thoughts? Yes this is very barbaric but it works quite well, I just need to be able to rip this other table off the planet so I can redrop an updated copy...
In the worst case I can make the user close the form out and hit another button in the main form but this is being designed heavily with user compitence in mind. However this now has my full attention and would like to at least find a solution even if it's not the optimal one.
-EDIT-
Two forms are used in this problem
FormA (Role: Load in and search for problems)
Examine button is pressed that:
- Uses TextTransfer based on predefined specs into tempExtract to
import the file
- DAO fires off on the Fields collection in tableDefs for
tempExtract, creates new table tempExtractID
- Performs searches through the file to find errors. Errors are saved to
a table Problem_t. Table contains Problem_ID (Set from the ID field
added to tempExtractID) and Description
- Execution of these tasks is successfully requerying the initial
form to showing a list of problems and number of occurances. A button
gains visibility, with onClick that opens the form DataEntry.
- At this point in the code after DAO execution, I can DROP the table
tempExtractID. DAO is NOT used again and was only used to build a new table.
FormB - Data Entry Form
As soon as I open this form, the table tempExtractID becomes locked and I cannot drop the table. The recordsource to the form querys tempExtractID against the ID's in Problems_t to return only what we need to key.
I cannot drop the table until the form has fully terminated. Button on the Data Entry form is pressed to commit changes, in which there are only 5 lines of code that get to fire off before I get my lock error.
*Xargs refers to the list of Field names pulled earlier through DAO. As DAO loops through Field objects, the physical names are added to an Xargs String which is placed in this table. Basically everything but the AutoNumber is being inserted back
docmd.Close acForm, "frmDataEntry", acSaveNo
call reInitializeExtract
> docmd.RunSQL "DELETE FROM tempExtract"
> docmd.RunSQL "INSERT INTO tempExtract SELECT (" & DLookup("Value", "CONFIG_t", "Item = 'Xargs'" & ") FROM tempExtractID"
docmd.DeleteObject acTable, "tempExtractID"
This is the only code that is run between the time where the form is opened (Where the table first gets locked) and continues to be locked until all subs & functions have completed.
I suggest setting the recordsource of the form to vbNullString and then deleting the table. This should work, unless you also have comboboxes and so forth bound to this table.
Without code it's hard to say, but if you're using DAO, you need to clean up your code objects. That means setting to Nothing your database objects, and closing and setting to Nothing any recordset objects.
Dim db As DAO.Database
Dim rs As DAO.Recordset
Set db = DBEngine.OpenDatabase("[path to database]")
Set rs = db.OpenRecordset("[SELECT statement]")
rs.Close
Set rs = Nothing
db.Execute("[DML or DDL statement]", dbFailOnError)
db.Close
Set db = Nothing
Set db =CurrentDB
Set rs = db.OpenRecordset("[SELECT statement]")
rs.Close
Set rs = Nothing
Set db = Nothing ' you don't close a db variable initialized with CurrentDB
While VBA is supposed to clean up these objects when they go out of scope, it's not 100% reliable (because VBA uses reference counting to keep track of whether an object can be released, and it doesn't always know when all the references have been cleared).
Objects left open is the most likely source of the locks, so you should make sure you're cleaning up your object variables after you've finished with them.
EDIT after seeing that you're using DoCmd.RunSQL:
Using DoCmd.RunSQL is likely the cause of the problem. It is certainly something that takes away your programmatic management of your connections. If you use DAO instead, you'll have control over the connection, as well as avoiding the real pitfall of DoCmd.RunSQL, which is that it doesn't handle errors. If a DML or DDL statement cannot complete successfully in full, the whole thing should fail. For example, if you're appending 100 records and 10 of them fail for key violations, DoCmd.RunSQL will transparently append the 90 and NOT REPORT THE 10 FAILURES. It's the same with updates and any other DML/DDL statement. DoCmd.RunSQL "helpfully" silently completes as many of the updates as it can, leaving you having no idea that some of it failed to complete.
Granted, in some cases you might want that to happen, e.g., if you're appending records that you know might have PK collisions and don't want to spend the CPU cycles on an outer join that eliminates the duplicates from the set of records you're appending.
But most of the time, that is not the case.
As I said in my comment above, I use a function that is designed to transparently replace DoCmd.RunSQL and uses a DAO Execute statement and error handling. I have posted it a couple of times on SO (here's one), and here's the version I have in production use in my currently most-active development project:
Public Function SQLRun(strSQL As String, Optional db As Database, _
Optional lngRecordsAffected As Long) As Long
On Error GoTo errHandler
Dim bolCleanup As Boolean
If db Is Nothing Then
Set db = CurrentDb
bolCleanup = True
End If
'DBEngine.Workspaces(0).BeginTrans
db.Execute strSQL, dbFailOnError
lngRecordsAffected = db.RecordsAffected
'DBEngine.Workspaces(0).CommitTrans
exitRoutine:
If bolCleanup Then
Set db = Nothing
End If
SQLRun = lngRecordsAffected
'Debug.Print strSQL
Exit Function
errHandler:
MsgBox "There was an error executing your SQL string: " _
& vbCrLf & vbCrLf & Err.Number & ": " & Err.Description, _
vbExclamation, "Error in SQLRun()"
Debug.Print "SQL Error: " & strSQL
'DBEngine.Workspaces(0).Rollback
Resume exitRoutine
End Function
(the transactions are commented out because they were causing problems that I didn't have time to troubleshoot)
You could replace these lines of yours:
DoCmd.RunSQL "DELETE FROM tempExtract"
DoCmd.RunSQL "INSERT INTO tempExtract SELECT (" _
& DLookup("Value", "CONFIG_t", "Item = 'Xargs'" & ") FROM tempExtractID"
...with this:
SQLRun "DELETE FROM tempExtract"
SQLRun "INSERT INTO tempExtract SELECT (" _
& DLookup("Value", "CONFIG_t", "Item = 'Xargs'" & ") FROM tempExtractID"
You could also do this:
Debug.Print SQLRun("DELETE FROM tempExtract") & " records deleted."
Debug.Print SQLRun("INSERT INTO tempExtract SELECT (" _
& DLookup("Value", "CONFIG_t", "Item = 'Xargs'" _
& ") FROM tempExtractID") & " records inserted."
Since the function returns the .RecordsAffected for each Execute, you can print to the Immediate Window, or you could assign the return value to a variable, or pass an existing variable through to it and work with that variable thus:
Dim lngRecordsAffected As Long
...
Call SQLRun("DELETE FROM tempExtract", , lngRecordsAffected)
Debug.Print lngRecordsAffected & " records deleted."
Call SQLRun("INSERT INTO tempExtract SELECT (" _
& DLookup("Value", "CONFIG_t", "Item = 'Xargs'" _
& ") FROM tempExtractID", , lngRecordsAffected)
Debug.Print lngRecordsAffected & " records inserted."
The point is that if there are errors on the Execute statement, the whole thing will fail (and pop up an error message -- you might want to change it so that if there's an error it returns -1 or some such instead of popping an MsgBox).
I use this function most often by passing in a pre-cached database variable, so I don't want to clean it up afterwards. If you're using a different database other than CurrentDB(), you really do want to make sure any database variable pointing to your external db is closed and set to Nothing. Without that, locks are maintained on the top-level database objects, and the LDB file remains open and active.
精彩评论