Inserting primary key values using SubSonic and sqlite - problem because SubSonic thinks columns are either AutoIncrement or not
I'm using SubSonic 2.2 and sqlite and have encountered a problem when dealing with tables with an INTEGER PRIMARY KEY column that isn't AUTOINCREMENT. According to the faq:
If you declare a column of a table to be INTEGER PRIMARY KEY, then whenever you insert a NULL into that column of the table, the NULL is automatically converted into an integer which is one greater than the largest value of that column over all other rows in the table, or 1 if the table is empty.
So sqlite thinks these columns are sometimes auto incremented (ie just when NULL values are provided). The problem is that SubSonic thinks they are always auto incremented.
In my application my ID values are generated from a remote database, so I don't want to auto-generate them in sqlite. This should be no problem: I'll simply provide values when I create records in this table. However, when I use SubSonic's sonic.exe to auto-generate my DAL, the primary key column is set to AutoIncrement = true. This seems to mean that I can't set the ID column - subsonic's ActiveHelper.GetInsertCommand() ignores it, since it thinks it's auto-generated.
The line where it determines whether it's autoincrement or not is in SubSonic.SQLiteDataProvider.GetTableSchema():
column.AutoIncrement = Convert.ToBoolean(row["PRIMARY_KEY"]) && GetDbType(row["DATA_TYPE"].ToString()) == DbType.Int64;
I guess the solution is either to
Not use INTEGER PRIMARY KEY columns for keys that are generated elsewhere, or
Modify the templates so these sorts o开发者_C百科f columns are not set to AutoIncrement = true. This would mean SubSonic won't ever treat them as auto increment, so I'd need to be careful that I don't later expect to get auto generated values. Unfortunately I don't think it's possible within the templates to easily determine if the column really is AUTOINCREMENT or not, so maybe I'd have to do some ugly hard-coding instead....
Any other thoughts or suggestions?
Unfortunately it looks like our SQLiteDataProvider assumes that if it's an Int64 PK, then it's auto-increment. I'm looking over the source right now (I didn't write this provider) and I can see that the way the schema is being loaded is using Connection.GetSchema - which uses built System.Data.Common.DbConnection to get the schema for the tables.
This is suboptimal most of the time because it returns limited information. In this case - it's not telling us if the column is AUTOINCREMENT or not. There's probably a better way to ask SQLite the meta information on the table - but it wasn't used unfortunately.
Short answer: define a new PK, if you can, and use the other key as a reference.
As I mentioned before I checked in a revised SQLiteDataProvider in december. Check that at line 407 in SQLiteDataProvider.cs you have:
// Autoincrement detection now available in recent System.Data.SQLite. 1.0.60.0 -- paul column.AutoIncrement = Convert.ToBoolean(row["AUTOINCREMENT"]);
There are also several other improvements and bug fixes in the surrounding lines. The new code was never added to the main project distribution in github I guess, I don't follow the project too much. SQLite has been a wonderful provider aside from the file level locking. I have a home grown version of System.Data.SQLite that uses the new foreign key features of SQLite, and the official version should be done this month?
Here is the revised version: SQLiteDataProvider.cs
BTW, check out this project in case you need to convert from sql server:
Convert SQL Server DB to SQLite DB http://www.codeproject.com/KB/database/convsqlservertosqlite.aspx
I find I can't use a CreateConnection written as in SqlDataProvider because of the file locking. The CreateConnection in SQLiteDataProvider as it is now is wrong since it ignores new connection strings.
The System.Data.SQLite doc says "You May create multiple threads, and those threads can create their own SQLiteConnection and subsequent objects for accessing a database. Multiple connections on multiple threads to the same database file are perfectly acceptable and will behave predictably."
So what I have tried is the following, and it's really cludgey. Use a dictionary of connections keyed by thread id and connection string. But all of the unit tests pass, including most of the transactions (needs better tests). I wrote a couple more transaction tests, with critical section locks, and I think it might be thread safe, just need more realistic tests.
private Dictionary<string, SQLiteConnection> threadConnectionTable = new Dictionary<string, SQLiteConnection>();
public override DbConnection CreateConnection(string newConnectionString)
{
SQLiteConnection conn;
string connKey = "t" + Thread.CurrentThread.ManagedThreadId + "__" + newConnectionString;
if(threadConnectionTable.ContainsKey(connKey))
{
conn = threadConnectionTable[connKey];
if(conn.State != ConnectionState.Open)
conn.Open();
return conn;
}
conn = new SQLiteConnection(newConnectionString);
conn.Open();
threadConnectionTable[connKey] = conn;
return conn;
}
private Object thisLock = new Object();
[Test]
[ThreadedRepeat(10)]
public void MultiThreadRepeat()
{
lock(thisLock)
{
var qcc = new QueryCommandCollection();
int threadId = Thread.CurrentThread.ManagedThreadId;
Debug.WriteLine("MultiThreadRepeat: thread id = " + threadId);
int count = 0;
for(int n = 0; n < 10; n++)
{
Query qry1 = new Query(Product.Schema);
qry1.QueryType = QueryType.Update;
qry1.AddWhere(Product.Columns.ProductID, n);
qry1.AddUpdateSetting("ProductName", threadId + ": unit test ");
QueryCommand cmd = qry1.BuildUpdateCommand();
qcc.Add(cmd);
count++;
}
DataService.ExecuteTransaction(qcc);
var p1 = new Product(1);
Assert.AreEqual(p1.ProductName, threadId + ": unit test ", StringComparison.InvariantCultureIgnoreCase);
}
}
精彩评论