开发者

Python sqlite index out of range error after table drop/add?

I've got a wrapper I wrote around the sqlite3 module that lets me serialize access from multiple threads. It also lets me automatically migrate tables when I change their definition. I noticed when I drop a table and re-add it with more columns, I get an index out of range error. Something like this:

conn = sqlite3.connect("test.db", detect_types=sqlite3.PARSE_COLNAMES)
curs = conn.cursor()

curs.execute("CREATE TABLE test (derp TEXT);"); conn.commit()
curs.execute("INSERT INTO test (derp) VALUES ('deedle');"); conn.commit()
print curs.execute("SELECT * FROM test;").fetchall()
curs.execute("DROP TABLE test;"); conn.commit()
curs.execute("CREATE TABLE test (derp TEXT, val REAL);"); conn.commit()
curs.execute("INSERT INTO test (derp) VALUES ('deedle');"); conn.commit()
print curs.execute("SELECT * FROM test;").fetchall()

conn.close()

Will print this:

[(u'deedle',)] 开发者_高级运维Traceback (most recent
call last):   File "test.py", line 23,
in <module>
    print curs.execute("SELECT * FROM test;").fetchall() IndexError: list
index out of range

When executing the second SELECT statement. Does anyone know why this is?


Well, it works just fine for me [(u'deedle',)] [(u'deedle', None)]

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜