开发者

INSERT w/subquery INTO another table - overflow error

I've been trying to insert data from one Access table to a linked SQL Server table using the following SQL statement from this question, modified for my purposes - but I keep getting 'Overflow' as an error message:

INSERT INTO dbo_tblGageActivity(strGageID, strCustJobNum, datDateEntered, datTimeEntered)
SELECT [Gage ID] as GageID, [Customer Job#] as JobNum, [Date] as DateEntered, 
[Time Entered] as TimeEntered FROM TblInstTrak;

I've tried this a number of ways, all resulting in the 'overflow' error. I must be missing something, but for the life of me, I don't know what. Is >100,000 records just too much for one Insert subquery to handle?

-- Edited 01/25/2011 @ 1540 hours --

The data types and sizes of the fields are as follows:

TblInstTrak          Type:Size     Required     dbo_tblGageActivity          Type:Size      Required
----------------------------------------------------------------------------------------------------
Gage ID              Text:50       True         strGageID                    Text:50        True
Customer Job#        Text:50       False        strCustJobNum                Text:50        True
Date                 Date/Time:8   False        datDateEntered               Date/Time:8    True
Time Entered         Date/Time:8   False        datTimeEntered               Date/Time:8    True

-- Edited 01/26/2011 @ 1355 hours --

Alright - I've been banging my head on this for a few days (before I came to SO with it, and the day it's been here) and the ONLY differences I can come up with is that the Date Entered field on the local app side is formatted to 'm/d/yyyy' and the Time Entered field on the local app side is formatted to 'Long Time' - neither of these fields in the SQL Server table ha开发者_如何学编程ve any format set.

For performance reasons, I am starting to wonder if there would be a better way to pull all the current data now to get the data into the table - then only pull updates from the local app into the mirror SQL Server copy. No data is entered into the SQL Server side copy, it's only used for performance for my test application - the only data that changes is on the local application side. Would this approach be easier than wiping the SQL Server database table and repopulating it every time I want to update it?

-- Edited 01/28/2011 @ 1236 hours --

After finding some dates from around the Roman Empire days (thanks David Fenton for pointing me onto this) I modified my import SQL statement to the following:

INSERT INTO dbo_tblGageActivity(strGageID, strCustJobNum, datDateEntered, datTimeEntered)
SELECT [Gage ID] as GageID, [Customer Job#] as JobNum, [Date] as DateEntered, 
[Time Entered] as TimeEntered FROM TblInstTrak WHERE [Date] >=1/1/1900;

I got a couple really great answers, and I can only chose one answer (mental anguish) so please don't get pissed - this really REALLY helped me learn.


The overflow occurs because one or more of the values in the rows selected does not fit into the table you're trying to insert. In other words, in the TblInstTrak you have values that do not fit into their respective columns in dbo_tblGageActivity. Compare the definitions of the two tables and make sure the destination table columns are wide enough to accommodate the data from your source table.


Do you have date values in Access that are outside the range of the SQL Server date/time field? That's a very, very common issue I encounter when upsizing old Access databases to SQL Server -- lots of finger errors with dates that were actually during the late Roman Empire!


If you have linked it using OLEDB provider, there is a property for the maximum insert commit size See more details here: http://www.bidn.com/blogs/ShawnHarrison/ssis/803/maximum-insert-commit-size

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜