SQL Filestream and Coldfusion. Reading very large files into a variable
To insert a file into filestream with coldfusion, first, I have to read the file into a binary variable and then I have to insert it into SQL. When I do this with a large file (1gig), the sy开发者_如何学Gostem goes out of memory.
Has anyone been able to do this successfully?
Thanks!
While you could probably acocmplish this by increasing the amount of RAM allocated for CF in the administrator, I'd suggest that CF is not the right tool for this, and there are at least two ways to get around it..
Without knowing anything about your system, I'd offer this generic advice: don't store files (especially of that size) in the db. Instead, store a reference to them in the DB and store them on the filesystem. If they are being accessed from the web, store them on the web server.
If you really do need to store them in the DB, I would move the file to the SQL server and do the insert there, rather than using CF to do it. You could, for example, use CF to FTP the file to the SQL sever, then call a stored procedure (or similar) with the file location to do the insert.
You do not read very large files into variables, using CF or any other environment. You use streams. I have an article explaining how to do this in C#, Download and Upload images from SQL Server via ASP.Net MVC and a follow up that uses the FILESTREAM storage option FILESTREAM MVC: Download and Upload images from SQL Server. You can apply the same techniques to CF.
If you are on an older version of Coldfusion (pre-9) or a non-64 bit version, you may have trouble allocating more than 1.5 GB of ram to the Coldfusion server anyway. You may need to write a class in Java or C# that handles the file stream to database and include it as a Java or .Net object.
I thought that Adobe had addressed this issue with Coldfusion 9, but it has been a while since I have dealt with such large files.
精彩评论