Best way to deal with potentially very large form creating/posting using jquery $.download
I have a process as follows:
User does a complex search that is done ajaxly that returns a bunch of ids (could be 1, could be 10000)
Once they have there users, they can select a few things and then they download a file (which is a report based on the ids, and the things they select)
To accomplish this, I use a highly modified version of $.download
seen here:
jQuery.download = function (url, data, method, loadingHolderDivId) {
if (url && typeof data == 'object') {
//for this version, data needs to be a json object.
//loop through the data object..
$('#' + loadingHolderDivId).html($('#LoadingScreen').html());
var theForm = $('<form></form>').attr('action', url).attr('method', method).attr('id', 'jqueryDownloadForm').attr('target', 'iframeX');
$.each(data, function (propertyName, propertyVal) {
if (propertyVal != null) {
if (typeof propertyVal == 'object') {
//HANDLE ARRAYS!
for (var i = 0, len = propertyVal.length; i < len; ++i) {
theForm.append($("<input />").attr('type', 'hidden').attr('id', propertyName + i.toString).attr('name', propertyName).val(propertyVal[i]));
}
}
else {
theForm.append($("<input />").attr('type', 'hidden').attr('id', propertyName).attr('name', propertyName).val(propertyVal));
}
}
});
var iframeX;
var downloadInterval;
// remove old iframe if has
$("#iframeX").remove();
// create new iframe
iframeX = $('<iframe src="javascript:false;" name="iframeX" id="iframeX"></iframe>').appendTo('body').hide();
if ($.browser.msie) {
downloadInterval = setInterval(function () {
// if loading then readyState is “loading” else readyState is “interactive”
if (iframeX && iframeX[0].readyState !== "loading") {
$('#' + loadingHolderDivId).empty();
clearInterval(downloadInterval);
}
}, 23);
}
else {
iframeX.load(function () {
$('#' + loadingHolderDivId).empty();
});
}
theForm.appendTo('body').trigger('submit').remove();
return false;
}
else {
//they didn't fill in the params. do nothing
}
};
Basically what is does is parses what's in data, and builds a form out of it. this works great, when there isn't a lot of ids. but when there is 8000, it takes 5 or 10 seconds in IE, no surprise really, it's well know IE sucks at dom manipulation.
The other issue is, in IE. the $('#' + loadingHolderDivId).html($('#LoadingScreen').html());
won't actually happen until after it's done building the form. I am guessing this is because it takes a second to do that, and before it can finish it's already to busy building the form.
开发者_如何学运维The reason I am building out the form this way is so that the default model binder will be happy and bind my form right into a lovely model. The list of ids is being bound to an ilist (of integer)
Here is a sample of what the controller action looks like:
Function ExportUsers(ByVal model As ExportUsersPostModel) As ActionResult
and here's an example of what the model looks like:
<Serializable()> _
Public Class ExportUsersPostModel
Public Property FilterUserIds As IList(Of Integer) = New List(Of Integer)
Public Property FilterColumnIds As IList(Of Integer) = New List(Of Integer)
public property ShowThis as boolean
public property OtherStuff as string = string.empty
Public Property FormatId As Integer
End Class
so the actual question is two fold:
How do I make my "loading" message show up before it begins the horribly slow form building of death?
How can I speed up the form building, or build the form in a way that won't be slow, but that will still keep the model binder happy?
If you're able to pass the model across as JSON, you can create a custom ModelBinder to handle mapping the JSON to your data structure. I did that recently for an object type that could not be mapped automatically. Json.Net provides a class called JObject which takes a JSON string and maps it to a dynamic C# object. You can then map the dynamic object to your strongly typed object.
To create a custom ModelBinder, simply create a class that inherits from IModelBinder and implement the BindModel method. Here is a copy of my implementation. Yours will obviously vary slightly:
internal class FilterBinder : IModelBinder
{
public object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
{
if (controllerContext == null)
throw new ArgumentNullException("controllerContext");
if (bindingContext == null)
throw new ArgumentNullException("bindingContext");
if ((controllerContext.HttpContext.Request.Form.Count > 1 || (controllerContext.HttpContext.Request.Form.Count == 1 && !string.IsNullOrWhiteSpace(controllerContext.HttpContext.Request.Form.AllKeys[0]))) || (controllerContext.HttpContext.Request.QueryString.Count > 1 || (controllerContext.HttpContext.Request.QueryString.Count == 1 && !string.IsNullOrWhiteSpace(controllerContext.HttpContext.Request.QueryString.AllKeys[0]))))
{
ValueProviderResult val = bindingContext.ValueProvider.GetValue(bindingContext.ModelName);
string value = val == null || string.IsNullOrEmpty(val.AttemptedValue) ? string.Empty : val.AttemptedValue;
if (string.IsNullOrEmpty(value)) return null;
dynamic obj = JObject.Parse(value);
return new FilterSet(obj);
}
else
return null;
}
}
I have a bunch of checks to make sure what I'm getting is valid, which you may or may not need. Then, after getting the JObject, I pass it along to my constructor which does the mapping.
Ok, maybe it's just me, but iFrame + trying to ajax a file download + 10+ seconds per query (possibly) = cloog. Maybe it's my years as a UI engineer focusing on what the client sees, but I've got a real problem making somebody wait that long on one of my applications. There has to be a better way, and I think it can be improved on the UI end of things.
In a nutshell, you need to do a big search, drill down to specifics, then dump data. So, user-pattern-wise, I'd start by looking at options to achieve each task.
The search is easy....send a pattern to the server, get a result back. Nothing magic here, though Ajax would probably be a nice touch.
Now, the "drill down" of data. Since you're potentially dealing with a LOT of data here, you need a way to allow the user to quickly and easily get through the mountain of "stuff" in an organized and efficient manner. To me, this is screaming for a grid. My preference is DataTables. What this buys you is efficiency, organization, paging, and most of all the ability for the user to easily interact with the data to pick out "stuff." Your ajax query from step 1 would populate DataTables via ajax "pipelining" That is, it would grab what the user wants to see-- say 25 results -- and grab ahead of and possibly behind a bit to speed up the interface. All the data would be available, it just would be grabbed a section at a time, which speeds up the query. The user could sort, filter, order, and limit the data on the interface, making it easier for them to highlight to select data.
Now, onto step 3, a downloadable report. To make it REALLY easy, Datatables has a plugin called "TableTools" that will automatically spit out Excel, PDF, Text, and Printable versions of the data you're offering the user. It's a few lines of code and a couple of images.....takes about 10 minutes to configure. Voila, done. Yeah, it's --that-- simple.
As for being able to handle that many records, yeah it does. I've got an app with 2.5 million records being handled in this manner via a standard MySQL database (not clustered) Because you're only grabbing a "slice" of data at a time, it's never having to do any exotic, gigantic queries aside from a record count. The user can page through results with just a tiny, occasional delay. It really is a thing of beauty.
精彩评论