开发者

How to Load a Generic class without loop

ok this is the thing I have right now which is working quite well except its a bit slow:

Public Function GetList() As List(Of SalesOrder)
Try
    Dim list As New List(Of SalesOrder)

    Dim ds As DataSet

    ds = cls.GetSalesOrderList 'CLS is the data access class


    For i = 0 To ds.Tables(0).Rows.Count - 1

        Dim row As DataRow = ds.Tables(0).Rows(i)
        Dim kk As SalesOrder = New SalesOrder()


        kk.ID = Val(row.Item("id") & "")
        kk.SalesOrderNo = row.Item("salesorderid") & ""
        kk.SalesOrderDate = row.Item("OrderDate") & ""
        kk.CustomerId = Val(row.Item("customerid") & "")

        list.Add(kk)

    Next
    Return list

    Catch ex As Exception
        Throw ex
    End Try

End Functio开发者_高级运维n

Now once I start retrieving more than 10000 records from the table, the loop takes long time to load values into generic class. Is there any way that I can get rid of loop? Can I do something like the following with the generic class?

txtSearch.AutoCompleteCustomSource.AddRange(Array. ConvertAll(Of DataRow, String)(BusinessLogic.ToDataTable.ConvertTo(WorkOr derList).Select(), Function(row As DataRow) row("TradeContactName"))) 


I would have thought the problem isn't with doing a loop but with volumes of data. Your loop method seems to process each bit of data only once so there isn't any massive efficiency crash (such as looping over the dataset once and then again for each row or that kind of thing). Any method you choose is at the end of the day going to have to loop through all your data.

Their methods might be slightly more efficient than yours but they aren't going to be that much more so I'd think. I'd look at whether you can do some refactoring to reduce your data set (eg limit it to a certain period or similar) or whether you can do whatever searching or aggregating of that list you intend in the database instead of in code. eg if you're just going to sum the values of that list then you can almost certainly do it better by having a stored procedure that will do the summing on the database rather than in the code.

I know this hasn't directly answered your question but this is mainly because I don't know of a more efficient method. I took the question as asking for optimisation in general though rather than how to do this specific one. :)


Converting the loop into some kind of LINQ construct isn't necessarily going to improve performance if you're still enumerating over every row at once. You could return IEnumerable(Of SalesOrder) if you don't need to give the consumer the ability to add/remove from the list (which it looks like might be the case), and then in that case you could create an enumerator to handle this. That way, the dataset is loaded all at once, but the items are only converted into objects when they're being enumerated over, which may be part of your performance hit.

Something like this:

Return ds.Tables(0).Rows.Select(Function(dr As DataRow) Return New SalesOrder ... );

My VB with LINQ is a little rusty, but something to that effect, where the ... is the code to instantiate a new SalesOrder. That will only create a new SalesOrder object as the IEnumerable(Of SalesOrder) is being enumerated over (lazy, if you will).


Hey Paul, You mean something like below code

Dim list As New List(Of SalesOrder) 

Dim kk As SalesOrder = New SalesOrder() 

Function DrToOrder(dr as datareader)     
        kk.ID = Val(dr.Item("id") & "") 
        kk.SalesOrderNo = dr.Item("salesorderid") & "" 
        list.Add(kk)     
End function 

Function LoadData()     
         datareader.Rows.Select(DrToOrder)     
End function 

Are you talking about something like above code?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜