asp.net and the performance
Q:
In my previous work with asp.net , all what i think about is: how to do the required task any way.i don't care about performance, but now i think , I'm supposed to focus on the performance issues after i have learned a lot concerning how to deal with this language to some extent.
when learning the concepts and the syntax , my instructor told me, that the performance issues are not so important , the priory always is to do the required task in time.and the great revolution in the hardware and network infrastructures will save the performance gap you leave in your code!.
Now I want to know some tips (Do and Not To Do) concerning the performance issues in asp.ne开发者_开发技巧t and the web in general.i will be grateful if there is some example about the idea.
for example : i had been told that:
int count = dataTable.Rows.Count;
for(int i = 0 ; i<count ; i++)
{
//Do some thing
}
is more performant than:
for(int i = 0 ; i<dataTable.Rows.Count ; i++)
{
//Do some thing
}
Simple: don't optimise prematurely. And in this case, that is premature. There is a subtle difference between hoisting the length vs querying it each time, but in any sane circumstance, a: we're talking nanoseconds if that, and b: it changes depending on whether it is a naked vector, a list, an ilist, etc. Don't learn a general rule.
But the bigger problem here: that specific example is absolutely irrelevant to overall performance; you are talking about a DataTable
; a DataTable
presumably that is being populated from a database, which is out-of-process, and probably on a different machine. You are comparing nanoseconds (possibly less) against network latency (normally 0.3ms on a local LAN) plus query time, plus bandwidth (depends on the query).
You can't change the speed of light (latency), but you can write an efficient data query that accesses only the required data, using appropriate indexing and possibly denormalization. Oh, and N+1 - that's a biggie.
Likewise, even in memory - most bottlenecks tend to be due to things like inappropriate looping (when a hash-based lookup would be better), or not using appropriate caching to remove the need to constantly query data over and over and over and over and over.
On "the web in general"; caching, compression (transport and content - for example js/css minification), cookie-free domains for static content (maybe a CDN), farms of servers, fat pipes, and proper CPUs...
The first version can be more performant, since you don't re-evaluate dataTable.Rows.Count
in every iteration of the loop.
If every call to dataTable.Rows.Count
is expensive (and potentially can be), the first version will indeed be better.
As for general tips about performance:
- Always decide what performance means for you (latency? throughput? something else?)
- Always figure out how you measure it.
- Figure out when your code is performant enough (when are you done).
- Profile your code to find the worst performance points.
- Fix these and reprofile.
精彩评论