Software architecture, parallel processing, and asynchronous patterns [duplicate]
Possible Duplicate:
Any good resources on design patterns for parallel architectures?
I'm finding more and more of my code going in the direction of asynchronous and parallel. I work on web applications, and have to deal with web services (both client > server, and server > server). I'm looking for good resources on software architecture and design patterns for asynchronous and parallel programming.
To reign things in a little, lets focus on patterns and architectures for parallel programming achieved using asynchronous techniques. Focusing on languages which have built in support for callbacks.
IMHO the only/best book on this topic (software architecture, parallel processing, and asynchronous patterns) is "Concurrent Programming on Windows" by Joe Duffy.
Yes, it has "Windows" in the title, but don't let that fool you. It's 958 pages of serious consideration for the couple-of-decades of "sticky notes" the author accumulated in real-world settings on important concepts with concurrent programming that you won't find elsewhere (systems, hardware, virtual machines, languages, etc.) (The, "Why I wrote this book" is quite interesting, IMHO.)
There's a tremendous amount of information there, no matter your language and platform. The first half of the book is to simply get you to think about the problem. When the Windoze stuff shows up, it's still helpful to understand what "reusable" architecture example(s) might look like, with good-and-bad, and the kinds of sacrifices made. There are a few things Windoze-specific (e.g., thread "fibers"), but even that serves to show (1) What was the problem (e.g., expensive thread context switch), and (2) A possible solution (e.g., separate the data stack from the thread so the data stack switch is fast). I don't use fibers, nor much/any of the Windoze-specific "reusable libraries", but those trade-offs are really important for you to understand (such as when to use the "generic" thread pool on any given platform, and when to write your own).
It talks a lot about common design options, like the (typical) "Thread-Stealing-Work-Queue" where "tasks" are created-and-added-to-a-queue, and each of the threads "steal/eat" those items from the FIFO queue when the threads are available. In short, the book talks about how to think about the problem differently, as opposed to historical OO or imperative approaches.
For example, after reading the book a couple times (and some other information gleaned elsewhere), we rewrote all our stuff: We NEVER sleep()
our threads anymore. They are always working at 100%, or swapped out (like in a "thread pool" that wakes up when there is new work to be done). Even for concurrency, there are novel ways to "think about the problem" that make you re-write your (very well-written) English Term Paper so that it is even more well-written.
As you mention, callbacks are one approach. I typically implement those with "Functor-like" things that handle issues like cross-thread signaling or cross-thread marshaling of data. The signal/slot
paradigm works well for that too, and some implementations like Qt's handle cross-thread signalling very nicely. However, this topic is so huge that I hesitate to dig too closely to any given design approach or language for this type of discussion...
But, like you, I had to start somewhere. This book (which I read this Spring) helped me really "formalize" my concurrency understanding, even though I've got a number of years of exposure to hardware/software concurrency on a number of systems/platforms with a bunch of different design approaches.
Let me know if you find another book like this one ... I'd like to read that one too (one really needs to "invert" one's brain to naturally flow with design options within the context of asynchronous/concurrent/parallel), and most stuff I've found on this topic is pretty narrow to a given language or technology.
精彩评论