Is Automata Theory dead? [closed]
I loved the course I took in Automata Theory and Formal Languages, so naturally I started looking around the interwebs to learn what happened since the time the books on which the course was based were written.
What I discovered was that the list of stuff I wasn't familiar with seemed to be very short. For example, from the list of automatons in the Wikipedia entry for the subject, half were covered by the course, and the other half were mostly related to the one language not covered by the course.
Further, when researching about the applications of the theory, I got mostly the same results: programming language syntax, compilers, text search, and.... that's about it.
So is it really that dead? Or does it continues to evolve? Are there new applications for the theory?
Automatons are really useful. I completed my degree in software engineering and computer science nearly 20 years ago. One of the first courses was Models of Machines, which covered FSAs, and ventured a bit into turning machines, computability, halting problem etc.
Everyone thought the course was either boring, irrelevant, too difficult or pointless. The circles and arcs made little sense to anyone, and what's the point of a tape with just ones on it? What's wrong with a hard disk? At the end of the course, the lecturer gave out a questionnaire - how useful do you think this course will be in one month, in one year, in ten years. Then, I answered not useful for all of them. Now it would be increasing usefullness with time, ending with "very useful"
I've used automata lots in my day job, and they are the right tool for certain classes of problems, with little else to compete with it. I've used them for compressing multi-million word lists+category data (ok, quite banal), and also implemented an extension where the symbols are complex objects and the state transitions are predicates. This allowed a complex set of rules to be compiled to a deterministic FST and all rules evaluated simultaneously and deterministically with no redundant computation.
My vote is for still relevant!
Automata and formal languages are foundation of regular expressions, parsers, compilers, virtual machines, etc. which improve regularly.
There are also required in the domain of theorem prover for program checking, which aims to prove that a program or a protocol achieves what it pretends to do. This domain is critical (vote machine software, banking transaction, security systems in vehicle, etc.) and still under development.
So no, the automata theory and formal languages are not dead!
It's not dead, more 'put out to stud' - it's a simple formalism which is used more as the basis for others, rather than being a particularly active research topic.
Henry Thompson's work on XML schemata uses and extends automata theory.
Many embedded software projects make heavy use of finite state machines, which are related to automata, and some of the techniques to work with them draw on or extend automata theory.
Pi-calculus extends automata theory with the concept of bisimulation and adds capabilities for analysing concurrent processes. It's the closest bit of recent research to the automata theory I learnt at university.
I think as new areas of computing, such as quantum computing and hypercomputation open up then there will be new applications requirements, requirements and theoretical bredth from automata theory and things like evolutionary automata and computation, cellular automata and whatnot.
I don't think it is dead, just a bit cold for the time being.
I think Automata Theory is involved in a lot of areas without people realising. For example, I can see it's application in cryptography and cryptanalysis.
a lot of process control stuff is heavily based on the theory. Especially in terms of proving the robustness of control systems.
Take a look at workflow processes and see how automata theory could be put to use there in formalizing the concepts and patterns described: Workflow Patterns
One of the new techniques I came across a few years ago is called Parsing Expression Grammars, aka PEGs aka Packrat Parsing. Bryan Ford has done some work on it which can be viewed at http://pdos.csail.mit.edu/~baford/packrat/.
This is basically a top-down recursive descent parser, and one of the advantages of it is that lexical analysis and semantic analysis can be performed in one step.
PEGs compare with CFGs in that PEGs are more suited to parsing a context-free language, whereas CFGs are more suited to generating a context-free language.
Rather than looking at the theory as dead, instead consider that it has become so practical for applications that we've moved beyond the theory. An excellent book to consider that bridges between the theory and application is Miro Samek's "Practical Statecharts in C/C++". There is now a second edition available, which I've not read. But I found nothing lacking in the first edition; to this day I find it one of the most valuable texts I have ever read.
精彩评论