Semantics Engineering with PLT Redex:
This text is the first comprehensive presentation of reduction semantics in one volume; it also introduces the first reliable and easy-to-use tool set for such forms of semantics. Software engineers have long known that automatic tool support is critical for rapid prototyping and modeling, and this book is addressed to the working semantics engineer (graduate student or professional language designer). The book comes with a prototyping tool suite to develop, explore, test, debug, and publish semantic models of programming languages. With PLT Redex, semanticists can formulate models as grammars and reduction models on their computers with the ease of paper and pencil.
The text first presents a framework for the formulation of language models, focusing on equational calculi and abstract machines, then introduces PLT Redex, a suite of software tools for expressing these models as PLT Redex models. Finally, experts describe a range of models formulated in Redex.
PLT Redex comes with the PLT Scheme implementation, available free at http://www.plt-scheme.org/. Readers can download the software and experiment with Redex as they work their way through the book.
For me this is a long, long, long goal :).
Coyotos and Genode are two operating systems about which I had never heard that were mentioned in an Ikarus Users mailing list thread that doesn’t seem to have been mirrored online.
Coyotos is a secure, microkernel-based operating system that builds on the ideas and experiences of the EROS project.
We understand the complexity of code and policy as the most fundamental security problem shared by modern general-purpose operating systems. Because of high functional demands and dynamic workloads, however, this complexity cannot be avoided. But it can be organized. Genode is a novel OS architecture that is able to master complexity by applying a strict organizational structure to all software components including device drivers, system services, and applications. The Genode OS framework is the effort to advance the Genode OS architecture as a community-driven open-source project.
This looks like it might be a good article.
Todd asked “Why [did Church choose] lambda and not some other Greek letter?”. Here are three answers:
The story is that in the 10s and 20s, mathematicians and logicians used ^ as a notation for set abstraction, as in ^i : i is prime. Church used ^` (i.e., a primed version of this symbol) for function abstraction, because functions are just sets with extra properties. The first type setter/secretary read it as λ and Church was fine with. True or not? I don’t know but it’s fun.
This paper (link provided by Dave Herman here):
(By the way, why did Church choose the notation “λ”? In [Church, 1964, §2] he stated clearly that it came from the notation “xˆ” used for class-abstraction by Whitehead and Russell, by first modifying “xˆ” to “ˆx” to distinguish function abstraction from class-abstraction, and then changing “ˆ” to “λ” for ease of printing. This origin was also reported in [Rosser, 1984, p.338]. On the other hand, in his later years Church told two enquirers that the choice was more accidental: a symbol was needed and “λ” just happened to be chosen.)
This paper (link provided by Dave Herman here):
We end this introduction by telling what seems to be the story how the letter ‘λ’ was chosen to denote function abstraction. In  Principia Mathematica the notation for the function f with f(x) = 2x + 1 is 2xˆ +1. Church originally intended to use the notation xˆ .2x+1. The typesetter could not position the hat on top of the x and placed it in front of it, resulting in ˆx.2x + 1. Then another typesetter changed it into λx.2x + 1.
Here is one person’s take on the genius of Turing’s machine.
I forget how I reached this page.
I asked “What is a good book on automata theory?” because I don’t recall much of it from college. Marco replied here: Elements of the Theory of Computation by Lewis and Papadimitriou.
Do you know of any more?
The 1979 edition of “Introduction to Automata Theory, Languages, and Computation“, by Hopcroft and Ullman. The algorithms are pretty imperative, though.
Addendum: 20/02/09 at 2:25CST
I very much appreciate: Formal Languages and their |Relation to Automata by John E. Hopcroft and Jeffrey D. Ullman. My first read through it was 40 years ago, but even nowadays I consult it now and then.
SCIgen is a program that generates random Computer Science research papers, including graphs, figures, and citations. It uses a hand-written context-free grammar to form all elements of the papers. Our aim here is to maximize amusement, rather than coherence.
One useful purpose for such a program is to auto-generate submissions to conferences that you suspect might have very low submission standards.
There are seemingly a number fun things that you could do with this!
Mike expounds on it here. It is a good article.
Mike’s blog is located here.
In 1997, on his retirement from Bell Labs, Doug McIlroy gave a fascinating talk about the “History of Computing at Bell Labs.”
Here is one man’s transcription of that talk.
I only skimmed over it (at the moment I’m not interested in what was); but nonetheless it does look like a great talk with a lot of fun and interesting bits in there.