" /> Ezra's Research: September 2007 Archives

« August 2007 | Main | October 2007 »

September 20, 2007

'raises' predicate (Haskell)

A Haskell predicate that tests whether an expression raises an error (for my future reference):

import Control.Exception

raises :: a -> IO Bool
raises expr = 
   Control.Exception.catch (
      return . (const False) =<< Control.Exception.evaluate expr
   ) (return . (const True))

September 18, 2007

Mixed feelings

Poplar's users and potential users had mixed feelings about the syntax. Even aspects we consider successful were not universally appreciated. No one was ever sure what the precedence rules were or should be."
—James H. Morris, Eric Schmidt, Philip Wadler. Experience with an Applicative String Processing Language, POPL. 1980.
Har!

Is too valid

Grrr:


And yet:

Is too valid. Check the spec, y'all. Cf. the grammar for "." In particular:

     ::= any one of the 128 ASCII characters, but not any
               or 
     ::= "<" | ">" | "(" | ")" | "[" | "]" | "\" | "."
              | "," | ";" | ":" | "@"  """ | the control
              characters (ASCII codes 0 through 31 inclusive and
              127)

September 13, 2007

Logarithmic Slowdown of Functional Languages?

The Wikipedia page on "Functional Programming" has some assertions about efficiency that sound questionable. I'm not the only one--someone else tagged the item with "citation needed" and started a discussion.

It currently says, "Any imperative algorithm is expressible in these languages with a logarithmic slowdown in the worst case." David Hopwood notes in the discussion that "Imperative algorithms are expressible with no asympotic slowdown," using some clever/complicated technique involving mutable arrays (citing a caml-list posting by one Tom Breuel).

All of this sounds a bit strange to me--as far as I know, there's no conclusive reason to think that functional languages must execute a particular algorithm more slowly than an imperative language. (Where does this "logarithmic slowdown" come from, anyway?) But I don't have a mastery of all the relevant literature. Can someone else jump in and straighten this out?

September 12, 2007

Foundational Papers in Programming Languages

Here's a rather good paper of the sort that I wish someone had shown me when I first started as a grad student: Conception, evolution, and application of functional programming languages, by Paul Hudak (1989). It gives a good history of how functional languages arose (from Lisp, ISWIM, KRC, and so on) and coalesced (into ML and Haskell and the like), and describes in some detail the salient features of (and issues within) functional programming, including pattern-matching, algebraic datatypes, memoization, nondeterminism, and of course, type-inference and polymorphism. It also includes a short bit on formal semantics, which might give an neophyte the flavor of such a formalism without getting too heavy. The paper is fairly long, but it covers a lot and is written in the old style of clear, readable prose with a minimum of specialized notation.

I found the paper from a link on Wikipedia.

Other papers that fall into this class ("classic papers I wish I'd seen at the beginning") include "The Next 700 Programming Languages" and "The Mechanical Evaluation of Expressions,"[1] both by Peter J. Landin. (I'll post more as I think of them.)

Tree papers that I gladly did read at the beginning are "Definitional Interpreters for Higher-Order Programming Languages" (1972—the original typewritten version is better than the one typeset in TeX that you find sometimes), "The discoveries of continuations" (2005), and "Gedanken" (1970), all by John C. Reynolds. The latter introduces (?) the nice idea of an escape construct (now often called let/cc) and "references" as first-class values (or is that older?).

[1] Sadly no longer available online.

September 11, 2007

Elements

I quite like this:
One of the important perceptions of category theory is that an arrow x : T â†’ A in a category can be regarded as an element of A defined over T. The idea is that x is a variable element of A, meaning about the same thing as the word “quantity” in such sentences as, “The quantity x2 is nonnegative”, found in older calculus books.
—Toposes, Triples and Theories, Michael Barr and Charles Wells, Version 1.1, 2002

September 10, 2007

Abstraction and Variation

Fellow researchers may be interested in this fanciful article on the techniques, advantages, and supposed pitfalls of abstraction in software engineering.