Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have a corollary to Greenspun's Tenth Law, which is that:

  Lisp programmers see everything in terms of as an ad hoc,
  informally-specified, bug-ridden, slow implementation of half of Lisp,
  and don't see other benefits it might have.
  That is, Greenspun's Tenth Law is true - for Lisp programmers.
I came to this conclusion because of a tragic pair of research papers, which had a fantastic usability idea. The second half of the first paper took the focus off the usability, and developed it into a very simple functional language. In their next paper, they dropped the fantastic usability idea completely, and made it into a lisp. :-(

Some XML standards fell into a similar trap, by wanting languages that process XML to be themselves written in XML - such as XSLT. It's a nice abstract concept to be able to process yourself... but at the price of abominations like "i < 10".

Adam Bosworth pointed out that XML's XPath resisted this - by making XPath itself an embedded non-XML mini-language. Imagine an XML representation of path components - now that would be verbose!

In a politically expedient move, I'd like to point out that pg didn't fall into this trap: he made the DSL for users of Viaweb to customize their store to not be lisp (though an easily mapped subset, if I understand correctly.) It's a non-lisp mini-language.

Great link from the article, about "principle of least power", for mini-languages: http://www.w3.org/DesignIssues/Principles.html#PLP Constraints are very empowering, because you know what to expect.

Regarding XML: I'd always thought it was just one of many possible syntaxes for representing hierarchical data; and it really didn't matter which syntax you used. As in a lingua franca (or any standard), provided that it is barely adequate, the key thing is that everyone agrees on it. XML became the Chosen, de facto standard, because everyone was already familiar with HTML, propelled by the mass adoption of the web. So the question becomes: why did we get HTML (based on SGML), instead of S-expressions? The article gives reasons, but I guess the short of it is that if a group of people work towards a specific purpose for years, and are successful at it (as SGML was), it is probably a good base to start from if you want to do something similar, i.e. describe documents.

Also, more directly, if I imagine a large webpage described with S-expressions, I think HTML is a bit clearer.

Nitpick: The article omits that quotes (or apostrophes) must be escaped in XML attributes.

Very telling points about LaTeX - that like XML/HTML, it also uses named end-tags; and that Lisp documentation itself is used LaTeX instead of S-expressions - drinking their kool-aid; but not eating their dog-food.



> Very telling points about LaTeX - that like XML/HTML

No, it's a demonstration of ignorance. LaTeX wasn't written by Lispers, it is merely used by them. The fact that they find its design decisions acceptable must be weighed against the cost of their alternatives. That doesn't imply that they wouldn't have been happier with a more lispish syntax.

At the time that those decisions were made, LaTeX was pretty much the best alternative. The fact that Lispers, like almost everyone else in related communities, made that decision merely says that Lispers don't cut off their noses to spite their face.


The article suggests Lispers could have used sexp as a front-end to LaTeX, in the same way that XML was used as a front-end to LaTeX. Very easy to do.

> If S-expressions were easier to edit, it would be most logical to edit the document in S-expressions and then write a small Scheme program to convert S-expressions into a formatting language like LaTeX. This is, what XML and SGML people have done for decades [...]


> The article suggests Lispers could have used sexp as a front-end to LaTeX

(1) As another comment points out, they have when doing so provided benefits. (2) Lispers tend to be multi-lingual; they'll use other languages when appropriate. If XMLers can only work in XML....

>This is, what XML and SGML people have done for decades

Decades? 20 years/two decades ago is 1988. The first draft of XML is roughly 1998/10 years later/one decade ago. GML, a predecessor to SGML, didn't become public until 73 but the "multiple use" stuff was still in the future.

SGML rode the WWW wave, but that didn't happen for technical reasons.


Note that such front-ends are inherently leaky. If all they do is transform syntax, they're probably a bad idea.

Note that the point of using s-expressions as a front-end would be programmatic generation, not by-human editing. (Neither sexpressions nor xml is actually all that friendly for editing text.)

Do XML folks really think write front-ends for ease of editing?


S-Expressions are OK to work with as a human if you have a decent editor that helps you indent and balance parens.


The article suggests Lispers could have used sexp as a front-end to LaTeX

How does he know they didn't?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: