Combinators Eat
Near the end of Wolfram’s weird and wonderful book, Combinators: A Centennial View is a section called "One More Thing" (page 291 in my physical copy of the book).
These days, the most common way to parse an English sentence like "I am trying to track down a piece of history" is a hierarchical tree structure — analogous to the way a context-free computer language would be parsed: [tree diagram of parts of "I am trying to track down a piece of history" as leaves on the tree]. But there is an alternative — and, as it turns out, significantly older — approach: to use a so-called dependency grammar in which verbs act like functions, "depending" on a collection of arguments: [much flatter diagram with "I am trying to track down a piece of history" on a line with arrows pointing to each dependency].
And here’s the evocative description of how this works in the next paragraph:
…in a natural language like English, everything is just given in sequence, and a function somehow has to have a way to figure out what to grab. And the idea is that this process might work like how combinators written out in sequence "grab" certain elements to act on.
That description also reminds me a lot of the way a tacit language like Forth grabs what it needs from the stack without ever giving explicit names to any of it.
( Oh, how nice: it looks like Wolfram has posted this whole section of the book as an article here: https://writings.stephenwolfram.com/2021/03/a-little-closer-to-finding-what-became-of-moses-schonfinkel-inventor-of-combinators/ )
Anyway, I like the idea of functions (or perhaps even just call them subroutines) just pulling what they need from some pool of current state. (I think this fits pretty well with the concept of "concatenative" programming as I understand it.)
Related Wikipedia articles: