So, I have been doing a little linguistics. A shocking revelation, I know, but I may as well jabber on about it a bit.
Way back in the days before colour television, Jim Lambek turned the linguistic notion of categorial grammar into logic. We've all seen something of the sort in school - nouns, adjectives, subjects and silly old intransitive verbs - to name but a few of the categories you may very well have been subjected to. The question then is how members of these categories can be glued together to form a valid sentence. Or, to turn the problem on its head, how we can get a computer to decide whether or not a sentence is grammatical. Lambek addresses these in his paper:
The Mathematics of Sentence Structure, J. Lambek
Forever more, the system presented in that paper has been known as Lambek Calculus. If you are familiar with Sequent calculus, Lambek Calculus fits in like this. Start with Ye Olde Calculus for the negation free fragment of intuitionistic logic. First off, get rid of contraction and weakening. What happens is that conjunction shatters into an "additive" version and a "multiplicative" version. Keep only the multiplicative version. Then, do something dastardly and devious - drop exchange also. If you do this, then implication breaks up into two versions of implication, both of which are residuated with respect to your conjunction. What this means is that x and y entails z if and only if x entails that y implies z. And the same thing with "implies" replaced with your other implication, let's be silly and call that "coimplication". Lambek denotes these little beauties by / and \ respectively. If you like category theory, this just means that Hom (x and y, z) is isomorphic to Hom(x, y / z) and also to Hom(x, y \ z). So, we have some adjunctions floating around.
Linguistically, however, getting rid of all of the structural rules is a bit on the dodgy side. For instance, it has been shown that the Lambek Calculus only really generates context free grammars. Most unsatisfactory! What we can do, however, is to let the structural rules back in a controlled and regimented way (much like linear logic...). So, for instance, we may have a unary operator that says that everything in its scope has access to weakening. Here is an example where this all makes sense:
(1) Take the book to John
Let us now postulate that the noun phrase "John" has access to weakening. In which case, we derive that the following is grammatically valid:
(1') Take the book to John, Mary, Jane and Susan
This makes a lot of sense linguistically. Logically, however, these structural "modalities" open up a can of worms. For instance, if we make the conjunction commutative, by adding back exchange, then the two sorts of implication collapse back to classical implication. If we now add modalities for contraction and weakening, then we get linear logic! Although...exchange is not really warranted on linguistic grounds, because we do really want the two sorts of implication to be different.
I suppose the area where this sort of stuff is best understood is the proof theoretic side of things, where Belnap's "Display Logic" provides a nice formalism for systematically adding in these sort of modalities and obtaining cut elimination for free. More or less what is known is contained in this thesis:
Reasoning with polarity in categorial type logic, Raffaella Bernardi
I'll speak about the semantic side of things another time, when I have grokked a bit more...