### Linear Logic - Naturally!

Linear logic has enjoyed enormous popularity over the last couple of decades or so. For those without some training in structural proof theory, understanding the system can be quite intimidating, especially because of the funny notation and weird jargon. In this post, I am going to show you that, in fact,

*you could have invented linear logic*!

To start with, we need to understand classical logic. Our basic ingredients are conjunction, disjunction and negation. I'll denote these by ⊗, ⊕ and ¬ respectively. Since we are doing proof theory, we build our logic by saying how to "introduce" these connectives and describing how they play with each other. This is easy for negation. We want it to act as a unit, so we just say that we have I → A ⊗ ¬A and A ⊕ ¬A → I. Here, I have overloaded the symbol "I" to mean "the strongest false proposition" when it appears on the left of an arrow and "the strongest true proposition" when it appears on the right of an arrow. Such overloading of symbols is quite common in proof theory, so it is best to get used to it. Now, at least for classical logic, I can be taken to mean, respectively, the bottom and top element of the associated boolean algebra, viewed as a lattice. Alternatively, it can be viewed as either the initial or terminal object of your categorical semantics. Anyway, we complete our describption of negation by throwing in the De Morgan laws (but these are derivable from the rest of the things we'll throw in later).

Anyway, let's move on. We need to say how to introduce conjunction and disjunction. As it turns out, there are two ways in which to do this - this is important later. Essentially, the distinction is to whether we treat conjunction as a cartesian product or a tensor product and dually for disjunction. We'll see in just a bit that this distinction does not matter at the moment - classical logic cannot tell the difference! With a view to the future, I'll reserve ⊗ and ⊕ for the tensor intepretation and use x and + for the cartesian/cocartesian interpretation of conjunction and disjunction, respectively.

If we choose the tensor option, then we throw in implications A ⊗ B → A ⊗ B and A ⊕ B → A ⊕ B. For the cartesian interpretation, we would throw in implications A x B → A, A x B → B, A → A + B and B → A + B. We also say that if A → B and A → C, then A → B x C. Dually if A → C and B → C, then A+B → C. The cartesian/cocartesian structure should be clear from these implications.

Now, classical logic also includes certain "structural rules", which describe the structure of sequents. If you don't know what sequents are don't worry - we won't need them today. There are three basic flavours of structural rules, each of which has a version for conjunction and for disjunction, respectively. The first is "exchange", which says that things commute. That is, we throw in implications A ⊗ B → B ⊗ A and A ⊕B → B ⊕ A. We're going to keep exchange in all of our systems today. The second type of structural rule is called "weakening". For this, we throw in implications A ⊗ B &rarr A, A &otimes B → A and, dually, A → A ⊕ B as well as B → A ⊕ B. Notice the similarity with the cartesian introduction rules! We'll exploit this in just a sec. Before that, let's take a peek at the last type of structural rule. It is called "contraction" and says that we have to throw in all implications of the form A → A ⊗ A and A ⊕ A → A.

With all of these structural rules, the cartesian and tensor interpretations collapse! Let's show that A ⊗ B → A x B. First, we certainly have A → A and B → B. We can then use weakening to obtain A ⊗ B &rarr A and A ⊗ B → B. Then, we can use the cartesian introduction rule to obtain A ⊗ B → A x B. I'll leave the derivation of A x B → A ⊗ B to you (hint: use a contraction). The derivations for disjunction are completely dual.

Now, what you may argue that you would like your logic to be able to tell the difference between a tensor product and a cartesian product. In order to achieve this, all you need to do is remove weakening and contraction. If you do this, then the tensor and caresian introduction laws fail to lead to the same thing, so they create different connectives. The ⊗, ⊕ are the main part of the

*multiplicative*fragment of linear logic, whereas the x and + connectives come from the

*additive*fragment of linear logic. To complete the scenario, all you need to do is add in some units for the additive fragment.

Well, we have have almost finished creating linear logic and it is not even dinner time yet! We have one more thing left to do, however. Recall that Glivenko's Theorem says that you can embed classical logic in intuitionistic logic, via the double-negation translation. That is, A is classically provable if and only if ¬¬A is intuitionistically provable. Can we get a similar embedding of classical logic into this logic we have just cooked up?

As it stands, there is no obvious way in which to embed classical logic. What we need to do is create a mechanism whereby we can access weakening and contraction without actually adding full weakening and contraction to our logic. Let's worry about contraction and weakening for ⊕ only, for now. Remember, what we want to do is to take a classical formula A and translate it into something provable in our logic. But to do so, we need to somehow give it access to weakening and contraction. So, let's have the translation mark A in a way that gives it access to these rules. Let's be crazy and denote this mark by ?. We'll also throw in some rules that say that any formula can be given access to weakening/contraction: A → ?A. Also, There's no point in marking a formula twice, so let's also say that ??A → ?A. Finally, postulate the ⊕ version of weakening and contraction only for formulae that have a ? out front. That is, contraction looks something like ?A ⊕ ?A → ?A. Everything for ⊕ pushes through in a completely dual manner. We could, say, mark formulae with a ! to give them access to the ⊗ weakening/contraction rules. These marks are the "exponentials" of linear logic and actually comprise an S4 modality - more on these critters here.

Apart from some added formalism and some theorems to say that everything works out as it should, you have now created linear logic!