Chomsky had turned linguistics into a search for the computational processes underlying language, and in so doing, had unleashed a flood of new observations about previously undiscovered patterns. But his proposal had a deeper significance. Remember Descartes argument that human intelligence could never be duplicated by a machine which needed a special adaptation for every particular action. A human language is an infinite set of sentences, and you can't prepare for a conversation ahead of time with a pre-set list of questions and plan responses. What Chomsky realized is that the computer represented a new machine that could deal with such infinite open-ended challenges. A finite state machine indeed has this ability. But Chomsky began to explore a different model of computing, a model which has come to be known as a free structure grammar. With this model, Chomsky tried to show that Descartes miracle was not a miracle at all, but rather simple computation. Here's the challenge. English consists of an infinite set of sentences, and there are also infinitely many sentences that don't belong to English. We want a device that can recognize for any given sentence whether it is part of English or not. As Descartes might say, this is something even the lowest type of man can do. Chomsky showed how such a recognizer can be produced for English with a phrase structure, grammar, or PSG. A PSG consists of rules for breaking down a symbol into its parts. We begin with the basic idea that a sentence, s consists of a noun phrase, NP and a verb phrase, VP. This is the rule; this rule says that the S symbol can be broken down into two symbols; NP followed by VP. These symbols can also be broken down in various ways. For example, an NP could simply be a name, and a VP could be a verb followed by an NP, which is a direct object as shown by the following rules. Now we get some special rules that introduce words. We now have a complete PSG that can recognize a few sentences of English. Think about the sentence; Susan saw Ted. We can see right off that a name and the rules apply like this. Now, we can apply our NP rules giving this. This makes it possible to apply the VP-rule, and finally the S-rule, giving this tree-like structure. This list of rules constitutes a recognizer for a small subset of English sentences. It will recognize a sentence if it can use its rules to build a tree structure on top of the sentence, culminating in the S symbol. It can recognize sentences like; Susan saw Ted and Ted saw Susan. It would correctly reject sentences or sequences like; saw Susan, or Ted Susan, or Susan saw. For Susan saw, we could get started like this. But now we're stuck. There's no rule that applies to NP, followed by V. This expresses the idea that the verb saw is transitive. It needs an object following it, like the NP term. We can add rules to our grammar to cover more sentences of English. For example, adding rules for other transitive verbs like, hit, like, and help, and rules for other names. Of course, we need rules for other categories like prepositions, and adjectives. But this still seems like the machine Descartes criticized, needing a special adaptation for every particular action. We can add a few new rules, and they will cover a few new sentences. The miracle of the human ability is that there's no limit to the number of sentences it can handle. We don't appear any closer to explaining Descartes miracle. Actually, the solution to this was there all the time. It's already built into the PSG that we've been looking at. Remember the structure of a PSG rule, we can have rules with one symbol on the left hand side, and one or more symbols on the right like this. This means that an S can be broken down into an NP and a VP. But what if there's a symbol on the right hand side of one rule that appears on the left hand side of another? After all, we haven't ruled that out. Consider the verb 'think', as in; Sarah thinks Susan saw Ted. Saw, think, is a transitive verb, it needs an object, but its object is itself a sentence. This is the rule we need for think. Notice the S-symbol on the right hand side of the rule expressing the fact that verbs like, think, can take sentences as their objects. With this simple change, we have a solution to Descartes miracle. Our PSG can now recognize an infinite set of sentences. Here's a past tree story for; Sarah thinks Susan saw Ted. The same PSG will also accept; Tom believe Sarah thinks Susan saw Ted, and Ted thinks Tom believes Sarah thinks Susan saw Ted. These are not necessarily interesting sentences, but they're undoubtedly acceptable sentences of the English language, and our simple PSG will recognize infinitely many of them.