Computation not only takes place in provoked contexts of scientific experimentation, but in natural circumstances too. We are going to approach computation in natural contexts. How the nature computes? Turing machines and Chomsky grammars are rewriting systems, and the same is true for Post, Thue, Markov, Lindenmayer and other classes of axiomatic systems. If, among the whole set of natural objects, we focus natural language description, we must say that major trends in contemporary linguistics look at syntax as a rewriting process. Is rewriting unavoidable in this case, does our mind work by rewriting, does the nature compute in this way? We shall attempt to defend that the answer could be negative. The arguments will come from computability theory as well as from linguistics. First we'll formally explain the former ones, then informally the latter ones. With regard to computability theory arguments, we will see that, using the operation of adjoining, a large generative capacity is obtained. This is the case with contextual grammars. It has recently been proved that each recursively enumerable language is the quotient by a regular language of a language generated by a contextual grammar of a particular form. Thus, adjoining (paste) and quotient (cut) lead to computational universality. Recursively enumerable languages can also be characterized as the quotient by a regular language of a language generated by an insertion grammar. The same result is obtained if we take the splicing operation, a formal model of the DNA recombination. This is again a cut-and-paste operation. On the basis of the proof of this result, several further characterizations of recursively enumerable languages have been obtained. Computability theory, then, could be reconstructed without rewriting (and non-terminal symbols) and without any loss in power. Our first aim will be to show some formal aspects of such reconstruction. Later, we'll try to obtain some consequences for the future development of generative theory of natural language.