DNA as topological quantum computer: XIn previous postings I, II, III, IV, V, VI, VII, VIII, IX I have discussed various aspects of the idea that DNA could acts as a topological quantum computer using fundamental braiding operation as a universal 2gate. Many problems of quantum computation in standard sense might relate to a wrong view about quantum theory. If TGD Universe is the physical universe, the situation would improve in many respects. There is the new fractal view about quantum jump and observer as "self"; there is padic length scale hierarchy and hierarchy of Planck constants as well as self hierarchy; there is a new view about entanglement and the possibility of irreducible entanglement carrying genuine information and making possible quantum superposition of fractal quantum computations and quantum parallel dissipation; there is zero energy ontology, the notion of Mmatrix allowing to understand quantum theory as a square root of thermodynamics, the notion of measurement resolution allowing to identify Mmatrix in terms of Connes tensor product; there is also the notion of magnetic body providing one promising realization for braids in tqc, etc... Taking the risk of boring the reader by repeating things that I have already said I will summarize these new aspects TGD below. There is also a second motivation. Quantum TGD and TGD inspired theory of consciousness involve quite a bundle of new ideas and the continual checking of internal consistency by writing it through again and again is of utmost importance. The following considerations can be also seen as this kind of checking. I can only represent apologies to the benevolent reader: this is a work in progress.
A. Fractal hierarchies Fractal hierarchies are the essence of TGD. There is hierarchy of spacetime sheets labelled by preferred padic primes. There is hierarchy of Planck constants reflecting a book like structure of the generalized imbedding space and identified in terms of a hierarchy of dark matters. These hierarchies correspond at the level of conscious experience to a hierarchy of conscious entities selves: self experiences its subselves as mental images. Fractal hierarchies mean completely new element in the model for quantum computation. The decomposition of quantum computation to a fractal hierarchy of quantum computations is one implication of this hierarchy and means that each quantum computation proceeds from longer to shorter time scales T_{n}=2^{n}T_{0} as a cascade like process such that at each level there is a large number of quantum computations performed with various values of input parameters defined by the output at previous level. Under some additional assumptions to be discussed later this hierarchy involves at a given level a large number of replicas of a given submodule of tqc so that the output of single fractal submodule gives automatically probabilities for various outcomes as required.
B. Irreducible entanglement and possibility of quantum parallel quantum computation The basic distinction from standard measurement theory is irreducible entanglement not reduced in quantum jump. B.1 NMP and the possibility of irreducible entanglement Negentropy Maximimization Principle states that entanglement entropy is minimized in quantum jump. For standard Shannon entropy this would lead to a final state which corresponds to a ray of state space. If entanglement probabilities are rational or even algebraic  one can replace Shannon entropy with its number theoretic counterpart in which padic norm of probability replaces the probability in the argument of logarithm: log(p_{n})→ log(N_{p}(p_{n}). This entropy can have negative values. It is not quite clear whether prime p should be chosen to maximize the number theoretic negentropy or whether p is the padic prime characterizing the lightlike partonic 3surface in question. Obviously NMP favors generation of irreducible entanglement which however can be reduced in U process. Irreducible entanglement is something completely new and the proposed interpretation is in terms of experience of various kinds of conscious experiences with positive content such as understanding. Quantum superposition of unitarily evolving quantum states generalizes to a quantum superposition of quantum jump sequences defining dissipative time evolutions. Dissipating quarks inside quantum coherent hadrons would provide a basic example of this kind of situation. B.2 Quantum parallel quantum computations and conscious experience The combination of quantum parallel quantum jump sequences with the fractal hierarchies of scales implies the possibility of quantum parallel quantum computations. In ordinary quantum computation halting selects single computation but in the recent case arbitrarily large number of computations can be carried out simultaneously at various branches of entangled state. The probability distribution for the outcomes is obtained using only single computation. One would have quantum superposition of spacetime sheets (assignable to the maxima of Kähler function) each representing classically the outcome of a particular computation. Each branch would correspond to its own conscious experience but the entire system would correspond to a self experiencing consciously the outcome of computation as intuitive and holistic understanding, abstraction. Emotions and emotional intellect could correspond to this kind of nonsymbolic representation for the outcome of computation as analogs for collective parameters like temperature and pressure. B.3 Delicacies There are several delicacies involved.
C.Connes tensor product defines universal entanglement Both timelike entanglement between quantum states with opposite quantum numbers represented by Mmatrix and spacelike entanglement reduce to Connes tensor dictated highly uniquely by measurement resolution characterized by inclusion of HFFs of type II_{1}
C.1 Timelike and spacelike entanglement in zero energy ontology If hyperfinite factors of II_{1} are all that is needed then Connes tensor product defines universal Smatrix and the most general situation corresponds to a direct sum of them. Mmatrix for each summand is product of Hermitian square root of density matrix and unitary Smatrix multiplied by a square root of probability having interpretation as analog for Boltzmann weight or probability defined by density matrix (note that it is essential to have Tr(Id)=1 for factors of type II_{1}. If factor of type I_{∞} are present situation is more complex. This means that quantum computations are highly universal and Mmatrices are characterized by the inclusion N subset M in each summand defining measurement resolution. Hermitian elements of N act as symmetries of Mmatrix. The identification of the reducible entanglement characterized by Boltzmann weight like parameters in terms of thermal equilibrium would allow interpret quantum theory as square root of thermodynamics. If the entanglement probabilities defined by Smatrix and assignable to N rays do not belong to the algebraic extension used then a full state function reduction is prevented by NMP. Ff the generalized Boltzmann weights are also algebraic then also thermal entanglement is irreducible. In padic thermodynamics for Virasoro generator L_{0} and using some cutoff for conformal weights the Boltzmann weights are rational numbers expressible using powers of padic prime p. C.2 Effects of finite temperature Usually finite temperature is seen as a problem for quantum computation. In TGD framework the effect of finite temperature is to replace zero energy states formed as pairs of positive and negative energy states with a superposition in which energy varies. One has an ensemble of spacetime sheets which should represent nearly replicas of the quantum computation. There are two cases to be considered.
If the degrees of freedom assignable to topological quantum computation do not depend on the energy of the state, thermal width does not affect at all the relevant probabilities. The probabilities are actually affected even in the case of tqc since 1gates are not purely topological and the effects of temperature in spin degrees of freedom are unavoidable. If T grows the probability distribution for outcomes flattens and it becomes difficult to select the desired outcome as that appearing with maximal probability. D. Possible problems related to quantum computation At least following problems are encountered in quantum computation.
D.1 The notion of coherence region in TGD framework In standard framework one can speak about coherence in two senses. At the level of Schrödinger amplitudes one speaks about coherence region inside which it makes sense to speak about Schrödinger time evolution. This notion is rather defined. In TGD framework coherence region is identifiable as inside which modified Dirac equation holds true. Strictly speaking, this region corresponds to a lightlike partonic 3surface whereas 4D spacetime sheet corresponds to coherence region for classical fields. pAdic length scale hierarchy and hierarchy of Planck constants means that arbitrarily large coherence regions are possible. The precise definition for the notion of coherence region and the presence of scale hierarchies imply that the coherence in the case of single quantum computation is not a problem in TGD framework. Decoherence time or coherence time correspond to the temporal span of spacetime sheet and a hierarchy coming in powers of two for a given value of Planck constant is predicted by basic quantum TGD. pAdic length scale hypothesis and favored values of Planck constant would naturally reflect this fundamental fractal hierarchy. D.2 Decoherence of density matrix and replicas of tqc Second phenomenological description boils down to the assumption that nondiagonal elements of the density matrix in some preferred basis (involving spatial localization of p"/public_html/articles/) approach to zero. The existence of more or less faithful replicas of spacetime sheet in given scale allows to identify the counterpart of this notion in TGD context. Decoherence would mean a loss of information in the averaging of Mmatrix and density matrix associated with these spacetime sheets. Topological computations are probabilistic. This means that one has a collection of spacetime sheets such that each spacetime sheet corresponds to more or less same tqc and therefore same Mmatrix. If M is too random (in the limits allowed by Connes tensor product), the analog of generalized phase information represented by its "phase"  Smatrix  is useless. In order to avoid decoherence in this sense, the spacetime sheets must be approximate copies of each other. Almost copies are expected to result by dissipation leading to asymptotic selforganization patterns depending only weakly on initial conditions and having also spacetime correlate. Obviously, the role of dissipation in eliminating effects of decoherence in tqc would be something new. The enormous symmetries of Mmatrix, the uniqueness of Smatrix for given resolution and parameters characterizing braiding, fractality, and generalized Bohr orbit property of spacetime sheets, plus dissipation give good hopes that almost replicas can be obtained. D.3 Isolation and representations of the outcome of tqc The interaction with environment makes quantum computation difficult. In the case of topological quantum computation this interaction corresponds to the formation of braid strands connecting the computing spacetime sheet with spacetime sheets in environment. The environment is fourdimensional in TGD framework and an isolation in time direction might be required. The spacetime sheets responsible for replicas of tqc should not be connected by lightlike braids strands having timelike projections in M^{4}. Length scale hierarchy coming in powers of two and finite measurement resolution might help considerably. Finite measurement resolution means that those strands which connect spacetime sheets topologically condensed to the spacetime sheets in question do not induce entanglement visible at this level and should not be affect tqc in the resolution used. Hence only the elimination of strands responsible for tqc at given level and connecting computing spacetime sheet to spacetime sheets at same level in environment is necessary and would require magnetic isolation. Note that superconductivity might provide this kind of isolation. This kind of elimination could involve the same mechanism as the initiation of tqc which cuts the braid strands so the initiation and isolation might be more or less the same thing. Strands reconnect after the halting of tqc and would make possible the communication of the outcome of computation along strands by using say em currents in turn generating generalized EEG, nerve pulse patterns, gene expression, etc... halting and initiation could be more or less synonymous with isolation and communication of the outcome of tqc. D.4 How to express the outcome of quantum computation? The outcome of quantum computation is basically a representation of probabilities for the outcome of tqc. There are two representations for the outcome of tqc. Symbolic representation which quite generally is in terms of probability distributions represented in terms "classical spacetime" physics. Rates for various processes having basically interpretation as geometrotemporal densities would represent the probabilities just as in case of particle physics experiment. For tqc in living matter this would correspond to gene expression, neural firing, EEG patterns,... A representation as a conscious experience is another (and actually the ultimate) representation of the outcome. It need not have any symbolic counterpart since it is felt. Intuition, emotions and emotional intelligence would naturally relate to this kind of representation made possible by irreducible entanglement. This representation would be based on fuzzy qubits and would mean that the outcome would be true or false only with certain probability. This unreliability would be felt consciously. In the proposed model of tqc the emergence of EEG rhythm (say theta rhythm) and correlated firing patterns would correspond to the isolation at the first half period of tqc and random firing at second half period to the subsequent tqc:s at shorter time scales coming as negative powers of 2. The fractal hierarchy of time scales would correspond to a hierarchy of frequency scales for generalized EEG and power spectra at these scales would give information about the outcome of tqc. Synchronization would be obviously an essential element in this picture and could be understood in terms of classical dynamics which defines spacetime surface as a generalized Bohr orbit. Tqc would be analogous to the generation of a dynamical hologram or "conscious hologram" (see this). EEG rhythm would correspond to reference wave and the contributions of spikes to EEG would correspond to the incoming wave interfering with it. Two remarks are in order. D.5 How data is feeded into submodules of tqc? Scale hierarchy obviously gives tqc a fractal modular structure and the question is how data is feeded to submodules at shorter length scales. There are are certainly interactions between different levels of scale hierarchy. The general ideas about masterslave hierarchy assigned with selforganization support the hypothesis that these interactions are directed from longer to shorter scales and have interpretation as a specialization of input data to tqc submodules represented by smaller spacetime sheets of hierarchy. The call of submodule would occur when the tqc of the calling module halts and the result of computation is expressed as a 4D pattern. The lower level module would start only after the halting of tqc (with respect to subjective time) and the durations of resulting tqcs would come as T_{n}= 2^{n}T_{0} that geometric series of tqcs would become possible. There would be entire family of tqcs at lower level corresponding to different values of input parameters from calling module. D.6 The role of dissipation and energy feed Dissipation plays key role in the theory of selforganizing systems. Its role is to serve as a Darwinian selector. Without an external energy feed the outcome is a situation in which all organized motions disappear. In presence of energy feed highly unique selforganization patterns depending very weakly on initial conditions emerge. In case of tqc one function of dissipation would be to drive the braidings to static standard configurations, prevent overbraiding, and perhaps even effectively eliminate fluctuations in nontopological degrees of freedom. Note that magnetic fields are important for 1gates. Magnetic flux conservation however saves magnetic fields from dissipation. External energy feed is needed in order to generate new braidings. For the proposed model of cellular tqc the flow of intracellular water induces the braiding and requires energy feed. Also now dissipation would drive this flow to standard patterns coding for tqc programs. Metabolic energy would be also needed in order to control whether lipids can move or not by generating cis type unsaturated bonds. For the model of DNA as topological quantum computer see the chapter DNA as Topological Quantum Computer.
