A new computational methodology for executing calculations with infinite and infinitesimal quantities is described in this paper. It is based on the principle ‘The part is less than the whole’ introduced by Ancient Greeks and applied to all numbers (finite, infinite, and infinitesimal) and to all sets and processes (finite and infinite). It is shown that it becomes possible to write down finite, infinite, and infinitesimal numbers by a finite number of symbols as particular cases of (...) a unique framework. The new methodology has allowed us to introduce the Infinity Computer working with such numbers (its simulator has already been realized). Examples dealing with divergent series, infinite sets, and limits are given. (shrink)
Pattern recognition is represented as the limit, to which an infinite Turing process converges. A Turing machine, in which the bits are substituted with qubits, is introduced. That quantum Turing machine can recognize two complementary patterns in any data. That ability of universal pattern recognition is interpreted as an intellect featuring any quantum computer. The property is valid only within a quantum computer: To utilize it, the observer should be sited inside it. Being outside it, the observer would obtain (...) quite different result depending on the degree of the entanglement of the quantum computer and observer. All extraordinary properties of a quantum computer are due to involving a converging infinite computational process contenting necessarily both a continuous advancing calculation and a leap to the limit. Three types of quantum computation can be distinguished according to whether the series is a finite one, an infinite rational or irrational number. (shrink)
Infinite machines (IMs) can do supertasks. A supertask is an infinite series of operations done in some finite time. Whether or not our universe contains any IMs, they are worthy of study as upper bounds on finite machines. We introduce IMs and describe some of their physical and psychological aspects. An accelerating Turing machine (an ATM) is a Turing machine that performs every next operation twice as fast. It can carry out infinitely many operations in finite time. Many (...) ATMs can be connected together to form networks of infinitely powerful agents. A network of ATMs can also be thought of as the control system for an infinitely complex robot. We describe a robot with a dense network of ATMs for its retinas, its brain, and its motor controllers. Such a robot can perform psychological supertasks - it can perceive infinitely detailed objects in all their detail; it can formulate infinite plans; it can make infinitely precise movements. An endless hierarchy of IMs might realize a deep notion of intelligent computing everywhere. (shrink)
This paper argues that the idea of a computer is unique. Calculators and analog computers are not different ideas about computers, and nature does not compute by itself. Computers, once clearly defined in all their terms and mechanisms, rather than enumerated by behavioral examples, can be more than instrumental tools in science, and more than source of analogies and taxonomies in philosophy. They can help us understand semantic content and its relation to form. This can be achieved because they have (...) the potential to do more than calculators, which are computers that are designed not to learn. Today’s computers are not designed to learn; rather, they are designed to support learning; therefore, any theory of content tested by computers that currently exist must be of an empirical, rather than a formal nature. If they are designed someday to learn, we will see a change in roles, requiring an empirical theory about the Turing architecture’s content, using the primitives of learning machines. This way of thinking, which I call the intensional view of computers, avoids the problems of analogies between minds and computers. It focuses on the constitutive properties of computers, such as showing clearly how they can help us avoid the infinite regress in interpretation, and how we can clarify the terms of the suggested mechanisms to facilitate a useful debate. Within the intensional view, syntax and content in the context of computers become two ends of physically realizing correspondence problems in various domains. (shrink)
Any computer can create a model of reality. The hypothesis that quantum computer can generate such a model designated as quantum, which coincides with the modeled reality, is discussed. Its reasons are the theorems about the absence of “hidden variables” in quantum mechanics. The quantum modeling requires the axiom of choice. The following conclusions are deduced from the hypothesis. A quantum model unlike a classical model can coincide with reality. Reality can be interpreted as a quantum computer. The physical processes (...) represent computations of the quantum computer. Quantum information is the real fundament of the world. The conception of quantum computer unifies physics and mathematics and thus the material and the ideal world. Quantum computer is a non-Turing machine in principle. Any quantum computing can be interpreted as an infinite classical computational process of a Turing machine. Quantum computer introduces the notion of “actually infinite computational process”. The discussed hypothesis is consistent with all quantum mechanics. The conclusions address a form of neo-Pythagoreanism: Unifying the mathematical and physical, quantum computer is situated in an intermediate domain of their mutual transformation. (shrink)
Natural argument is represented as the limit, to which an infinite Turing process converges. A Turing machine, in which the bits are substituted with qubits, is introduced. That quantum Turing machine can recognize two complementary natural arguments in any data. That ability of natural argument is interpreted as an intellect featuring any quantum computer. The property is valid only within a quantum computer: To utilize it, the observer should be sited inside it. Being outside it, the observer would obtain (...) quite different result depending on the degree of the entanglement of the quantum computer and observer. All extraordinary properties of a quantum computer are due to involving a converging infinite computational process contenting necessarily both a continuous advancing calculation and a leap to the limit. Three types of quantum computation can be distinguished according to whether the series is a finite one, an infinite rational or irrational number. -/- . (shrink)
A comment on Paul Schoemaker's target article in Behavioral and Brain Sciences, 14 (1991), p. 205-215, "The Quest for Optimality: A Positive Heuristic of Science?" (https://doi.org/10.1017/S0140525X00066140). This comment argues that the optimizing model of decision leads to an infinite regress, once internal costs of decision (i.e., information and computation costs) are duly taken into account.
The notion of an ideal reasoner has several uses in epistemology. Often, ideal reasoners are used as a parameter of (maximum) rationality for finite reasoners (e.g. humans). However, the notion of an ideal reasoner is normally construed in such a high degree of idealization (e.g. infinite/unbounded memory) that this use is unadvised. In this dissertation, I investigate the conditions under which an ideal reasoner may be used as a parameter of rationality for finite reasoners. In addition, I present and (...) justify the research program of computational epistemology, which investigates the parameter of maximum rationality for finite reasoners using computer simulations. (shrink)
2nd edition. Many-valued logics are those logics that have more than the two classical truth values, to wit, true and false; in fact, they can have from three to infinitely many truth values. This property, together with truth-functionality, provides a powerful formalism to reason in settings where classical logic—as well as other non-classical logics—is of no avail. Indeed, originally motivated by philosophical concerns, these logics soon proved relevant for a plethora of applications ranging from switching theory to cognitive modeling, and (...) they are today in more demand than ever, due to the realization that inconsistency and vagueness in knowledge bases and information processes are not only inevitable and acceptable, but also perhaps welcome. The main modern applications of (any) logic are to be found in the digital computer, and we thus require the practical knowledge how to computerize—which also means automate—decisions (i.e. reasoning) in many-valued logics. This, in turn, necessitates a mathematical foundation for these logics. This book provides both these mathematical foundation and practical knowledge in a rigorous, yet accessible, text, while at the same time situating these logics in the context of the satisfiability problem (SAT) and automated deduction. The main text is complemented with a large selection of exercises, a plus for the reader wishing to not only learn about, but also do something with, many-valued logics. (shrink)
The halting theorem counter-examples present infinitely nested simulation (non-halting) behavior to every simulating halt decider. Whenever the pure simulation of the input to simulating halt decider H(x,y) never stops running unless H aborts its simulation H correctly aborts this simulation and returns 0 for not halting.
A new computational methodology allowing one to work in a new way with infinities and infinitesimals is presented in this paper. The new approach, among other things, gives the possibility to calculate the number of elements of certain infinite sets, avoids indeterminate forms and various kinds of divergences. This methodology has been used by the author as a starting point in developing a new kind of computer – the Infinity Computer – able to execute computations and to store in (...) its memory not only finite numbers but also infinite and infinitesimal ones. (shrink)
This is an explanation of a key new insight into the halting problem provided in the language of software engineering. Technical computer science terms are explained using software engineering terms. -/- To fully understand this paper a software engineer must be an expert in the C programming language, the x86 programming language, exactly how C translates into x86 and what an x86 process emulator is. No knowledge of the halting problem is required.
This dissertation is a contribution to formal and computational philosophy. -/- In the first part, we show that by exploiting the parallels between large, yet finite lotteries on the one hand and countably infinite lotteries on the other, we gain insights in the foundations of probability theory as well as in epistemology. Case 1: Infinite lotteries. We discuss how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. The solution boils down (...) to the introduction of infinitesimal probability values, which can be achieved using non-standard analysis. Our solution can be generalized to uncountable sample spaces, giving rise to a Non-Archimedean Probability (NAP) theory. Case 2: Large but finite lotteries. We propose application of the language of relative analysis (a type of non-standard analysis) to formulate a new model for rational belief, called Stratified Belief. This contextualist model seems well-suited to deal with a concept of beliefs based on probabilities ‘sufficiently close to unity’. -/- The second part presents a case study in social epistemology. We model a group of agents who update their opinions by averaging the opinions of other agents. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating. To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. The probability of ending up in an inconsistent belief state turns out to be always smaller than 2%. (shrink)
The goal of this paper consists of developing a new (more physical and numerical in comparison with standard and non-standard analysis approaches) point of view on Calculus with functions assuming infinite and infinitesimal values. It uses recently introduced infinite and infinitesimal numbers being in accordance with the principle ‘The part is less than the whole’ observed in the physical world around us. These numbers have a strong practical advantage with respect to traditional approaches: they are representable at a (...) new kind of a computer – the Infinity Computer – able to work numerically with all of them. An introduction to the theory of physical and mathematical continuity and differentiation (including subdifferentials) for functions assuming finite, infinite, and infinitesimal values over finite, infinite, and infinitesimal domains is developed in the paper. This theory allows one to work with derivatives that can assume not only finite but infinite and infinitesimal values, as well. It is emphasized that the newly introduced notion of the physical continuity allows one to see the same mathematical object as a continuous or a discrete one, in dependence on the wish of the researcher, i.e., as it happens in the physical world where the same object can be viewed as a continuous or a discrete in dependence on the instrument of the observation used by the researcher. Connections between pure mathematical concepts and their computational realizations are continuously emphasized through the text. Numerous examples are given. (shrink)
A Simulating Halt Decider (SHD) computes the mapping from its input to its own accept or reject state based on whether or not the input simulated by a UTM would reach its final state in a finite number of simulated steps. -/- A halt decider (because it is a decider) must report on the behavior specified by its finite string input. This is its actual behavior when it is simulated by the UTM contained within its simulating halt decider while this (...) SHD remains in UTM mode. (shrink)
There exist many applications where it is necessary to approximate numerically derivatives of a function which is given by a computer procedure. In particular, all the fields of optimization have a special interest in such a kind of information. In this paper, a new way to do this is presented for a new kind of a computer - the Infinity Computer - able to work numerically with finite, infinite, and infinitesimal number. It is proved that the Infinity Computer is (...) able to calculate values of derivatives of a higher order for a wide class of functions represented by computer procedures. It is shown that the ability to compute derivatives of arbitrary order automatically and accurate to working precision is an intrinsic property of the Infinity Computer related to its way of functioning. Numerical examples illustrating the new concepts and numerical tools are given. (shrink)
The paper considers a new type of objects – blinking fractals – that are not covered by traditional theories studying dynamics of self-similarity processes. It is shown that the new approach allows one to give various quantitative characteristics of the newly introduced and traditional fractals using infinite and infinitesimal numbers proposed recently. In this connection, the problem of the mathematical modelling of continuity is discussed in detail. A strong advantage of the introduced computational paradigm consists of its well-marked numerical (...) character and its own instrument – Infinity Computer – able to execute operations with infinite and infinitesimal numbers. (shrink)
A recently developed computational methodology for executing numerical calculations with infinities and infinitesimals is described in this paper. The approach developed has a pronounced applied character and is based on the principle “The part is less than the whole” introduced by the ancient Greeks. This principle is applied to all numbers (finite, infinite, and infinitesimal) and to all sets and processes (finite and infinite). The point of view on infinities and infinitesimals (and in general, on Mathematics) presented in (...) this paper uses strongly physical ideas emphasizing interrelations that hold between a mathematical object under observation and the tools used for this observation. It is shown how a new numeral system allowing one to express different infinite and infinitesimal quantities in a unique framework can be used for theoretical and computational purposes. Numerous examples dealing with infinite sets, divergent series, limits, and probability theory are given. (shrink)
The Koch snowflake is one of the first fractals that were mathematically described. It is interesting because it has an infinite perimeter in the limit but its limit area is finite. In this paper, a recently proposed computational methodology allowing one to execute numerical computations with infinities and infinitesimals is applied to study the Koch snowflake at infinity. Numerical computations with actual infinite and infinitesimal numbers can be executed on the Infinity Computer being a new supercomputer patented in (...) USA and EU. It is revealed in the paper that at infinity the snowflake is not unique, i.e., different snowflakes can be distinguished for different infinite numbers of steps executed during the process of their generation. It is then shown that for any given infinite number n of steps it becomes possible to calculate the exact infinite number, Nn, of sides of the snowflake, the exact infinitesimal length, Ln, of each side and the exact infinite perimeter, Pn, of the Koch snowflake as the result of multiplication of the infinite Nn by the infinitesimal Ln. It is established that for different infinite n and k the infinite perimeters Pn and Pk are also different and the difference can be infinite. It is shown that the finite areas An and Ak of the snowflakes can be also calculated exactly (up to infinitesimals) for different infinite n and k and the difference An − Ak results to be infinitesimal. Finally, snowflakes constructed starting from different initial conditions are also studied and their quantitative characteristics at infinity are computed. (shrink)
A set theory model of reality, representation and language based on the relation of completeness and incompleteness is explored. The problem of completeness of mathematics is linked to its counterpart in quantum mechanics. That model includes two Peano arithmetics or Turing machines independent of each other. The complex Hilbert space underlying quantum mechanics as the base of its mathematical formalism is interpreted as a generalization of Peano arithmetic: It is a doubled infinite set of doubled Peano arithmetics having a (...) remarkable symmetry to the axiom of choice. The quantity of information is interpreted as the number of elementary choices (bits). Quantum information is seen as the generalization of information to infinite sets or series. The equivalence of that model to a quantum computer is demonstrated. The condition for the Turing machines to be independent of each other is reduced to the state of Nash equilibrium between them. Two relative models of language as game in the sense of game theory and as ontology of metaphors (all mappings, which are not one-to-one, i.e. not representations of reality in a formal sense) are deduced. (shrink)
The paper addresses Leon Hen.kin's proposition as a " lighthouse", which can elucidate a vast territory of knowledge uniformly: logic, set theory, information theory, and quantum mechanics: Two strategies to infinity are equally relevant for it is as universal and t hus complete as open and thus incomplete. Henkin's, Godel's, Robert Jeroslow's, and Hartley Rogers' proposition are reformulated so that both completeness and incompleteness to be unified and thus reduced as a joint property of infinity and of all infinite (...) sets. However, only Henkin's proposition equivalent to an internal position to infinity is consistent . This can be retraced back to set theory and its axioms, where that of choice is a key. Quantum mechanics is forced to introduce infinity implicitly by Hilbert space, on which is founded its formalism. One can demonstrate that some essential properties of quantum information, entanglement, and quantum computer originate directly from infinity once it is involved in quantum mechanics. Thus, these phenomena can be elucidated as both complete and incomplete, after which choice is the border between them. A special kind of invariance to the axiom of choice shared by quantum mechanics is discussed to be involved that border between the completeness and incompleteness of infinity in a consistent way. The so-called paradox of Albert Einstein, Boris Podolsky, and Nathan Rosen is interpreted entirely in the same terms only of set theory. Quantum computer can demonstrate especially clearly the privilege of the internal position, or " observer'' , or "user" to infinity implied by Henkin's proposition as the only consistent ones as to infinity. An essential area of contemporary knowledge may be synthesized from a single viewpoint. (shrink)
The Turing machine is one of the simple abstract computational devices that can be used to investigate the limits of computability. In this paper, they are considered from several points of view that emphasize the importance and the relativity of mathematical languages used to describe the Turing machines. A deep investigation is performed on the interrelations between mechanical computations and their mathematical descriptions emerging when a human (the researcher) starts to describe a Turing machine (the object of the study) by (...) different mathematical languages (the instruments of investigation). Together with traditional mathematical languages using such concepts as ‘enumerable sets’ and ‘continuum’ a new computational methodology allowing one to measure the number of elements of different infinite sets is used in this paper. It is shown how mathematical languages used to describe the machines limit our possibilities to observe them. In particular, notions of observable deterministic and non-deterministic Turing machines are introduced and conditions ensuring that the latter can be simulated by the former are established. (shrink)
There exists a huge number of numerical methods that iteratively construct approximations to the solution y(x) of an ordinary differential equation (ODE) y′(x) = f(x,y) starting from an initial value y_0=y(x_0) and using a finite approximation step h that influences the accuracy of the obtained approximation. In this paper, a new framework for solving ODEs is presented for a new kind of a computer – the Infinity Computer (it has been patented and its working prototype exists). The new computer is (...) able to work numerically with finite, infinite, and infinitesimal numbers giving so the possibility to use different infinitesimals numerically and, in particular, to take advantage of infinitesimal values of h. To show the potential of the new framework a number of results is established. It is proved that the Infinity Computer is able to calculate derivatives of the solution y(x) and to reconstruct its Taylor expansion of a desired order numerically without finding the respective derivatives analytically (or symbolically) by the successive derivation of the ODE as it is usually done when the Taylor method is applied. Methods using approximations of derivatives obtained thanks to infinitesimals are discussed and a technique for an automatic control of rounding errors is introduced. Numerical examples are given. (shrink)
In this paper, a number of traditional models related to the percolation theory has been considered by means of new computational methodology that does not use Cantor’s ideas and describes infinite and infinitesimal numbers in accordance with the principle ‘The part is less than the whole’. It gives a possibility to work with finite, infinite, and infinitesimal quantities numerically by using a new kind of a compute - the Infinity Computer – introduced recently in [18]. The new approach (...) does not contradict Cantor. In contrast, it can be viewed as an evolution of his deep ideas regarding the existence of different infinite numbers in a more applied way. Site percolation and gradient percolation have been studied by applying the new computational tools. It has been established that in an infinite system the phase transition point is not really a point as with respect of traditional approach. In light of new arithmetic it appears as a critical interval, rather than a critical point. Depending on “microscope” we use this interval could be regarded as finite, infinite and infinitesimal short interval. Using new approach we observed that in vicinity of percolation threshold we have many different infinite clusters instead of one infinite cluster that appears in traditional consideration. (shrink)
New algorithms for the numerical solution of Ordinary Differential Equations (ODEs) with initial condition are proposed. They are designed for work on a new kind of a supercomputer – the Infinity Computer, – that is able to deal numerically with finite, infinite and infinitesimal numbers. Due to this fact, the Infinity Computer allows one to calculate the exact derivatives of functions using infinitesimal values of the stepsize. As a consequence, the new methods described in this paper are able to (...) work with the exact values of the derivatives, instead of their approximations. (shrink)
This book tries to present math to the millions and does a pretty good job. It is simple and sometimes witty but often the literary allusions intrude and the text bogs down in pages of relentless math--lovely if you like it and horrid if you don´t. If you already know alot of math you will still probably find the discussions of general math, geometry, projective geometry, and infinite series to be a nice refresher. If you don´t know any and (...) don´t have a natural talent for it, you will find it very dense or impossible. Being somewhere in the middle I skimmed thru most of it and slowed down when it got interesting. If you have only a little time I would suggest the last chapter ´The Abyss` about Georg Cantor and transfinite arithmetic. -/- At points they wax philosophical and ask the perennial question: is math is out there in the world or in here in our heads. Why not ask this about art or music or literature or computer programs or philosophy itself? In a very general way math must come from the same place that words and ideas and images come from---our brain evolved to make them and they must in many ways (every way?) reflect the structure of our brains, which reside in our DNA, which was shaped by natural selection, which was shaped by the geology of the earth and the structure of our universe, which comes from particle physics which comes from the laws of nature which are just there. -/- I have written extensively on the nature of math and language and mind and how they are all one in my many other reviews so please see them if these topics interest you. Those interested in all my writings in their most recent versions may consult my e-book Philosophy, Human Nature and the Collapse of Civilization - Articles and Reviews 2006-2016 662p (2016). -/- All of my papers and books have now been published in revised versions both in ebooks and in printed books. -/- Talking Monkeys: Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet - Articles and Reviews 2006-2017 (2017) https://www.amazon.com/dp/B071HVC7YP. -/- The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle--Articles and Reviews 2006-2016 (2017) https://www.amazon.com/dp/B071P1RP1B. -/- Suicidal Utopian Delusions in the 21st century: Philosophy, Human Nature and the Collapse of Civilization - Articles and Reviews 2006-2017 (2017) https://www.amazon.com/dp/B0711R5LGX -/- . (shrink)
If the computational theory of mind is right, then minds are realized by machines. There is an ordered complexity hierarchy of machines. Some finite machines realize finitely complex minds; some Turing machines realize potentially infinitely complex minds. There are many logically possible machines whose powers exceed the Church–Turing limit (e.g. accelerating Turing machines). Some of these supermachines realize superminds. Superminds perform cognitive supertasks. Their thoughts are formed in infinitary languages. They perceive and manipulate the infinite detail of fractal objects. (...) They have infinitely complex bodies. Transfinite games anchor their social relations. (shrink)
Quantum computer is considered as a generalization of Turing machine. The bits are substituted by qubits. In turn, a "qubit" is the generalization of "bit" referring to infinite sets or series. It extends the consept of calculation from finite processes and algorithms to infinite ones, impossible as to any Turing machines (such as our computers). However, the concept of quantum computer mets all paradoxes of infinity such as Gödel's incompletness theorems (1931), etc. A philosophical reflection on how quantum (...) computer might implement the idea of "infinite calculation" is the main subject. (shrink)
Analysis is given of the Omega Point cosmology, an extensively peer-reviewed proof (i.e., mathematical theorem) published in leading physics journals by professor of physics and mathematics Frank J. Tipler, which demonstrates that in order for the known laws of physics to be mutually consistent, the universe must diverge to infinite computational power as it collapses into a final cosmological singularity, termed the Omega Point. The theorem is an intrinsic component of the Feynman-DeWitt-Weinberg quantum gravity/Standard Model Theory of Everything (TOE) (...) describing and unifying all the forces in physics, of which itself is also required by the known physical laws. With infinite computational resources, the dead can be resurrected--never to die again--via perfect computer emulation of the multiverse from its start at the Big Bang. Miracles are also physically allowed via electroweak quantum tunneling controlled by the Omega Point cosmological singularity. The Omega Point is a different aspect of the Big Bang cosmological singularity--the first cause--and the Omega Point has all the haecceities claimed for God in the traditional religions. -/- From this analysis, conclusions are drawn regarding the social, ethical, economic and political implications of the Omega Point cosmology. (shrink)
This paper investigates the view that digital hypercomputing is a good reason for rejection or re-interpretation of the Church-Turing thesis. After suggestion that such re-interpretation is historically problematic and often involves attack on a straw man (the ‘maximality thesis’), it discusses proposals for digital hypercomputing with Zeno-machines , i.e. computing machines that compute an infinite number of computing steps in finite time, thus performing supertasks. It argues that effective computing with Zeno-machines falls into a dilemma: either they are specified (...) such that they do not have output states, or they are specified such that they do have output states, but involve contradiction. Repairs though non-effective methods or special rules for semi-decidable problems are sought, but not found. The paper concludes that hypercomputing supertasks are impossible in the actual world and thus no reason for rejection of the Church-Turing thesis in its traditional interpretation. (shrink)
A definition of quantum computer is supposed: as a countable set of Turing machines on the ground of: quantum parallelism, reversibility, entanglement. Qubit is the set of all the i–th binary location cells transforming in parallel by unitary matrices. The Church thesis is suggested in the form relevat to quantum computer. The notion of the non–finite (but not infinite) potency of a set is introduced .
This monography provides an overview of the conceptual developments that leads from the traditional views of infinite (and their paradoxes) to the contemporary view in which those old paradoxes are solved but new problems arise. Also a particular insight in the problem of continuity is given, followed by applications in theory of computability.
Let f(1)=2, f(2)=4, and let f(n+1)=f(n)! for every integer n≥2. Edmund Landau's conjecture states that the set P(n^2+1) of primes of the form n^2+1 is infinite. Landau's conjecture implies the following unproven statement Φ: card(P(n^2+1))<ω ⇒ P(n^2+1)⊆[2,f(7)]. Let B denote the system of equations: {x_j!=x_k: i,k∈{1,...,9}}∪{x_i⋅x_j=x_k: i,j,k∈{1,...,9}}. The system of equations {x_1!=x_1, x_1 \cdot x_1=x_2, x_2!=x_3, x_3!=x_4, x_4!=x_5, x_5!=x_6, x_6!=x_7, x_7!=x_8, x_8!=x_9} has exactly two solutions in positive integers x_1,...,x_9, namely (1,...,1) and (f(1),...,f(9)). No known system S⊆B with a (...) finite number of solutions in positive integers x_1,...,x_9 has a solution (x_1,...,x_9)∈(N\{0})^9 satisfying max(x_1,...,x_9)>f(9). For every known system S⊆B, if the finiteness/infiniteness of the set {(x_1,...,x_9)∈(N\{0})^9: (x_1,...,x_9) solves S} is unknown, then the statement ∃ x_1,...,x_9∈N\{0} ((x_1,...,x_9) solves S)∧(max(x_1,...,x_9)>f(9)) remains unproven. Let Λ denote the statement: if the system of equations {x_2!=x_3, x_3!=x_4, x_5!=x_6, x_8!=x_9, x_1 \cdot x_1=x_2, x_3 \cdot x_5=x_6, x_4 \cdot x_8=x_9, x_5 \cdot x_7=x_8} has at most finitely many solutions in positive integers x_1,...,x_9, then each such solution (x_1,...,x_9) satisfies x_1,...,x_9≤f(9). The statement Λ is equivalent to the statement Φ. It heuristically justifies the statement Φ . This justification does not yield the finiteness/infiniteness of P(n^2+1). We present a new heuristic argument for the infiniteness of P(n^2+1), which is not based on the statement Φ. Algorithms always terminate. We explain the distinction between existing algorithms (i.e. algorithms whose existence is provable in ZFC) and known algorithms (i.e. algorithms whose definition is constructive and currently known). Assuming that the infiniteness of a set X⊆N is false or unproven, we define which elements of X are classified as known. No known set X⊆N satisfies Conditions (1)-(4) and is widely known in number theory or naturally defined, where this term has only informal meaning. *** (1) A known algorithm with no input returns an integer n satisfying card(X)<ω ⇒ X⊆(-∞,n]. (2) A known algorithm for every k∈N decides whether or not k∈X. (3) No known algorithm with no input returns the logical value of the statement card(X)=ω. (4) There are many elements of X and it is conjectured, though so far unproven, that X is infinite. (5) X is naturally defined. The infiniteness of X is false or unproven. X has the simplest definition among known sets Y⊆N with the same set of known elements. *** Conditions (2)-(5) hold for X=P(n^2+1). The statement Φ implies Condition (1) for X=P(n^2+1). The set X={n∈N: the interval [-1,n] contains more than 29.5+\frac{11!}{3n+1}⋅sin(n) primes of the form k!+1} satisfies Conditions (1)-(5) except the requirement that X is naturally defined. 501893∈X. Condition (1) holds with n=501893. card(X∩[0,501893])=159827. X∩[501894,∞)= {n∈N: the interval [-1,n] contains at least 30 primes of the form k!+1}. We present a table that shows satisfiable conjunctions of the form #(Condition 1) ∧ (Condition 2) ∧ #(Condition 3) ∧ (Condition 4) ∧ #(Condition 5), where # denotes the negation ¬ or the absence of any symbol. No set X⊆N will satisfy Conditions (1)-(4) forever, if for every algorithm with no input, at some future day, a computer will be able to execute this algorithm in 1 second or less. The physical limits of computation disprove this assumption. (shrink)
I use modal logic and transfinite set-theory to define metaphysical foundations for a general theory of computation. A possible universe is a certain kind of situation; a situation is a set of facts. An algorithm is a certain kind of inductively defined property. A machine is a series of situations that instantiates an algorithm in a certain way. There are finite as well as transfinite algorithms and machines of any degree of complexity (e.g., Turing and super-Turing machines and more). (...) There are physically and metaphysically possible machines. There is an iterative hierarchy of logically possible machines in the iterative hierarchy of sets. Some algorithms are such that machines that instantiate them are minds. So there is an iterative hierarchy of finitely and transfinitely complex minds. (shrink)
The paper investigates how the mathematical languages used to describe and to observe automatic computations influence the accuracy of the obtained results. In particular, we focus our attention on Single and Multi-tape Turing machines which are described and observed through the lens of a new mathematical language which is strongly based on three methodological ideas borrowed from Physics and applied to Mathematics, namely: the distinction between the object (we speak here about a mathematical object) of an observation and the instrument (...) used for this observation; interrelations holding between the object and the tool used for the observation; the accuracy of the observation determined by the tool. Results of the observation executed by the traditional and new languages are compared and discussed. (shrink)
A computational methodology called Grossone Infinity Computing introduced with the intention to allow one to work with infinities and infinitesimals numerically has been applied recently to a number of problems in numerical mathematics (optimization, numerical differentiation, numerical algorithms for solving ODEs, etc.). The possibility to use a specially developed computational device called the Infinity Computer (patented in USA and EU) for working with infinite and infinitesimal numbers numerically gives an additional advantage to this approach in comparison with traditional methodologies (...) studying infinities and infinitesimals only symbolically. The grossone methodology uses the Euclid’s Common Notion no. 5 ‘The whole is greater than the part’ and applies it to finite, infinite, and infinitesimal quantities and to finite and infinite sets and processes. It does not contradict Cantor’s and non-standard analysis views on infinity and can be considered as an applied development of their ideas. In this paper we consider infinite series and a particular attention is dedicated to divergent series with alternate signs. The Riemann series theorem states that conditionally convergent series can be rearranged in such a way that they either diverge or converge to an arbitrary real number. It is shown here that Riemann’s result is a consequence of the fact that symbol ∞ used traditionally does not allow us to express quantitatively the number of addends in the series, in other words, it just shows that the number of summands is infinite and does not allows us to count them. The usage of the grossone methodology allows us to see that (as it happens in the case where the number of addends is finite) rearrangements do not change the result for any sum with a fixed infinite number of summands. There are considered some traditional summation techniques such as Ramanujan summation producing results where to divergent series containing infinitely many positive integers negative results are assigned. It is shown that the careful counting of the number of addends in infinite series allows us to avoid this kind of results if grossone-based numerals are used. (shrink)
It Was the 15th of December.James Bardis - 2016 - Jpeg Cover-Dubai2016 ISSN: 2189-1036 – The IAFOR International Conference on Education – Dubai – 2016 Official Conference Proceedings:87-93.details
A reflection on the merits of an a priori poeto-epistemology in relation to tacitly held assumptions about the a fortiori validity of computational logic to transcend the limits of contradiction and infinite regression and establish a valid ontology.
This paper is concerned with learners who aim to learn patterns in infinite binary sequences: shown longer and longer initial segments of a binary sequence, they either attempt to predict whether the next bit will be a 0 or will be a 1 or they issue forecast probabilities for these events. Several variants of this problem are considered. In each case, a no-free-lunch result of the following form is established: the problem of learning is a formidably difficult one, in (...) that no matter what method is pursued, failure is incomparably more common that success; and difficult choices must be faced in choosing a method of learning, since no approach dominates all others in its range of success. In the simplest case, the comparison of the set of situations in which a method fails and the set of situations in which it succeeds is a matter of cardinality (countable vs. uncountable); in other cases, it is a topological matter (meagre vs. co-meagre) or a hybrid computational-topological matter (effectively meagre vs. effectively co-meagre). (shrink)
Natural recursion in syntax is recursion by linguistic value, which is not syntactic in nature but semantic. Syntax-specific recursion is not recursion by name as the term is understood in theoretical computer science. Recursion by name is probably not natural because of its infinite typeability. Natural recursion, or recursion by value, is not species-specific. Human recursion is not syntax-specific. The values on which it operates are most likely domain-specific, including those for syntax. Syntax seems to require no more (and (...) no less) than the resource management mechanisms of an embedded push-down automaton (EPDA). We can conceive EPDA as a common automata-theoretic substrate for syntax, collaborative planning, i-intentions, and we-intentions. They manifest the same kind of dependencies. Therefore, syntactic uniqueness arguments for human behavior can be better explained if we conceive automata-constrained recursion as the most unique human capacity for cognitive processes. (shrink)
This article would appeal to people interested in new ideas in sciences like physics, astronomy and mathematics that are not presented in a formal manner. -/- Biologists would also find the paragraphs about evolution interesting. I was afraid they'd think my ideas were a bit "out there". But I sent a short email about them last year to a London biologist who wrote an article for the journal Nature. She replied that it was "very interesting". -/- The world is fascinated (...) by electronics. Computer scientists, as well as computer buyers, would be intrigued by the fundamental role given to human electronics in creation of the universe. This obviously can only be done if time travel to the past is possible. I explain in scientific terms how it could be done (the world is also fascinated by the prospect of time travel). -/- My ideas on trips through time grew from the related topic of interstellar, and even intergalactic, travel (and those ideas are inspired by an electrical-engineering experiment at Yale University in 2009). After the ideas on time travel came the realization that this technology could be used to totally eliminate the problems of muscle and bone weakness, radiation exposure etc associated with a lengthy journey to Mars. -/- The exquisitely ordered cosmos proposed would have great appeal to religion and philosophy. Dealing as it does with time that does not exclusively operate in a straight line, the book could not only present a new view of evolution (present theory assumes time is always a straight line from past to future). Nonlinear time might also give religionists a new concept of who God is. This could possibly be that of humans from the remote future who are quantum entangled with all past, present and future states of the whole - infinite and eternal - universe; and thus have all God's powers. Such infinite power could be pantheistic but would naturally include the ability to manifest as an individual. (I know this article is very far removed from what is traditionally considered scientific. Just remember: science is the search for knowledge of how this universe works, and that search must be pursued wherever it leads - even if it leads into traditionally nonscientific areas such as religion.) -/- Finally - if we're entangled with the whole universe, we'd have to be entangled with each other. On a mundane level, this gives us extrasensory and telekinetic abilities. On a higher level, it eliminates crime and war and domestic violence since people don't normally desire to harm themselves in any way. (shrink)
A Monograph Dealing With Unification In Relation To Dark Energy, Dark Matter, Cosmic Expansion, E=mc2, Quantum Gravity, "Imaginary" Computers, Creation Of The Infinite And Eternal Universe Using Electronic BITS + PI + "Imaginary" Time, Earthly Education, Science-Religion Union, The Human Condition, Superconductivity, Planetary Fields, How Gravitation Can Boost Health, Space-Time Propulsion From The Emdrive To The Brouwer Fixed-Point Theorem, "Light Matter", Etc. These Effects Were Originally Discussed In Several Short Internet Articles. Table Of Contents Introduction Superconductivity And Planetary Magnetic (...) / Electric Fields Co-Movement Of Photons And Graviton General Relativity Deletes Dark Energy, Dark Matter And Universal Expansion The Relation Of The Higgs Field To Gravity Spin Interactions And Making Bosons Or Fermions The Final Missing Steps In E=mc2 What Will Education Be Like In 2049? Learn By Holographic Teachers Using Quantum Mechanics, "Imaginary" Computers And A Unification Of Physics That Will Bring Education To Everyone, Everywhere Hypotheses Supporting Gravitation As A Push - (1) M-Sigma, The Non-Fundamental Nuclear Forces (2) Geysers On Saturn's Moon Enceladus (3) Gravity, Falling Bodies (4) Earth's Tides, Astronomical Unit, Cosmic Backgrounds A Proposal For The True Human Condition That Reconciles Science With Religion Back To The Moon And On To The Stars Normalising Patients With Gravitation. (shrink)
In this paper, a metaphysics is proposed that includes everything that can be represented by a well-founded multiset. It is shown that this metaphysics, apart from being self-explanatory, is also benevolent. Paradoxically, it turns out that the probability that we were born in another life than our own is zero. More insights are gained by inducing properties from a metaphysics that is not self-explanatory. In particular, digital metaphysics is analyzed, which claims that only computable things exist. First of all, it (...) is shown that digital metaphysics contradicts itself by leading to the conclusion that the shortest computer program that computes the world is infinitely long. This means that the Church-Turing conjecture must be false. Secondly, the applicability of Occam’s razor is explained by evolution: in an evolving physics it can appear at each moment as if the world is caused by only finitely many things. Thirdly and most importantly, this metaphysics is benevolent in the sense that it organizes itself to fulfill the deepest wishes of its observers. Fourthly, universal computers with an infinite memory capacity cannot be built in the world. And finally, all the properties of the world, both good and bad, can be explained by evolutionary conservation. (shrink)
The problem of approximating a propositional calculus is to find many-valued logics which are sound for the calculus (i.e., all theorems of the calculus are tautologies) with as few tautologies as possible. This has potential applications for representing (computationally complex) logics used in AI by (computationally easy) many-valued logics. It is investigated how far this method can be carried using (1) one or (2) an infinite sequence of many-valued logics. It is shown that the optimal candidate matrices for (1) (...) can be computed from the calculus. (shrink)
In this survey, a recent computational methodology paying a special attention to the separation of mathematical objects from numeral systems involved in their representation is described. It has been introduced with the intention to allow one to work with infinities and infinitesimals numerically in a unique computational framework in all the situations requiring these notions. The methodology does not contradict Cantor’s and non-standard analysis views and is based on the Euclid’s Common Notion no. 5 “The whole is greater than the (...) part” applied to all quantities (finite, infinite, and infinitesimal) and to all sets and processes (finite and infinite). The methodology uses a computational device called the Infinity Computer (patented in USA and EU) working numerically (recall that traditional theories work with infinities and infinitesimals only symbolically) with infinite and infinitesimal numbers that can be written in a positional numeral system with an infinite radix. It is argued that numeral systems involved in computations limit our capabilities to compute and lead to ambiguities in theoretical assertions, as well. The introduced methodology gives the possibility to use the same numeral system for measuring infinite sets, working with divergent series, probability, fractals, optimization problems, numerical differentiation, ODEs, etc. (recall that traditionally different numerals lemniscate; Aleph zero, etc. are used in different situations related to infinity). Numerous numerical examples and theoretical illustrations are given. The accuracy of the achieved results is continuously compared with those obtained by traditional tools used to work with infinities and infinitesimals. In particular, it is shown that the new approach allows one to observe mathematical objects involved in the Hypotheses of Continuum and the Riemann zeta function with a higher accuracy than it is done by traditional tools. It is stressed that the hardness of both problems is not related to their nature but is a consequence of the weakness of traditional numeral systems used to study them. It is shown that the introduced methodology and numeral system change our perception of the mathematical objects studied in the two problems. (shrink)
Several ways used to rank countries with respect to medals won during Olympic Games are discussed. In particular, it is shown that the unofficial rank used by the Olympic Committee is the only rank that does not allow one to use a numerical counter for ranking – this rank uses the lexicographic ordering to rank countries: one gold medal is more precious than any number of silver medals and one silver medal is more precious than any number of bronze medals. (...) How can we quantify what do these words, more precious, mean? Can we introduce a counter that for any possible number of medals would allow us to compute a numerical rank of a country using the number of gold, silver, and bronze medals in such a way that the higher resulting number would put the country in the higher position in the rank? Here we show that it is impossible to solve this problem using the positional numeral system with any finite base. Then we demonstrate that this problem can be easily solved by applying numerical computations with recently developed actual infinite numbers. These computations can be done on a new kind of a computer – the recently patented Infinity Computer. Its working software prototype is described briefly and examples of computations are given. It is shown that the new way of counting can be used in all situations where the lexicographic ordering is required. (shrink)
In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. (...) The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines. (shrink)
In this article, some classical paradoxes of infinity such as Galileo’s paradox, Hilbert’s paradox of the Grand Hotel, Thomson’s lamp paradox, and the rectangle paradox of Torricelli are considered. In addition, three paradoxes regarding divergent series and a new paradox dealing with multiplication of elements of an infinite set are also described. It is shown that the surprising counting system of an Amazonian tribe, Pirah ̃a, working with only three numerals (one, two, many) can help us to change our (...) perception of these paradoxes. A recently introduced methodology allowing one to work with finite, infinite, and infinitesimal numbers in a unique computational framework not only theoretically but also numerically is briefly described. This methodology is actively used nowadays in numerous applications in pure and applied mathematics and computer science as well as in teaching. It is shown in the article that this methodology also allows one to consider the paradoxes listed above in a new constructive light. (shrink)
When this article was first planned, writing was going to be exclusively about two things - the origin of life and human evolution. But it turned out to be out of the question for the author to restrict himself to these biological and anthropological topics. A proper understanding of them required answering questions like “What is the nature of the universe – the home of life – and how did it originate?”, “How can time travel be removed from fantasy and (...) science fiction, to be made scientific and practical?”, and “How can the proposed young age of genus Homo be made to actually be reasonable – when simply stating it would be solid ground for instant rejection and dismissal?” The result is that the article also talks about subjects like Artificial Intelligence, General Relativity, and cosmology. -/- From where did life originate? God? Evolution? Panspermia? If the tendency of humans and scientists to regard undiscovered science as pseudoscience can be overcome, Einstein gave another alternative to consider when he introduced General Relativity. Time isn’t linear – progressing in a straight line from past to present to future. That assumption ignores Relativity which states that space AND TIME are curved. Where did life and the genetic code come from? Can the answer build AI? -/- The first question can be answered by the section of this article titled SETI, Evolution, and Time which says life (possibly multicellular and intelligent) and the genetic code came from humans acquiring knowledge of these things over the centuries, then applying that knowledge – via terraforming, accumulation of raw materials like amino acids and nucleic acids, genetic engineering - to a time in the past when life didn’t exist. From that origin, life evolved through innumerable mutations and adaptations, with humans once again acquiring knowledge of it in cyclic (nonlinear) time. -/- The second question is answered by saying artificial intelligence (AI) as the product of life is only half of the equation. The other half refers to Relativity’s curved space-time and violation of the notion that time always travels from past to future. We have always lived in an artificially intelligent, non-probabilistic universe where everything in time and space is connected into one thing by quantum entanglement – making the brain and genes products of binary-digit activity or artificial intelligence (life is not merely dependent on biology’s “lock and key” mechanisms but also possesses AI). -/- The earliest documented representative of the genus Homo is Homo habilis, which evolved around 2.8 million years ago. Scientists used to believe there was a straight line from H. habilis to us, Homo sapiens. This article will use the “advanced” waves loved by Physics Nobel laureate Richard Feynman, view the history of science through the lens of Conic Sections applied to Relativity’s curved space-time, and incorporate the necessity of so-called imaginary time * – popularized by Prof. Stephen Hawking. While the evolutionary proposals are more in agreement with this early straight line than with modern theories, Albert Einstein’s General Relativity is used to transform the straight line into a curved line, ultimately concluding that Homo habilis (H. habilis) originated only (and unbelievably, as far as today’s science and technology is concerned) ~250,000 years ago. Other branches and dead ends of Homo – e.g. Neanderthals – are the result of mutations and adaptations, with the resultant modifications to anatomy and physiology. The surprisingly young age of H. habilis allows nearly 200,000 years for habilis, or one of its descendants, to reach Australia … if this country’s indigenous Aboriginal population did, as claimed, reach this “island continent” 60,000 years ago. -/- * The ultraviolet catastrophe, also called the Rayleigh–Jeans catastrophe, is a failure of classical physics to predict observed phenomena: it can be shown that a blackbody - a hypothetical perfect absorber and radiator of energy - would release an infinite amount of energy, contradicting the principles of conservation of energy and indicating that a new model for the behaviour of blackbodies was needed. At the start of the 20th century, physicist Max Planck derived the correct solution by making some strange (for the time) assumptions. In particular, Planck assumed that electromagnetic radiation can only be emitted or absorbed in discrete packets, called quanta. Albert Einstein postulated that Planck's quanta were real physical particles (what we now call photons), not just a mathematical fiction. From there, Einstein developed his explanation of the photoelectric effect (when quanta or photons of light shine on certain metals, electrons are released and can form an electric current). So it appears entirely possible that another supposed mathematical trickery (imaginary time and the y-axis of Wick rotation) will find practical application in the future. -/- The article includes mathematical references to cosmology (spoiler alert – you’ll read about things like Vector-Tensor-Scalar Geometry, topology, the “eternal present”, Einstein’s Unified Field, the inverse-square law, and there being no Big Bang and no multiverse - but there will also be no equations). -/- The other subheadings in this essay are – -/- NONLINEAR TIME AND ELECTRICAL ENGINEERING (about a 2009 electrical engineering experiment at America’s Yale University, and cosmic wormholes) -/- BITS AND TOPOLOGY (base-2 maths aka Binary digiTS, Mobius strips, and figure-8 Klein bottles) -/- WICK ROTATION, CAUSALITY, AND UNITING TIME (do past, present and future co-exist in an “eternal present”?) -/- DIGITAL BRAIN, DIGITAL UNIVERSE (if the brain and the universe are ultimately composed of binary digits, we'll someday be able to do the same things with the brain and universe that we now do with computers) -/- PROPOSAL: HUMAN AND ANIMAL INSTINCTS ARE THE RESULT OF THE UNIVERSE BEING UNIFIED BY BINARY DIGITS (AND TOPOLOGY) (If everything in the universe is ultimately composed of electronic BITS, then the universe must possess Artificial Intelligence - some prefer the term Cosmic Consciousness) -/- INFORMATION THEORY CONQUERS A RED GIANT (preserving Earth by keeping the Sun near today’s level of activity forever). (shrink)
Abstract: At present, much attention is paid to the use of solar energy. Solar energy, in addition to traditional energy, is ecologically clean while changing. The rarity and the rising cost of fuel is one of the main problems of the scientific technique, which is the acquisition of infinite resources of solar energy. Further research and experiments on the use of solar energy, as well as the use of solar power plants in a number of countries, shows that solar (...) energy can be widely used today based on modern technical capacities. In order to determine the need to use a charger, it is necessary to know a number of parameters: short circuit current, no-load voltage, efficiency, etc. In this work, these parameters are estimated from the results of studies of the current-voltage characteristic of a solar panel obtained in automatic mode using a computer and digital measuring device "EPH 2 Advanced photovoltaics trainer". (shrink)
The article aims to substantiate the philosophy of synthesis, which is built on the basis of analysis, but gives it a constructive direction. The turning point from analysis to synthesis is the problematization of the elements identified in the analysis, their criticism, replacement, or rearrangement, leading to the construction of alternative concepts and propositions that expand the field of the thinkable and innovate the categorical apparatus of philosophy. This article provides examples of philosophical synthesis at different levels: alternative terms and (...) concepts (“infinition”), postulates (the “diamond rule” in ethics), and disciplines (“horrology” as the study of the self-destructive mechanisms of civilization). Next, we consider the transition from the philosophy of synthesis to the synthesis of philosophy itself with contemporary scientific and technical practices. Technology of the 21st century is no longer instrumental/utilitarian, but a fundamental technology ("onto–technology"), which, thanks to science’s penetration into the micro- and macrocosm, can change the foundational parameters of being, thereby acquiring a philosophical dimension. Accordingly, philosophy as a study of the most general principles of the universe becomes a practical requirement in any “world-forming,” synthesizing acts of technology, including the design of computer games and multi-populated virtual worlds (e.g., “Second Life”), that involve a new ontology, logic, ethics, and axiology. The vocation of philosophy in the 21st century is not just to comprehend our unique world, but to lay the foundations for new world-forming practices, to initiate and design the ontology of possible worlds, and to pave the way for alternative forms of synthetic life and artificial intelligence. Contrary to Hegel, philosophy is no longer the “owl of Minerva” taking flight at dusk, but a skylark proclaiming the dawn of a new creative day. -/- . (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.