Citations of:
Add citations
You must login to add citations.


Statistical mechanics is a strange theory. Its aims are debated, its methods are contested, its main claims have never been fully proven, and their very truth is challenged, yet at the same time, it enjoys huge empirical success and gives us the feeling that we understand important phenomena. What is this weird theory, exactly? Statistical mechanics is the name of the ongoing attempt to apply mechanics, together with some auxiliary hypotheses, to explain and predict certain phenomena, above all those described (...) 

This is the introduction to the Routledge Companion to Thought Experiments. 

It is usually claimed that in order to assess a thought experiment we should assess the nomological possibility, or realizability in principle, of its scenario. This is undoubtedly true for many TEs, such as Bohr’s reply to Einstein’s photon box. Nevertheless, in some cases, such as Maxwell’s demon, this requirement should be relaxed. Many accounts of TEs fail in this regard. In particular, experimental and some mental model accounts are too strict, since they always require realizability in principle. This paper (...) 

This chapter will review selected aspects of the terrain of discussions about probabilities in statistical mechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statistical mechanics. The reason for this is the same reason that convinced Maxwell, Gibbs, and (...) 

Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statisticalmechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the singleparticle case, the finite particles case, and the (...) 

I claim that one way thought experiments contribute to scientific progress is by increasing scientific understanding. Understanding does not have a currently accepted characterization in the philosophical literature, but I argue that we already have ways to test for it. For instance, current pedagogical practice often requires that students demonstrate being in either or both of the following two states: 1) Having grasped the meaning of some relevant theory, concept, law or model, 2) Being able to apply that theory, concept, (...) 

The principle of 'information causality' can be used to derive an upper boundknown as the 'Tsirelson bound'on the strength of quantum mechanical correlations, and has been conjectured to be a foundational principle of nature. In this paper, however, I argue that the principle has not to date been sufficiently motivated to play this role; the motivations that have so far been given are either unsatisfactorily vague or else amount to little more than an appeal to intuition. I then consider how (...) 

Must a Maxwell demon must fail to reverse the second law of thermodynamics? Standard attempts to show it must fail make use of notions of information and computation. None of these attempts have succeeded. Worse they have distracted both supporters and opponents of these attempts from a much simpler demonstration of the necessary failure of a Maxwell's demon that employs no notions of information or computation. It requires only Liouville's theorem and its quantum analog. 

This paper addresses the question of how we should regard the probability distributions introduced into statistical mechanics. It will be argued that it is problematic to take them either as purely ontic, or purely epistemic. I will propose a third alternative: they are almost objective probabilities, or epistemic chances. The definition of such probabilities involves an interweaving of epistemic and physical considerations, and thus they cannot be classified as either purely epistemic or purely ontic. This conception, it will be argued, (...) 

Combining physics, mathematics and computer science, quantum computing and its sister discipline of quantum information have developed in the past few decades from visionary ideas to two of the most fascinating areas of quantum theory. General interest and excitement in quantum computing was initially triggered by Peter Shor (1994) who showed how a quantum algorithm could exponentially “speedup” classical computation and factor large numbers into primes far more efficiently than any (known) classical algorithm. Shor’s algorithm was soon followed by several (...) 