Charles L. Dodgson (Lewis Carroll) first published his visual method in The Game of Logic, a book published in 1886, extending it ten years later in Symbolic Logic, Part I. Originally designed to teach the theory of inference in Aristotelian logic and to improve on the earlier diagrammatic methods of Leonhard Euler (1772) and John Venn (1880), Carroll's method has not been considered seriously as a visual logic system.
In the two parts of Symbolic Logic, the second part first published in 1977, Dodgson developed a formal logic in which he set down valid rules for making inferences.
In this paper, I will describe the methods he invented to mechanize reasoning in his formal logic, and demonstrate the superiority of his diagrammatic method over Euler's and Venn's methods.
This talk is a progress report on an effort to locate, transcribe, and annotate all of the surviving correspondence written by John Playfair (1748-1819), Professor of Mathematics and then of Natural Philosophy at Edinburgh University. The letters reflect Playfair's wide-ranging interests in philosophy, geology, anthropology, chemistry, architecture, literature, and drama. They contain lists of his prominent friends: John Leslie, Dugald Stewart, William Robertson, John Rennie, Mary and Agnes Berry, Lord and Lady Minto, Archibald Constable, and so on. They share opinions on how the British government handled the American Revolution and Napoleonic Wars. And, woven throughout the correspondence, are references to Playfair's lifelong concern for mathematics and its history, including his analysis of observations made at Schehallien, papers prepared for the Transactions of the Royal Society of Edinburgh, and his supplement to the Encyclopaedia Britannica.
Following the declaration of war in 1914, scientific communications were interrupted. This occurred formally, so that (for example) German journals were no longer sent to libraries or individuals in enemy states. There was also an informal side, with many individuals deciding to cease correspondence with colleagues on the other side. This talk will give some examples of consequences internationally both during and after the war, as well as discussing activities by a few people who chose to resist this path.
In 1708 Pierre Rémond de Montmort published his book Essay d'analyse sur les jeux de hazard, an analysis of games of chance of the time using probability theory. Three years later Abraham de Moivre published his treatise De Mensura Sortis solving various problems in games of chance, again using probability theory. Montmort felt that De Moivre had plagiarized his work. In 1718 De Moivre published an expanded version of his original Latin treatise under the name The Doctrine of Chances. The dispute is described from surviving publications and letters. Both the Essay d'analyse and The Doctrine of Chances contain engravings that describe in pictures the nature and importance of their work. These pictures are analyzed in the context of the dispute.
In 1749, King Frederic the Great sought Euler's mathematical counsel concerning the establishment of a state lottery. The combinatorial issues involved in the analysis of this game of chance, known as the Genoese Lottery, piqued Euler's curiosity. As a consequence, he wrote four memoirs over the course of his career examining questions arising from this lottery. We will survey these works, paying particular attention to the use of the partition function in the second one.
We study the modern Iranian (Persian) calendar after first considering older Iranian calendars. In examining the modern Iranian (solar) calendar, we also discuss arithmetical formulas necessary to convert it reciprocally into the Gregorian calendar.
There will moreover be a short investigation on the Islamic (or Muslim lunar) calendar vis à vis the modern Iranian calendar. A few centuries after the modern Iranian calendar was started, Omar Khayyám (1048-1131) studied the exactitude of the leap year and made important contributions. Finally, we examine the political situation of the 20th century, which changed the various calendars back and forth, eventually settling again on the main calendar.
In 1748, after ten years of hard work, Maria Gaetana Agnesi (1718-1799) published the first Calculus book designed for teaching and written in Italian: Instituzioni Analitiche ad uso della Gioventu' Italiana (Analytic Institutions for the use of the Italian Youth). In the introduction to her work, Agnesi wrote: ... when considering the Integral Calculus, the Reader will find a completely new method for Polynomials, which has not appeared anywhere else; it belongs to the famous and never sufficiently praised Count Jacopo Riccati, Nobleman very proficient in all sciences, and well known in the literary world. He wanted to do me the favor of letting me know about it [the method], favor that I did not deserve, and I want to give him, and the Public, the appropriate justice, as it should properly be done. What was this new method, presented in the sixty-fourth article of the book? Was it really about polynomials? Is it as useful as Agnesi seemed to think?
Is there any sense in which it is both interesting and true that there is a plurality of logics? There are, of course, a multiplicity of systems traveling under the name `logic': various modal, deontic, combinatorial, constructive, paraconsistent, relevant, higher order, free, and other `logics', not to mention impoverished ancestors like Aristotelean syllogism, that differ from the standard first-order predicate logic favored by mathematicians and philosophers. But for all that, there might be a plurality of logics in only a trivial or uninteresting sense.
In this paper the prospects for logical pluralism are investigated. In particular, currently popular defenses of pluralism, such as the one due to J. C. Beall and Greg Restall, are investigated and found to yield just such an uninteresting logical plurality. An alternative version of pluralism is sketched, beginning with the observation that a variety of traditional accounts of what distinguishes logical from non-logical principles, usually regarded as equivalent, actually draw the logic-non-logic line in different places.
Robert K. Merton's `On the Shoulders of Giants' has had a good deal to offer the community of historians of mathematics as well as historians of science in general. His posthumous book on serendipity has come in for rather harsher treatment by both groups. This talk is designed to point out some of the theoretical features of the book which apply to mathematics and to illustrate how this fits into Merton's general view of the rationality of the scientific and mathematical enterprises.
That the invention of geometric cosmological models based on general relativity occurred at the same time that Vesto Slipher and Milton Humanson were documenting systematic large nebular red-shifts seems to have been a coincidence. Edwin Hubble in 1936 explicitly associated the expanding-universe interpretation of his red-shift law with relativistic cosmology; for Hubble, universal expansion was a theoretical notion rooted in relativity theory. The paper explores various historical questions concerning the relationship between theory and observation in cosmology around 1930.
By the middle of the 19th century, it was recognized that Euler's gamma function had some special properties. One of them will be convexity. A curve is convex if the following is true: take two points on the curve and join them by a straight line; then the portion of the curve between the points lies below the line. A convex function cannot look like a camel's back! It corresponds to a fundamental geometric concept of a function. In this work, we present some concepts developed by Jensen in 1906, in Sur les fonctions convexes et les inégalités entre les valeurs moyennes, and mainly Minkowski's work. We discuss some important applications of convexity in variational calculus, linear programming and non-linear programming.
The application of mathematics to the physical world was rather more problematic for the Greeks than it is for us. I shall here be concerned not with the achievements of the Greeks in this line, which are generally well known, but with the underlying conditions, the philosophical and cultural assumptions which tended to encourage or to inhibit the endeavour.
Relevant issues include the compartmentalization of knowledge, debate over the desirability and possibility of abstraction, and the widespread doubt that we can have true knowledge of the changeable.
Just over a hundred years ago, a Greek stone inscription was discovered on the island of Rhodes, at a site near Lindos called Keskinto. The inscription, of which only the last fifteen lines are partially preserved, dates from about 100 B.C. and contains a set of periodicities for the motions of the planets. The relations between some of the numbers were explained by Paul Tannery shortly after the text of the inscription was published in 1895, but little progress has been made since in making sense of the astronomical and mathematical principles underlying the inscription. The present talk will discuss the problems and present some new tentative solutions.
The ubiquity of describing the statistics of characteristics in large populations by a normal distribution is commonly accepted. Thus for instance Hernstein and Murray in their widely disseminated book The Bell Curve describe the Normal Distribution as:
"A common way in which natural phenomena arrange themselves approximately."In fact their model for the distribution of IQ's is mathematically way off the mark in satisfying the criteria for a normal distribution.
A more precise, yet still off the mark definition is in Jim Holt's article in the January 4, 2005, New Yorker Magazine:
"As a matter of mathematics the Bell Curve is guaranteed to arise whenever some variable is determined by lots of little causes (like human height, health, diet) operating more or less independently."
The great French mathematician Paul Lévy in writing his classic 1924 Calcul des Probabilités-in spite of the fact that the eminent mathematicians Borel and Deltheil felt it unnecessary to make the notion of probability more mathematically precise rather than to rely on common sense reasoning-intended to systematically develop and use the method of characteristic functions in order to simplify proofs about limit laws.
A sufficient condition for the sums of a large number of "individually small", independent random variables to approach the normal distribution-i.e., for the Central Limit Theorem to hold-was first given by Liapounoff in 1901 and a more general one by Lindeberg in 1922. Paul Lévy used this simpler method to derive Lindeberg's condition. In so doing he put his finger on the essential necessary condition and its meaning for the Central Limit Theorem to hold. This condition states that the components of the sum be not only "individually negligible" (small) with respect to their total sum, but that they be "uniformly negligible" (we might use the term "collectively negligible"), i.e., the probability that even the largest component random variable be of the order of the magnitude of the sum, must be negligible.
Lévy showed that in case this condition is not satisfied there exist families of limiting distributions for sums of independent random variables, among which the so-called Stable Laws of Index a, where 0 < a < 2. Here the approach to the limiting distribution is determined by the contribution of the few largest components in the sum. Consequently the probability of values of the sum which deviate from the mean (or the median in case a < 1) by a large amount is considerable. The "tail" of the probability distribution of the largest component in the case where the sum converges to a stable distribution of index a, as well as the "tail" of the limiting stable distribution decrease as the function x-a. The limiting distribution of the sum reflects that of its largest component terms. This dichotomy defines the "domains of attraction" of the normal vs. those of the other stable distributions.
The statistics of social phenomena in which stable distributions prevail such as wealth, power, batting averages, intellectual accomplishments, physical beauty, etc., are hardly ever discussed in the popular culture where they most emphatically deserve more attention.
It's well known that Lewis Carroll, the famous author of the Alice books, was a mathematician. His works include essentially manuals of Euclidean geometry, a treatise on determinants, popular textbooks on logic and collections of problems and puzzles. The majority of these works were signed with his real name: Charles L. Dodgson. The logical works are an exception. In effect, Carroll signed his two textbooks The Game of Logic (1886) and Symbolic Logic (1896) and his two articles in Mind: "A logical Paradox" (1894) and "What the Tortoise said to Achilles" (1895) with his "literary" pseudonym.
This state of affairs led to a number of prejudices and misunderstandings which influenced the reception of the work. People thought that Carroll's work was intended for children, that he considered logic a game and that his logical work completed and concluded his literary work. Even when his works revealed accurate ideas and discoveries, commentators claimed Carroll was not fully "conscious" of the depth of his works.
In this paper, I will essentially try to contextualise Lewis Carroll's logical work according to three themes: first, historically according to the development of the new logic in the nineteenth century; then geographically, by focusing on the academic British background; and finally personally by situating Carroll's logical works in relation to the rest of his work. From this, we can correct certain received ideas which harm the understanding of the work. Also, we will be able to suggest new ways to explain the use which Carroll made of his pseudonym, the growing interest which he had in logic, and finally the status he gave it.
One of the teaching fathers at the Séminaire de Québec observed the October 27, 1780, solar eclipse. In an unsigned manuscript attributed to Thomas-Laurent Bédard, superior of the Séminaire, he followed the methods of Lalande's ASTRONOMIE, using the copy from the seminary's library, still on site in the archive. From this detailed observation he calculated the longitude of Quebec relative to Paris, not Greenwich, England, despite the British Conquest.
Euler's four-volume 2500 page calculus text is often described as the origins or foundations of the modern calculus curriculum. This idea should not be accepted without some reflections. For example, few modern mathematics curricula include properties of elliptic integrals, as does Euler's Integral Calculus. We describe the content and intent of Euler's calculus course, and make some comparisons with the modern curriculum.
H. B. Curry is known for a philosophy of mathematics which he called "formalism". However, most people who know anything of his philosophy identify it with an early version which dates to 1939, relatively early in his career. In this early version, Curry proposed that mathematics be defined as the "science of formal systems", where he had in mind a definition of formal system somewhat different from the usual notion. Among the criticisms of this proposed definition are the suggestion that under this definition there could have been no mathematics before there were formal systems, a little more than a century ago.
In this paper, I will quote from Curry's later work to show that this criticism does not apply to his mature philosophy, and that his mature version of formalism is a form of structuralism.
Curricula and textbooks used in colleges, academies, and private schools in the newly formed United States between 1776 and 1826 show a gradual evolution from a vague exposure to whatever mathematical works, primarily of British origin, were in the possession of tutors, teachers, and faculty, to a more structured plan in which students progressed through arithmetic, algebra, geometry, and trigonometry, culminating in an application of these areas of mathematics to navigation and surveying, supported by texts specifically written for that purpose.
While these navigational topics can be found in documents dating from the earliest years of the 18th century, reflecting the study of navigation from private tutors and almanac makers, by the second decade of the 19th century the teaching of navigation had divided into at least two paths: one for professional mariners typified by the work of Nathaniel Bowditch, and a second as a capstone liberal arts experience (intended to teach scientific and mathematical principals) at colleges and universities, exemplified by texts written by Jeremiah Day, Professor of Mathematics and eventually President at Yale University.
This talk will present the details of these curricula and the ways in which navigation was used to illuminate the principles of Geometry and Trigonometry for the students of early America.
Arthur Buchheim was a short-lived mathematician of great promise. He attended the City of London School when Edwin A. Abbott ( Flatland: A Romance in Many Dimensions) was headmaster. Buchheim received his undergraduate degree from Oxford. He left England for a time to study under Felix Klein in Leipzig. Upon his return, he accepted a position as mathematical master at the Manchester Grammar School. In the short span of seven years, and in deteriorating health, he published twenty-four papers on a wide variety of mathematical topics. Sylvester claimed that "had his life had been spared, I think we may safely say of him what Newton said of Cotes, that if he had lived, we should have known something." We focus our attention on Buchheim's work and accomplishments.
While mathematics can be regarded as both a science and as an art and benefits from the tension between those two motivations for practising it, philosophies of mathematics often do not take either of these often defended views with much seriousness. While I am unable to offer philosophical elaboration of the art view (though not wishing to disparage it), I present how it is that I see my science view of mathematics as connected to similar views of other sciences.
Cohen's method of forcing allows one to show that many concrete mathematical statements are undecidable from the axioms of ZFC (Zermelo-Fraenkel set theory with the Axiom of Choice). A natural question arises: Can one repair the weaknesses in ZFC exposed by forcing, and if so, to what extent? The investigation of this question involves the study of higher axioms of infinity (the so-called "large cardinal axioms"), their canonical models, and the determinacy of infinite games. Recently, Woodin has developed W-logic, a strong extension of first-order logic that is the natural logic given by the method of forcing. He has also developed a transfinite proof system for W-logic, and isolated the W Conjecture, which is essentially completeness for this logic and its corresponding proof system. A proof of the W Conjecture would quantify the limits of forcing, provide a possible solution to the Continuum Hypothesis, show that those large cardinal axioms which admit an inner model theory of the kind that we know today are "cofinal" amongst all large cardinal axioms, and challenge the popular conception that there is no need for additional axioms of set theory.