{"title":"《权衡胜算》书评,D. Williams著","authors":"W. J. Braun, Hao Yu","doi":"10.1145/565145.565151","DOIUrl":null,"url":null,"abstract":"Weighing the Odds, by David Williams, is published by Cambridge University Press (publishing date 2001, ISBN 0-521-00618-X). 556 pp. £ 24.95 The author's motivation in writing this book might be inferred from a statement made at the end of the eighth chapter: 'that I have not been more actively involved in Statistics throughout my career and my wish to persuade others not to follow my example in that regard'. Weighing the Odds is a book on Probabili ty and Statistics, written from the perspective of a probabilist. Its intended audience is mathematics students who have not yet been exposed to these subjects. The author's objective is to entice these students by introducing the more mathematical elements of the subjects. The book is highly idiosyncratic, and the writer's personal views are never far from the surface. Thus, the book is perhaps the most lively account of these two subjects that we are aware of. The book contains a relatively small, but interesting treatment of traditional problems and some newer-looking problems. Some excellent hints are provided for some of the more challenging problems. A lot of statistics (all?) is based on conditioning. We cannot write down a model or compute a probability without conditioning on something. At the same time, conditioning is often one of the beginning student's greatest difficulties with the subject of probability and statistics. Therefore, it is refreshing to see a book which opens by addressing conditioning so boldly with an attempt at an intuitive look. Many of the examples are very good: the 'Monty Hall problem' (referred to as the 'Car and Goats Problem') , the 'Two Envelopes' problem, and the \"Birthday problem' are all described clearly, and analyzed carefully either in Chapter 1 or later on in the book. The example referencing system seems confusing, at first. The author warns that the preface should be read first; this warning should be heeded if for no other reason than to discover that problem 19A is really problem A on page 19. Chapter Two contains, among other things, a collection of measure-theoretic results. We're scratching our heads a bit, wondering why the author seems so insistent on avoiding measure theory, when he has gone almost halfway there. Some of the results, such as the monotone convergence theorem, are useful as 'Facts ' , but the description of the n system lemma seems to be deficient. We are not sure anyone without a background in measure theory already would really know what the author is talking about. Similarly, the Banach-Tarski paradox may be bewildering to many readers. On the plus side, we think that the hat-matching problem is a good nontrivial application of the inclusionexclusion principle. Chapter Three gives a concise discussion of random variables, density functions, mass functions, and expectation, and Chapter Four begins with a discussion of conditional probability and independence before moving into the laws of large numbers. The author claims that the proof of the Kolmogorov strong law is much too difficult to give. We are a bit perplexed about this, since all of the ingredients are given, and that the proof really is not that much difficult than, say the proof of Stirling's formula (admittedly as optional, but on page 13!). The inclusion of the Bernstein polynomial proof of the Weierstrass approximation theorem is a nice idea. The only real look at Markov chains occurs in the section on random walks. The reflection principle is described briefly. The section the strong Markov property is rightly indicated as optional reading. The treatment of simulation is excellent. It is unusual for pseudorandom number generation to feature in a text on probability, but it fits into the context naturally. The presentation is careful, and it is nice to see the Wichman and Hill (1982) pseudorandom number generator featured.","PeriodicalId":314801,"journal":{"name":"SIGSAM Bull.","volume":"76 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Review of weighing the odds, by D. Williams\",\"authors\":\"W. J. Braun, Hao Yu\",\"doi\":\"10.1145/565145.565151\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Weighing the Odds, by David Williams, is published by Cambridge University Press (publishing date 2001, ISBN 0-521-00618-X). 556 pp. £ 24.95 The author's motivation in writing this book might be inferred from a statement made at the end of the eighth chapter: 'that I have not been more actively involved in Statistics throughout my career and my wish to persuade others not to follow my example in that regard'. Weighing the Odds is a book on Probabili ty and Statistics, written from the perspective of a probabilist. Its intended audience is mathematics students who have not yet been exposed to these subjects. The author's objective is to entice these students by introducing the more mathematical elements of the subjects. The book is highly idiosyncratic, and the writer's personal views are never far from the surface. Thus, the book is perhaps the most lively account of these two subjects that we are aware of. The book contains a relatively small, but interesting treatment of traditional problems and some newer-looking problems. Some excellent hints are provided for some of the more challenging problems. A lot of statistics (all?) is based on conditioning. We cannot write down a model or compute a probability without conditioning on something. At the same time, conditioning is often one of the beginning student's greatest difficulties with the subject of probability and statistics. Therefore, it is refreshing to see a book which opens by addressing conditioning so boldly with an attempt at an intuitive look. Many of the examples are very good: the 'Monty Hall problem' (referred to as the 'Car and Goats Problem') , the 'Two Envelopes' problem, and the \\\"Birthday problem' are all described clearly, and analyzed carefully either in Chapter 1 or later on in the book. The example referencing system seems confusing, at first. The author warns that the preface should be read first; this warning should be heeded if for no other reason than to discover that problem 19A is really problem A on page 19. Chapter Two contains, among other things, a collection of measure-theoretic results. We're scratching our heads a bit, wondering why the author seems so insistent on avoiding measure theory, when he has gone almost halfway there. Some of the results, such as the monotone convergence theorem, are useful as 'Facts ' , but the description of the n system lemma seems to be deficient. We are not sure anyone without a background in measure theory already would really know what the author is talking about. Similarly, the Banach-Tarski paradox may be bewildering to many readers. On the plus side, we think that the hat-matching problem is a good nontrivial application of the inclusionexclusion principle. Chapter Three gives a concise discussion of random variables, density functions, mass functions, and expectation, and Chapter Four begins with a discussion of conditional probability and independence before moving into the laws of large numbers. The author claims that the proof of the Kolmogorov strong law is much too difficult to give. We are a bit perplexed about this, since all of the ingredients are given, and that the proof really is not that much difficult than, say the proof of Stirling's formula (admittedly as optional, but on page 13!). The inclusion of the Bernstein polynomial proof of the Weierstrass approximation theorem is a nice idea. The only real look at Markov chains occurs in the section on random walks. The reflection principle is described briefly. The section the strong Markov property is rightly indicated as optional reading. The treatment of simulation is excellent. It is unusual for pseudorandom number generation to feature in a text on probability, but it fits into the context naturally. The presentation is careful, and it is nice to see the Wichman and Hill (1982) pseudorandom number generator featured.\",\"PeriodicalId\":314801,\"journal\":{\"name\":\"SIGSAM Bull.\",\"volume\":\"76 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2002-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SIGSAM Bull.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/565145.565151\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIGSAM Bull.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/565145.565151","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Weighing the Odds, by David Williams, is published by Cambridge University Press (publishing date 2001, ISBN 0-521-00618-X). 556 pp. £ 24.95 The author's motivation in writing this book might be inferred from a statement made at the end of the eighth chapter: 'that I have not been more actively involved in Statistics throughout my career and my wish to persuade others not to follow my example in that regard'. Weighing the Odds is a book on Probabili ty and Statistics, written from the perspective of a probabilist. Its intended audience is mathematics students who have not yet been exposed to these subjects. The author's objective is to entice these students by introducing the more mathematical elements of the subjects. The book is highly idiosyncratic, and the writer's personal views are never far from the surface. Thus, the book is perhaps the most lively account of these two subjects that we are aware of. The book contains a relatively small, but interesting treatment of traditional problems and some newer-looking problems. Some excellent hints are provided for some of the more challenging problems. A lot of statistics (all?) is based on conditioning. We cannot write down a model or compute a probability without conditioning on something. At the same time, conditioning is often one of the beginning student's greatest difficulties with the subject of probability and statistics. Therefore, it is refreshing to see a book which opens by addressing conditioning so boldly with an attempt at an intuitive look. Many of the examples are very good: the 'Monty Hall problem' (referred to as the 'Car and Goats Problem') , the 'Two Envelopes' problem, and the "Birthday problem' are all described clearly, and analyzed carefully either in Chapter 1 or later on in the book. The example referencing system seems confusing, at first. The author warns that the preface should be read first; this warning should be heeded if for no other reason than to discover that problem 19A is really problem A on page 19. Chapter Two contains, among other things, a collection of measure-theoretic results. We're scratching our heads a bit, wondering why the author seems so insistent on avoiding measure theory, when he has gone almost halfway there. Some of the results, such as the monotone convergence theorem, are useful as 'Facts ' , but the description of the n system lemma seems to be deficient. We are not sure anyone without a background in measure theory already would really know what the author is talking about. Similarly, the Banach-Tarski paradox may be bewildering to many readers. On the plus side, we think that the hat-matching problem is a good nontrivial application of the inclusionexclusion principle. Chapter Three gives a concise discussion of random variables, density functions, mass functions, and expectation, and Chapter Four begins with a discussion of conditional probability and independence before moving into the laws of large numbers. The author claims that the proof of the Kolmogorov strong law is much too difficult to give. We are a bit perplexed about this, since all of the ingredients are given, and that the proof really is not that much difficult than, say the proof of Stirling's formula (admittedly as optional, but on page 13!). The inclusion of the Bernstein polynomial proof of the Weierstrass approximation theorem is a nice idea. The only real look at Markov chains occurs in the section on random walks. The reflection principle is described briefly. The section the strong Markov property is rightly indicated as optional reading. The treatment of simulation is excellent. It is unusual for pseudorandom number generation to feature in a text on probability, but it fits into the context naturally. The presentation is careful, and it is nice to see the Wichman and Hill (1982) pseudorandom number generator featured.