Everyone knows that abuse of statistics is rampant in popular media. Politicians and marketers present shoddy evidence for dubious claims all the time. But smart people make mistakes too, and when it comes to statistics, plenty of otherwise great scientists- yes, even those published in peer-reviewed journals- are doing statistics wrong. Statistics Done Wrong comes to the rescue with cautionary tales of all-too-common statistical fallacies. It'll help you see where and why researchers often go wrong and teach you the best practices for avoiding their mistakes.
In Statistics Done Wrong, you'll learn:
- Why "statistically significant" doesn't necessarily imply practical significance
- Ideas behind hypothesis testing and regression analysis, and common misinterpretations of those ideas
- How and how not to ask questions, design experiments, and work with data
- Why many studies have too little data to detect what they're looking for – and, surprisingly, why this means published results are often overestimates
- Why false positives are much more common than "significant at the 5% level" would suggest
By walking through colorful examples of statistics gone awry, Statistics Done Wrong offers approachable lessons on proper methodology, and each chapter ends with pro tips for practicing scientists and statisticians. No matter what your level of experience, Statistics Done Wrong will teach you how to be a better analyst, data scientist, or researcher.
There are currently no reviews for this product. Be the first to review this product!
Alex Reinhart is a statistics Ph.D student at Carnegie Mellon University who received his B.S. in physics at the University of Texas, Austin. He teaches introductory statistics at Carnegie Mellon.