Highlights from “The Signal and The Noise”

Few things to ponder about; more so, if you deal with data.

  1. If it has become cheaper to produced a new invention, this suggests that we are using our information wisely and are forging it into knowledge. If it is becoming more expensive, this suggests that we are seeing signals in the noise and wasting our time on false leads.
  2. Data-driven predictions can succeed — and they can fail. It is when we deny our role in the process that the odds of failure rise. Before we demand more of our data, we need to demand more of ourselves.
  3. In the paper, he [Akerlof] demonstrated that in a market plagues by asymmetries of information , that quality of goods will decrease and the market will come to be dominated by crooked sellers and gullible or desperate buyers.
  4. In a complex systems, however, mistakes are not measures in degrees but in whole orders of magnitude.
  5. The litmus test for whether you are a competent forecaster is if more information makes your predictions better.
  6. Accuracy is the best policy for a forecaster.
  7. Statistical inferences are much stronger when backed up by theory or at least some deeper thinking about their root causes.
  8. A model is a tool to help us understand the complexities of the universe, and never a substitute for the universe itself.
  9. Finding patterns is easy in any kind of data-rich environment; that’s what mediocre gamblers do. The key is in determining whether the patterns represent noise or signal.
  10. Elite chess players tend to be good at metacognition — thinking about the way they think — and correcting themselves if they don’t seem to be striking the right balance.
  11. a) While aggregate forecast will essentially always be better than the typical individual’s forecast, that doesn’t necessarily mean it will be good. b) The wisdom-of-crowds principle holds when forecasts are made independently before being averaged together. c) Although the aggregate forecast is better than the typical individual’s forecast, it does not necessarily hold that it is better than the best individual’s forecast.
  12. This is another of those Information Age risks: We share so much information that our independence is reduced.
  13. There is a tendency in our planning to confuse the unfamiliar with the improbable. The contingency we have not considered seriously looks strange; what looks strange is thought improbable; what is improbable need not be considered seriously.
  14. When a possibility is unfamiliar to us, we do not even think about it.
  15. Whatever range of abilities we have acquired, there will always be tasks sitting right at the edge of them. If we judge ourselves by what is hardest for us, we may take for granted those things that we do easily and routinely.
  16. Information becomes knowledge only when it’s placed in context.

Programming, experimenting, writing | Past: SWE, Researcher, Professor | Present: SWE

Programming, experimenting, writing | Past: SWE, Researcher, Professor | Present: SWE