Considering how hard it is to predict the future, why do we think we can say anything useful about AGI today?

  • This branch of reasoning seems to come from a philosophy that it’s possible to say or believe nothing. This is false. You have to have some predictions, and take some actions.

  • Often ends up being a fight over what the priors should be

  • Agency-ness -> knowing something about where it ends up even if we can’t predict much about the

  • Extrapolation of current economy leads to economic singularity in our lifetime

  • Question disguises “shouldn’t we just do nothing because we don’t know” the answer to that is “no”

  • better question is that since it will get completely crazy, and we haven't thought about it anything like hard enough, probably less than “how did Sherlock fake his death in that TV show”

  • You can extrapolate and try to figure out different scenarios using the normal methods of careful thought.

  • We see some success in the past from attempts at extrapolation (sci fi predicting submarines or space travel)

  • Reasoning from first principles can work. Not perfectly, but what the hell else do we have.

  • In general depends on your complexity of the model