That list was aimed at entering PhD students at Berkeley,
who I assume are going to devote many decades of their lives to the field, and who want to get to the research frontier fairly quickly. I would have prepared a rather different list if the target population was (say) someone in industry who needs enough basics so that they can get something working in a few months.
That particular version of the list seems to be one from a few years ago; I now tend to add some books that dig still further into foundational topics. In particular, I recommend A. Tsybakov's book "Introduction to Nonparametric Estimation" as a very readable source for the tools for obtaining lower bounds on estimators, and Y. Nesterov's very readable "Introductory Lectures on Convex Optimization" as a way to start to understand lower bounds in optimization. I also recommend A. van der Vaart's "Asymptotic Statistics", a book that we often teach from at Berkeley, as a book that shows how many ideas in inference (M estimation---which includes maximum likelihood and empirical risk minimization---the bootstrap, semiparametrics, etc) repose on top of empirical process theory. I'd also include B. Efron's "Large-Scale Inference: Empirical Bayes Methods for Estimation, Testing, and Prediction", as a thought-provoking book.
I don't expect anyone to come to Berkeley having read any of these books in entirety, but I do hope that they've done some sampling and spent some quality time with at least some parts of most of them. Moreover, not only do I think that you should eventually read all of these books (or some similar list that reflects your own view of foundations), but I think that you should read all of them three times---the first time you barely understand, the second time you start to get it, and the third time it all seems obvious.
I'm in it for the long run---three decades so far, and hopefully a few more. I think that that's true of my students as well. Hence the focus on foundational ideas.
没有评论:
发表评论