My experience with Academia is that its full of people who couldn't get jobs after undergrad, so they stayed in school. Their parents are at least middle class and support this route so they can harvest some of the status of having a PhD or professor.
I'd be less bitter if so much 'science' that came out in the last 100 years didn't fail to replicate... Yet we were expected to learn it like it was fact, while signing the praises of the authors. Literal waste of resources.
Meanwhile, industry is sending the world forward with life changing improvements. I think academia would hope they had some impact on it, but I think its wishful thinking.
Sure, nowadays this whole field is pushed forward by industry. But I would argue that for most technological advancements the foundations are laid in traditional academia.
Of course if you are not in academia you will only ever get in touch with the things which work out and get picked up by industry, completing the impression that only industry is doing valuable stuff. Insert suvivorship bias meme here
Backpropagation has been reinvented multiple times, because it is a basic application of the chain rule. The earliest recognizable usage of it is in control theory at NASA during the Apollo program.
It's a mistake to be dismissive of academic work which has been very important, but it's equally a mistake to think that academia is the sole source of foundational work.
- What status do you think people with PhDs or who teach and do research have? (And do you think that having those beliefs is necessary to remain in academia?)
- Who are we learning that we’re supposed to be treating like fact, and singing the praises of? (I remember learning about, say, Hindley-Milner type inference in undergrad, but I didn’t think of it as “fact”, and nor did I have to write of its praises to pass my classes.)
- I wanted to become a mathematician, and my colleagues are currently statisticians that work on medical claims data from entire health systems. What does “replicate” mean in this case? Similarly, what does “replicate” mean in the context of industry, if a comparison is being set up here?
This might also be of interest: https://karpathy.github.io/2016/09/07/phd/
How does one approach collaborators in this situation? Like, hey, I have this idea that solves the problem you have been trying to solve in a fundamentally different way that invalidates all the legacy approaches you have invested in, BTW. My emails that follow this spirit tend to get ghosted.
Also, a collaborator is usually not a stranger over the internet, it's often someone who you know and you already worked with, so it is not that ackward to expose a new idea and propose to work together.
It takes time and social skills to make long lasting collaborations, the two parties must trust each other in order to collaborate. In this context, exchanging ideas is not really an issue.
Or more concretely that famous story where a student had solved certain open problems in statistics thinking they were homework problems.
I’ll admit that I may just be immature at research as almost all my experience has either been attempting to replicate research or to put it into practice in production systems.
I tend to associate it with folks who are prepared to victim blame researchers for not adapting to the "new economy" as being people who have "bad taste" or "low agency", maybe as a way to rationalize/justify the upcoming inqeuality that AI will create.
Basically a recycling of the way "IQ"/smarts/hard-work has historically been used to justify disproportionate rewards for the upper class.
(Obviously a gigantic stretch on my part, and not saying the author is in this camp, but just wanted to vent somewhere)
(If we're venting about words, I'll bring up "opinionated", which has somehow become a positive .)
Links: https://paulgraham.com/taste.html https://www.paulgraham.com/goodart.html https://paulgraham.com/goodtaste.html
Taste is mostly about having a good intuition on the topics where your intuition is worth following. It tends to develop with experience. But if you want to develop the kind of taste that helps picking good research topics, you need the right kind of experience for that field of research. Experience that turns out to be of the right kind, in retrospect. If your experiences and interests align (again in retrospect), you will probably develop a good taste for research problems in your field of interest. But that requires some amount of luck, in addition to everything else.
Research is all about studying topics of uncertain value. You have to commit to a project long before you can say if it's actually worth doing.
Taste comes with deliberate effort and experience. It doesn't tell you that a topic is definitely worth studying, but it increases the likelihood that you will guess right.
Either the reader already has it, in which case there’s no point in being told that. Or the reader doesn’t, in which case you have declared that good taste cannot be taught.
Perhaps the author’s next article should be How to win the lottery: be lucky which is just about as actionable.
> But if I had to summarize it in one sentence, it would be that taste comes from practicing the skill of research, keeping your focus always on identifying what works and what doesn't.
Instead of following general guidelines, focus on figuring out what works and what doesn't in each specific situation. Keep doing that for many years, and your taste will develop. Remember that you are training your intuition, not developing a set of exact rules.
I recall reading an interview about a legendary developer, and the majority of the interview was not focused on his coding decisions or the structures he built, but it was about a notebook that he kept with voluminous notes about what was good and what wasn't. That notebook is a materialized version of 'taste', and it's certainly something almost anyone could put together with enough effort and time.
If I wrote about “how to paint great art” or “how to cook great meals” or “how to build great things” then it would be silly to say “have good taste”—even if that’s part of the answer. It won’t help anyone else to improve in any of those endeavors.
Most of this list is about how to dress for senpai; figuratively speaking. A pretty depressing take on "how to do important research that matters".
I would hope that would be of the most unimportant part of science, totally irrelevant to what's important and what matters. But maybe that's not true today.
Step 2. ???
Step 3. Receive award
How is that the case? The tips seem to aim for impactful research: picking good ideas and executing well on them. There's a tacit assumption that such impactful research will win best paper awards, but that's actually not substantiated and isn't obviously correct, since best paper selection committees can't see the future. For example, many (maybe most?) winners of retrospective awards (test-of-time / influential paper) aren't papers that won a best paper award when originally published.
Most of the author's papers he cites in the post, including the membership inference paper which is one of the papers the author is "most proud of," didn't win best paper awards.
Put in an unreasonable amount of effort
> Earlier I made an analogy to being an explorer; here's another I like even more. Think of yourself as a wildlife photographer. Obviously you need to be in the right place (you won't get a great picture of anything from your couch) and you need to be skilled at your craft. But once you've met those preconditions, the way to get the best picture is to just spend an unreasonable amount of time waiting for exactly the right circumstances to arise.
The point is he is a superb writer in my book, it is fun to read and there are similarly good quotes that say the same in different ways. It is not enough to hone your technical skill alone because "the right circumstances" do not appear if you only change one variable.
When I read this I see someone having fun, being able to convey that is a good trait.
But the comments have proven me wrong.
If you asked a bunch of researchers working on the “boring” stuff to predict what the hot papers of the year will be about, do we really think they’ll be that far off base? I’m not talking about groundbreaking or truly novel ideas that seem to come out of nowhere, but rather the high impact research that’s more typical of a field.
Even in big tech companies, it’s quite obvious what the interesting stuff to work on is. But there are limited spots and many more people who want those spots than are available.
There are a few kinds of important research. One is solving a well-defined, well-known problem everyone wants to solve but nobody knows how. Another is proposing a new problem, or a new formulation of it, that people didn't realize was important.
There is also highly-cited research that isn't necessarily important, such as being the next paper to slightly lower a benchmark through some tweaks (you get cited by all the subsequent papers that slightly lower the benchmark even further).
In a way, it was a sidetrack of the book, but for me the attitude speaking from that text was interesting and inspiring. When I could pull it off, it tended to work.