Rich Sutton's note "The Bitter Lesson" is often quoted by others. The most frequently quoted sentence is the following:
The biggest lesson that can be read from 70 years of AI research is that general methods that leverage computation are ultimately the most effective, and by a large margin.
Based on such a statement, a lot of researchers spend most of their time applying old learning methods with more computation to solve old problems. That's true, using more computation we can solve problems that we are not able to solve in the past.
However, I find what's more valuable from "The Bitter Lesson" is not "add more computation!", but "we should spend our effort on developing meta-methods that can find and capture more complexity". The exact quote is as follow
They (the contents of minds) are not what should be built in, as their complexity is endless; instead we should build in only the meta-methods that can find and capture this arbitrary complexity. Essential to these methods is that they can find good approximations, but the search for them should be by our methods, not by us.
Thus, research projects that have impact in the long run are those try to increase the capability of methods instead of those try to add more human knowledge on exsiting methods and achieve "SOTA performance".