sankalp's blog

Career advice given AGI "Imagine you have a 17-year-old brother/nephew just starting college. What would you recommend
he study, given your AGI timelines?" That's so tough, right? I don't know, become a podcaster? I feel like that job's still gonna be around. It's funny,
because I studied computer science, and in retrospect- at the time, you could've become a software engineer or something. Instead, you became a podcaster,
it’s kind of an irresponsible career move, but in retrospect, it's like… It kinda worked out.
Just as these guys are getting automated. I get asked this question all the time, and one answer that I like to give is that you should think about the next couple of
years as increasing your individual leverage by a huge factor every year. So already software engineers will come up and say, "You know, I'm two times faster," or,
"In new languages, I'm five times faster than I was last year." I expect that trend line to continue, basically, as you go from this model of, "Well, I'm working with some
model that's assisting me on my computer, and it's basically a pairing session," to,
"I'm managing a small team," through to, "I'm managing a division or a company". Basically, that is targeting a task. And so I think that deep technical knowledge in fields will still matter
in four years. It absolutely will. Because you will be in the position of managing dozens- or,
your individual management bandwidth will be maxed out by trying to manage teams of AIs. And maybe we end up in a true singularity world where you have AIs managing AIs and this kinda
stuff. But I think in a very wide part of the possibility spectrum you are managing enormous,
vastly more resources than an individual could command today, and you should be
able to solve so many more things with that. That's right, and I think I would emphasize that this is not just cope. Like, it genuinely is a case that these models lack the kind of
long-term coherence which is absolutely necessary for making a successful company or… Just, getting
a fucking office is kinda complicated, right? So you can just imagine that for sector after sector-
the economy is really big, right? And really complex. Exactly, and so, I don't know the details, but I assume if it's a data sparse thing
where you gotta know what is the context of what's happening in the sector or something, I feel like you'd be in a good position. Maybe the other thought I have is that it's
really hard to plan your career in general. And I don't know what advice that implies,
because I remember being super frustrated. I was in college, and the reason I was doing the podcast was to figure out what it is I want to do. It wasn't the podcast itself. And I would go on,
80,000 Hours or whatever career advice, and in retrospect it was all mostly useless,
and just try doing things. I mean, especially with AI, it's so hard to forecast what kind of
transformations there will be, so try things, do things. I mean, it's such banal, vague advice, but
I am quite skeptical of career advice in general. Well, the piece of career advice that I'm not skeptical of is put yourself close to the frontier, because you have a much better
vantage point from there. Right? You can study deep technical things, whether it's computer
science or biology, and get to the point where you can see what the issues are because it's
actually remarkably obvious at the frontier what the problems are. It's very difficult to see… Actually, do you think there is an opportunity, because one of the things people bring up is,
maybe the people who are advanced in their career and have all this tacit knowledge will be in a position to be accelerated by AI, but you guys four years ago or two years ago,
when you were getting discovered or something, that kind of thing where you have a GitHub open issue and you try to solve it; is that just, that's done, and so the onboarding is much harder? That's still what we look for in hiring. So, you know? Yeah, I'm in favor of the ā€œlearn fundamentals, gain useful mental modelsā€, but it feels like
everything should be done in an AI-native way, or top-down instead of your bottom-up learning. So
first of all, learn things more efficiently by using the AI models, and then just know where their capabilities are and aren't. And I would be worried and skeptical about
any subject which prioritizes rote memorization of lots of facts or information instead of ways of thinking. But if you're always using the AI tools to help you,
then you'll naturally just have a good sense for the things that it is and isn't good at. Okay, next one. What is your strategy, method, or criteria for choosing guests?