I really enjoyed diving into Seb Ruder's latest NLP Newsletter which focuses on all the areas of NLP that are still in desperate need of attention in a post-LLM world.

In an era where running state-of-the-art models requires a garrison of expensive GPUs, what research is left for academics, PhD students, and newcomers to NLP without such deep pockets?

...while massive compute often achieves breakthrough results, its usage is often inefficient. Over time, improved hardware, new techniques, and novel insights provide opportunities for dramatic compute reduction...

I wrote about some of the same issues in my post NLP is more than just LLMs earlier this year and I recently speculated about how current industry AI darlings, sexy-scale-up companies very much in "growth" mode as opposed to incumbents in "cost-saving" mode, are just not incentivised to be compute-efficient.

If you are just starting out in this space there are plenty of opportunities and lots of problems to solve - particularly around trust, reliability and energy efficiency.