Surveys such as these provide a summary of popularity: https://www.tiobe.com/tiobe-index/ , https://madnight.github.io/githut/#/pull_requests/2021/3 .
You will find is that academic courses will focus on languages that help show computer science topics such as object oriented development, data structures, and common/useful design patterns. Java and Python are probably popular, and you will learn the disciplines you learn in one language are portable to another.
Let's take an example of sorting numbers. Each language has its own way of storing a list of numbers in something like an array, iterating through the list, and then reorganizing. You have to be precise with the syntax of each language, but the common discipline is what matters.
In my experience, software developers are always learning new things about the language they are using, or a language that is new to them because a new assignment requires it. My best recommendation is you solve some interesting problems in a class or of your own creation, and be ready and open to take that learning onto your next project.
The short version is that if you make computer science your career, you're going to do a fair amount of re-learning periodically. That is the job.
The main exception to this is SQL. Database is generally undertaught in computer science programs relative to the need and how much it actually comes up, and SQL hasn't changed much in decades. Learn SQL and learn it well. It will pay off. Virtually all databases use some minor modification of the main SQL language.