No, it isn't an absolute necessity. I only spent one year at a university and my career as a professional writer has endured for decades across many forms of media: tv, animation, comics, games, and books.
That said, a lot depends on what type of writer you want to be. It's a lot harder to break into many forms of writing now than when I started out. Having a college degree can help to open doors. It won't make or break your chances. It doesn't guarantee you anything. It's simply a good thing to have.
I would consider it more vital for some forms of writing than others. If you went into journalism, for example, I suspect it would be more important. If you want to write novels, it's irrelevant. No one is going to judge your novel on whether or not you've gone to college. But if you want to write novels, I strongly advise you to have some other occupation that will allow you to earn a living, because it's highly unlikely you'll do it from novels alone.
Decide on what type of writing you want to pursue. There are schools that have courses on writing for games or writing for comics or animation. There are schools that have degrees for journalism. Research jobs in the field that interests you and see what sort of job requirements are listed (leaving out something like writing novels, which doesn't apply here). Most will probably list at least a BA.
No matter what type of writing interests you, one thing will be as or more important than college courses -- living life. Have varied and challenging life experiences. Learning about people and why people behave the way they do. Live the kind of life that gives you experience to draw from, and that gives you something to write about.
I particularly agree with the last part of her answer, and add something my daughter gave me: a paperweight with this profound inscription:
"Either write something worth reading or do something worth writing."
- Ben Franklin.