humans of reddit, do you think school and college are important/worth it and why?

It depends on the field. Sometimes plain work experience is more important than a generic degree and other times, the degree is needed. For example, sales and management should have more of a focus on experience while healthcare requires the education.