Though it feels quite far away, there was a time when colleges and universities across the United States actually educated young people. For generations, parents have sent their kids to these institutions, believing in their promises to provide learning, networking, and other opportunities that are essential to a good life.
Sadly, our nation’s higher learning institutions aren’t what they used to be. Democrats have completely usurped colleges and universities across the country. Instead of helping young adults become equipped with the necessary skills and knowledge, they’re brainwashing young adults with woke radical agendas.
Keep reading with a 7-day free trial
Subscribe to Deskooled to keep reading this post and get 7 days of free access to the full post archives.