Colleges Have Damaged Education

One of the most profound changes in United States culture during my lifetime is the role of higher education.  By and large I think it has not been change for the best.  In many ways colleges and universities have damaged education and had a number of deleterious impacts on society. In 1950 29.7% of high…

Read More