Not exactly about books, but PG would bet that over 80% of those who read the books written by regular visitors to TPV (excepting authors of children’s and YA books) are college graduates.
From The Wall Street Journal:
We are at the end of an era in American higher education. It is an era that began in the decades after the Civil War, when colleges and universities gradually stopped being preparatory schools for ministers and lawyers and embraced the ideals of research and academic professionalism. It reached full bloom after World War II, when the spigots of public funding were opened in full, and eventually became an overpriced caricature of itself, bloated by a mix of irrelevance and complacency and facing declining enrollments and a contracting market. No one has better explained the economics of this decline—and its broad cultural effects—than Richard Vedder.
Mr. Vedder is an academic lifer—a Ph.D. from the University of Illinois and a long career teaching economic history at Ohio University. In 2004 he brought out “Going Broke by Degree: Why College Costs Too Much,” and in 2005 he was appointed to the Commission on the Future of Higher Education, a group convened by Margaret Spellings, the U.S. education secretary. “Restoring the Promise: Higher Education in America” is a summary of the arguments he has been making since then as the Cassandra of American colleges and universities. Despite the optimistic tilt of the book’s title, Mr. Vedder has little to offer in the way of comfort.
As late as 1940, American higher education was a modest affair. Less than 5% of adults held a college degree, and the collegiate population amounted to about 1.5 million students. This scale changed with the first federal subsidies, Mr. Vedder notes, beginning in 1944 with the Servicemen’s Readjustment Act (the “GI Bill”). Within three years, veterans accounted for 49% of all undergraduate enrollment—some 2.2 million students. Having earned degrees themselves, the veterans expected their own children to do likewise.
Such expectations were supported by still further subsidies, through the National Defense Education Act (1958) and the Higher Education Act of 1965. By the 1970s, there would be 12 million students in the American college and university system; by 2017, there would be 20 million. Meanwhile, more and more federal research dollars poured into campus budgets—reaching some $50 billion in direct funding by 2016—and set off infrastructure binges. To pay for them, as Mr. Vedder documents, tuition and fees vaulted upward, while the federal programs that were intended to ease the financial burden—especially low-interest student loans—only enticed institutions to jack up their prices still higher and spend the increased revenue on useless but attention-getting flimflam (from lavish facilities to outsize athletic programs). At Mr. Vedder’s alma mater, Northwestern, tuition rose from 16% of median family income in 1958 to almost 70% in 2016. Over time, armies of administrators wrested the direction of their institutions away from the hands of faculties and trustees.
Today a college degree has become so common that 30% of adult Americans hold one. Its role as a bridge to middle-class success is assumed—though bourgeois comfort is rather hard to achieve these days with a B.A. in English literature or a degree in, say, sociology. The modern economy, says Mr. Vedder, simply doesn’t possess the number of jobs commensurate with the expectations of all the degree-holders.
The over-educated barista is one of the standing jokes of American society, but the laughter hasn’t eased the loan burden that the barista probably took on to get his degree. Mr. Vedder says that student loans have injected a kind of social acid into a generation of young adults who, over time, manifest a “decline in household formation, birth rates, and . . . the purchase of homes.” Pajama Boy was born, and took up residence in his parents’ basement.
Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)
And a quote from economist Herbert Stein:
What economists know seems to consist entirely of a list of things that cannot go on forever . . . . But if it can’t go on forever it will stop.
PG suspects that this practice may have become impolite or illegal, but when he was interviewing for his first job out of college (before he went to law school) one of the last questions he was asked by the final interviewer, the head of the department in which the job opening existed, was, “What were your SAT scores?”
Evidentally PG’s answer was satisfactory because he was hired for the position despite having absolutely no training or education that might lead a reasonable person to conclude he was prepared for the specific tasks involved in carrying out his job responsibilities.
What the interviewer was trying to ascertain was whether PG might be smart enough to learn how to do the job if he was hired. (PG was, and received a promotion after about a year, but left the company when a better job beckoned.)
PG has read that the SAT and ACT tests (for visitors to TPV from outside of the United States, these are standardized tests required for entry into virtually any college or university in the country) are effectively proxies for IQ tests.
IQ tests were first developed during the early part of the 20th Century for the purpose of identifying retardation in school children. During World War I an intelligence test was devised to help quickly screen soldiers coming into the US Army for assignment to either basic training or officers training. (At the start of the war, the US ground forces included about 9,000 officers. At the end of the war, there were over 200,000 officers.)
After World War I, IQ testing became much more widespread in both education and business. Unfortunately, it also became entangled with the eugenics movement during the 1920’s and 1930’s.
On a general basis, there is a correlation between educational attainment and IQ – MDs, JDs, and PhDs have higher IQ’s on average than college graduates who, in turn have higher IQ’s than those who attended college but did not graduate and those individuals have higher average IQ’s than those who graduated from high school, but received no additional education.
In this as in countless other things, correlation is not causation. There are plenty of people who possess the inherent intelligence and ability to become MDs, JDs and PhDs who choose not to pursue that educational/occupational path. Such individuals do not, of course, become less intelligent if they go in another direction. From personal experience, PG can attest that there is no shortage of attorneys who do stupid things.
A US Supreme Court case titled Griggs v. Duke Power Co., decided in 1971, effectively forbade employers from using arbitrary tests—such as those for measuring IQ or literacy—to evaluate an employee or a potential employee, a practice that some companies at the time were using as a way to get around rules that prohibited outright racial discrimination.
Griggs began when African American workers at the Duke Power Company in North Carolina sued the company because of a rule that required employees who wished to transfer between different departments to have a high-school diploma or pass an intelligence test.
By a unanimous decision, the Supreme Court held that the tests given by Duke Power were artificial and unnecessary and that the requirements for transfer had a disparate impact on African-Americans. Furthermore, the court ruled that, even if the motive for the requirements had nothing to do with racial discrimination, they were nonetheless discriminatory and therefore illegal. In its ruling, the Supreme Court held that employment tests must be “related to job performance.”
Griggs and resulting civil rights laws notwithstanding, prospective employers still want the best evidence available that a job applicant possesses the abilities (including intelligence) to succeed in the position that needs to be filled.
Given the regulatory environment in which employers operate, post-high school education is a common (and legal) requirement specified in a great many job descriptions. In the US business world, a bachelor’s or advanced degree is often a hard and fast must-have. Written or online job applications always include a section for the applicant to list undergraduate and post-graduate degrees and the institutions that granted such degree(s).
In addition to a degree, the identity of the college/university the applicant attended is often regarded as a proxy for the applicant’s intelligence and ability. The holder of a bachelor’s degree from Harvard will generally be assumed to be more intelligent than someone who graduated from Northeast Hickville State College and Welding School regardless of the personal abilities, talents, work ethic and raw intelligence of the latter.
So, back to the OP,
- A college degree from an institution generally known for its selective nature is becoming more and more and more expensive because there is no indication that increased tuition and other costs will have any adverse impact on the number and general qualifications (intelligence) of its applicants; and
- A college degree from some institution, high-quality or not-so-high-quality, as a proxy for intelligence, regardless of the field of study, is a requirement for obtaining a job with a reasonable salary or even getting a foot in the door at a very large number of employers; and
- Government and other loans are available to any student who wishes to attend almost any college, regardless of a student’s field of study or ability to pay; and
- As a general proposition under US bankruptcy laws, it is difficult or impossible to avoid the obligation to repay student loans, especially for recent college graduates or graduates who have obtained jobs, regardless of the amount of their current income.
PG wonders one of the ways to address this problem would be to permit employers to receive the results of an IQ test or quasi-IQ test like the SAT or ACT from a job applicant without risking litigation or other penalties for doing so.