Saturday, February 10, 2007

Role of Universities

One very interesting lecture presented was Jim Duderstadt’s presentation on the history of the institution of “the university,” the university in Britain, the early United States, and The University of Michigan. The concluding remarks about the future of higher education, as well as the Spellings Commission were also very elucidating.

A couple of points of interest that came up for me were:

· the continued desire of higher education to continue to require a liberal arts education;

· research universities teaching basic classes; and

· an (apparent) lack of connection between secondary and higher education.

With regard to the first point – the continued desire to teach a liberal arts education – is, I believe, one of the major points of weakness in the US higher education system. While this is usually seen as a major strength of the US system, I am cognizant of the continually quickening pace of the world’s economies. Students in US higher education are waiting two years – half of their potential time of training at university – to decide their major field of study. This is compared to students in the UK who are admitted directly into their majors, and take courses almost exclusively within it. In Taiwan – as far as I understand their curriculum – all students are required to take a series of “core” courses (“colloquia”), but then stay pretty confined to their disciplines.

The idea that the US-educated university graduate knows a little about everything, and more about their topic is a good thing to have, but I believe that there are enough students who are in the system that have little desire to learn outside their field. This leaves me at an intellectual fork-in-the-road situation: should students be forced to take courses they do not want to (and professors and lecturers teach courses to these students who do all wish to learn) in order to have a liberal education, or should the requirements of education be changed to be more in line with a different goal?

I feel that a student body should not be forced to take courses for the sake of broadening their education – that is what personal development (for which everyone has a whole lifetime) is for. Instead, the US university system should use the first two years of each students’ academic tenure to try and instill the traits they desire in their graduates. Hopefully, these would include a strong basis in ethics, as well as training in thoughtful reasoning, leadership, and cooperation, as well as the basic tools each student would need in continuing within their chosen field.

This last point brings me to my second point of interest: research universities teaching basic classes. Should UofM be offering classes in basic algebra, biology, chemistry, etc? What difference does it make for a student to take such courses at an expensive research university, as opposed to a community college? A lecturer at a community college should have as much knowledge with such foundational material as any professor, lecturer, or graduate student at a research university. Indeed, I would make the argument that a professor at a research university who is interested in the cutting-edge of science might be less inclined to be interested in re-hashing the basics of his or her field every four months. If this is the case, it makes little sense that such a professor would teach such a class, leaving it (or as much of the running of it as possible) to his or her graduate students – which is what happens in many of these introductory 100-level courses around campus. So why should a student (or the parents of a student) pay thousands or tens-of-thousands of dollars to learn material from a graduate student when they can pay a fraction of the cost to learn the same material at a community college? Although Jim didn’t go into it very much in his presentation, I subsequently learned from him that the original setup of the University of California system was set up to focus only on upper-level and graduate courses, but was eventually scrapped due to recruitment concerns.

Of course, this could potentially all be solved by having a much-improved secondary education system in the United States. The major gap was one of the major findings of the Spellings Commission (barring the meta-findings of political “tampering”). In the UK, there is a Department for Education and Skills which can set teaching (and presumably learning) requirements nationwide, and although many debates occurred in the UK about the benefits and pitfalls surrounding teaching to tests, during my time living there, I felt that those students coming into universities were of a greater level of academic maturity than many of the Junior Year Abroad (JYA) students from the United States. While there are definite problems with the British education system, there should be a greater emphasis on connecting the skills learned in secondary education with those of university. Potentially having upper-tier universities require certain prerequisites for entry, or have a “catch-up” year or semester for students who need one before starting in on their actual degree.

And this leads me back, full-circle, to the questionable insistence by United States universities to stick to a liberal arts education. I feel that the problems with holding onto this commodity – decreased specialization, recalcitrant students being taught by disinterested lecturers/professors, high financial costs to learn basic subject matter, etc – will outweigh the benefits of a broader education. A breadth of education should be encouraged throughout a person’s schooling and life, and not forced upon those individuals who choose to be specialists. At the same time, universities should strive to teach all their students a set of tools – thoughtful reasoning, leadership, cooperation, etc – to help study and solve field-spanning problems that we are beginning to see today.

1 comment:

Anonymous said...

Some very interesting thoughts here, and I'd love to respond in a more detailed fashion -maybe I'll do so in a blog post one of those days.
Anyway, here are a few quick points in response to your post:

1) Early specialisation in higher ed. Well, although it's true that in Britain they tend to be more specialised (starting in high school -A levs), you and I know full well that the Scottish system admits you to a College rather than to a department, leading to far more flexibility in the choice of major than other systems. While there are differences due to the age at admission in Scotland, etc., some degree of flexibility seems desirable, and this system has been adopted in many other countries (universities not only in Wales, but also in France, etc.). Part of the problem, which I think is very much related to your point about the higher ed-high school (missing) link, is that students coming out of high school generally have little to no idea of the breadth of the disciplines one may elect to study at University (as a matter of fact, this may be why students in countries where university education isn't as "general" as in the US (as opposed to elitist and/or research oriented, say) do not go into university in the first place).
2) I think your point about the liberal arts ed requirement making no sense as it should be left up to the individual is beautiful, and furthermore vindicated by experiences such as that of language departments at Brown University (for example), where the absence of a requirement for a foreign language is no obstacle to very high enrollment figrues in those departments -the difference being that the students they get are actually motivated to learn the subject.
3) The risk in giving high school students courses over the spectrum of the disciplines, and we must ask whether this happens in Universities as well, is that they get very little of something and yet think they possess the subject. This leads, e.g., to the philosophy class (the pride of the French school system), a one-year course taken in the final year of high school, to cause all French people to believe that they can think better than the rest of the world.