Friday, May 29, 2009

Should we attack the principles of the institution of academia, or our view of academia as an institution?


A recent New York Times Op-Ed by Marc Taylor criticized higher education, especially graduate programs:

"End the University as We Know It"

This article attracted a lot of attention, and one piece looked at exactly from whom it attracted attention (mostly academics, she points out):


I am taking up Anne Trubeck's call in this piece for a more engaging discussion from people "excited about the exchange of ideas" to mention a few things.

I, too, am unsatisfied with the bulk of graduate-level research, since it normally leads to what I like to call a "cornering" of oneself: as you become increasingly specialized in your field, you trap yourself in an ever-tighter corner in the sense that you can intelligibly converse about your research with an increasingly smaller group of people. This happens in a lot of different fields, including economics and definitely math, from my personal experiences in these two fields. Thus, one of Taylor's main policy proposals is interdisciplinary work in order to expand the scope of academic work.

For Taylor, the question is whether such work needs to be regulated or if the market for research responds to incentives for these "problem task forces" quicker. I have several problems with his call for regulation. There is already a lot of interdisciplinary work, and as society demonstrates a need for some area of research the funding usually springs up for it. In addition, lists that people give of "important" areas are completely arbitrary since there are so many pressing matters that should be on researchers' agendas, and because we never know when the next problem will come to our attention.

On a somewhat related note, since the education market is characterized by positive externalities, we are already underproducing -- that's given and is no insight which Taylor can call his own.

Considering an alternative explanation for Taylor's problem

If we argue that we must restructure the system of academia, we're saying there is something wrong with the institution as it is currently designed -- which, incidentally, is based on principles that are very old and have produced much benefit to society. I'm no conservative, socially or politically or economically, but I'm also no radical anarchist who believes that the institution of higher ed needs to be brought down -- I'm in the academy for many reasons.

On the other hand, if we argue that we must consider how society views academia and its perceived "benefits", we challenge the current paradigm surrounding higher ed. This is the route that I take in the following analysis.

1st point: considering different institutions of knowledge production

One of the most fruitful paths I think we can explore in order to reform our views of education is to see the university as one of several institutions of "production of knowledge" in society. If we restrict ourselves to the view that knowledge is produced within the ivory tower and then disseminated throughout society to those not cerebral enough to contribute but only to utilize, then we have created several artificial barriers in education that limit the effects of the university itself. Consider an example of the limiting effects of the university. Without an arrow from the real world to the ivory tower, theories will appear largely uninformed of reality and will lose their potency as powerful explanatory tools. This then weakens theory, and if there are then attempts to use those theories in real life, exposes further weaknesses, etc. In other words, there are institutions within the real world itself which can generate knowledge production and thus strengthen the effectiveness of old theories and create new theories (without the help of some professor!).

If we take steps to solve this problem, then we get rid of Taylor's concern for lack of interdisciplinary work: with ivory tower theories constantly informed by reality we transform the nature of those theories and therefore the "toolkit" needed to solve the problems, requiring analysis from a wider range of fields.

Also, by considering the university as one of many institutions of knowledge production, we give it competition from other sources of knowledge production in society, including private research, all the different resources on the internet, even the workplace. A lot of experience- and on-the-job-types of training are already an important part of some jobs, and they lessen the need for standard university-produced education.

2nd (stronger?) point: considering how society views higher education

There is an important point here. Higher education -- bachelor's, master's, law, MBA, PhD, etc. -- is increasingly being seen as compulsory in our society. It leads to people spending many years accruing debt to gain an education which has nothing to do with the job they go on to afterwards. In a sense, it is "overeducation": we press upon our young adults a need for more degrees to a, well, unnecessary degree because we feel it is almost a "right" for every high school graduate to be able to go to college.

This is not an elitist call for the purging of universities. The point I am making is, as a society that values education for our children, we are pushing the need for education onto those who may not be interested in getting a degree in the first place. (As an aside, this point is not to imply that those people will have any less "value" in society for not spending 4 years at college. I purposely never mentioned anything about the value of certain jobs over others in the above discussion because I do not think it is very fruitful to judge various jobs in society on whether they are more "valuable" -- for example, whether some professor's work is more valuable to society than what a carpenter does.)

Of course, Taylor is talking about graduate school, but it all boils down to this same principle: not only in the sense that: "the MA is the new BA", and maybe soon: "the PhD is the new MA", but also in the sense that as more people go to college, well, someone needs to teach them. This creates jobs for the graduate students since they are cheaper to hire than full professors, so the whole mindset reproduces itself: more undergrads creates more demand for graduate students, this pushes in more undergrads as society increases its standards of what it means to be more "educated" (hah), and so on.

Synthesizing points 1,2: implications

The implications of this analysis are the following: Taylor's argument addresses the institution of higher education instead of how society views the institution of higher education. The latter, I argue, is more to the point for a few reasons: 1. we place too much emphasis on kids getting their BA, Master's, etc. before they are allowed to go out into the real world and be productive to society; 2. if we view the university as one of many institutions of knowledge production (and, god forbid! not the most important one!) including the workplace itself and also the internet (having message boards and group discussions and email listserves) then we should consider how we allocate time and resources, and not the university itself.

Bottom line: should we attack the principles of academia, or our own understanding of academia?

The principles of academia have been tried and tested for millenia. They are a tradition which has produced some outstanding results over the course of human history. On the other hand, our own understanding of academia, of education, is constantly on the move. Perhaps we've veered a bit off track and it's time to regain our position on the course. That is what I am arguing for.

No comments:

Post a Comment