Sunday, May 31, 2009
Friday, May 29, 2009
Should we attack the principles of the institution of academia, or our view of academia as an institution?
A recent New York Times Op-Ed by Marc Taylor criticized higher education, especially graduate programs:
"End the University as We Know It"
This article attracted a lot of attention, and one piece looked at exactly from whom it attracted attention (mostly academics, she points out):
I am taking up Anne Trubeck's call in this piece for a more engaging discussion from people "excited about the exchange of ideas" to mention a few things.
I, too, am unsatisfied with the bulk of graduate-level research, since it normally leads to what I like to call a "cornering" of oneself: as you become increasingly specialized in your field, you trap yourself in an ever-tighter corner in the sense that you can intelligibly converse about your research with an increasingly smaller group of people. This happens in a lot of different fields, including economics and definitely math, from my personal experiences in these two fields. Thus, one of Taylor's main policy proposals is interdisciplinary work in order to expand the scope of academic work.
For Taylor, the question is whether such work needs to be regulated or if the market for research responds to incentives for these "problem task forces" quicker. I have several problems with his call for regulation. There is already a lot of interdisciplinary work, and as society demonstrates a need for some area of research the funding usually springs up for it. In addition, lists that people give of "important" areas are completely arbitrary since there are so many pressing matters that should be on researchers' agendas, and because we never know when the next problem will come to our attention.
On a somewhat related note, since the education market is characterized by positive externalities, we are already underproducing -- that's given and is no insight which Taylor can call his own.
Considering an alternative explanation for Taylor's problem
If we argue that we must restructure the system of academia, we're saying there is something wrong with the institution as it is currently designed -- which, incidentally, is based on principles that are very old and have produced much benefit to society. I'm no conservative, socially or politically or economically, but I'm also no radical anarchist who believes that the institution of higher ed needs to be brought down -- I'm in the academy for many reasons.
On the other hand, if we argue that we must consider how society views academia and its perceived "benefits", we challenge the current paradigm surrounding higher ed. This is the route that I take in the following analysis.
1st point: considering different institutions of knowledge production
One of the most fruitful paths I think we can explore in order to reform our views of education is to see the university as one of several institutions of "production of knowledge" in society. If we restrict ourselves to the view that knowledge is produced within the ivory tower and then disseminated throughout society to those not cerebral enough to contribute but only to utilize, then we have created several artificial barriers in education that limit the effects of the university itself. Consider an example of the limiting effects of the university. Without an arrow from the real world to the ivory tower, theories will appear largely uninformed of reality and will lose their potency as powerful explanatory tools. This then weakens theory, and if there are then attempts to use those theories in real life, exposes further weaknesses, etc. In other words, there are institutions within the real world itself which can generate knowledge production and thus strengthen the effectiveness of old theories and create new theories (without the help of some professor!).
If we take steps to solve this problem, then we get rid of Taylor's concern for lack of interdisciplinary work: with ivory tower theories constantly informed by reality we transform the nature of those theories and therefore the "toolkit" needed to solve the problems, requiring analysis from a wider range of fields.
Also, by considering the university as one of many institutions of knowledge production, we give it competition from other sources of knowledge production in society, including private research, all the different resources on the internet, even the workplace. A lot of experience- and on-the-job-types of training are already an important part of some jobs, and they lessen the need for standard university-produced education.
2nd (stronger?) point: considering how society views higher education
There is an important point here. Higher education -- bachelor's, master's, law, MBA, PhD, etc. -- is increasingly being seen as compulsory in our society. It leads to people spending many years accruing debt to gain an education which has nothing to do with the job they go on to afterwards. In a sense, it is "overeducation": we press upon our young adults a need for more degrees to a, well, unnecessary degree because we feel it is almost a "right" for every high school graduate to be able to go to college.
This is not an elitist call for the purging of universities. The point I am making is, as a society that values education for our children, we are pushing the need for education onto those who may not be interested in getting a degree in the first place. (As an aside, this point is not to imply that those people will have any less "value" in society for not spending 4 years at college. I purposely never mentioned anything about the value of certain jobs over others in the above discussion because I do not think it is very fruitful to judge various jobs in society on whether they are more "valuable" -- for example, whether some professor's work is more valuable to society than what a carpenter does.)
Of course, Taylor is talking about graduate school, but it all boils down to this same principle: not only in the sense that: "the MA is the new BA", and maybe soon: "the PhD is the new MA", but also in the sense that as more people go to college, well, someone needs to teach them. This creates jobs for the graduate students since they are cheaper to hire than full professors, so the whole mindset reproduces itself: more undergrads creates more demand for graduate students, this pushes in more undergrads as society increases its standards of what it means to be more "educated" (hah), and so on.
Synthesizing points 1,2: implications
The implications of this analysis are the following: Taylor's argument addresses the institution of higher education instead of how society views the institution of higher education. The latter, I argue, is more to the point for a few reasons: 1. we place too much emphasis on kids getting their BA, Master's, etc. before they are allowed to go out into the real world and be productive to society; 2. if we view the university as one of many institutions of knowledge production (and, god forbid! not the most important one!) including the workplace itself and also the internet (having message boards and group discussions and email listserves) then we should consider how we allocate time and resources, and not the university itself.
Bottom line: should we attack the principles of academia, or our own understanding of academia?
The principles of academia have been tried and tested for millenia. They are a tradition which has produced some outstanding results over the course of human history. On the other hand, our own understanding of academia, of education, is constantly on the move. Perhaps we've veered a bit off track and it's time to regain our position on the course. That is what I am arguing for.
Thursday, May 28, 2009
Rather than beginning with some posts on economic history, I've gone a different route by opening this blog with subjects closer to teaching/education, and I continue that with this post, which is something that has been on my mind all semester and still haunts me as I enter my final grades into Spire: what are the most effective methods/styles of teaching? Is there one method or style that works better than all others, or should the teacher find one that suits him or her best?
The answer will be obvious to anyone who has spent hours talking with students outside of the classroom and, after some time, feels a certain level of companionship with his students.
Once I realized this feeling of companionship, I slowly came to understand that there is no hierarchical teacher-student relationship in the classroom, and if one (either the teacher or student) walks into the classroom with that view one immediately sets the stage for certain attitudes and dispositions that are barriers to learning (both for the teacher and student).
Perhaps this is common sense, but I found it to be very enlightening and has helped me formalize my style and method. I thought I had more to say regarding this but perhaps it's best left for a separate, more focused post.
Wednesday, May 27, 2009
I found this link via marginalrevolution.
The main point of the article: uncertainty is useful in monetary policy because it can force people to be more cautious during economic upturns since there may be a probability (albeit small) that interest rates will increase during an economic boom. Thus, investors will factor in this risk of costlier investment when making decisions, forcing them to be less speculative.
It is clear that Salmon's argument is purely in the context of boom periods:
"The resulting uncertainty would force people to take a more defensive stance at all times, just in case rates went sharply upwards — even if the probability of such a rate hike was quite low."
But does it work in the case of a recession? Not really. Recessions, I would argue, are very different from booms. In a recession, you need people to be the opposite of cautious -- you need them to take risks. And you cannot promote risk-taking by saying there is a probability that rates actually may increase next FOMC meeting. And, since there is a lower limit on interest rates, expectations (in addition to easy money) play a central role in getting an economy out of a recession. Keynes said as much in his chapter in the General Theory on trade cycles: the initial drop into a recession may be caused by a crisis in either one of two phenomena (possibly both but not necessarily), credit or expectations. However, to get out of a recession you need to solve both problems of credit and expectations.
Shoddy analysis by Salmon aside, there is, of course, a larger political and ontological battle drummed up here.
The ontology: Salmon (and I, in fact) believes monetary policy has little real effect on an economy. We agree that policies like this are just one very small part of the workings of an economic system, and that in some sense (which I confess I do not completely understand myself) the system is overdetermined -- there are so many variables interacting in the economy that to think the interest rate is the key determining factor, or even a largely influential factor, is debatable.
The politics: Salmon (not I) believes monetary policy, or government intervention more generally, is largely trivial and therefore may do more harm than good. (This appears to be the same as the ontology but it is very different. I hope I need not explain why but let me know if clarification is needed.) Thus, by randomizing policy you in essence get rid of any possible room for government policy-- as Cowen suggests in his own comments on the article on his blog, "assume the return on gold follows a random path..." and we're back to debates over commodity-based currencies and laissez-faire political economy.
I believe that some fiscal and monetary policy is optimal, but that is based on my own views: investment is inherently unstable and does not follow the lines of the type of uncertainty discussed in Salmon's article: the nature of fundamental uncertainty is not probabilistic. Investment is guided, therefore, by animal spirits, which are only partly a function of rational calculation (not completely determined by it). So, government can help pick the economy back up in times of recession through direct investment in the economy.
You could say (a la Einstein) that while God may play dice with the universe Bernanke shouldn't play dice with the economy.
But, nevertheless, despite the philosophical and political backdrops, the case is poorly argued by Salmon because he fails to address both peaks and troughs of the business cycle.
Tuesday, May 26, 2009
I'm debating with myself over how to structure my syllabus for Econ 362 (American Economic History) which will be offered Fall 2009.
In the process, I've been asking myself a couple of related questions: 1. What are the broad themes of U.S. economic history, as I see them? 2. What are the most important parts of U.S. economic history, to me?
As you can tell, I've quickly realized that syllabus writing is no objective science. There simply is too much material out there to assign all of it, so the teacher is compelled to mix what he thinks is the best and most comprehensive material out there at the suitable level of difficulty.
For example, the concept of class and its historical implications are, for me, a central part of the story of U.S. economic history. So, I want to include class analysis in as many periods of my syllabus as possible. Examples include:
-pre-Revolutionary colonial governments and the political economy which gave rise to them
-class in the American Revolution
-legal history and class, late 18th and early 19th century
-proletarianization as we move through the 19th century
-trade union politics and welfare states in the 20th century
Since I am interested in race and gender in history and how they relate to economics, I would also want to talk about:
-slavery from the 17th to 19th century, its effects on post-bellum labor markets
-women and work in the 18th 19th 20th centuries
I am also interested in macro history, so I would want to include:
-Chandler-esque industrial history of the 19th and 20th century
-the Great Depression and Keynes
-studies of more recent trends in U.S. history including post-WWII "golden age" and financialization
I encounter some problems as soon as I want to include this third set. When I took U.S. economic history with Gerald Friedman, he concentrates solely on class analysis, and integrates slavery and gender issues into this theme. This would leave little room for Chandler, serious analysis of the Great Depression, and financialization.
My current dilemma is: can I reconcile my strong interests in class (particularly with regard to the first part of the course) with my equally strong interests in macro histories of competition and investment? (Perhaps if I was familiar with vols. 2 and 3 of Capital some theoretical framework could be developed? -- just an idea I am throwing out there.)
Discussing this dilemma with my friend Zhun, he said that if I could find such a unifying framework for U.S. economic history I'd probably have a dissertation on my hands. But even if I don't exactly come up with dissertation material, it helps to know that by thinking through the problem in this way, I am addressing some important and exciting questions in U.S. economic history.
This post is therefore just a beginning. There is much more to come
Monday, May 25, 2009
My name is Daniel MacDonald, and I recently finished my second year in the UMass econ PhD program (fields: macroeconomics and economic history). I'm starting up a blog with the intent of working through some ideas for a dissertation topic in an open atmosphere, as well as creating a forum for some of my other ideas and interests. These include videogames (Wii, DS, PS2), books (Murakami, Kafka, poetry), as well as my thoughts and philosophies regarding teaching.
I look forward to growing as a writer through maintaining this blog daily or at least every few days, so I promise to keep the content coming!
Welcome again and I look forward to any comments you may have on my writing,