Fireside this week! I know we’re all anxious to get to the last part of our look at the Roman Republic – a discussion of Roman courts and the legal system – but academic job season is upon us and I needed to take a week to focus on getting some of those applications out. On the flipside, that makes this a good opportunity to talk about how professors in academic fields are generally hired, something I’ve discussed more with Patrons but not on the main blog (worry not, Patrons, for you shall soon have the inside scoop on another, different topic).

Academic hiring is seasonal, based around the academic calendar, because every department wants their new hires to start at the beginning of the academic year in August/September. In practice this means the process works in two rounds, the main affair in the fall (through winter) and then the consolation round in the spring.
The academic hiring calendar starts for applicants with the posting of new jobs, usually in August or September. For hiring committees, putting up the job posting is actually the culmination of a bunch of work getting a job description put together and a tenure line approved that started at the latest in the previous spring. In any event, the fall season is mostly for tenure line jobs, which is to say, permanent positions as full members of the department. Generally speaking, jobs are posted in August and September with applications due in September, October or early November, with first round (‘long list’) interviews – usually these days via zoom1 – in November or December. Then three finalists are selected (there may be a second round of interviews) for the ‘campus visit’ and the ‘job talk’ which take place between January and March, usually.
The ‘campus visit’ generally consists of flying out to the campus in question for what is effectively a multi-day interview. The core of this process is the ‘job talk,’ a presentation of the candidates research (usually 45 minutes plus a Q&A), but that ‘main event’ is in turn surrounded by a host of interviews with members of the departments, the department chair, the dean or other administrators and so on. There are a lot of stakeholders in an academic job search and because the expectation is that the new hire will be part of the department for the rest of their career (‘forever’), all of those stakeholders want a voice in the selection. Consequently every part of the experience, from the drive from the airport (it’ll usually be a faculty member picking you up) to the meals (generally also hosted by faculty members) to the actual formal interviews are, in fact, interviews.
By this point the second round has begun: postings for non-permanent positions – adjuncts, post-docs and the like – usually go up early in the spring semester and then move through an abbreviated form of the process. The idea here is of course everyone in the second round is also applying in the first round, since the first round’s jobs are objectively better, so you don’t want to interview a whole bunch of people for your post-doc only to find out they all got tenure-track offers (and thus will most certainly not take your post-doc).
Note that effectively all permanent, tenure-line searches have to be full, national searches going through the whole process.2 The days of getting hired to a tenure-line post via the ‘old boys’ network are long gone. However such informal hiring is often used to fill out adjunct positions (but not post-docs, which at least ought always to have competition applications), especially since few people will move for an adjunct position, meaning that if you know someone local, you tend to rely on that someone local.
Applications have a set of standard elements, with the full list being: cover letter, CV (curriculum vitae), teaching portfolio, diversity statement, research statement, a writing sample, letters of recommendation (usually three) and sometimes transcripts. The academic cover letter is a pretty rigid genre, with standard expectations on length (exactly two pages, no more and no less) and content.3 While you’ll generally have a ‘standard’ cover letter, these do need to be tailored to each individual job, so the application process involves editing that standard cover letter each time. Academic CVs are also rather different than resumes; I keep a copy of my CV in the ‘about the Pedant’ section (often not quite up to date, I admit), so you can see what they look like; it is much more of a list than a traditional resume, focused on publications and achievements, not skills.
Then you have the three statements: teaching, diversity and research. These are not always required and every application will have some different set of them. A full teaching portfolio consists of a ‘statement of teaching philosophy,’ evidence of teaching effectiveness (awards, student feedback, etc.), sample syllabi and sample assignments. Many applications will ask, instead of for the full portfolio, for some combination of these parts. A research statement is an opportunity in a few pages to expand on how you described your research in the cover letter.
Diversity statements are…controversial…and not all applications require them. The statement is supposed to both demonstrate how a candidate handles diversity in their classrooms and also how they address it in their research. They are often mandated by university administrators (often as an excuse to be ‘doing something’ for diversity without actually doing anything for diversity), which means some committees care deeply about them and others consign them instantly to the circular file. In at least some cases (but not all) it seems clear they function as political litmus tests, which would be impermissible for public universities, something I suspect will end up tested in court before too long. On the other hand, given the diversity of college students today, “convince me you can handle a class of people who do not have your background” is a pretty reasonable request when hiring a teacher. My own standard version of this statement – which then gets tailored to each job – tries to emphasize my ability to appreciate and handle different kinds of diversity in my classroom, including not just socioeconomic and ethnic background, but also things like previous military service (an important thing to consider, teaching military history).
That leaves transcripts (‘unofficial’ usually fine, unless you are applying to an institution run by the federal government), letters of recommendation (usually three, from fellow scholars in your field) and the writing sample. The writing sample is typically something article or chapter length and is usually something you’ve published or are otherwise working on; you aren’t creating bespoke writing samples for each job. A article or chapter that has gone through peer review and is published is generally the strongest thing to use.
Naturally, that’s quite a lot of materials. I think everyone has ‘standard’ or ‘blank’ versions of each standard document, but each still needs to be tailored to each job, which can eat a lot of time. And you will be applying a lot in this system, at least in history. Because on the one hand, placement rates are horribly low, so you need to apply to many jobs – in practice, every job in your subfield – but of course that means that every job gets a huge pool of applicants. To give you a sense of that, in my field (ancient history) it is not uncommon to see 6-12 jobs in a cycle (not counting teaching for the Department of Defense) and to know that the best ones have 100-200 applicants.
Who do committees pick? Every committee is its own unique creature, a product of the incentives and views of its members, but when you aggregate them all together a few basic statistical patterns come out. There are two things that clearly really matter: pedigree and time-from-the-PhD. It seems insane for pedigree to matter as much as it does given how low placement rates means that there are excellent candidates from all sorts of schools not being picked up, but most committees are old fashioned and it isn’t hard to see in who they hire that ‘candidate quality’ is often just a very baroque way of saying, ‘received their PhD from an Ivy League school.’
Alternately, it is really clear that even despite the low placement rates meaning that excellent candidates do not always get picked up on their first year through the system, it is nevertheless the case that your chances of being hired decline every year after you get your PhD. Committees prefer the fresh and exciting and even the best candidate who has been on the market for three years isn’t ‘new.’ Once again, this is old fashioned, a holdover from when placement rates were 60+% and thus it really was the case that most very solid candidates were picked up in the first couple of years. That is no longer true, but hiring committees are generally composed of academics who were hired in the much better days of yesteryear, so they may not have fully internalized how things have changed. There are other incentives that push to this result too: a fresh PhD doesn’t have as much of a record and so can more easily be sold as everything to everyone, making them more natural compromise candidates (which matters more as tenure lines become more scarce and thus more precious), and of course it dovetails with pedigree since the available candidates from the highest prestige programs are going to be the ones fresh out.

What does all that mean for my job hunt? Well, in practice, it makes things tricky: in a world where most committees are old fashioned, I am looking for a committee willing to get with the times, both in considering someone with a ‘stale’ PhD (thanks COVID) and also in considering someone with an unusual scholarly profile. I have my share of peer-reviewed research, of course, but a lot of old fashioned committees will not consider things like ACOUP – heck, even things like writing for Foreign Policy – as valuable or appropriate scholarly activity; this project can, for some committees, be an active negative (they don’t understand it, its scary, they think it isn’t serious, etc.). I think I have a decently strong application all things considered, with a wide-range of peer-reviewed and traditional media publications, a large outreach footprint, a lot of teaching experience (and very good teaching feedback), but I need a bunch of things to line up to catch a break here: an open-minded committee whose department needs exactly the sort of historian I am and which appreciates the value of the sort of work I do, comfortable with both military and public history, in particular.
That’s a hard square to circle, but there’s no way to find that unicorn of a department except to apply to all of them and hope for the best. Though the upshot is that the departments that won’t really bother to give me a second look at probably, on balance, not departments I’d want to be at long-term anyway.
I should note before we move to recommendations that one group of places I apply to I have left out here, which are professional military education (PME) institutions, which is to say the war colleges and their equivalents in the other services. Those are their own topic and worth discussing on their own at some point.

On to recommendations!
First off, a our brave narrator has created audio versions for another set of posts, in this case the series on generalship!
Next, let me note that my interview with noted naval history YouTuber Drachinifel has finally gone live; we talk about the First Punic War and ancient naval warfare generally. I will admit, I find it kind of funny that of all of the things I have done – peer reviewed articles, write in the New York Times, write for The Atlantic, appear on EconTalk – this is the thing my students have been most impressed by. I think that says something about my students and it is hardly bad. Drach is one of the better history YouTubers, being quite careful and source-oriented with his videos.
Meanwhile, I want to highlight this article on retiring chairman of the Joint Chiefs Mark Milley at The Atlantic (in theory that ‘gift link’ should get you past the paywall), not for what it says about him, but for what it says about the nature of the civil-military relationship in the United States. It’s quite a ‘rich’ text on the topic, both Milley’s own musings on the civ-mil relationship, the American tradition of an apolitical officer corps and what that means for his own actions – some of which he clearly regrets and some of which he is clearly proud of – but also in examining Milley’s own example of trying to walk the tightrope between loyalty to the Constitution and civilian control of the military when the civilian leader in question appeared to be attempting to overthrow the Constitution.
Also from The Atlantic, I wanted to highlight “Bakhmut, Before It Vanished” (also should be a ‘gift link’). We talk a lot here about conflict and historical wars and it is easy in the clinical discussions of foraging or cohesion or leadership to lose sight of the very real human cost these wars impose. So as much as it can be unpleasant, emotionally, to read, I think there is great value in reading the stories of survivors whose plans, lives, futures were transformed or simply wrecked by war and reaching out with empathy, to imagine how we would feel in that situation. Because while we can rationally know that ‘war is bad, we should do less of it,’ humans are emotional beings and so there is value, every so often, in driving home the emotional lesson, that “War is an ugly thing.”4
Finally, for this week’s book recommendation, I am going with a recent release, Wayne E. Lee, The Cutting-Off Way: Indigenous Warfare in Eastern North America, 1500-1800 (2023). This is one of those books I have been waiting to come out for quite some time, as I studied under the author at UNC Chapel Hill and so had heard parts of this argument laid out for years; it is a delight to see the whole thing altogether now in one place.
Fundamentally, Lee aims in the book to lay out a complete model for Native American warfare in eastern North America (so the East Coast, but also the Great Lakes region and the Appalachian Mountains), covering both the pre-European-contact system of warfare and also how that system changes as a result of contact. In presenting this model of a ‘cutting-off’ way of war, Lee is explicitly looking to supplant the older scholarly model, called the ‘skulking way of war,’ which he argues has been fatally overtaken by developments in history, archaeology and anthropology. As a description of a whole system of war, Lee discusses tactics, the movement of war parties, logistics and also the strategic aims of this kind of warfare. The book also details change within that model, with chapters covering the mechanisms by which European contact seems to have escalated the violence in an already violent system, the impact of European technologies and finally the way that European powers – particularly the English/British – created, maintained and used relationships with Native American nations (as compared, quite interestingly, to similar strategies of use and control in contemporary English/British occupied Ireland).
The overall model of the ‘cutting-off’ way of war (named because it aimed to ‘cut off’ individual enemy settlements, individuals or raiding parties by surprise or ambush; the phrase was used by contemporary English-language sources describing this form of warfare) is, I think, extremely useful. It is, among other things, one of the main mental models I had in mind when thinking about what I call the ‘First System‘ of war.5 Crucially it is not ‘unconventional’ warfare: it has its own well-defined conventions which shape, promote or restrict the escalation of violence in the system. At its core, the ‘cutting-off’ way is a system focused on using surprise, raids and ambushes to inflict damage on an enemy, often with the strategic goal of forcing that enemy group to move further away and thus vindicating a nation’s claim to disputed territory (generally hunting grounds) and their resources, though of course as with any warfare among humans, these basic descriptions become immensely more complicated in practice. Ambushes get spotted and become battles, while enmities that may have begun as territorial disputes (and continue to include those disputes) are also motivated by cycles of revenge strikes, internal politics, diplomatic decisions and so on.
The book itself is remarkably accessible and should pose few problems for the non-specialist reader. Lee establishes a helpful pattern of describing a given activity or interaction (say, raids or the logistics system to support them) by leading with a narrative of a single event (often woven from multiple sources), then following that with a description of the system that event exemplifies, which is turn buttressed with more historical examples. The advantage of those leading spots of narrative is that they serve to ground the more theoretical system in the concrete realia of the historical warfare itself, keeping the whole analysis firmly on the ground. At the same time, Lee has made a conscious decision to employ a fair bit of ‘modernizing’ language: strategy, operations, tactics, logistics, ways, ends, means and so on, in order to de-exoticize Native American warfare. In this case, I think the approach is valuable in letting the reader see through differences in language and idiom to the hard calculations being made and perhaps most importantly to see the very human mix of rationalism and emotion motivating those calculations.
The book also comes with a number of maps, all of which are well-designed to be very readable on the page and a few diagrams. Some of these are just remarkably well chosen: an initial diagram of a pair of model Native American polities, with settlements occupying core zones with hunting-ground peripheries and a territorial dispute between them is in turn followed by maps of the distribution of actual Native American settlements, making the connection between the model and the actual pattern of settlement clear. Good use is also made of period-drawings and maps of fortified Native American settlements, in one case paired with the modern excavation plan. For a kind of warfare that is still more often the subject of popular myth-making than history, this book is extremely valuable and I hope it will find a wide readership.
- In the olden days, these interviews always took place at the discipline’s Big Conference which always happens in early January. The shift to zoom has enabled a lot more flexibility and has led to the end of conference interviews.
- As an aside, this means your academic friend or relative has absolutely no control over where in the country they end up living, so do not ask them where they would like to teach. If there are, say, 10 openings in your field in a given year, you apply to all of them and you go to whichever one takes you, if any do.
- A ‘research first’ cover letter in history has the following paragraphs in order: salutation, then a sketch of the book project, then the project’s contribution and impact, then a paragraph for other scholarship, then a sense of what you can teach, then evidence of teaching effectiveness, then a valediction. A ‘teaching first’ cover letter inverts this order.
- “…but not the ugliest of things: the decayed and degraded state of moral and patriot feelings which thinks that nothing is worth a war, is much worse.” – John Stuart Mill.
- Itself an ultra-broad category with many exceptions and caveats.
The hardback on the Cutting-off Way of War is, substantially more expensive than the paperback. Is it a library edition or something?
The link to the Milley article doesn’t actually get me past the paywall, FYI.
Ditto.
Double Ditto
Check if it is fixed now.
No, possibly your free article only gives one person a free read?
Both articles came up for me just now.
The Milley article came up for me first try.
Also works for me.
> GIFT ARTICLE
> LINK EXPIRES IN 13 DAYS
> This article was a gift from an Atlantic subscriber.
I listened to Professor Lee discuss the book on https://podcasts.apple.com/us/podcast/ep-87-wayne-lee-on-native-american-warfare/id1589160645?i=1000626064218
and it was very interesting. I hope to read the book soon.
I’ll have to check that one out. It seems like it would go well with Pekka Hamalainen’s latest book about native Americans, which covers similar ground (particularly the importance of the trade with Europeans in guns and iron tools, which gave a major military advantage to any groups that had them versus others that did not, and which consequently became a major source of conflict).
This post omits such a wide swath of academic hiring that it feels misleading to a non-academic audience, and it leads me to a question about your application process.
At small liberal arts colleges and community colleges (the kinds of teaching-oriented institutions many of us prefer to work at), the concern about research is often less important or even non-existence (the interview for my current job at a mostly associates-granting private school did not even ask for a research statement). SLACs tend to be mostly but not always teaching-oriented, and the smaller regional state schools run the gamut though they tend to be more research-oriented when they think they can move up some ranking, so that’s a lot of them. I personally have had several more interviews where I gave a teaching demonstration than where I gave a research talk.
I guess to sum up: this looks like an accurate description of applying for jobs at University of [Some State] schools, and many Regional University of [Some State] or [Some City] State College jobs, but not community colleges, liberal arts colleges, and the like.
The question: Are you looking at community colleges/two-year schools, less prestigious SLACs, and the like? Or are you focusing on places that can support your research interests? I guess another way to ask: are you applying to “ancient historian” jobs, “anything before 1500 (or in Europe or some similarly large category) historian” jobs, or “any historian” jobs?
Although you could say the same in the opposite direction as well. Myself I stopped at a MS and ended being a programmer and IT person for researchers… But wife and close friends are Research academics. And umm teaching presentations have to my knowledge rarely entered into their hiring. Number of papers – yes. Size of the amount of grant money you are running – very much so. From where multiple sources or one. Are you bringing in a research area needed, a group we don’t collaborate with, or are you just disposable 3 years and out if not self funded?
I think it’s
At least it sounds like you don’t have to pay for the flights for all those campus visits. That’s something.
It seems like they think that as long as you’re from the “right” university, they don’t really see a lot of downside from picking the “wrong” candidate if they’re fresh PhD recipients. And the upside is that they can fantasize about getting a secret superstar, something which is less likely from someone who has an existing record.
I’ll have to check out the Lee book, since it seems to fit well with Pekka Hamaillainen’s latest book about indigenous warfare and history in the Americas. Big emphasis on this:
Access to guns and iron tools gave native societies something new to fight over (access to trade in this stuff), but also a serious military advantage to any societies that had them in conflict with others.
It also gave them new interests. If you can grow wealthy from furs, you want your neighbors out of the way to enable the flourishing of fur bearing animals.
The Iroquois didn’t care whether they committed genocide or ethnic cleansing as they cleared neighbors
“At least it sounds like you don’t have to pay for the flights for all those campus visits. That’s something.”
Yes, and No. At my university the search committee has to ask the candidate pay for their own airfare, any ground transportation they need to get to campus, meals and their hotel. As part of the campus visit the candidates go to HR and complete a bunch of reimbursement paperwork. If that paperwork gets screwed up, it can be delayed by a couple of weeks, which is super anxiety inducing for recent PhDs or current grad students who may be floating the whole thing on their credit card.
This was even worse in the bad old days (1970s-2010s) when grad students were also expected too shlep themselves to the American Historical Association’s (AHA) annual meeting to interview at the conference as one of twelve “semi finalists.” You had to pay for that yourself. As a grad student. If you were lucky maybe you got on a panel for a presentation and your home department would flip you $500 bucks towards hotel and airfare. But that does not go far in getting from the Midwest to New York, Washington DC, or Atlanta. The interviews were done in hotel rooms, sometimes with the candidate sitting on the edge of one bed the committee sitting in chairs or on the other bed. You can imagine how demeaning this was for the female job candidates being interviewed by three men. The Zoom/phone interview has been a great equalizer on many levels, but the AHA took way too long to get rid of the conference interviews. They should have banned them in the 1990s when long distance calls got cheaper.
If you want I can also tell you horror stories about the “Open Call” interviews that were held in booths in a hotel ballroom at the AHA as recently as 2006.
Any modern history grad student would trade “paying for their own plane tickets and hotels with in person interviews in said hotels” for “More than 10 jobs available in the entire country.”
John Stuart Mill, needless to say, never actually fought in a war in his life.
That doesn’t make him wrong. He wrote those lines about the American Civil War and in particular the justness and necessity of the cause of emancipation. Is he wrong? Is ending the brutal enslavement of millions of one’s fellow countrymen not worth a war?
And for Ukraine, is keeping the Russian army – which has manifestly shown a predilection towards murder, rape and pillage (see: https://foreignpolicy.com/2022/04/06/russia-ukraine-atrocities-war-crimes/) – away from Ukrainian civilians, away from their children (whom the Russian state abducts, see: https://www.atlanticcouncil.org/blogs/ukrainealert/russias-mass-abduction-of-ukrainian-children-may-qualify-as-genocide/) not worth a war?
You must forgive me, but the assumption that war could never be necessary, that no one who fought in one could ever consider it justified – this is a boy’s philosophy, unfit for serious adults. War is always terrible and sometimes necessary. An ugly thing, but not the ugliest of things.
While you’re right on all accounts, I am rather curious about your way of arguing this point. You as an educator – despite being in adult education, you’re still someone who wants to teach young people to learn to form (better) opinions on their own – should know better than to harp on age as a factor of a well-formed thought and argument. After all, a child would have a pretty strong excuse for not getting why people can’t just get along.
And if last and this year should have driven one thing home, it’s that this assumption that war = bad, from which to some miraculously follows that “defending oneself” is still bad (especially coming from individuals in countries in which sanctions interact with interests in cheap resources from Russia) is not just an attitude of “boys”.
Again, you are right, but I find it ugly to not just unnecessarily gender here (is the discussion here only for men, so we can safely ignore the “girl” side?), but also to 100% abstract not just lack of knowledge but absolute ignorance with childhood. Children aren’t stupid, and certainly not ignorant, they are still learning. It’s too often “serious” adults who close their eyes and do not want to see anything that doesn’t benefit them.
In fact, I think a huge issue in today’s world is serious-sounding adults who have an answer to everything, but only answers that benefit them one way or another. That’s what concerns me when people bring up some way of maturity the way you just did. It’s missing the point.
The gender point is fair enough – it is a child’s philosophy, unfit for serious adults.
But I find I am quite comfortable with the age distinction. As children, we, all of us, viewed the world through simple lenses, because that is all we were capable of. Most of us then grow up, both physically but also mentally and emotionally, and it is unworthy for us to keep to childish ways of viewing the world. Those ways of seeing were not wrong for us when we were children, because we are children. We are now grown-ups and ought act like it.
“When I was a child, I talked like a child, I thought like a child, I reasoned like a child. When I became a man, I put away childish things.” (1 Corinthians, 13:11 – you will have to forgive the gendered language for it is original to the text, but the Greek in this case (ἀνήρ) is ‘man’ both as distinct from ‘child/baby’ (νήπιος) and as distinct from ‘woman’ (γυνή))
I think the important distinction for me comes down to willful ignorance compared to a child’s simplistic view which they have due to inexperience. One thing can be averted by teaching, and the other is still a big issue to solve.
Taking your chosen Bible verse: I see English translations more or less chose “childish”, but when I give German translations a quick look, the same is more often than not translated to “kindlich” (childlike).
Here for example 1. Korinther 13, 11 in the Lutherbibel of 2017: “Als ich ein Kind war, da redete ich wie ein Kind und dachte wie ein Kind und war klug wie ein Kind; als ich aber ein Mann wurde, tat ich ab, was kindlich war.”
Here, I see the same distinction: Not knowing better due to a child’s natural naivety compared to an adult’s chosen ignorance.
At the end of the day, all of that is more about miniscule differences than about correctness, but it is an important distinction to me because I think we often overlook how much of a choice ignorance is when we call it childish and end it there.
The original Greek actually does not use an adjective: it reads literally “the things of the child.”
Or, to admittedly somewhat oversimplify it, there’s a great deal of difference between a war of pure aggression (like said invasion of Ukraine) and a war where you’re defending yourself. Like the Ukrainians in said war.
It’s ugly, it’s terrible…but not fighting would be worse, since it would require accepting the yoke of a conqueror. A particularly brutal conqueror in this case.
The “yoke of conqueror” is pretty language, but a bit too abstract. In the particular case of Ukraine, that “yoke” would entail:
*physical destruction of everyone involved in politics or national defence (as evidenced in Bucha)
*thorough robbing of all movable property by the enemy troops
*organised looting of all non-movable property
*random killing and raping on large scale (as again, evidenced in areas where Russians have been evicted)
*eradication of Ukrainian language and national culture (as evidenced by Putin’s speeches)
This is a result which is and should be, to a patriotic Ukrainian, a fate worse than death. It is a result that means the end of the existence of the nation, and immense suffering for the people. Stopping that result with any means possible is quite acceptable.
I don’t disagree, but I kept it abstracted to “yoke of a conqueror” because you are justified in defending yourself against *any* conqueror- there is no obligation to allow an invader to conquer you.
Hence “a particularly brutal conqueror in this case”.
You support the Intifada, then?
This reply completely misses the point. The issue is not about whether Mill is “right” or “wrong”. The objection is that Mill has no experience with war to justify his opinion being worth more than anyone else’s. So quoting him does not give valid support to any argument. And doing so looks like an attempt to convey some aura of authority by dropping famous historical names.
Man, I cannot believe this historian occasionally quotes the famous sayings of well-known historical thinkers. Surely he must mean this as empirical evidence to prove the point, rather than merely producing the famously eloquent and well-stated articulation of the point he wishes to make precisely because it is eloquent and well-stated (and thus famous).
Yup, that must be it.
Man, I cannot believe my audience is analyzing context and not just dazzled by rhetoric. What do they want from me? Well-reasoned arguments? Substance over style? The ability to understand objections without veering off-topic?
Nah, that can’t be it.
“Man needs more to be reminded than to be instructed.” And most often, the best way to do that is “what oft was thought, but never so well expressed.”
I’m not making a point about morality, I’m making a point about psychology.
In your “test case” for a justified war, the American Civil War, the majority of those drafted did not serve, or hired stand-ins to go in their place. (Mill’s pen pal William James and his little brother Henry both hired stand-ins. James, like Mill, made plenty of grandstanding statements about the nobility of war.)
Hell, World War II, which is most people’s go-to case for a justified war, saw more draft dodging than Vietnam, the corresponding stand-in for unjustified war.
There are others who have expressed similar sentiments, including but not limited to J.R.R. Tolkien, who went through the Somme and lost many of his friends in the trenches. For a rather less sympathetic example, I would also throw in Ernst Junger.
If ability to speak intelligently about anything requires having personal experience with it, we need to throw out most fields of academic inquiry. It’s not like people who *do* have military experience are all of one mind about the costs and benefits of war, either.
A comment from the STEM side: usually the stronger a department is (especially in the area of the search) the more confident they will be in their own judgement and thus the less they will care about easily legible proxies for quality like ‘pedigree’ and ‘time since PhD.’ Not sure if it works the same way in history.
Related to the Milley piece, there
I realize that law is different from history, and also that I may be biased by my own Ivy/T14 (a legal academic category) pedigree, but I have to say that, as a law firm partner, my own observations of young associates tend to confirm a strong correlation between the institutional quality/prestige or a candidate’s educational institutions and his or her performance. In fact, my own observations also confirm a strong correlation among more senior partners between “pedigree” and intellectual caliber as a professional (though that is far from the only determinant of professional success). So I don’t blame hiring committees for using the same metric.
I am retired from that trade, but in my experience, performance was often anti-correlated with the prestige of the associates school. My hires are now at the peaks of their careers and I know who has been successful and who has been left behind.
Remember, you are not a better lawyer than Abraham Lincoln, and he didn’t even finish elementary school.
Statistics beat anecdotes. At Cravath, which is certainly in the running for the top law firm in the country, 26.8% of the associates went to law school at Harvard, Yale, or Stanford, and 24.8% of the partners. So the top law school graduates are about as successful as anyone else in their practice, though not more so. At my firm, which is in the Amlaw 100 but a long way below Cravath, 0.4% of the associates went to those three top law schools, and 2.3% of the partners. So graduates of top law schools somewhat outperform the others at this lower level.
Maybe if I were at Cravath, I would say that it doesn’t matter so much what school you went to–as long as it’s T14 and you were law review.
While pedigree certainly matters in success for partnership and law as a business, I would argue it matters far less in terms of legal acumen and the actual practice of law, especially in court. I of course am biased as my practice is almost entire a barristers, but at least in my experience, pedigree and law review are essentially useless in assessing ability.
I would add that solicitors work is just as important as barristers, and the business side is important but I would argue also unfortunate requirement. Whether one makes partner in my view should not be the judge of legal acumen.
Evidently, the partners at Cravath, Wachtell, Skadden, etc. disagree with you. Since they command the top fees of any lawyers in America, it appears that their clients also disagree with you.
Can’t reply to ey81’s reply, so replying here: fees do not seem like the best indicator of talent and achievement.
Consider two students in medical school. One becomes a neurosurgeon, the other a general practitioner. The neurosurgeon, if competent, will certainly command a higher salary and make more money over a successful career. But the GP could well be a better doctor, except in the one area of conducting neurosurgery.
If you use fees or prestige of position to assess talent or success, you will necessarily select in favor of people who attended the top schools/programs because you are selecting for people who prioritize money or prestige over other career factors. If the opportunity to then earn the highest fees or gain the most prestige is largely delimited to those graduating the top schools/programs, then those people will have an advantage in success because they have greater opportunity, even with the same or lesser talent.
If (as I suspect they would) the numbers show that a greater percentage of partners are men than the male/female ration for associates, that doesn’t prove men are better lawyers. It suggests that other factors (including gender bias, priorities in life decisions, the possibility of factors internal to the firm which drive away the best women) may be at play, although it does not rule out the possibility of men being better lawyers. You’d first have to rule out the other elements.
If the top law firms, who presumably get the candidates they want, are hiring the best of the Harvard/Yale/Stanford cohort and setting them against the best cohort from other law schools, and the results at partner level reflect that the “other” cohort slightly outperforms the H/Y/S cohort, it’s odd to conclude that the H/Y/S is better. You might conclude that Cravath hires so many H/Y/S candidates that they’re getting folks in the top 20 from those programs, and folks in the top 3 from other programs. In any event, the statistics you cite do not, on their own, prove what you claim they do.
Related to the Milley piece, the concern for instability at the top of the nuclear chain of command has a history, and a martyr – Harold Herring, who was discharged for asking the question. It’s not promising for what the officers would do in the event of an illegal order – official policy is that whether an order is illegal is ‘need to know’, and the officers turning keys don’t need it.
Much like Stanislav Petrov (happy belated Petrov Day), I’ve thought for years we should honor Herring.
https://thepdv.wordpress.com/2015/08/09/heroes-harold-l-hering-the-loyal-contrarian/
I can’t read the article, but I wouldn’t be surprised if there was a personality mismatch between Trump and Milley. Many successful businessmen are given to ranting and shooting off their mouths (e.g., Elon Musk, who does it more publicly than most), and expect their subordinates to understand when they are serious and when they are not. The military–especially today’s thoroughgoingly risk-averse, careerist military–doesn’t welcome that personality type.
On the other hand, I would note, first, that Trump was about the least militarily adventurist president of recent times, and spared the total disaster of situations like Libya, where Obama destroyed a passably functioning government, in the process forfeiting our credibility with the Russian government and overturning established government legal procedures, in order to produce total chaos and warlordism. The blood of the recent flood victims is on his hands. Second, the recent record of the US military is one of unrelieved failure, at home and abroad, so Milley has a lot to answer for.
There are a vast number of business men that quietly run their business without running their mouth. You obviously don’t hear about them as they stay out of the media except for statements carefully run by lawyers first, but they are a majority.
At the time NATO intervened (France and the UK dragging the US into it), there was already civil war, in which Qaddafi was committing massacres. What you’re describing – passably functioning government, forfeiting credibility, etc. – is what the US did in Iraq, in 2003. 2011 was different, and it’s notable that while most US allies were pissed as all hell about 2003, no such reaction happened in the 2010s.
Iraq? What does that have to do with Trump? You might as well talk about Vietnam. Fortunately, Trump avoided all such foreign misadventures, unlike most of his predecessors.
But I agree, having dragged us into Libya, our European allies have been tactfully silent about its negative consequences. Good to know they have some shame. What they don’t have is good geopolitical judgment: Nordstream 2, anyone?
Neither Iraq nor Libya is about Trump. I’m not talking about Trump; I’m talking about how Libya was already in a state of civil war when NATO intervened. There was about to be a massacre in Benghazi, just as there were massacres everywhere Assad got to reconquer (Homs, Aleppo, etc.).
I suggest you look at “Libya: Examination of intervention and collapse and the UK’s future policy options”
We’ll just have to agree to disagree. My firm view is that foreign military intervention really never benefits the foreigners on whom it is practiced. Libya is no exception, even if the Europeans are not complaining. YMMV.
An intervention in an already violent civil war. This wasn’t 2003, this was 1990-1, or, more relevant to the news of the last few days, 1999 in Serbia.
“My firm view is that foreign military intervention really never benefits the foreigners on whom it is practiced.”
Are you claiming that it is applicable also in Kosovo? Applicable to WW II?
I should be a little clearer: attacks by a foreign military on the government of a country, or a faction within a country, purportedly for the benefit of some or all of the other residents of that country, never do the residents as a whole any good. Otherwise Haiti would be a paradise by now. On the other hand, clearly our actions in World War II benefited the Filipinos, the Koreans, the French, the Belgians, etc. But it would be tough to claim that our actions benefited the Germans or the Japanese, and in any case we never purported to be acting for their benefit.
Trump was / is no less of a hawk than his predecessors- his rhetoric (and in some cases actions) regarding Syria, Iran, Venezuela, and most recently Mexico (?!) were as belligerent as the most hawkish of the hawks. He was just lucky none of his endeavors ended in a war.
Biden on the other hand successfully withdrew from Afghanistan (at significant political cost to himself) and has not started any wars either. (The Ukraine war was not started by the US, and for those who blame NATO expansion, NATO actually expanded under Trump’s term in office as well). I’m no fan of his foreign policy bluster about councils of democracies, “autocracy” etc.., but I sort of expect that of most US presidents and thus far he’s been smart enough to keep it at the level of bluster.
As bad as Democrats are regarding foreign policy adventurism and US hegemony, Republicans are worse.
The burden of proof is on you to establish that it was “luck.” Hawkish rhetoric has a long history of preserving the peace.
Trump committed an act of war against Iran when he had Qasem Soleimani killed. And 110 American soldiers were wounded in the Iranian counter-strike. That things didn’t escalate into a proper war from there looks like luck to me.
“Trump committed an act of war against Iran when he had Qasem Soleimani killed.”
yes, i was think exactly of that, and of things like it.
Fortunately for Trump, within a couple of months of Soleimani’s death Iran had much bigger problems to worry about (they were one of the first countries to be hit by the first wave of COVID, and they were hit very hard, IIRC a bunch of their political and religious elites were killed by it).
“Hawkish rhetoric has a long history of preserving the peace.”
can you give me examples of what you mean?
At the time NATO intervened (France and the UK dragging the US into it), there was already civil war, in which Qaddafi was committing massacres.
Without US intervention, Gaddafi would probably have won, there’d have been a few massacres, and passably functioning government would have returned. Instead we have a Libyan Ulcer in a state of near anarchy destabilising the entire region.
And I don’t think “France and the UK dragging the US into it” is really much of an excuse. The US is the world’s only superpower, they could easily have declined to get involved.
No? Look at Syria as a counterfactual of what happens when the dictator stays in power. There are more than just a few massacres, even after he’s consolidated power; the death toll in Syria, per capita, is several times that of Libya. There’s a lot of global destabilization coming from the continuation of low-intensity warfare by the state against its own people. Libya has some ISIS affiliates, but Syria is where ISIS started as a global force (originally it’s from Iraq, but until its takeover of Raqqa it was not globally significant).
Syria has spent the last twelve years in a state of civil war, in part because the US has been sending aid to anti-Assad forces. Granted if the US had spent over a decade aiding Libyan rebels without overthrowing Gaddafi, that might have been worse than what we actually got, but that’s not the most relevant counterfactual.
The US sent aid to rebel groups fighting ISIS, and at one point was sabotaging anti-Assad efforts by insisting those groups fight ISIS and not Assad even though Assad was doing most of the killing in Syria. At no point was it making a concerted effort to be the rebels’ air support the way NATO was in Libya.
There’s a big difference between Libya and Syria (prior to 2011) which is that Libya is (or rather, was) a comparatively wealthy petrostate, and their statistical indicators seem to suggest that at least in terms of gross measures like life expectancy, nutritional levels etc. the oil wealth had benefited a broad slice of the population. Libyans had a lot more to lose by continuing to resist or rebel against the government than Syrians did. (as they learned- Libyan GDP/capita decreased about 60% in a single year). So I don’t think the Libyan civil war would have lasted as long as the Syrian one did. Without foreign intervention I think Qadhafi would have crushed the rebellion pretty quickly. Yes there would have been a few massacres and a lot of executions, but then it would be over and stability restored.
Again with “a few” executions. Assad executed 120,000 prisoners (link).
“Again with “a few” executions. Assad executed 120,000 prisoners”
I said “a few massacres and a lot of executions”.
I would bet that most Libyans today would say that 40k political executions (scaling down to Libya’s population size) is well worth it to have the pre-2011 economy and political stability back, but maybe I’m wrong.
Some hints that you’re wrong and Libyans don’t want Qaddafi back:
1. There’s very little refugee migration originating in Libya. The refugees crossing the Mediterranean are not Libyan – they come to Libya via poorer places in Africa. Compare that with Syria.
2. None of the warlords in Libya is claiming continuity with Qaddafi. They don’t use the all-green flag. They don’t use pan-African socialist anti-Western rhetoric. Haftar has not made any moves to rehabilitate Qaddafi, in contrast with what Sisi has done with rehabilitating Mubarak. Maybe they’re stupid and don’t realize there’s some groundswell of support for restorationism, but unless you have very strong knowledge of the area, which you don’t, you should not assume that an ideology that multiple warlords in a civil war reject is locally popular.
“2. None of the warlords in Libya is claiming continuity with Qaddafi. They don’t use the all-green flag. They don’t use pan-African socialist anti-Western rhetoric. Haftar has not made any moves to rehabilitate Qaddafi, in contrast with what Sisi has done with rehabilitating Mubarak. Maybe they’re stupid and don’t realize there’s some groundswell of support for restorationism, but unless you have very strong knowledge of the area, which you don’t, you should not assume that an ideology that multiple warlords in a civil war reject is locally popular.”
Fair point, and yes, you certainly know more about it than me.
On a separate note, re Trump and the military, there’s a specific US military culture element that doesn’t work well with populist politics. The military hierarchy has many layers of subordination, just like any other hierarchy of that size. A key element of it is that commanders at each level are trained to reinforce the authority of their immediate subordinates over their own subordinates. For example, in the navy, the captain of a ship will not barge into the control room to bark orders directly at the sailors – if the captain’s presence is necessary, the orders will still go via the officer of the watch, in order to reinforce the officer of the watch’s authority.
Civilian hierarchies, whether political or corporate, differ in how much top people reinforce the authority of the people immediately below them. In politics, it’s common enough for politicians to publicly undermine their own instruments of governance and campaigning – “my consultants told me to do X, but we all know they don’t understand Real America so I’m doing Y” is a common line; Beto O’Rourke said that in the Texas Senate race in 2018 to justify campaigning in colleges. The US encourages this behavior in politicians more than most other rich democracies; populist politicians do this more than non-populists; Trump has done this to an extreme extent. A Trump-like ship captain, and probably also a Beto-like ship captain, would be barking orders, teaching every sailor within earshot that the intermediate officers are only good for small stuff, and for anything important, they should wait to hear from the captain.
Note also that this isn’t even “let’s shrink the size of government” – politicians can cut taxes and reduce regulation without publicly undermining the civil service. Nor is this a tendency to have party governance and political appointees replace state governance and a permanent civil service, because such politicians undermine party actors and not just state actors. It’s just a style of bad management in large hierarchial organizations.
So citizens should regard the civil service as their superiors, and themselves as the subordinates?
That’s exactly the attitude that elects people like Trump.
Way to miss the point.
You build an infrastructure project. The mayor has a habit of canceling plans made by the city’s DOT at the last moment over some backroom deal because some neighborhood notable objected. Is it going to make the voters feel “yep, mayor’s got my back”? Are voters going to be less likely to vote for an anti-system populist? Or is the DOT going to internalize that it can’t make any long-term plans, and, over time, hemorrhage every ambitious planner to the private sector?
(Okay, in this case – Eric Adams – this kind of behavior means that his reelection is not guaranteed, which is rare for a New York City mayor. But Adams is not anti-system – he’s machine – and “the mayor is so repulsive he might be successfully primaried in a city where this never happens” is not really the point you make.)
Way to miss the point.
Notice that this anecdote is about the right of DOT to run roads where they pleased without regard to those who live where they want them or who will use them.
“Ambitious planner” sounds exactly like the person whom anyone sane wants out of the DOT or any government post.
It’s actually about bus lanes.
And okay, you don’t like having a government. That’s fine. As Bret documents for the Ancient Mediterranean, people who are used to systems of slavery tend to reject government, because it makes them feel like slaves. The Romans had a killer app that bypassed that, through the client-patron system; thus the Macedonians ended up being colonized by the Romans. Likewise, the Southern US underdeveloped state institutions while the Northern states were building canals and railroads with which they could invade the South and destroy its system of slavery.
If you can’t imagine any other defense of your particular governmental overreach than to accuse the other side of not wanting a government, your particular governmental overreach is indefensible.
“It’s actually about bus lanes.”
So much for ambitious planner! Bus lanes can easily be painted over and buses sent to new locations because people bought their homes without regard for the plan. REAL ambitious planners put in subways because they can’t be moved and other people will have to live their lives in accordance with the plan.
No. (If this counts as an argument: it’s been named civil service purposefully, exactly to answer this question.)
Just as in material production, it is more efficient for people to specialize into jobs rather than doing a little bit of everything, in the field of creating coordinated action (a.k.a. governance) it is more efficient if some people do it full-time (and further specialize into subfields) rather than everyone trying to do a little bit of it on the side. Hence a civil service.
Within any individual organization, there is also a question of how much delegation of initiative is taking place. To use military terminology, such delegation is based on a clear communication of commander’s intent. I think the problem with this management style could be described in terms of creating an entirely unnecessary muddle around commander’s intent. Civil servants are hired at public expense, to serve the public and in the example given above they are, in effect, kept uncertain about what their job is.
Famously, in Switzerland the effectively complete plans (with cost estimates attached) are submitted to a yes/no vote by politicians, or in the case of the largest projects, a plebiscite. This produces clear feedback, from which the civil service can learn if it did a bad job and if so in what way, and improve. On the other hand, if the entire feedback is “the mayor thinks (s)he is doing a favor to some unidentified interest group”, that is much less helpful, to put it mildly.
Is the above completely missing the point? Does it still imply government overreach?
Does it try to answer the question?
The question was why Alon Levy thinks the civil servants are entitled to so much deference. Down to and including their being able to implement “ambitious plans” that skip the formality of usefulness.
Delegation is not an answer because anyone with delegated authority ought to know that it can be revoke at any time. If civil servants are angered at having the delegation revoked, it isn’t be revoked enough.
I don’t think Alon thinks any of that?
– I’m very certain he cares about the usefulness of public projects.
– With less certainty, I think by “ambitious” Alon means projects that are large in scope (big benefits for big costs) and cannot usefully be done piecemeal. Planning such is, as far as I’m concerned, exactly the job that the public hired (these particular) civil servants to do. Then the public (or the politicians representing it) should get to say whether to go ahead with the plans or to put them aside.
– If you mean something outside that by “deference”, I’m afraid I haven’t caught what you’re referring to. Which part(s) of Alon’s comment is/are the offending bit(s)?
Yes, delegated authority can be revoked. The way to do that is to fire the person in question, or at least assign them a task of different scope (presumably not involving authority). For the military version, see our host’s analysis of the last jedi. If the admiral wants to revoke the authority of an officer, then remove the officer from their post and put someone else in it.
Alon precisely objected to the idea that the public gets a say in whether the plan gets implemented.
“Alon precisely objected to the idea that the public gets a say in whether the plan gets implemented.”
Er, could you quote the relevant sentence(s)?
The closest I see at this time is that if someone were to assume that any “neighborhood notables” should be taken as rightfully speaking on behalf of the entire population of the area, then
“The mayor has a habit of canceling plans […] because some neighborhood notable objected.”
indeed would mean that. However, the context is the very next sentence being:
“Is it going to make the voters feel “yep, mayor’s got my back”?”
which in my reading implies that this rhetorical question is answered with a “no“, because the interests of the “neighborhood notables” have very little to do with the interests of the general public even in the area, never mind outside it.
“…by claiming it must be some voter who doesn’t count”
They do count — as exactly one voter. According to our (post-Ind.Rev.) mores, the observed fact that some people have the ear of the mayor in ways that most residents don’t counts as an aberration to be rectified rather than the proper order of politics. I’m personally open toward some ideas of creating legible channels to turn excess economic power into excess political power, to prevent/ameliorate stasis, but the keyword there is legible, and in this case Alon explicitly spoke of “backroom deal“, an illicit way.
“If you wish to back up your claims, please describe how you would differentiate between the voice of the voters and what some voters say. […] Given that bureaucrats are notorious for [committing injustices].”
I have several ideas, actually.
– First, I keep mentioning that the public (or the politicians representing it) should get to have a yes/no vote on whether to go ahead with a plan.
– Second, if we are on the topic of transportation, it bears mention that most German transportation agencies (Verkehrsverbünde) keep an official Fahrgastbeirat (literally “passenger council”; a “jury” or a “focus group”, to use legal and market-research analogies; or compare the boards of minor magistrates from classical poleis) of 10-20 citizens, otherwise unaffiliated with the agency, who they ask a few times a year for in-depth feedback. Because the whole process is official and legible (including that the agency straight-up publishes it, even if it sums to “the trains are late, get your act together”), if e.g. the group is nonrepresentative, then that is immediately obvious.
– More or less equivalently to the previous, polling representative subsamples of the population with short questionnaires is a known thing. You could mandate that the agency do that once a year (i.e. hire a market research company to do it; this is a commodity service), then publish the questions and results (so that sabotaging the process by asking stupid/leading questions reliably result in calls for heads to roll at the agency).
“Besides, why shouldn’t a single voter be heard if he’s the victim of monstrous injustice?”
He’s welcome to say it. If people widely agree with it (even without this being a majority opinion), it will make its way into the regular feedback channels and, should it become necessary (because the agency plays deaf, and the reigning politicians also play deaf), escalated to the level of elections. But in the case where according to the mores of the society in question, the society does not agree that he is the subject of injustice, because by the mores of said society, he is spuriously objecting to a decision duly made by society (or made on the behalf of society by its magistrates), then by the mores of this society he ought to be overruled. This is not a society of idiots, it knows that sometimes its decisions could unduly burden individuals, therefore it has a process to e.g. sue the government and people not uncommonly win. But given the textual context (i.e. the very next sentence implying that the voters do not welcome the mayor acting on his objections), this is obviously not the case here.
“Trump was about the least militarily adventurist president of recent times”
This claim doesn’t really stand up to scrutiny in any meaningful way, sorry. Trump made multiple highly risky, aggressive and/or destabilising military decisions.
You have to look at the alternatives, sorry. He was the first president in a long time to not start a war. You can say he got lucky about the geopolitical situation compared to his predecessors – but in 2016, Hillary was promising to shoot down Russian planes in Syria if she gets elected.
To give credit where it’s due, Biden has arguably continued the streak of not starting wars. Good for all of us.
He didn’t start wars. That’s unique.
On the other hand, given the diversity of college students today, “convince me you can handle a class of people who do not have your background” is a pretty reasonable request when hiring a teacher.
Only if the statement concentrates on those differences that would actually affect how you would handle them. Different political points of view. In particular, it would not concentrate on those differences that are only regarded as affecting that by those who stereotype students by such traits as race and sex.
Out of curiosity : How well do the famous foreign-to-Americans schools count?
Oxford, Cambridge (England, not Massachusetts!), Berlin, etc?
In math, pedigree will get you the postdoc, but not the tenure-track job (and getting a tenure-track job without three years of postdoc is practically unheard of). By the time you apply for a TT position, committees look at pub lists and rec letters, and count time from Ph.D., and in practice, it’s uncommon for people to get a job at a more prestigious institution than where they got their Ph.D., but the pedigree itself no longer matters at this point. If you have an unimpressive pub list from a prestigious postdoc, they’ll say you got a great opportunity and wasted it.
Looking at Roman Republic, Roman Republic seems to not have had academic hiring to or from Rome – and this is notable because Roman Republic was close but not there on both counts.
The Academy itself was destroyed by Romans, namely Sulla. The last scholarch of Academy, Philo of Larissa, moved to Rome and taught in Rome for 4 years… but not as a school (will explain).
Of the other schools of Athens, Panaetius of Rhodes taught for years in Rome as a member of Scipionic circle, but after Scipio´s death went back to Athens and became the scholarch of Stoa.
What is conspicuous contrast: the four schools of Athens (Academy, Lyceum, Stoa, Garden) were continuing institutions. They had several teachers affiliated with each other, someone designated as scholarch (though Academy for a time had joint scholarchy of Telecles and Euander), and they replaced scholarch when vacant. The account of Academy scholarch election in 339 BC (Xenocrates won by a few votes over two other candidates) suggests sizable formal voting membership – there must have been a way to get formally admitted to be a member of Academy entitled to vote at a future election.
And now contrast Rome. Greek philosophers of high profile visited and taught, but there was no school, no affiliation of teachers, no academic hiring. When the teacher went back to Greece like Panaetius or died like Philo, the students just dispersed – there were no hired replacements.
Oh, and there was no academic hiring from Rome either… which is odd because Phoenicians did get hired. The predecessor of Philo was called “Clitomachus”, but he was actually Hasdrubal of Carthage. Lyceum was last headed by Diodorus… of Tyre. Stoa was founded by Zeno of Citium… expressly a Phoenician. And one of the last scholarchs of Garden – one who survived Mithridatic War unlike the others – was Zeno… of Sidon.
Phoenicians were well attested at the top level of the Athens schools. Now can you name Romans who taught in Athens during Roman Republic?
This style of warfare seems like it’s pretty well understood today, except that it’s called “terrorism”. (And not considered a legitimate form of warfare.)
How would the book’s approach deal with the Irish Troubles, or the Israeli-Palestinian conflict?
Regarding legitimacy, as they say, one man’s terrorist is another man’s freedom fighter. I think the legitimacy of “first system” warfare is very much contested today, and wht you think of it depends (like with many things) on which group you identify with.
Say it’s very easy to tell a freedom fighter from a terrorist by the targets they pick. Soft civilian targets are the sign of a terrorist.
The Marine barracks in Lebanon and the Pentagon are not civilian targets, though the former was evidently softer than it should have been. And every military looks for soft enemy targets.
I think it is more accurate to say that the “cutting off” way of war simply doesn’t comply with modern laws of war. Modern laws do not permit torturing prisoners, raping female civilians, and various other elements of “first system” warfare. Therefore, modern states justly punish practitioners of the “first system,” as all societies punish those who do not conform to current norms in ways that harm others.
I think that works as a nice ideal, but in practice things get gray very fast, especially when you consider government officials, police officers, etc..
If someone, say a disgruntled Syrian Islamist, was to take out Vladimir Putin tomorrow (which would be good for the world, in my opinion), I think the Russian government and media would absolutely consider that an act of terrorism (as they did with the waves of political assassinations in the late 19th / early 20th c), even though Putin isn’t a soft civilian target in any morally meaningful sense.
The Palestinian way of warfare is… not that. At all. The most obvious differences is that the Palestinians do not have a unitary polity. There are autonomous actors and have been since the First Intifada, and both their goals and their tactics differ. Those actors often choose tactics based on domestic power struggles – for example, terrorism out of the West Bank nowadays comes not from Fatah (which believes in negotiations to effect a two-state solution) but from armed clans that commit attacks to show their independence from Fatah. It’s not really possible to analyze Palestinian warfare from a realistic point of view – one has to be constructivist.
Relatedly, there’s the question of “which territory do the Palestinians want to kick the Israelis out of?”. Is it just the West Bank, or the entirety of the historic Mandate of Palestine? Again, this is not really a realist question but a constructivist one. Different factions want different things, and there is no unitary state that can tell the from-the-river-to-the-sea dead-enders to stuff it.
How is that a difference? The natives raiding the early United States were not known for belonging to a unitary polity. The opposite is true.
“Committees prefer the fresh and exciting and even the best candidate who has been on the market for three years isn’t ‘new.’”
Sounds a lot like the debutante system – if you didn’t have an offer of marriage within about three years of “coming out” your likelihood of an offer went way down.
It’s mostly to do with the relative numbers of positions and applicants. Due to primogeniture and military service, there were always more women than men around during the “season”.
I agree wholeheartedly with John Stuart Mill.
Mr Devereaux,
Does this blog help or hinder your academic career?
Genuinely curious , I can see a case for either.
On the balance, probably hinder. But it also pays for my food and housing, so that’s a plus.
That a project like this one could actually harm your academic career is a stronger indictment of academia than anything else you’ve written.
Yes, but things are changing. Blogging is twenty odd years olde now, but the American Historical Society has come up with an official policy for how university and college history departments _may_ evaluate digital scholarship and activities like blogging.
On the balance, probably hinder.
I thought that might be the case.
FWIW i do appreciate this blog and the effort it takes to produce it. Hopefully times are changing and it will pay off for in the long run.
….also, ACOUP is of a piece with your broader vocation as an historian, no? Utilizing your expertise and writing ability* to edify those of us who choose to read your public work seems worthwhile to me, in a way distinct from career considerations…. FWIW.
* proofreading not included (:P)
The graph you posted does not show “the strong preference for fresh PhDs over more experienced teacher-scholars being very clear.” It shows the absolute numbers of hired APs, and unless we know the size of the applicant population we cannot know whether fresh PhDs are more or less likely to be hired.
I am a tenured history professor at a regional state university in the upper midwest (an RII for those in the lingo). I was hired nineteen years ago as a fixed term visiting assistant professor because the tenure tracked person they had originally hired received a much more prestigious post doc Back East which would give that person another bite at the apple for a research university job. I talked with the chairperson a few years after I was hired as a tenure track assistant professor, and they basically said I was the first CV out of the recycling bin that called them back. It helped that I was literally an hour and a half drive away from campus, so I could do an in person interview in May. All this is to say, I am ridiculously lucky — basically I’m a unicorn. So what I have to say about hiring in history and the jobs market, should be understood from that atypical perspective.
I have served on multiple hiring committees and a lot of what Dr. Devereaux says here resonates with those experiences. As he notes, the trajectory of the job search for the hiring department starts well in advance. It sometimes takes years to get the funding for a full time tenure track faculty member, although before 2016 that was a little easier. Since then, our department has had three retirements, but no replacements. Even when enrollments were good, convincing the provost to let us hire was an uphill battle. Academic administration tends not to be from the liberal arts or history. They don’t understand why our US Historians can’t also double up and teach ancient China, for example. One Academic VP said they wanted to merge our department with Global Studies, because “History, that’s like Ancient Greek, nobody teaches that anymore.” This particular VP was a Chemist. Sometimes getting the budget line and approval to hire is a years long process that involves a lot of conversations with administration. So every hire feels like it is high stakes and hard fought before we begin the search.
Once we have permission we had to craft the job descriptions in ways that made sense to our discipline of history, but also complied with the university policies as well as state and Federal hiring and employment laws. Every interview question, for example, had to be vetted by the lawyers. Everything went through legal, the job description, questions for the phone and on campus interviews, the rating forms, the job talk questionnaires. We especially had to compromise among ourselves, because every new hire nudges the department in a slightly different direction for the next twenty years. In a department of ten people, there were ten perspectives on what that direction should be.
Overall, I think my colleagues were pretty good about not falling into some of the pitfalls Dr. Devereaux outlines here. I would say that some were interested in pedigree, but most were not. We also understood that pedigree meant different things for different subspecialties. For example, UT Austin has a reputation for being a great school for Latin American History, but not so much for Russia or Eastern Europe. So pedigree meant different things depending on what subfield we were trying to hire.
In terms of “stale” PhDs we were very different from most schools or the statistics Dr. Devereaux presented in the original post. I was on three committees and we all agreed that we wanted candidates who were ready to teach. We threw out applications where the candidate had no prior teaching experience. We did not count TA-ships as teaching experience. Every candidate had to have taught their own class at least as a grad instructor. We gave bonus points on the rating form to people who had between one to three years teaching experience outside their PhD granting institution.
I think my Americanist colleagues put more weight on research potential, maybe too much. We have a 4/4 teaching load that makes it difficult to have an active agenda if you need to travel to archives outside North America. But people do have successful research programs at our university and in the department.
I feel really grateful that we were able to hire great colleagues who have been successful at our institution. They have all earned tenure and promotion. I think my university is an anomaly in a lot of ways. We have strong union protections in our contract that have kept Administration from drowning us in a tide of adjuncts. Pay is not necessarily that great (compared to the colleagues in Nursing and Business), but we have the same pension and healthcare as other state Employees, which has been a boon to me and my family.
I would also say that politics in the Republicans and Democrats sense of that term has very little to do with hiring in Universities. My scholarly work is in the field of Modern Eastern and Central European History. There are plenty of conservative historians in that subfield and they are well regarded. Something with openly liberal scholars. Everyone is judged on the quality of their work. I have to also say that in my own experience, the “politics” that matters inside a history department is not about Republicans vs. Democrats, Conservatives vs. Liberals, its about course releases, advising, service work, whether the Americanist colleagues think your Renaissance historian colleague’s book being “in press” vs. “forthcoming” is enough to qualify them for tenure. It’s also about different visions of what a history department should be for its students. In that respect I got on a lot better with my conservative Republican colleague than I did with the colleagues who were supposedly more aligned with me politically.
One other thing, when Dr. Devereaux said that there are 100 applicants for almost every position, he is pretty much on the money. I teach at a not very prestigious school, and we never had less than sixty people applying for a position. The last hire we did was looking for someone who had a main specialty in Modern East Asia (any country) but could also compliment our strengths in gender history, legal history, or environmental history. We had a 100 applicants for that position. We could throw out 20 of the applications right away as not meeting that basic job description. “We are sorry, your research on Forest Management in Enlightenment era East Prussia is fascinating, but it is not Far East enough to count as Asia.” Getting down to ten for phone interviews was tough. Same with bringing the final three to campus. We hired a wonderful colleague who has been a great teacher, scholar, and active participant in the service work of the university. But to be honest, when we got down to the last twenty, or even sixty, every candidate was qualified and could have done the job. There isn’t some “fair” or “impartial” measure to get the right person. It’s three or five people on a committee doing the best they can, to evaluate a lot of material quickly and consistently using the same criteria for everyone. If this process was fair, those sixty people would have jobs.
I want to end by saying good luck to Dr. Devereaux. I wish we could hire you, but I can’t get any traction in my department, college, or university for an ancient history hire for any region in the world. Dr. Devereaux your accomplishments are outstanding. Your results on the job market are in no way a reflection on you as a scholar. The miserable state of the job market is a reflection on the sorry state of higher eduction and graduate school system that promises much and delivers very little. Dr. Devereaux, you’re doing a great job with your public facing history writing and your regular scholarship. If a committee can’t see that then you don’t want to work for them.
good luck with the job search, if I could do anything to help you get one at my own school I totally would, but I’m unfortunately just a random undergrad
The article linked says that the person suing stated that they favor:
“colorblind inclusivity,” “viewpoint diversity,” and “merit-based evaluation” As at least two of these are common dog-whistles, I would not want to hire this person either.
If putting in the effort to at least *sound* like you aren’t going to be racist in the the classroom is too much effort, then who can trust you to put in effort for anything else?
Yep. Having been on a number of hiring committees with diversity statements, I do *not* want to hear about candidates’ values or passions or sentiments or commitments or beliefs. That’s all inside their head. I want to hear about their track record with student diversity. They’re being hired to do a job, not to have good thoughts.
Be more specific.
The theory that people who disagree with you have secret, hidden messages (“dogwhistles”) in their words that reveal their secret evil is clinical paranoia.
Stating you are not going to treat people according to their race is stating that you are not going to be racist.
As someone who used to be libertarian, while coming from a family on food stamps, I have to push back on “has never been about innate rights.” At least some of the time, there is a genuine and non-selfish moral fervor involved. Non-Coercion Principle, “just leave people alone”, etc. Plus the intellectual appeal of apparently deducing a whole political platform from that single axiom. (In reality, there were unstated axioms about the justice of private property and its current distribution, and of course denial or or avoidance of thinking about externalities.)
While the political positions are different, I’ve come to see a lot of similarity in how such libertarians, and many socialists/leftists, think. Similar dogmatism and purity, elevated over practicality.
Cf. Charles Stross’ line: “Libertarianism is like Leninism: a fascinating, internally consistent political theory with some good underlying points that, regrettably, makes prescriptions about how to run human society that can only work if we replace real messy human beings with frictionless spherical humanoids of uniform density”
Wilde’s “The Soul Of Man Under Socialism” seems to me a better answer to a lot of the professed liberty-wanters; “liberty to” rather than “liberty from” is an illuminating dichotomy. (Of course Wilde is not so much not being practical as disdaining the idea of being practical entirely in that essay, but that’s a different problem.)
Personally, I wouldn’t call those dogwhistles, though they do tangibly exist- look at the famous Lee Atwater quote, where he explains the use of several- but often, from a white person (and as a white person), to claim colorblindness comes off as and is often due to not having done the research. Racism is, after all, mostly these days unconscious prejudice I think. You can only know if you’re not doing it by paying attention to what you are doing and why, which colorblindness is insufficient for; how can you know if you tend to give (say) one student more time in the classroom than another without paying attention to where that time is going?
Hope this helps.
That’s irrelevant to the claim. You have to base the behavior on the activity, not the claim. You certainly can not exonerate anyone who doesn’t claim to be colorblind, and you can not condemn someone for it.
I have pruned a rather long and winding argument which didn’t seem to me to be productive and in any case involved too much ad hominem.
Mary Catelli – this is becoming a frequent feature of your contributions to the discussion and my patience for it is about at its end. At the very least, I expect you (and your rotating list of interlocutors) to show more civility and intellectual charity. I would hate to have to wield the banhammer, but it does feel like we are getting to that point when every thread devolves into the same pointless arguments.
The source thread got removed, but in it I was asked how I stopped being libertarian; hopefully it’s okay to respond to that.
Though I don’t have a very satisfactory answer; it’s like something simply flipped inside my brain from valuing ideology to valuing outcomes. I’d been in a libertarian-dominated argument over universal health care, and I didn’t really learn any new facts, but had the thought “you know, Sweden sounds like a nice place to live, not some hellhole”, and I slid from there. Even as a fervent young anarcho-capitalist, I had had _doubts_ about certain things: defense, welfare (poor people not starving), the environment, but hoped that we would have figured out good answers by the time we were relevant. After the epiphany, I became more “maybe there aren’t good libertarian solutions for those because there cannot be. Externalities and all, like the _other_ parts of Econ 101.”
So yeah, I pretty much changed, rather than getting convinced by some specific argument. Valuing outcomes and accepting a messy world, over a moral purity that, on deeper thought, wasn’t that pure anyway. (If there’s lots of government corruption and market interference, why should we treat the existing property distribution as a moral starting point, anyway?)
I don’t intend to start an argument about the merits of (US-style) libertarianism here. But I’d defended at least some libertarians (like my younger self) being in it for the moral clarity, not selfishness, and got asked why I stopped, so I wanted to answer that.
I do think pedigree matters too much in the hiring process at most schools, but having now read through that study twice, I am unconvinced that they controlled for the right factors. The most glaring problem with their methodology is that they do not appear to have taken graduate program size into account. I know UW-Madison, Cal-Berkeley, U Mich, and Stanford have humongous graduate schools, and having gone to UW, I can attest that at least one program only cut down to the number of students they could fully fund in the late 90s. The study does not consider how many PhDs each school graduated over the period being studied: if these five programs produced 20% of all PhDs in the job market across the period of time surveyed, you’d expect them to have 20% of all hired faculty, all other factors being equal. Finding that they account for 13.8% of all employed faculty is only meaningful if you know how many faculty in the pool of potential faculty were their graduates. That information appears nowhere in their data-set. They never accounted for it.
I know that my own hire was in part because I had a string of Visiting AP positions and they were “climbing the ladder” of prestige: from a school ranked 78th to a school ranked in the 50s to a school ranked 13th. The committee chair saw that as an ascending career arc, and thus a positive sign. But more importantly, I was hired because the candidate receiving the initial offer turned the school down because of low salary, and one other candidate had accepted a job elsewhere. (I believe they simply rejected the third and flew me out at a late hire at number four.)
Conversely, I know unquestionably that other departments hire overwhelmingly from the Ivys, and you can too by looking over where faculty got their PhDs in a program advertising a position.
One final factor: Letters of Recommendation are often pretty important in getting to the campus visit stage, because generally speaking the interviews might weed out one or two candidates and advance one or two to the visit stage, but most schools invite three. You obviously can’t get a feel for a person from a 30 minute to 1 hour conversation, so committees have to rely on those letters. If you have strong letters either from top scholars in your field or strong letters from recommenders known personally to members of the committee, those matter more because committee members have a better basis to judge what the letter writer is saying. So pedigree comes up again in this regard, but with odd knock-on effects. My first year on the market, I had a “reach” interview with a school because one of my recommenders had been on the faculty there. This tends to get read as “doing a colleague a favor” sometimes, but I didn’t get past the first stage and I never told that recommender about the interview; I think it was more “we all know this guy and if he’s being so positive about this candidate, we should take a look even if the CV isn’t competitive.” Those with better academic pedigrees are more likely to be able to secure strong letters from prestigious scholars, and that’s going to be a clear advantage at the initial stage and possibly at the final stage, where everything else is equal but of the two top candidates, one has strong letters from the best scholars in the world and the other doesn’t.