Monday, February 18, 2019

My Battery is Low and it's Getting Dark



"There's a little black spot on the sun today,
that's my soul up there
"
- The Police, "King of Pain."


"My battery is low and it's getting dark."

Of course, these were not the actual last words of the Opportunity Rover, which sent its last transmission February 13th -- a routine status report that was not quite as poetic or existentially charged as its anthropomorphic translation. What set it apart was only that it was the last report Opportunity would ever send.

When I wrote Posthuman Suffering, I was thinking of exactly this kind of relationship between human beings and machines. And the momentary poignancy as this virally flashes across a social media landscape shows us exactly the dynamic I tried to elucidate: we want our machines -- our technological systems -- to legitimize and validate our own pain: in this instance, the pain of existential dread.

This object -- an only semi-autonomous planetary rover -- was designed to last 90 Martian days (a martian day is about 30 minutes longer than one on earth). It dutifully lasted over 5,000, spending its final moments in a valley, enshrouded by the dark of a major planetary dust storm. Its "dedication," coupled with the finality of its message, affects us on a deep emotional level. It "dies" alone. Its last status message is transformed into a last fulfillment of duty -- calling out to earth, noting the encroaching darkness and its own dwindling power supply. We are often fascinated by these real and fictional moments, whether it is the HAL 9000's halting rendition of "Daisy, Daisy," or Roy Batty's "tears in rain" speech from Blade Runner,  we feel a certain empathy as these fictional and real machines sputter and die.

Where most believed that we were simply projecting ourselves (and our fears) onto our machines, I took it a step further. This wasn't mere projection; it was a characteristic of a deeper, more ontological relationship we had with these machines. Yes, we are sad and lonely because we see our own existential loneliness in the dust-covered rover now sitting, dead, in a distant valley of Mars. But, more importantly, we're satisfied by it. Satisfied not due to any inherent sadism or misanthropy; quite the opposite: we're satisfied because it keeps us company in that solitude.

If you've ever pulled our your smart phone to take a picture in low light, and it gave you a low-battery warning, you received pretty much an analogous message that Opportunity sent back to NASA. Yet, in that moment, you're more likely to be angry with your phone rather than want to cradle it in your arms and serenade it with David Bowie or Imogen Heap.

But this -- this object that was 54.6 million kilometers away.

And it was alone.

And it was dying.

Of course, there are all sorts of reasons why NASA would "translate" Opportunity's final transmission in such a way (a way to "humanize" science, or perhaps even authentic, heartfelt emotion for a fifteen-year mission that was incredibly successful and coming to an end). Regardless, the reaction on social media, however fleeting it may be (or may have been), falls somewhere between empathy and solidarity.

The object sitting alone on Mars, made by human hands, the product of human ingenuity, partakes in a broader, deeper loneliness in which humans partake. Yet, there is no way to share such loneliness except metaphorically. And in this case, it's the humans who make the metaphors. If anything is being extended here, it's not humanity, it's metaphor. The mistake many cultural theorists make is to present this dynamic as simple anthropomorphization: we're personifying "Oppy" (interestingly enough, quite often as female: "she's sent her last message"). But that's not exactly what's happening. We're re-creating Opportunity into something else: through metaphor we are making it into a unique, autonomous, metaphorical entity that can and does feel.

In this posthuman suffering we were extending our autonomy, and all the suffering that goes along with that autonomy. We imagine ourselves sitting alone, reaching out, texting into the dark, hoping for some kind of response; posting on Facebook or Twitter or Instagram because it's not socially acceptable to say "I'm lonely and need someone to speak to and also I know someday I will die and that makes me feel even more lonely and I need some kind of contact."

So we post or text, and wait for authentication and validation.

In many ways, Opportunity rover is us, alone, in the dark, posting on social media and hoping for some kind of response to tell us we're not alone.

I've often said in my classes that every social media post -- no matter what the content -- is simply a Cartesian expression and can be translated into "I exist."

I say less often in my classes that there's always an existential codicil to these posts:

"I exist and I'm afraid of death."

But now, as I make a turn in my philosophy, I realize that the existentialist in me was too dazzled by the idea of our own, consciousness-based fear of death: a survival instinct complexified by a cerebral cortex which weaves narratives as a means of information processing. And when I thought about this in light of technological artifacts and systems of their use, I was too focused on the relationship between human and object rather than on the human and the objects in and of  themselves. In other words, I was being a good cultural theorist, but a middling philosopher.

The Opportunity rover is "up there," alone, amid rocks and dust. On the same planet are the non-functional husks of its predecessors and distant relatives. It was unique; the last of its kind. We imagine it in the desolation. We weave its narrative as one of solitary, but dedicated duty, amid rocks and dust. When we think about Opportunity, or any of the other human-made objects sitting on the moon, other planets, asteroids, and now hurtling through interstellar space (alone), the affect that occurs isn't a simple projection of human-like qualities onto an object. In the apprehension of the object, we become a new object, an Opportunity/human aggregate that is also constituted by the layers of sense-data, memories, emotions, experiences, and platforms through which much of that phenomena is brought into awareness. Metaphor isn't a thing we create or project, it is the phenomena of a distributed awareness.

To paraphrase "King of Pain," the speaker's soul is many things:
A little black spot on the sun today.
A black hat caught in a high tree top.
A flag pole rag and and the wind wont's stop.
A fossil that's trapped in a high cliff wall.
A dead salmon frozen in a waterfall.
A blue whale beached by a spring tide's ebb.
A butterfly trapped in a spider's web.
A red fox torn by a huntsman's pack.
A black winged gull with a broken back.
And, in the context of the song, there are other objects existing that aren't necessarily in the awareness of the speaker:
There's a king on a throne with his eyes torn out
There's a blind man looking for a shadow of doubt
There's a rich man sleeping on a golden bed
There's a skeleton choking on a crust of bread
The first group of objects (black spot, black hat, rag, etc.) are directly equated with the speaker's soul. But the second are not. They are just objects that frame the broader existence of the speaker, embedding them and all other objects in a broader world of objects, distributing the "pain" via the images invoked. The poignancy of the song comes with the extensive and Apollonian list of things, things that aren't necessarily solitary, sad, or tragic in and of themselves, but come to be so when folded into a broader aggregate that just happens to include a human being who is capable of understanding the above lyrics.

Whereas most would say that it's the reader that is lending the affective qualities to these objects, we need to look at the objects themselves and how -- as solitary objects embedded in a given situation, whether "real," "sensed," "imagined," "called to mind," etc. -- these objects create the "reader."

Getting back to our solitary rover, the pathos we feel for it comes from the images we see, our broader knowledge of Mars, our basic understanding of distance, the objects on the desks around us or the bed we're sitting on, the lack of any messages (or a particular message) on our phone, the dissonance between the expected amount of likes, loves, retweets,  comments on our last social media posts and the actual number of aforementioned interactions, the memories of when some caregiver may have forgotten to pick us up after karate practice, the dying valentine flower on our nightstand, the dreaming dog at our feet, etc., etc.

We feel for it not as a separate subjectivity witnessing something; we feel for it as an aggregate of the "objects" (loosely defined) which constitute our broader awareness. This is, perhaps, why on some level, for some particular people, at some particular moments, we are more moved by this object on a distant planet than we are from seeing suffering first-hand by a stranger or by the larger tragedy of our own dying planet. Certain aspects of this object, plus the objects around us, plus the "objects" of our thoughts, come together in a particular way creating a particularly emotional response.

It feels like the world is "turning circles, running 'round [our] brain[s]," because our brains are constituted by the "world" itself, even if that world includes a planet that we've only actually seen via pictures on the internet ...

... and a small robot, dying alone in the dark.








Monday, January 21, 2019

The Narratives of Things

Each of us lives according to our own narratives of self. Various traditions of philosophy treat those narratives differently. Some will celebrate it, that if we think positive thoughts or visualize what we want, that it will come to fruition. These traditions will put thoughts front and center as the way to progress, all stemming from a Cartesian way of looking at the world where we literally are (that is, we exist) because we think. The most watered down version of this comes in pop philosophy/psychology that we often see celebrated by celebrities and talk show hosts. "The Secret," which basically says (spoilers) that if you visualize something hard enough and long enough (aka think about it enough) you will achieve it. Adherents will say that it's much more complex than that; but it really isn't.

Other traditions see thought as more of a peripheral aspect of existence. Thinking is a result of the specific structures of our brains, and the stimuli that our embodied brains perceive and process. All thoughts have causes; those causes are materially based. The thoughts we have are determined by those causes. That's not as bleak as it sounds, however, when one thinks of the myriad stimuli to which we are exposed on a daily basis. We have enough of those, in fact, to make us think that we have free will. Our narratives of self have causes, and are not self-generated. In other words, we are not the prime-movers of our actions, per-se. But just because our thoughts and actions have causes, that doesn't mean we can't and don't make choices. Those choices are determined by causes, but that doesn't negate volition (the ability to choose between the myriad options we have).

For those who have been following my blog and/or my research, you know that I take a philosophical approach that expands the above to include the physical environments in which the embodied mind finds itself, as well as the artifacts we use to negotiate and mediate that environment.

As I've been thinking a lot about, well, thinking, I find that I often fall back on my old training in the field of literature and literary theory. In fact, the first bridge I built from literary theory into philosophy was that we use narratives to understand the world and our place in it. We create narratives not just to explain the unknown, but to integrate ourselves into that world. Even if our narratives are ones of the solitary lupine nature (i.e. the lone wolf), it is a story of solitude. It becomes a narrative through which we understand our place. And, if we're not careful, these narratives can dictate how we will behave. I find it ironic that people who often flinch at the aforementioned implications of determinism are often themselves enmeshed in their own deterministic narratives. They feel themselves "destined" or "cursed" to be [insert emotional/financial/psychological/academic state here].

I think, however, that it is just as philosophically valid to look at "things" -- that is to say, actual physical objects -- with the same, if not more weight than the thoughts that define our narratives. What stories are we creating and telling ourselves through the things that we both passively find ourselves surrounded by and the things we actively surround ourselves with?

The temptation is to think about these objects as "traces" of ourselves; as markers of past achievements; mementos to remind us of events or periods in our lives. And yes, that is true, but in emphasizing that view, we don't think of the effect those objects have on us in the present. I do think, however, that we see glimmers of that when -- usually after a trauma of some kind, either the loss of a relationship, the death of a loved one, or the failure of some major project -- we suddenly decide it's time to redecorate our personal or professional spaces in some way. But we get an interesting shift in perspective if we ask ourselves -- in moments of calm (or at least non-trauma/panic) -- "how are these objects defining and supporting my current narrative of self? What story of self is this object, this space, this environment making possible?"

In a more philosophical mode, we can ask "How does my being supervene upon these physical objects?" Or, "How is my being brought about by the objects around me?"

Most would think that was a psychological issue: objects affect emotional responses. Of course they do. But oftentimes when we think that way, we are looking at the self as a static object. An existence rather than an exist-ing.

If we want to dig deeper into the idea of distributed cognition and the object-oriented-ontology I'm getting at here, we need to think of the self as a dynamic, ongoing process.

How do these objects constitute, intervene upon, determine, or otherwise affect the process by which my "existence" unfolds or manifests itself?

So, I'm not asking "what stories are we telling through the artifacts we use and the environments in which we use them in?" I'm asking: "How do these artifacts and environments constitute the meaning-making process through which these stories are told?"

At least I think I am ... or maybe it's just time to redecorate.


Monday, January 14, 2019

Academic Work and Mental Health

I've always said to my students -- especially those thinking of doing Masters or Ph.D. programs -- that graduate work (and academic work in general) can psychologically take you apart and put you back together again. It will often bring up deeper issues that have been at play in our day-to-day lives for years.

As I was annotating a book the other day, I felt a familiar, dull ache start to radiate from my neck, to my shoulders, shoulder blades, and eventually lower back. I took a moment to think about how I was sitting and oriented in space: I was hunched over -- my shoulders were high up in and incredibly unnatural position close to my ears.  I thought about what my current acupuncturist, ortho-bionomist, and past 3 physical therapists would say. I stretched, straightened myself out, and paused to figure out why I hunch the way I do when I write.

It’s like I’m under siege, I thought to myself.

And then I realized there was something to that.

If there’s one refrain from my childhood that still haunts me when I work it’s “You’re lazy.”

My parents had this interesting pretzel logic: The reason I was smart was because I was lazy. I didn’t want to spend as much time on homework as the other kids because I just wanted to watch TV and do nothing. So I’d finish my homework fast and get A’s so “I didn’t have to work.”

No, that doesn’t make sense. But it was what I was told repeatedly when I was in grade school. Then in high school, on top of all of the above, I was accused of being lazy because I didn’t have a job at 14, like my father did.

And then in college, despite being on a full academic scholarship, getting 4.0s most semesters, making the deans list, (and eventually graduating summa cum laude), I was perpetually admonished by my parents for not getting a job during the 4 week winter break, or getting a “temporary job” in the two or three weeks between the last day of classes and the first day of my summer jobs (lab assistant for a couple of  years, and then day camp counselor). Again, according to them, it was because I was “lazy.” My work study jobs during the school year as an undergraduate didn’t count because they weren’t “real jobs.”

And even though I was doing schoolwork on evenings and weekends, my parents often maintained that I should be working some part-time job on the weekends.

So doing schoolwork (that is to say, doing the work to maintain my GPA, scholarships, etc.,) wasn’t “real work.” In retrospect, the biggest mistake of my undergrad days was living at home. But I did so because I got a good scholarship at a good undergrad institution close to home. It was how I afforded college without loans.

But just about every weekend, every break, or every moment I was trying to do work, I was at risk of having to field passive aggressive questions or comments from my mother and father regarding my avoidance of work.

My choice to go to grad school because I wanted to teach was, of course, because I didn’t want a “real job.”

Most confusing, though, was how my parents (my mother in particular) would tout my achievements to family and friends, even telling them "how hard [I] worked.” But when relatives or friends were gone, the criticism, passive aggressive comments, and negativity always came back. It’s no wonder why I hunch when I do work. I am in siege mode. It explains also why my dissertation took me so long to write, and why that period of my life was the most difficult in terms of my mental health: the more I achieved, the more lazy I thought I was actually being.

Even though I have generally come to terms with the complete irrationality of that logic, I do have to take pains (often literally) to be mindful of how I work, and not build a narrative out of the negative thoughts that do arise as I submerge into extended research. I went back into counseling last summer, mainly because I was starting to feel a sense of dread and depression about my sabbatical, which I knew made no sense. I'm so glad I did.

The things we achieve -- whether academic, professional, personal, etc. -- are things of which we should be proud. Sometimes we have to be a little proactive in reminding ourselves of how to accept our own accomplishments.

And maybe every 30 or 60 minutes, stand up and stretch.






Friday, January 4, 2019

Excavations and Turns

"To take embodiment seriously is simply to embrace a more balanced view of our cognitive (indeed, our human) nature.  We are thinking beings whose nature qua thinking beings is not accidentally but profoundly and continuously informed by our existence as physically embodied, and socially and technologically embedded organisms."
 -- Andy Clark, Supersizing the Mind: Embodiment, Action, and Cognitive Extension, (217).  

I've reached a point in my field-related research that I've internalized certain ideas to the extent that they have become the conceptual bedrock of my current project. However, as I dug up my annotations of Andy Clark's Supersizing the Mind, I realized that I have taken certain assumptions for granted ... and had briefly forgotten that I didn't always think the way that I do about phenomenology, materialism, and particularly distributed cognition. Apparently, as little as seven years ago, I wasn't convinced of Clark's hypothesis regarding the ways in which our cognition is functionally and essentially contingent upon our phenomenal environments. Now, of course, I am. But reading my sometimes-snarky comments and my critiques/questions about his work gave me valuable insight into my own intellectual development, and pointed at ways to sharpen my arguments in my current project.

Seven years ago, I was still thinking that language was the mitigating factor in the qualia of our experience. In fact, I had written a chapter for an anthology around that time, working under the aforementioned idea. Now I realize why that chapter was rejected and left to literally collect dust in my office. The rejection of that chapter really affected me, because it was an anthology in which I really wanted to be included. I knew that something was off with it. It never felt quite "right."

Then, filed next to those notes, was a different set of notes written around eight months later. Those notes represented a complete 180 degree turn in my thinking. Unsurprisingly, that chapter was accepted into a different anthology ("Thinking Through the Hoard" which appeared in Design, Mediation, and the Posthuman).  That piece was really the beginning of my current journey. I suppose Clark's ideas had "sunk in" with the help of other authors who pointed out some of the broader implications of his work (like Jane Bennet and Hans Verbeek).

There are a couple of takeaways from this anecdote: 1) as academics/researchers, our ideas are always evolving. Several philosophers, including Heidegger, experienced "turns" in their thinking, marked by a letting go of what seemed to be foundational concepts of their work. My own work in posthumanism has made a couple of turns from its original literary theory roots, to an emo existential phase, to its current post-phenomenological flavor. 2) Embrace the turns for that they are. There are reasons why we move on intellectually. Remember why we move on is helpful when anticipating critiques to your current thinking.







Monday, December 31, 2018

Sabbatical: The True Meaning of Time

So no apologies or grand statements regarding that my quiet academic blog is now alive and awake again. No promises as to what it will become, or how often I will update. I'm going to let this evolve on its own. Like all the best things I do, I have a sketch in my head as to what I'd like this blog to be while I'm sabbatical -- as I research and write what will hopefully be another book. But, things happen and unfold in interesting and unpredictable ways. I have been doing a great deal of research in the past several months, all in preparation for what will be several months of concentrated work.

For those who may not be familiar with what a sabbatical is or how it works, it's basically a paid leave from one's usual responsibilities on campus in order to do intensive research or writing. Most universities grant year-long sabbaticals; but since Western isn't the most cash-flush or research-oriented university, our sabbaticals are one semester long ... we can take a year if we'd like, but at half-pay. Since I can't afford to live on half of my salary, I opted for the semester-long sabbatical. Sabbaticals are something for which faculty have to apply and be approved. It's a multi-step process that requires proposals and evidence that one has actually done research while they're gone. Once you are on the tenure track, you can apply for a sabbatical once every 7 years. 

This is my first sabbatical. So I have no idea what to expect nor can I wax philosophical on what it's like. 

I can say, however, that this will be the first time I'm not on an academic schedule since I first started going to school. And I don't mean grad school. I mean Pre-K. My years have been portioned by the academic calendar since I was 4. Elementary school. High school. College. Grad school. Teaching. There were no breaks. I have always either been in a classroom as a student or as an instructor since I was 4 years old. I am now 46. You do the math. Sure, there are semester breaks, but this was the first time I entered a semester break without having to think about the next semester's classes. It was less disorienting than I thought it would be. 

Not many people who aren't teachers understand exactly how much time and energy teaching requires. I normally have a teaching load of 4 classes per semester. I'm physically in the classroom for 3 hours per course per week (spread out over a Monday/Wednesday/Friday or Tuesday/Thursday schedule for each). I am also required to have at least 5 hours per week of "office hours" for students. So that's 17 hours per week of teaching/office hours. That doesn't include class preps, grading, committee work, meetings, and the administrative side of directing the philosophy program. Most days, I arrive on campus by 8:30am and leave after 5pm. Most days before or after that I'm prepping/reading for classes, grading, or doing paperwork. Weekends are the same. When I leave for the day, I bring work with me. 

I get up at 5am on weekdays in order to have a little under 60 minutes to do my own research. Semester breaks are also times when I've been able to do my own research. But 1/3rd to 1/2 of those breaks are filled with writing recommendations for students, prepping for the next semester's classes, and dealing with the inevitable committee work that brings me to campus during those breaks. 

With a sabbatical, 85%-90% of the above work goes away. 

This is why sabbatical are precious ... because it gives us time. 

Time to let the big thoughts develop. Time to sit down and THINK. Time to actually read something that isn't a student paper or a committee report. Time to write through a problem without looking at the clock and thinking about how you're going to make Kant into a remotely interesting class. Time to focus on your own work instead of the at-risk student who has been looking really tired in class and probably isn't eating because they just got dumped by their fiancee or their dog is sick or they flipped their car over for the 3rd time in 2 years. Time to sit in quiet instead of dealing with yet another new directive from administration to fund raise or recruit even though you have zero experience or expertise in doing so. Time to read relevant writing in your field instead of being asked to justify the importance of your field or to report back as to exactly where your students from 7 years ago are working now and how your classes got them that particular job. 

There is time. 

Time to recharge myself so that when I do return, Kant will be an interesting class. Time to become re-invested in my field and feel legitimate as an academic again so that I can pay better attention to my students and reach out when I know they're at risk. Time to research so that when I return I have evidence of exactly how important my field is, and exactly why studying it isn't just important, but imperative to making students marketable to employers. 

There is time for me to focus on me, so that I can eventually focus better on my job and doing it well. 

That's what sabbatical is all about, Charlie Brown. 






Wednesday, August 29, 2018

Posthuman Determinism: Possibility through Boundaries

In my "Posthuman Topologies: Thinking Through The Hoard," I end on a somewhat cryptic note about "posthuman determinism." In all honesty, that was one of those terms that just came out as I was writing that I hadn't thought about before. For me, it was a concept that served as a good point of departure for more writing.

As I'm deep into a new project (and heading toward a sabbatical for the Spring semester), the idea has come to the forefront, with the help of a wonderful book called The Incorporeal: Ontology, Ethics, and the Limits of Materialism, by Elizabeth Grosz. As she questions and re-frames the realtionship between the ideal and the material in the works of the Stoics, Spinoza, Nietzsche, Deleuze, Simondon, and Ruyer, she provides a thoughtful critique of materialism (and, consequently "new materialism" -- my own sub-specialty) that reinvigorates certain points of idealism while maintaining the importance of the material substrate of existence. It's a similar maneuver to Kant's critique of the rational/empirical dichotomy in the 1700s. 

Thankfully, as I started taking notes on Grosz's book, the idea of "posthuman determinism" kept coming back, and with it, a journey back to the core of my philosophical worldview: how do the artifacts which we use -- and which surround us -- contribute to the self. Note here, I'm not saying "contribute to the idea of the self." While we may have ideas of who we are, my position -- as a posthumanist, post-phenomenologist, and new materialist -- is that the objects which surround us and their systems of use are essential and intrinsic parts of the very mechanisms that allow ideas themselves to arise. Ideas may be representations of phenomena or mental processes, but the material of which we are made and that surrounds us make representation itself possible. This means that -- unlike a Cartesian worldview that puts mind over matter, and privileges thought over the material body which supports it -- I place my emphasis on the material that supports thought. That includes the body as well as the physical environments that body occupies.

In that context, a "posthuman determinism" is a way of saying that the combination of our physical bodies and physical spaces those bodies occupy create the boundaries and parameters of experience; and, to a certain extent, create boundaries and parameters of the choices we have and our capacity to make those choices. Our experiences are determined -- not predetermined -- by the material of which we are composed. The trick is to think about the difference between "determinism" and "predeterminism." In relation to the human, the former states only that all events are determined by causes which are external (read, material) to the will; while the latter implies that all human action is established in advance. Determinism emphasizes causality while predeterminism emphasizes result. That is to say, ascribing to a deterministic philosophy implies only that human action always has a cause: that specific factors guide how human beings express their will. Predeterminism implies that the specific choices that humans make are somehow established in advance and that each of us is moving toward a specific, fixed point. That would mean that our choices are themselves illusory, and that regardless of what we choose, we will arrive at a specific end.

Ascribing to a deterministic worldview does not mean -- despite what people critical of philosophy  may tell you -- that nothing matters and that we are not responsible for our choices. In fact, quite the opposite: in a deterministic philosophy everything literally matters. We are responsible for our actions by understanding the causes and conditions that supervene on our decisions. What factors affect the choices I have, and how do those factors contribute to my own decision-making processes? That is to say, What factors instantiate the mechanisms through which I make my choices? From my materialist point of view, I believe that our ability to think and our ability to choose are bounded by the material properties of our bodies and the world around us.*

So although I may ascribe to a certain posthuman determinism, I still believe in "free will," but one that has specific limits and boundaries. To us, there may seem to be infinite choices we can make in any given situation; and, indeed, there may be many choices we can make, but those choices are not unlimited. A person can't imagine a color that isn't a shade, variation, or combination of a color (or colors) that person has already seen. We can't imagine an object that isn't some component, combination, or variation of an object that we've experienced before.

None of the above is new. Both Hume and Descartes say similar things, although Descartes's (and to some extent, Kant's) valorization of the mind's ability to conceive of things like infinity and perfection prove that the mind can move beyond its physical limitations.For me, however, that's the mind moving within them. Infinity is a concept that is born of ones learned awareness of time and space.

All in all, there are limits and boundaries to free will. But those boundaries are what make volition itself possible. We can only think and act through the physical bodies and physical world those bodies occupy. Boundaries are not necessarily prohibitive, they make things possible, and give shape to the specific qualia of experience itself.







*Someday, will our computers be powerful enough to calculate the myriad physical properties around us and predict our behavior based solely on our brain chemistries coupled with the properties of the physical world around us? I think if humans survive long enough to develop that technology, then, yes. At that point, I do think the machines will literally think FOR us; transforming the human species into something very different than it is now -- something beyond the realm of our imagining ... literally. We can't think of what that thinking would be like because we literally do not have the biological capacity nor the material support to allow us to think that way.




Tuesday, March 6, 2018

Research, Sabbaticals, and the Reality of Higher Ed

It has been quite a while since I've posted, and -- for once -- it's for a good reason. I've been working on some new research which is very timely and somewhat sensitive, in that I am hoping that it is the start to a new larger, hopefully book-length, piece. I was recently granted a sabbatical for the Spring semester of 2019. While a year's sabbatical would be more conducive to research, my university only grants year-long sabbaticals at half-pay, which wasn't feasible financially.

I won't get into the details of my current project work here, but I hope to be posting more often, writing what I envision to be "parallel" pieces that indirectly relate to what I'm working on. Apologies for the intrigue, but sometimes when you've got a really good project that you think has legs, you want to keep it under wraps for fear of being distracted or getting "scooped." It's an aspect of posthumanism that hasn't really been explored in any meaningful way, and I'm hoping to be one of the first to do so.

It's an interesting feeling now, post-promotion to full professor, to establish a research agenda that -- while tempered by demands of my own field -- is my own. As academics, we often find ourselves driven by the desire to land positions that offer some kind of security amid various market pressures and political attacks. And even when we do find those positions, we're faced with internal pressures to engage in research that will ensure tenure and promotion. In most cases, academic freedom allows us to research what we'd like, but we also know that it has to be something publishable. And even then, as economic pressures on higher ed tempt universities to re-create themselves according to certain "identities" (i.e. we are a "destination" or "technical" or "public service" university etc), we find that rushed and panicked marketing campaigns begin to trickle down into discussions of liberal arts and general education: "perhaps if we taught more of [insert fundraising magnet field here], then we'd get more money."

It's especially frustrating for me when the perspective and knowledge I've gained from posthuman studies shows that competing and popular fields pushing these discussions forward are doomed given the demands of the coming decades. You can see the paths ahead to create curricula and programs that could make an institution a real force, but you're told -- directly -- that there have to be donors to support those changes. "Show us a donor with eight million dollars and we can talk about it." When those words can be spoken aloud -- to faculty --  at a university, it's hard to engage in research agenda not affected by those forces (whether it's to try to attract money or to purposely entrench in one's own research agenda out of classic academic spite).

Both extremes are destructive.

I'm not going to stand on the perspective of tenure or promotion to justify my position, because tenure and promotion mean nothing when your program is eliminated. But I can and will speak from the perspective of two decades' worth of experience. I know that to be an effective instructor and researcher, I need to engage in the research that speaks to my own passions and interests. I also know professionally that I have to adapt and shape those results into something that is marketable. And if it doesn't fit into the newest identity one's university is trying on for size, it has to be marketable enough to be published, and perhaps get a little attention. Even if a professor isn't publishing in the most popular majors, universities will still plaster their pictures up on website splash pages to tout their faculty's achievements.

My own research has taken a turn into something that is both meaningful and important to me but could also be timely and popular (well, as popular as academic writing can get). And my upcoming sabbatical is a chance for me to lose myself in it without dealing with the institutional noise and growing list of tasks that are being heaped upon faculty on a daily basis: write the copy for your program for our marketing materials for the 6th time in five years because we've fired the last five marketing people and have no idea where any of that information is; come to this campus discussion about how we're going to revolutionize our curriculum to the point where we're "encouraging" you to add certain content into your own classes; call prospective students to convince them to come.

At a teaching university, all of those are things that take me out of the classroom and interfere with my primary duties as in instructor. All of those are things that directly interfere with my face-time with students. All of those are things that contribute to the fatigue that makes me pass on sitting on committees that could actually make a difference. Some instructors make the transition from professor to fundraiser, although the titles they are given mask that fact: "Director" or "Dean" of something seems much more palatable than "chief fundraiser." The one token course they might teach a year become pegs upon which whatever pedagogical integrity they had is precariously hung.

I do, however, understand the need for people who can chase millionaires and billionaires for funds which are desperately needed to keep universities afloat. It's become a sad reality. And I have no problem speaking to parents and prospective students when they visit campus; I do see that as an aspect of what I need to do in order to actually remain employed. But my old mantra which I've said to the multiple marketing people who have come and gone has been "you get them into the classroom and I'll keep them here." That, sadly, is no longer enough.

It's ironic that sabbatical will take me out of the classroom which I so enjoy -- and have always enjoyed. It's not the classroom or the students from which I need a break, it's literally everything else. I am, in fact, very nervous to be without that classroom energy for a semester, because my students have always sustained and inspired me. But, in the bigger picture, losing myself in research will be a way for me to re-charge my classes and give the students the experience they all deserve.

"Your sabbatical isn't a break," I was told by an administrator at my university, who weeks before had told me that despite my "excellent proposal" I had "about a 50/50 shot" at getting sabbatical due to budget cuts.

But it is a break. A break from the things that distract me from what I do best. When the burdens of non-teaching duties and increased pressure to do the jobs of others encroaches on my class preps and time with students, then stepping away from that for even a semester IS a break. And during that time, I'll tap into the excitement of research that was the core of what allowed me to become a professor in the first place. As I said to a student recently, I knew early on that I wanted to be a professor, but my initial problem was that I saw research and the dissertation as a hurdle or impediment to that goal rather than the path to it. That research was a foundation upon which to build a career; a springboard for my passion to teach.

So, after twenty years, it's time to revisit my foundation, inspect it, and shore it up where necessary. I know I'll be a better professor for it.