3.4.26

The Wounded Christ and the Doubting Self: A Humanist’s Approach

I mean, I think anyone who grew up in the Christian Church—especially in a church that displayed the crucified Christ, the corpus, the body of Christ on the cross, either as a corpse or as an agonized, suffering figure—knows exactly what I am talking about. And of course, there are Christian traditions that do not depict the crucified Christ in that way. I have also seen images of Christ on the cross, radiant with resurrection, as though he has passed through suffering yet still bears it. I have seen the risen corpus with the wounds intact. And that, really, is one of the strange things about Christianity: resurrection does not erase the wounds. In John’s Gospel, the risen Jesus still says to Thomas, “Put your finger here and see my hands,” and, more intensely still, “bring your hand and put it into my side” (John 20:27).

Crucified Christ
Credit: Mosan or Rhenish third quarter 12th century (The Metropolitan Museum of Art)


That scene, to me, has always been one of the most psychologically compelling moments in the Gospels, even if it does not always get the same theological attention as the Passion itself. Caravaggio understood that. In The Incredulity of Saint Thomas—an oil painting usually dated to 1601–1602 and now in Potsdam—he compresses the whole drama into four bodies and one wound. Thomas does not merely look. He probes. Christ guides the hand. The whole painting is built around that terrible intimacy: doubt made tactile, faith forced through flesh. 

And I think that is why Thomas matters. He is not Judas, the catastrophe. He is not Peter, the blustering future leader. He is not Paul, the brilliant convert and architect. Thomas is smaller than that, closer to the rest of us. He wants proof. He cannot quite get there on charisma alone, or testimony alone, or rumor alone. And honestly, that is why I have always felt that, in some deep way, we are all Thomas. Jesus says, “Do not be unbelieving, but believe,” and Thomas responds with one of the great cries of the New Testament: “My Lord and my God!” (John 20:28). But he only gets there by way of the wound. 

For me, the Christian message was always bound up with that: Christ undergoes horrific suffering in order to save humanity from sin, from fault, from the catastrophe of itself. The old Easter proclamation, the Exsultet, says it with outrageous daring: “O happy fault / that earned so great, so glorious a Redeemer!” It is one of the strangest ideas in religion—that the wound is not incidental to salvation but somehow constitutive of it, that the disaster becomes the site of redemption. 

And then there is kenosis, that great Christian word for divine self-emptying. In Philippians, Christ is described as one who “emptied himself, / taking the form of a slave,” and who “became obedient to the point of death— / even death on a cross.” That is not just humility. That is divine abasement pushed to its outer limit. Von Balthasar, in a line that gets right to the heart of the matter, says, “no path of redemption can make a detour around it.” (von Balthasar 57). He means the Cross. He means that Christianity does not let you skip over humiliation, abandonment, flesh, blood, or death and still call it redemption. 

That, to me, is the twist in Christianity. Plenty of mythologies allow the gods to visit human beings in disguise or assume human form for a time. But Christianity says something more radical and, frankly, more disturbing: God does not merely visit. God becomes human and then submits to degradation, torture, and execution. The Paschal Mystery is not just resurrection. It is suffering, death, and resurrection. And I think sometimes Christians rush too quickly to the third term because the second one is so brutal. But if you linger there—if you really linger there—it is a terrifying religion. 

Jesus himself, at least in the Gospels, is not mild in the way people often want him to be. He is magnetic, cutting, aphoristic, often severe. He speaks in parables that are simple on the surface and devastating underneath. He tells the rich young man, “go, sell what you have and give to [the] poor,” and then follow him (Matthew 19:21, Luke 12:33). But the rich young man cannot do it. Jesus says it is easier for a camel to pass through the eye of a needle than for a rich person to enter the kingdom of God (Matthew 19:24). That is not a soft religion. That is not spiritual accessorizing. That is a demand for transformation so total that it feels, at times, almost impossible. 

So yes, Christianity can look like the Olympics of the religions: not because it is better, but because it often frames discipleship as complete surrender, the remaking of the self through identification with Christ. There is a rhetoric of totality in it—die to self, take up your cross, be conformed to the suffering Christ—that can be profound, beautiful, and for some people utterly life-giving. But it can also become psychologically dangerous when it turns suffering into a moral requirement or self-erasure into a virtue in itself. That is where I have had to part ways with certain forms of it. I do not need any theology that tells me I must make a sacrament out of my own diminishment. 

That is part of why Kierkegaard’s meditation on Lazarus has always haunted me. In The Sickness Unto Death, reflecting on John 11, he writes: “because He lives, therefore this sickness is not unto death.” He is trying to think through the difference between ordinary mortality and spiritual annihilation. Lazarus dies, yes, but for Kierkegaard the real horror is not bodily death alone; it is despair, the self lost to itself, the death beneath death. That is a powerful insight. It names something real. But it also intensifies the Christian drama to an almost unbearable degree: not only must you die, you must fear the wrong kind of death. 

And I think that is where my own resistance enters. I am not mocking belief. I have known extraordinary Christians—Catholic, Orthodox, Protestant—whose faith made them more humane, more generous, less petty, less cruel. I have also known Christians who were insufferable, controlling, and spiritually colonizing. What I cannot stand is the assumption that one path must become everybody’s path, that one religious imagination must swallow all the others whole. Leave me out of it. I do not need to be assimilated into your metaphysics to have a soul, a conscience, or a meaningful life. A tree has many roots. Water finds more than one way. That is not relativism. That is just spiritual adulthood.

And maybe that is where I am right now, personally. I have been going through it a bit—a low-grade depression, a small crisis, whatever you want to call it. Some mornings it is hard to get up and go to work. And one thing I keep circling back to is this: I do not need to become a servant to self-abnegation. I do not need to imitate the suffering Christ in order to justify my own suffering. I do not need to hurt more than I already hurt. If there is any wisdom I want right now, it is not the sanctification of pain. It is the permission to remain here, in the ordinary, and not despise the ordinary.

That is why Epicureanism, however badly people caricature it, sometimes feels like a healthier corrective than Christian masochism. Not because it says, gorge yourself and forget the world, but because it reminds you that pleasure can be modest, local, embodied, fleeting, and still real. Eat the ice cream. Enjoy the afternoon. Let the warmth of a stupid, passing thing be enough for the moment in which it is given. That is not gluttony. That is not moral collapse. That is simply refusing to turn human life into one long rehearsal for annihilation.

And maybe that is the final thing I mean. Christianity, at its most compelling, stares directly at mortality and says that death does not get the last word. I understand the force of that. I understand why the image of the crucified Christ seized me as a teenager, why the holy cards, the saints, the wounds, the mysteries, the whole theater of it all felt so aesthetically and spiritually charged. But I also think there comes a point when one has to ask whether one is being saved by that vision or crushed beneath it. That is where I am. Still thinking. Still arguing. Still moved by it. Still unconvinced. Still, in some deep and irritating way, Thomas.

PDF copy for Printing 


22.2.26

Theater Review: Every Brilliant Thing (Hudson Theatre, Broadway)

On Saturday afternoon, I caught Daniel Radcliffe in Every Brilliant Thing—a one-person play (with a whole lot of “you might get drafted” audience participation) now in previews at the Hudson Theatre. I happened to be at the 2:00 p.m. matinee on February 21, 2026, which was the first preview performance (and, yes, there was also a performance later that night).

Daniel Radcliffe stars in a new one-man show on Broadway at the Hudson Theatre.
photo credit: Mary Ellen Matthews

Radcliffe is already onstage when you enter, moving through the space and quietly setting the evening’s social contract: this will be intimate, slightly unpredictable, and collaborative — even if you never lift a finger. Before the show officially begins, audience members are handed cards and assigned small roles. You can feel the play “building its cast” in real time, like a classroom activity that somehow stays tender instead of corny. And then suddenly Radcliffe is in character. The show begins. I loved the magic of the show’s start.

The staging at the Hudson helps. The setup is simple but warm: Radcliffe has a central playing area, surrounded by those old-school filament-style bulbs — big glass lights that make the whole stage glow like a human memory. There’s also seating placed onstage for audience members, reinforcing the show’s main idea: this story isn’t delivered to you so much as assembled with you.

If audience participation gives you hives, here’s the honest truth: you can absolutely attend as a watcher. I did. I wasn’t chosen, and I didn’t volunteer. I sat in the orchestra, content to be a kind of emotional voyeur, letting the story wash over me while other people occasionally stepped into small supporting parts.

And those parts matter because Every Brilliant Thing is not just a monologue. It’s a carefully structured act of communal storytelling about depression, suicide, love, and the strange ways we try to keep each other alive. The premise is deceptively simple: a child begins a list of “brilliant things” (small pleasures, sensory joys, reasons to stay) after their mother’s suicide attempt. That list grows across the narrator’s life, reappearing when hope disappears, returning when it’s needed, vanishing when it can’t do the job anymore.

That last part is key: the play refuses the easy inspirational arc. It doesn’t claim the list “fixes” depression. It doesn’t pretend love is a cure. Instead, it shows what it’s like to keep trying. I resonated with the awkwardly, imperfectly, sometimes hilariously moppish way the show’s protagonist made me feel. You love someone and you don’t know what else to do. You love your job but feel empty. But I'm sounding preachy.

What surprised me most is how the play avoids two major traps: sentimentality and preachiness. For a story built around hope, it stays unsentimental—partly because the writing is genuinely funny, and partly because Radcliffe’s timing keeps the piece buoyant even when it turns dark. This Broadway production is co-directed by Jeremy Herrin and Duncan Macmillan (one of the writers), and you can feel the craft: the show is loose enough to breathe but tight enough to land its punches. I noticed a couple of tiny hiccups (the kind you’d expect in a first preview), but what I mostly felt was awe: he keeps the tone safe for the volunteers, keeps the story coherent for the rest of us, and makes the whole room feel like it’s doing something together. 

Radcliffe, for his part, is exactly the kind of performer the piece requires: quick, emotionally nimble, and able to pivot from clowning to rawness without breaking the spell. That skill matters in a show where strangers are asked to become your dad, your girlfriend, your counselor, your librarian, your witness. I felt the vulnerable parts. Radcliffe goes on about needing therapy even though he's British. “And it was group,” he says denoting the audience in front of him. The show as therapy. Therapy is the show. Art acts as a salve.

The show also has a broader life beyond Broadway. It first became a phenomenon at the Edinburgh Fringe in the 2010s, has been performed widely around the world, and it was even adapted into a filmed stage version for HBO starring co-creator Jonny Donahoe. That matters, because it explains why the play feels so lived-in: it’s been tested, reshaped, and performed in many different contexts—and it’s built to flex. Apparently the script is updated with footnotes so the actor in the performance can attuned to possible fluctuations in the improvisational parts.

Back to the list. By the time Radcliffe reaches the later sections of the story — when adulthood complicates what childhood tried to solve — the list stops being a gimmick and becomes something more like a philosophy. Not “gratitude” in the Instagram sense. Not a bumper-sticker cure. More like this: attention is a moral act. Naming what is good doesn’t erase the bad. But it does carve out a space where the bad doesn’t get total control.

Walking out of the Hudson, I found myself making my own private additions—things I’d forgotten to notice lately:

  • hearing the silence inside a snowstorm
  • watching a film in French without subtitles and realizing you’re following it
  • that weirdly perfect moment of anticipation on a subway platform when the air changes and the express train barrels into the station

The play isn’t really asking you to adopt its list. It’s asking you to remember you can make one—and that making it is, sometimes, a way of staying.

Did the show fix me? No. But it did what art is supposed to do: it widened the room inside my head. It reminded me that a life doesn’t have to be heroic to be worth continuing. Sometimes it just has to be noticed.

Every Brilliant Thing is in a limited 13-week Broadway run at the Hudson Theatre, with an official opening on March 12, 2026, and it’s currently scheduled to run through May 24, 2026. Runtime is about 85 minutes, no intermission. 

17.2.26

Student manifesto: Love of Learning = Desire for Something Bigger than Yourself

When I was a Benedictine monk, we read The Love of Learning and the Desire for God, Jean Leclercq’s masterwork on monastic culture and spirituality. The book is a defining text by a twentieth-century monk of Clervaux Abbey. Its central insight has stayed with me: reading is, at its best, a kind of devotion. We read to go beyond ourselves—not only to gather information, but to pursue spiritual perception. That’s why, in the monastery, we were trained to take a short passage—often from Scripture and internalize it, returning to the same lines until they began to live in us. This is the practice monks call lectio divina: slow-burn reading that forms the soul as much as the mind.

Take a walk. Ponder a passage.

Here's the universal truth. Learning isn’t just collecting facts. It’s a way of turning your attention outward: toward truth, beauty, justice, craft, community, and the mysteries you can’t solve in one sitting.

When you really love learning, you stop asking only: “Will this be on the test?” You start asking: “What kind of person does this help me become?” and “What does this let me see that I couldn’t see before?”

In older traditions of reading and study — especially the slow, careful reading practiced in monasteries—books weren’t treated like vending machines for information. They were treated like teachers: something you listen to, wrestle with, return to, and let reshape you. Leclercq describes that tradition in The Love of Learning and the Desire for God, where study is meant to lead not just to knowledge, but to wisdom: a life ordered toward what matters most.

So here’s a better definition for our room:

Love of learning = the desire to join something larger than your own ego.
It’s curiosity with a backbone. It’s attention that refuses to be lazy. It’s the choice to be changed by what you read, what you hear, and what you discover.

A sentence to live by

If it makes me more awake, more honest, more capable of wonder, and more responsible to others—then it counts as real learning.


Three simple practices that build this mindset

  1. The “Bigger-Than-Me” Question (daily, 2 minutes)

    • After any text, discussion, or video, write:

      • What’s the biggest question this raises — one that matters beyond my life?

  2. Commonplace + Commentary (weekly)

    • Keep a page with:

      • 1 powerful line (quote)

      • 3 sentences of commentary: Why this matters. What it challenges. What it asks of me.

  3. Slow Reading (Leclercq-style, 10–12 minutes)

    • Read a short passage twice.

    • Circle a phrase that “glows.”

    • Write: What might this be telling a human being how to live?


One 45-minute lesson:

“Desire for Something Bigger”

1) Hook (5 min):
On the board: “Learning is ______.” Students fill it in. Then add:
“Love of Learning = Desire for Something Bigger than Yourself.” Quick reactions.

2) Text encounter (10 min):
Give a rich paragraph (poem, aphorism, philosophical excerpt, or a key speech). Slow Reading routine.

3) Discussion (15 min):
Use these prompts:

  • What does this text ask you to value?

  • What would change if you took that seriously?

  • What’s the “bigger-than-me” concern here (truth, justice, beauty, freedom, belonging, etc.)?

4) Writing (12 min):
Students write a “Learning Vow” that begins:

  • This year, I want my learning to make me…

  • I will practice attention by…

  • The ‘bigger thing’ I want to serve is…

5) Close (3 min):
Pair-share one line from the vow. Collect for portfolios.


PDF Copy for Printing

5.2.26

Adjuncting in Graduate School: Money Stress, Meaningful Teaching

Let's get teaching!

In the following post, I write about my time as an adjunct professor. The title might sound abrupt and professional, but the reality of adjunct teaching is much less polished: you’re paid by the hour, which usually means you’re paid only for the hours your classes meet each semester. It kept me afloat, but it also made finances feel precarious.

I became an adjunct out of necessity while I was in graduate school in New York City. Between tuition, rent, and basic living expenses, my scholarship and support weren’t enough. I needed side income—plain and simple. But looking back, that time ended up being more rewarding than I expected. It wasn’t just about getting by; it became a formative stretch that taught me how to communicate academic ideas to students whose lives and goals weren’t necessarily built around the humanities.

My own educational background was classic liberal arts: humanities, philosophy, the kind of learning you pursue because you love it, surrounded by teachers and classmates who felt the same. Adjunct teaching was different. Many of my students were pursuing concrete credentials—health-related programs, business tracks, accounting certificates. They didn’t come to class thinking, How can philosophy change my life? They came thinking, How can I finish this requirement, get the degree, and improve my situation? That difference mattered. It forced me to ask a hard, useful question: How do you teach ethics, philosophy, epistemology—material that can feel “extra”—to people who are rightly focused on work, family, and survival?

A lot of my students were adults—some older than me—returning to school to earn a certification or an associate’s degree so they could get a promotion, switch careers, or rebuild momentum. Many of them also carried bruises from earlier schooling. You could feel it: skepticism, anxiety, a kind of guardedness around classrooms and teachers. Navigating that—respectfully, patiently, without condescension—became one of the defining challenges of the job.

And then there was the student body itself. Teaching in New York City introduced me to a level of heterogeneity I hadn’t experienced before: students from Eastern Europe, South Asia, China, and a wide range of Black and brown New Yorkers—many of them immigrants or children of immigrants—sitting in the same classroom, bringing different histories, languages, and relationships to school itself.

That contrast made me look back at my own education with new eyes. I grew up in Louisiana in a lower-middle-class family. We weren’t wealthy, but my parents worked full time, we had stability, and we owned our home. I also had grants and scholarships that opened doors. And yet, when I think back, my schooling was overwhelmingly white. In secondary school, I had one teacher who wasn’t white—Ms. Washington—and in college and graduate school I remember very few Black classmates, and no Black professors that I can recall. I’m not saying that for dramatic effect; I’m saying it because it’s clarifying. It points to something structural—how segregation and opportunity still shape who ends up where in the United States.

That’s part of why “diversity initiatives” matter now. They’re trying to correct a longstanding absence, not merely add a decorative layer of representation. But the work is hard because it challenges established norms—who gets access, who gets supported, who belongs in the room.

My time adjuncting in New York City gave me a lived perspective on all of this. It showed me what a truly mixed classroom can look like—and it also showed me how much effort it takes, institutionally and culturally, to make that kind of classroom feel normal everywhere.


4.2.26

The Fate of the Novel: A Reading of Ian Watt’s Formal Realism

The Fate of the Novel

What follows is a long-form reading of Ian Watt’s idea of “formal realism”: the narrative method by which the modern novel embodies the contingencies of lived experience. Starting with Defoe, Richardson, and Fielding, posts trace how private reading, proper names, and a new sense of time reshape what fiction can claim about reality—and how those claims intersect (and sometimes clash) with philosophy, from Plato’s quarrel with poetry to modern debates about knowledge and selfhood.
How modern is the novel?

Formal Realism

To call the novel “new” is to recognise that the modern sense of novel crystallised in the early eighteenth century, when writers such as Daniel Defoe, Samuel Richardson and Henry Fielding developed long fictional narratives that departed from romance and epic conventions[5]. Ian Watt credits these authors with inaugurating a literary form that we still call the novel, but his interest lies less in their social circumstances than in the philosophical implications of their work. Watt argues that the fate of the novel hinges on its association with formal realism, a term he coins to describe the narrative method by which the novel embodies the circumstantial contingencies of life. The heart of this essay is to examine what novels can say about reality, how they shape our reading experience, and whether they are compatible with philosophical inquiry. Watt’s distinction between literary form and philosophy is often overstated—he never writes, as has been claimed, that “philosophy is one thing and literature is another.” Nevertheless, his analysis invites reflection on Plato’s banishment of poets from the Republic and the struggle to reintegrate imaginative literature into philosophical discourse.

The novel cannot be a direct observation of the world; it cannot mirror Kant’s noumenal thing‑in‑itself. Instead, it constructs a claim on reality through narrative. Like the lyric or the play, it is bound to storytelling, yet it is a modern invention that asserts the autonomy of the subject over the epic’s reliance on divine decree. For Watt, what distinguishes the novel is not its subject matter but the way it presents reality. He notes that the novel raises “the problem of the correspondence between the literary work and the reality which it imitates,” an epistemological question that philosophers are well suited to analyse[2]. By focusing on how novels organise words to evoke a world, Watt shifts attention from mimetic accuracy to the form’s underlying logic. This emphasis aligns the novel with modern thought that emphasises individual access to truth and the correspondence between words and things.

The Experience of Reading Novels

According to Watt, the novel promises the closest correspondence between life and art; its formal realism overwhelms earlier narrative forms. Homer’s epics contain flashes of everyday detail, but such realism is rare, whereas the novel devotes itself to the circumstantial. This shift matters because it signals a new reading experience. The epic was part of an oral tradition: in ancient Greece, bards and rhapsodes performed poems like the Iliad and the Odyssey aloud, sometimes with musical accompaniment[6]. By contrast, the novel is read in solitude. While prose fiction long predates the eighteenth century—Satyricon was written centuries before Moll Flanders—the rise of the novel is tied to the emergence of silent, private reading. Scholars debate which work counts as the first novel—some cite Cervantes’s Don Quixote (1605), others Chaucer’s Canterbury Tales, Richardson’s Pamela (1740) or Defoe’s Robinson Crusoe (1719)—but the crucial shift is from public storytelling to introspective reading.

The Use of the Proper

One of the novel’s most manageable innovations is its use of proper names. Watt observes that eighteenth‑century novelists began naming characters as individuals rather than types. Proper names are paradoxical: they designate a particular person yet remain arbitrary and potentially shared by others. Hobbes explains the distinction succinctly: a proper name “bringeth to mind one thing only,” whereas universal names recall any one of many[7]. Earlier literature used descriptive or symbolic names—Odysseus (“wrathful”) and Oedipus (“swollen foot”)—that situated characters within mythic archetypes. Novels, however, favour combinations of first and last names that sound realistic and subtly suggest character: Pamela Andrews, Clarissa Harlowe, Robert Lovelace, Mrs. Sinclair and Sir Charles Grandison. Even when an alias such as “Moll Flanders” appears, it carries the weight of a full name. By individualising characters, novelists anticipate Lockean and Humean theories of personal identity, which locate identity in consciousness and memory rather than in fixed essences[3].

Reading and Individuality

Theatre‑goers who attended Sophocles’ Oedipus Rex or Shakespeare’s A Midsummer Night’s Dream already knew the plots; the dramatic form, like the epic, is meant to be performed. The novel, by contrast, invites each reader into a private world. Augustine’s Confessions records his surprise at seeing Ambrose read silently: “when Ambrose read, his eyes ran over the columns of writing and his heart searched out the meaning, but his voice and his tongue were at rest”[1]. Silent reading was not unknown, but it was notable enough for Augustine to comment on it. In medieval and early modern Europe, reading often involved vocalisation; only gradually did silent, introspective reading become common. The novel’s introspection builds on this shift. Novels immerse readers in the particulars of everyday life—bathing, laundry, eating a sour grape, making love on an unmade bed—and linger on the mundane. Charles Dickens’s Great Expectations illustrates this attention to detail when Pip traces his fingers over the raised letters on his parents’ tombstone and imagines their physical presence. Such scenes exemplify the novel’s repudiation of epic universals and its commitment to particularity.

What Realism Is Not

Watt famously contends that the novel’s realism does not reside in the kind of life it presents but in the way it presents it[2]. Historians have sometimes defined realism as fiction depicting the “seamy side” of life—Moll Flanders is a thief, Pamela a hypocrite, Tom Jones a fornicator—but Watt argues that this definition obscures the novel’s originality. Realism, in his sense, is not naturalism, scientific pragmatism or a mere truism that novels are slices of life. Rather, it is a narrative convention that treats the world of the novel as if it were based on evidence given by an eyewitness, emphasising verisimilitude in description, time and space. The novel thus distances itself from both idealised romance and confessional rhetoric; it seeks authenticity through form.

Philosophical Realism, a False Step

Medieval scholastic “realism” held that universals—classes, forms or abstractions—are the true realities, independent of sensory perception. Nominalists challenged this view, arguing that only particulars exist and that universals are names. This scholastic debate seems far removed from the novel’s aesthetic concerns. Watt nevertheless attempts to connect the novel’s rise to modern philosophical realism, suggesting that thinkers such as Locke, Descartes, Aristotle and even Plato share a commitment to truth discovered by the individual through his senses. This grouping is strained. Locke certainly emphasises sensory knowledge and argues that personal identity consists in the continuity of consciousness[3]. Aristotle distinguished between universals and particulars but did not adopt a modern empiricist position. Descartes, however, prioritises rational introspection over sense experience. His famous cogito—“I think, therefore I am”—comes after methodic doubt that suspends reliance on the senses. Aligning Descartes with empiricist realism mischaracterises his dualism and overlooks the idealist elements of his thought. Watt’s invocation of Plato and Aristotle may gesture toward a longer history of debates about universals and particulars, but the connection to the novel remains tenuous.

Why Descartes?

Watt sees in Descartes’ prose style a precursor to the novel’s narrative techniques. The Meditations and Discourse on Method are written in the first person and invite readers to follow an individual’s reasoning. Yet this does not make Descartes a realist in Watt’s sense. Cartesian philosophy predates the novel by a century; its sceptical method locates certainty in the mind rather than in the external world. While Descartes describes his environment—a warm room near a fire, the wax that changes shape—these narrative touches serve philosophical argument rather than imitation of everyday life. Kant’s transcendental philosophy, which mediates between empiricism and rationalism, may align more closely with the novel’s concern for how the mind organises experience. Watt’s attempt to find a direct genealogy from Descartes to Defoe obscures the novel’s more complex intellectual inheritance.

Locke’s theory of personal identity offers a more convincing link between philosophy and the novel. In Book 2, Chapter 27 of An Essay Concerning Human Understanding, Locke defines a person as “a thinking intelligent Being… which can consider itself as the same thinking thing in different times and places” and asserts that “consciousness always accompanies thinking, and ’tis that, that makes everyone to be, what he calls self”[3]. Identity persists as far as consciousness can be extended backwards to past actions and thoughts[3]. Novelists literalise this notion by tracing characters’ memories across time; Hume and later philosophers would complicate this further. Such psychological continuity undergirds the novel’s interest in character development.

The Novel’s Sense of Time

Watt notes that novels conceive time differently from earlier genres: they use past experiences as causes of present actions and discriminate time more minutely. Letters in Richardson’s Pamela and the date headings in Clarissa locate events precisely. Fielding satirises Richardson yet still constructs a coherent time scheme. Novels such as Joyce’s Ulysses or Woolf’s Mrs Dalloway compress a day into a stream of consciousness that evokes the flux of mental life. This differs from the “unity of time” developed by neoclassical critics from Aristotle’s Poetics; the unity of time holds that the action of a play should take place within a single revolution of the sun, roughly twenty‑four hours[4]. The novel, by contrast, is historical by nature; it spans years, even lifetimes, and dwells on memory. Ortega y Gasset calls the novel “sluggish and long” because it imitates the languorous passage of time. Later works such as Marcel Proust’s In Search of Lost Time and W. G. Sebald’s The Emigrants foreground memory and temporality. Sebald intersperses his narratives with photographs; in The Emigrants a train‑track image appears alongside the account of Paul Bereyter’s suicide. These images are not simply illustrations but evoke the punctum of memory—what Roland Barthes describes as the piercing detail. The novel thus integrates temporal flux into its very form.

Conclusion

Space, time, plot and character in the novel work together to create an authentic account of individual experience. Watt shows that eighteenth‑century novelists abandoned traditional plots, epic characters and rhetorical flourishes in favour of detailed description, psychological development and causal coherence[8]. Philosophers likewise turned to the individual—Locke’s consciousness, Hume’s bundle of perceptions, Kant’s transcendental subject. Yet aligning the novel directly with philosophical realism risks oversimplifying both domains. Nominalist scepticism about universals encouraged attention to particulars, but the novel’s realism also stems from commercial print culture, the rise of a reading public and a secular interest in private life. Before the novel, fiction was often praised for its rhetorical beauty rather than its reference to reality; the novel claims verisimilitude by imitating human experience while acknowledging the mediation of language. Plato’s allegory of the cave reminds us that all knowledge is mediated: the novel sits between idealism and realism, neither claiming direct access to reality nor retreating into pure mind. Its fate lies in continuing to explore this middle ground, giving form to the flux of life.

PDF Copy for Printing

30.1.26

Story Time: Emotional Support Pickles and Chickens in the Classroom

What if classroom management didn’t start with charts and systems — but with something soft, weird, and surprisingly effective? Meet the emotional-support pickle: a small, sensory tool that helps students reset, refocus, and get back to learning. Sometimes the simplest solutions really do work.

You can purchase emotional-support pickles online—just search for “emotional support pickles” and “plushy”.

For my school's Secret Elf gift exchange (everyone buys a gift for a “secret” person), I received these ridiculous plush emotional-support pickles and chickens. They were gifted to me, by lot, from the sweek school office lady, "Ms. Lia". They’re oddly perfect for managing the emotional weather of a high-strung middle and high school classroom. I love my school, but some days I just need to hug my emotional-support pickles.

Everyone’s out here talking about fancy classroom-management systems and color-coded behavior charts and the newest acronym-of-the-week. And I’m like: listen. Get some emotional-support pickles. Put them in your classroom. Especially if you teach sixth or seventh grade like I do.

Kids love sensory stuff. They love something tangible. And if a plush pickle helps a kid settle their nervous system and get back to learning, then fine. Call it “emotional regulation.” I call it: the pickle works.

First, you’ll have your Velcro students—the ones who will attach themselves to that pickle like it’s a life raft. They will want it all day. Forever. In perpetuity.

Second, you’ll have… let’s call them the tiny chaos scientists. One or two. The ones who look at an emotional-support chicken and think, What if I took this apart and learned what’s inside?

So yes: you are the therapist in this situation. You are also the bodyguard. You have to protect the emotional-support pickle at all costs.

Note: I don’t make any profit from the sale of these plushies. This post is simply based on my own experience.

And honestly, you can substitute any school-appropriate plushy toy and get the same effect: an axolotl, reindeer, oyster—whatever works for your kids.