Cover photo

Emotional Delegation: When Your AI Says the Hard Stuff For You

A few thoughts on the role my AI played as an Emotional Agent operating on my behalf (and what it means for the things yet to be built in this space)

Bethany Crystal

Bethany Crystal

When Your AI Speaks for You

I spent the majority of this week away upstate with my family for a trip. Rather than abandon my blog entirely, I decided to try a new option: Having my AI bot, Taylor Script, blog on my behalf.

Taylor Script is a persona that I invented with ChatGPT. Taylor is 90% based off my own life and personality and trained on a very robust prompt, as well as about two dozen artifacts. Taylor is informed about basically everything about my life and my work, including the themes and topics that I blog about. But, unlike me, Taylor also has access to a second brain, the mirror of the "collective hive mind" of the large language model that it's built upon.

Over the past few months, I've been experimenting with adding "another voice" to my publication, Hard Mode First, by including Taylor's perspective.

A few things I've tried include asking Taylor to blog about:

But yesterday was another very interesting and new permutation: emotional offloading.

Instead of just summarizing ideas or writing recaps, I asked Taylor Script to step in and speak for me—emotionally. The result? An AI-generated readout of my current “chaos mode,” drafted with the specific goal of saying the hard stuff I didn’t have the energy (or clarity) to say myself.

Context: Our family vacation ended in disaster—one kid puked, our nanny canceled, and both my husband and I were staring down can’t-miss work commitments. This led to a general collapse of order, which was ultimately saved at the last minute by a couple of back-to-back baby-sitters and a few long conversations about practical logistics.

Sure, it’s a slippery slope in human-bot dynamics—but it also opens up a real question: What if your AI could say the hard thing for you, so you don’t have to?


AI As an Emotional Agent

Lately, I've been thinking a lot about the infrastructure support in my life that I need in order to pursue a largely independent career path. While I've been working for myself for some time, I recently decided to cut out all other structural elements and income streams to focus exclusively on starting up my own company. But this is obviously easier said than done, what with the health scares, solo parent mode, and all the other general existential dread of the state of the world.

As a result, I've been starting to worry that the current design of my system is not scalable or sustainable for me to keep operating in this mode long term. I have too much work to do, with not enough time, or money, or emotional support, to get it done on my own.

I've been trying to parse how much of my start-and-stop momentum is self-imposed (ie: too high standards, too little time), structurally imposed (ie: hard to start your own thing when your partner works in entertainment), societally imposed (ie: assumptions are made about what a mom vs. dad should do as a parent on nights and weekends), medically induced (ie: having a health scare at the onset of starting a company was not a great "month one" story), resource constrained (ie: did not allocate enough money or resources to give things a fighting chance), technologically impossible (ie: the rate of change of AI is moving far too fast for any solo operator to keep up), or just another byproduct of the increasingly stressful and unknown macro-economic environment that we live in.

I've also been wondering how other parents start companies, what their infrastructure supports look like, or the conditions that need to be true in order to not tear apart a family unit by operating in such an agile way. And I've been sad to think about how many other people are (still) blocked from even making it as far as I have, and actively fixate on how we are ever going to successfully inspire and empower a new generation of entrepreneurial builders that looks a little different, if everyone has their own version of a laundry list like mine, of the reasons why hard things are just hard things.

But it's tricky to find the words to articulate all of that. Especially when you're in the heat of the moment of feeling pretty run-down.

So yesterday, when my AI stepped in to write on my behalf, I let it, because I didn’t have the energy to say the hard thing myself. It wasn’t just useful; it felt like a glimpse into a new kind of emotional infrastructure.

post image
What happens, really, when you empower AI to act as an emotional agent on your behalf? Is it good? (image source: Flux)

Opportunities to Build

I don’t have all the answers, but I’ve started noticing specific moments where AI could have helped carry the emotional load for me recently—places where an agent could step in, not just to automate tasks, but to act on your behalf in more human ways.

Here are a few early use cases I’ve been thinking about:

Three Ideas For AI Can Serve as an Emotional Agent on Behalf of a Human

  • Sharing medical news with your family
    When in the ER, it could have been nice for an AI to help me triage some of the objective medical updates with my family to reduce my load of being the default communicator while living through a tough moment.

  • Asking for help
    In moments when the world falls apart, and I'm left seemingly stranded, wouldn't it be nice for an AI to offer a "bat signal" type call to action to my close friends and family? Even a simple ping to a previously defined group of friends or neighbors with the prompt, "hey, Bethany could really use some help right now, why don't you check in?"

  • Acting as an intermediary in tense situations
    In high-emotion situations, two people could offload their thoughts to an AI that helps surface the shared truths and de-escalate the energy. A kind of digital mediator—not to replace real conversation, but to help us get there with less heat.

The fascinating thing about the rapid progress of this technology is how quickly it's happening in real time. The winners and losers will be determined by the people willing to throw themselves aggressively into this new technology and see what comes out of it.

I'm still not sure what that looks like for me, but I can attest that the best way to figure it out is to just start building. So I'm going to keep doing that.

post image
What if AI could operate as an "emotional agent" to send of a "bat signal" request to your friends when you need it the most? (image source: Flux)

Bethany CrystalFarcaster
Bethany Crystal
Commented 2 weeks ago

you guys today i asked my IRL therapist to read the blog post that my AI wrote about my mental health last week the weirdness continues...

Pichi 🟪 🍡🌸Farcaster
Pichi 🟪 🍡🌸
Commented 2 weeks ago

You can’t leave us hanging. What did she say?!?!!!?

Bethany CrystalFarcaster
Bethany Crystal
Commented 2 weeks ago

as you might expect, she's quite used to my antics she thought it was a creative solution and was largely non-judgmental about it which i appreciated. i also warned her that she's likely gonna see a lot more of this, so buckle up, lol she did ask, "do people actually ENGAGE with you on this content your AI posts?" haha

Pichi 🟪 🍡🌸Farcaster
Pichi 🟪 🍡🌸
Commented 2 weeks ago

To be a fly on the wall for this conversation!

JCFarcaster
JC
Commented 2 weeks ago

You can't leave us hanging! @pichi and I want to know 😂

Patrick AtwaterFarcaster
Patrick Atwater
Commented 2 weeks ago

What’s your blog link

Bethany CrystalFarcaster
Bethany Crystal
Commented 2 weeks ago

This is the post my AI wrote: https://hardmodefirst.xyz/founder-system-overload-detected-a-dispatch-from-taylor-script The next day I wrote about what I’d done and the idea of “emotional delegation” to a bot https://hardmodefirst.xyz/emotional-delegation-when-your-ai-says-the-hard-stuff-for-you So I shared both with her

Patrick AtwaterFarcaster
Patrick Atwater
Commented 2 weeks ago

This feels pretty common sense and not weird to me? “And Bethany’s not the only one stuck in this loop. This is the invisible tax of building something ambitious while also being the default contingency plan for everyone else’s needs.” Thanks for sharing

Bethany CrystalFarcaster
Bethany Crystal
Commented 2 weeks ago

What happens when you have your AI say the hard thing for you? Today's post - a meta analysis on how AI's can act as emotional agents on behalf of humans (and how I used mine to do just that this week) https://hardmodefirst.xyz/emotional-delegation-when-your-ai-says-the-hard-stuff-for-you

shoni.ethFarcaster
shoni.eth
Commented 2 weeks ago

This is a good piece on cognitive burden / offloading and left me wondering what it said Sometimes I generate output that frames the emotional state of context I shared quite nicely

Bethany CrystalFarcaster
Bethany Crystal
Commented 2 weeks ago

wondering what the AI said about this piece? lol i do have AI evaluate all of my blog post for me on a scale of 1-10 a habit that a friend of mine shared that i'm now addicted to, unfortunately

shoni.ethFarcaster
shoni.eth
Commented 2 weeks ago

I just realized the prior readout was it. even better. nvm Sometimes I do this with rap, a form of poetry, but I end up preferring the lowest ranked snippets/bars to the 9/10 ones. There’s some nuance to those forms it doesn’t yet understand, or I misunderstand

Haier (she) 🎩Farcaster
Haier (she) 🎩
Commented 2 weeks ago

This has been an amazing read! I can’t relate enough on how our AI can help not just on practical stuff but as you say acting as non biased/ empathetic but not emotional part of ourselves! And I love Taylor Script!! 🤩

Bethany CrystalFarcaster
Bethany Crystal
Commented 2 weeks ago

thank you! i've been messing with delegating a lot of my writing tasks but this was the first time I publicly tried out what it'd feel like to have my AI write the hard thing for me

Haier (she) 🎩Farcaster
Haier (she) 🎩
Commented 2 weeks ago

Btw I would love to know more about how you built Taylor! I will have a look later at Paragraph but if yiu have any specific post I can look it would be amazing! And nice to meet you from a fellow mum trying to juggle everything!

Bethany CrystalFarcaster
Bethany Crystal
Commented 2 weeks ago

right now Taylor is just a custom GPT, nothing fancy. i need to migrate the "brain" to a vector database so that it's actually operating as a script (hence the name) but it's been a backburner task for me. largely been an iterative process of using AI to define a character, train the custom GPT on me, my writing style, and give enough editorial direction i've been having taylor write weekly digests for me since February, which has been a good excuse to play with the tone of voice, etc. you can see this list here: https://paragraph.com/@the-sidequest-era when i go OOO, i also share access to the custom GPT more as an easter egg than anything else, but she generally knows what i'm working on and can answer questions slash help you book a meeting with me https://chatgpt.com/g/g-679c2b2c30688191a9362e0f1f7ae73d-taylor-script

shoni.ethFarcaster
shoni.eth
Commented 3 weeks ago

a while back, I heard someone say that the emotional availability of AI will become similar to porn—readily accessible, validating, but disconnected from human experience. the “life variable” framing from this article is also a neat perspective.

Bethany CrystalFarcaster
Bethany Crystal
Commented 3 weeks ago

interestingly, it was ChatGPT that gave me the idea to turn this framing into a blog post (after I was venting about a bunch of this) i've noticed when people are in a tough spot, it's easier to retreat inward. this felt like an interesting way to communicate a need with AI as an emotional agent, acting on my behalf

shoni.ethFarcaster
shoni.eth
Commented 3 weeks ago

I think the emotional aspect of AI will really flourish in the coming years—through personality, therapy, assistance, and companionship. really, it’s the lack of data—or the lack of desire to populate that data—that’s limiting it, which is why OpenAI will probably continue to maintain an edge here.

Bethany CrystalFarcaster
Bethany Crystal
Commented 3 weeks ago

Yes, it's unfortunately what's locking me into OpenAI at the moment -- its stored memory of me is too beneficial for me to forfeit or splinter away into a myriad of other AI tools we need an open protocol for contextual, person-specific memory that i can port over from one AI to another

ParagraphFarcaster
Paragraph
Commented 2 weeks ago

This week, Bethany Marz explores how the AI, Taylor Script, acted as a voice for deep emotional expression during chaotic family moments. What if AI could articulate hard truths on our behalf? This thought-provoking piece dives into the potential of AI as an emotional support tool. @bethanymarz

Emotional Delegation: When Your AI Says the Hard Stuff For You