Shared Links (weekly) Dec 8, 2024
For more like this, subscribe to the newsletter and get these links and more in your email.
For more like this, subscribe to the newsletter and get these links and more in your email.
Words about your workplace’s great culture ring hollow when team members regularly find themselves putting up with jerks. That’s not a great culture. That’s extra emotional labor—labor that likely doesn’t come close to matching what they are paid.
We don’t talk about this in terms of emotional labor. We talk about being resilient, staying composed, etc. We don’t talk about how exhausting it is to know that every day at work, someone is likely to yell at you, let alone know that when it happens, there will be no solution to prevent it from happening again. If they take the time to complain and ask for a solution, they’ll be told it’s “just part of the job.”
If the number they quote in the article about gaining four hours per week is accurate, that’s a massive change. In the last few months, many people in the industry have mentioned fixed-rate billing. It’s been an option for law firms for a long time, but it’s only gotten slight traction so far.
That might be about to change. Justifying the current economic model and investing in AI tools that save this much time will be quite challenging. The risks involved with “finding” more billable time are too high. I know; I’ve worked as a consultant before, and while it sounds great that finishing projects ahead of time frees you up to work on other projects, when those other projects don’t come in, you wind up short on your billable hour requirements and those tools that were supposed to make your work-life better, suddenly make it a lot worse.
You can say that users aren’t allowed to use AI without approval, but as the link above points out, they’re going to anyway. And why wouldn’t they? Most of them are stressed and overworked, and you’re telling them not to use a tool that could cut some of the time spent completing assigned tasks.
Good luck with that.
I’ve been saying this to anyone who will listen to me rant about it. Your IT or Training and Development teams can only go so far when it comes to training for AI. We can do some training around prompting, show them where to click to enter a prompt, and even show them how to integrate the AI responses into their work.
What we can’t do is help them judge the results and iterate their prompts based on the results. That requires expertise in your practice area. That can only come from other lawyers. It’s also the much larger learning curve for working with AI.