JORDAN COPELAND
I've been editing videos for twenty years. I've been directing creative people for ten. And I've been obsessed with the creative applications of generative computing since reading those leaked LaMDA transcripts in the summer of '22.
These three things are converging.
THE EDITOR
Twenty years at the timeline. Still the favourite part.
THE DIRECTOR
Assembling comrades. Finding the vibe. Keeping the chaos productive.
THE SLOPWRANGLER
Generative video for productions that care about the result.
Watch my work on the phones I edited it on.
iPhone
2007Final Cut Pro 7, a Kensington trackball, and the BBC on line one. Everything was Standard Definition and nobody had opinions about aspect ratios yet. The phone in my pocket was for calls. The work was on tape, then FireWire, then a hard drive the size of a brick. Simpler tools, but the cutting was just as hard.
iPhone 4
2010The first phone with a screen worth watching video on. By now I had a studio in Hackney, clients walking in off the street, and a growing suspicion that the edit suite was becoming portable. Retina display changed what "good enough on mobile" meant. It meant good.
Galaxy S III
2012The Android years. Bigger screen, worse colour. But phones were eating broadcast by this point and the work had to look right on both. I spent more time testing exports on handsets than I’d ever expected to. The Galaxy lived on my desk next to the client monitor, a reality check for every grade.
Galaxy S8
2017Edge-to-edge. This is when the phone stopped being a preview device and started being the destination. Tentacle was running by now, and half the deliverables we produced were built portrait-first. The S8 sat in a mount on my second desk, playing loops while we cut.
iPhone X
2017The notch. The gesture bar. The death of the home button. By this point I’d directed two documentaries, run a company for three years, and started wondering what would happen when the tools got smarter than the process. The X was the phone I was holding when I read those LaMDA transcripts five years later. The work on its screen looks nothing like the work on the 2007’s.
I work with people to make sure their videos are incredible, and credible, whatever that means in each context.
The Work
01 What do you actually do?
I edit videos. I direct them sometimes. And increasingly I build the generative pipelines that let productions do things their budget says they can't.
The editing is the foundation. Twenty years of it. Short-form, mostly: brand films, music videos, concert films, livestreams, social content, the occasional documentary. Everything from thirty-second Instagram cuts to four-hour live broadcasts.
The directing grew out of the editing. When you spend enough years fixing other people's rushes, you develop strong opinions about what should have been shot differently. Eventually someone lets you prove it.
The generative work is the newest layer. Two years of intensive tooling across sixty-odd platforms, plus a research project called Guava and regular lectures for young filmmakers. The pitch is simple: with the same money, people get to make better things.
02 What's the difference between editing and directing for you?
The edit is where the work lives. Directing is assembling the conditions for the edit to be good.
When I direct, I'm thinking about what the timeline needs. Which angles give me options, which performances have the energy, whether the coverage allows the cut I can already feel. Most directors think about the shoot. I think about what comes after it.
That's not better or worse. It's just what happens when you've spent two decades inside timelines before you step onto a set.
03 What's a Slopwrangler?
Generative video tools produce beautiful, unpredictable, frequently wrong output. The industry calls it "slop." I call it raw material.
A Slopwrangler takes generative output and makes it do what the project needs. That means knowing which model produces which kind of thing, how to art-direct a machine that doesn't understand art direction, and when to throw away three hours of generations because the fourth one has something alive in it.
It's editing, really. Just with different inputs.
04 What does "incredible, and credible" mean?
Incredible: it has to make you feel something. Credible: it has to feel true to whoever's behind it.
A music video for an artist nobody's heard of needs to feel like a discovery, not a marketing exercise. For a global brand, the film has to feel like they believe their own message. And a concert film? It needs to feel like being there, not like watching a recording of being there.
The comma between "incredible" and "credible" is doing a lot of work. They're almost opposites and you need both.
The Career
05 How did you get started?
Film school, then a runner job I was terrible at, then six months of pretending I could use Final Cut Pro until I actually could. First real client was the BBC. First real lesson was that nobody cares how clever your edit is if the deadline's passed.
I freelanced for eighteen years out of various East London studios, worked my way through CNN, Nike, Samsung, YouTube. Made two documentaries. Played in several bands at the same time, which is excellent training for the logistics of creative collaboration if nothing else.
06 What was Tentacle?
A creative editing and video engineering company I ran for four years. Started as a way to formalise the team I'd been assembling informally for a decade.
Tentacle did concert films, fashion films, livestreams, stadium visuals, SaaS product videos, the occasional art project. We pivoted hard into live streaming and remote production during 2020 for obvious reasons, and came out the other side with a speciality in high-stakes live broadcasts.
I closed it to retool for generative work. The company did what it was supposed to do. Four years of managing a team taught me how to brief creative people, how to argue about budgets, and how to ship complicated projects with too many stakeholders. Those skills turned out to be more useful than the company itself.
07 Who have you worked with?
The list I'm supposed to recite: Adidas, Nike, YouTube, Red Bull, BBC, CNN, Samsung, Channel 4, Universal Music.
The list I actually care about: Alabaster dePlume, GoGo Penguin, The Comet Is Coming, Shabaka, Super Best Friends Club, Primus, Beth Orton, Alewya. Musicians who trust me with their work and let me take risks with it.
Also: two seasons of being embedded with The Police during their reunion tour, which is a documentary story I'll tell you over a drink sometime.
08 What's the most complicated project you've done?
Swedish House Mafia's livestream, probably. Four hours, live to air, five cameras, a warehouse in Hackney dressed to look like a Stockholm basement, and a director (me) making cut decisions in real time while the biggest DJs in the world played to a global audience.
The closest runner-up is three seasons at London Stadium for West Ham. Not technically difficult in the same way, but politically complicated. Every screen in the ground, sixty thousand people, a club with strong opinions, and ninety seconds to land the energy before kick-off.
Generative AI
09 When did you start using AI?
Summer of 2022. I read the leaked LaMDA transcripts and spent the rest of that week unable to think about anything else. By September I had accounts on every tool I could find. By early 2023 I was using Midjourney to pre-vis a music video. By mid-2024 I was running multi-model pipelines with agentic code managing the prompt strategy.
The acceleration is real. What takes me an afternoon now would have taken a team of three a week in early 2024. What took that team a week was impossible in 2023.
10 Does AI replace people?
No. This is the wrong question and I'm tired of hearing it.
AI replaces specific tasks that were previously expensive or slow or both. A Slopwrangler doesn't replace a VFX artist. A Slopwrangler does the things a production couldn't afford to hire a VFX artist for. The budget was going to be spent either way. The question is whether it buys twenty minutes of generic stock footage or sixty seconds of something made specifically for this project.
Every technology in the history of filmmaking has triggered this conversation. Editing software didn't replace editors. Digital cameras didn't replace cinematographers. The tools change. The need for someone who knows what to do with them doesn't.
11 What's the pitch? Same money, better things?
Yes. That's literally it.
A music video with a treatment that requires twelve locations and a budget for four. Generative tools handle the eight locations the production can't physically reach. The director gets the film they wrote. The artist gets visuals that match the song. The budget doesn't move.
A brand campaign that needs character consistency across nine scenes. Previously that's a casting call, a wardrobe department, two shoot days. Now it's one shoot day and a pipeline that extends the results into every scene the budget couldn't cover.
The money doesn't go down. The work gets closer to what it was supposed to be.
12 Which tools do you actually use?
Sixty-odd accounts at last count, but the ones that earn their keep:
IMAGE: Recraft V3, Flux, Midjourney v7. VIDEO: Kling, Runway Gen4, Seedream. AUDIO: ElevenLabs, Udio. EDIT: Premiere, Resolve, TouchDesigner. 3D: Blender, Notch, Unreal. LANGUAGE: Claude, GPT, Gemini.
The landscape shifts every few weeks. I'm not loyal to platforms. I'm loyal to results.
13 What is Guava?
An HCI research project. The short version: I'm studying how creative professionals actually interact with generative tools, what works, what fails, and where the friction is.
The longer version involves a Hetzner VPS, several Telegram bots, a personal knowledge system called Wisdom, and more Python than I expected to write in my forties. If you're interested in the methodology, ask me at a lecture or buy me a coffee.
The Philosophy
14 Is pain and struggle necessary for creation?
Every atom in the universe is heading towards entropy, which is to say a nice lie down. Making things is the opposite of that. So yes, it's work. It has to be.
But "struggle" gets romanticised by people who've confused suffering with seriousness. The best work I've made happened when the conditions were right and the tools were good and the people involved trusted each other. That's not painless. It's just not painful for the sake of it.
15 What do you think about when you're editing?
Rhythm, mostly. Whether this cut lands on the beat or just after it. Whether three seconds of silence will make the next shot hit harder. Whether the viewer needs a breath here or whether we should keep pushing.
It's closer to playing music than to writing. You feel when it's right. The intellectual justification comes afterwards, if anyone asks.
16 What's the future of creative work?
Fewer people doing more. Not because people become unnecessary, but because the tools let a small team match the output that used to need a department.
The difficult bit isn't the technology. It's the business model. A freelance editor with generative tools can produce work that would have required a studio. But the studio charged studio rates. The freelancer charges day rates. How does the freelancer capture the value of the output rather than billing for the hours? That's the question I spend most of my time on.
17 Why "Slopwrangler" and not something more professional?
Because the job description doesn't exist yet and the serious-sounding alternatives are all lies.
"AI Creative Director" implies you're directing the AI. You're not. You're negotiating with it. "Generative Video Specialist" sounds like a job at a consultancy. "Prompt Engineer" is already a punchline.
Slopwrangler is honest. The raw output is slop. The job is wrangling it into something that serves the project. And it's memorable, which is what a job title is actually for.
Personal
18 What instruments do you play?
Any musical instrument except a flute. Mainly guitar, piano, and bass, but the list also includes drums, mandolin, ukulele, harmonium, and a brief, ill-advised period on trumpet.
I played in several bands through my twenties and thirties. Still do, when schedules align. The musical training is more relevant to editing than people expect. Rhythm is rhythm.
19 What do you do when you're not working?
I have two daughters, so the honest answer is: parenting. Beyond that, playing music, arguing about films, cooking badly, and reading about whatever I'm currently obsessed with, which at the moment is the history of printing and how every previous leap in reproduction technology followed the same pattern we're seeing now.
20 What's Total Wealth?
A state where my whole family is wealthy enough for me to spend all my time making things for fun and playing music with my friends.
There's no way to get there on a day rate. The gap between what I earn per day and what "wealthy" means is too large for linear accumulation. I need to find a seam where old skills and new tools combine to create disproportionate value.
Most of what I do now is oriented around finding that seam. The generative work, the lectures, the research, the tools. All of it is exploration with a commercial purpose.
AI for Creative People
If someone in your team feels either threatened or underwhelmed by AI, they're probably using it wrong. This talk is practical, honest, and built on two years of daily use across sixty tools. Principles anyone can apply to their creative work, with examples from real productions.
Available for teams, conferences, and film schools.
Three projects where generative tools changed what the budget could buy.
OpenAI Super Bowl Pre-Vis
SLOPWRANGLERThe director had a treatment. Budget covered maybe a third of it. Everything else needed to exist as pre-vis, convincing enough to get approval for the shoot, detailed enough that every department could plan from it.
It started with World Labs, building spatial definitions of each environment: camera positions, light sources, depth relationships. Those fed into Recraft for keyframe generation, iterating until the compositions matched the director's references. Then the video models. Sora for architectural interiors (it handles straight lines). Kling for character movement. Seedream for anything that needed to feel organic.
Fifty-nine assets. Pre-vis that the production team could actually block a shoot from. The kind of thing that would have been a separate line item on the budget, with a dedicated pre-vis studio, three weeks of lead time, and a conversation about whether it was worth the investment. This was one person and a pipeline.
Kettama — Sort It Out
DIRECTOR / SLOPWRANGLERThe treatment called for a liquid metal ball that appears in seventeen shots, tracking with the camera, interacting with light and reflection in each environment. Plus a melting monster sequence in the final act. The director wanted it to feel physical, not composited.
The live action was shot on Alexa. Clean plates, tracking markers, the usual VFX prep. Then the generative work: frame-matched AI generation for each shot, using Recraft for the initial look development, Kling for motion, and Magnific for resolution. Each shot went through fifteen to thirty iterations. The ball had to sit in the frame correctly, which meant generating from the plate and masking precisely, not dropping in a generic effect.
Seventeen shots of consistent VFX, plus the monster sequence. On a music video budget. Without the generative tools, those seventeen shots would have been a visual effects house, a supervisor, a four-week render queue, and a conversation about whether the video could afford to be the thing the director actually wanted it to be.
Chase Sapphire Reserve
SLOPWRANGLERNine scenes. A consistent character across all of them. Premium brand, premium expectations. The kind of brief that usually means a two-day shoot with a large cast and a wardrobe department.
This was the first properly agentic pipeline. Claude Code managing the prompt strategy, maintaining character consistency across four image models, then pushing the best outputs through three video models for motion. 120 images, 22 video clips, and a character who looks like the same person in every frame, which turns out to be the hardest single problem in multi-model generative production.
The system worked because the agent understood the brief at a level beyond prompt engineering. It knew which model handled which kind of scene, when to regenerate versus refine, and how to maintain the thread of the character's appearance across completely different environments. The client saw the finals and asked which agency produced the shoot.