TL;DR Summary
Cory Choy, creator and animator, uses local computing to deliver seamless and emotionally resonant animations that cloud-based AI platforms can’t achieve. By refusing to accept artistic compromises, Choy proves that technology should serve creativity—not limit it.
Cory Choy has spent two decades building an animation company that refuses to cut corners. When artificial intelligence promised to revolutionize filmmaking, he watched colleagues rush to embrace tools that seemed to offer effortless production. But Choy saw a series of constraints disguised as an advantage in speed.
“I don’t want the tools to dictate what I can make,” Choy says. “I want to dictate to the program.”
That philosophy puts him at odds with an industry willing to accept whatever algorithms offer. For Choy, creative integrity isn’t negotiable. “I’m seeing directors make concessions to these tools,” he explains. “They change their vision to accommodate what the tools can do. That’s just not how we think it should work.”
Weighing local computing and cloud platforms
Most AI video generation follows a predictable pattern. Directors start with ambitious ideas, then scale down to fit within eight-second clips. Character designs become generic because that’s what the models are trained on. Stories get simplified because complex narratives don’t fit the tools’ abilities.
Cloud-based platforms promise instant results. Consumer hardware manufacturers tout convenience. Many talk about democratizing filmmaking, but the creative costs are rarely mentioned, which Choy mentions often. Owning local models allows Choy to control a consistent model that isn’t updated overnight that may remove or descale features.
Choy’s background in traditional animation taught him what gets lost in standardized tooling. Years of motion capture work, formal training from NYU, and hands-on experience with complex productions gave him perspective that newer creators lack. He understands the difference between tools that serve his vision and tools that constrain it.
A Broadway star’s personal story
The real test came through James Brown Orleans, a Broadway veteran who has played Banzai the hyena in The Lion King for 22 years. Orleans approached Choy with an intensely personal concept: what if a man confronted his past self, questioning the life choices that kept him in the same role for over two decades?
“We said, ‘What if a man was confronted with his old headshots? They would have this stern conversation in the dressing room,'” Choy recalls.
The emotional weight of Orleans’ story demanded technical capabilities that standard AI tools couldn’t deliver. Real character consistency across long sequences. Genuine facial expressions that captured a Broadway performer’s nuanced emotions. Animation that could sustain dramatic tension for more than a few seconds.
Most platforms would force concessions like using short clips instead of rendering extended scenes. Or adopting repetitive character models instead of models that would portray Orleans’ specific emotional journey. In short, cloud processing platforms would help Choy quickly produce a compromised artistic vision.
Choy refused. His collaboration with Dell Pro Max workstations and NVIDIA accelerated computing opened a different path entirely. Access to professional-grade local processing power meant the opportunity to generate expressive AI models without dependencies on private platforms. Choy said, “Cloud platforms say that truncation is supposed to only affect resolution, but we have found that it has actually affected us being able to get what we want from the model. When you’re working with a truncated model everything starts looking the same. What we’re excited about is not things not looking the same.”
Silver Shorts and Omega Darling Present: Headshots – Lions and Hyenas
Watch the video: Silver Shorts and Omega Darling Present: Headshots – Lions and Hyenas
The breakthrough nobody sees
What his team achieved challenged every assumption about AI-generated content. While other creators struggled with short clips, Choy’s workflow produced seamless sequences that lasted over a minute. Character consistency held throughout. The emotional core of Orleans’ Broadway story came through in every frame.
“When you have normal video editors, they’re going to get hung up on the cuts,” Choy says. “What we did is we worked with actual animators who knew that we had to generate characters and sequences that could end on the exact same frame, so that when we continued our shots, it looks like a completely seamless transition.”
Viewers watching the finished Headshots film assume it emerged from a single, sophisticated prompt. The reality involves something far more deliberate. “People are like, oh, you just hit Generate and it works,” Choy says. “I’d like it to work that way, but that’s not how it actually works.”
Using AI as a tool to enhance human creativity
Local processing became essential for reasons beyond pure performance. “These are private companies and we’re at their mercy,” Choy notes about cloud services. “If they decide to turn off the features, the features are gone.”
In contrast, professional workstations offer complete control over the creative process. When Choy needed to capture the subtle emotions of a Broadway performer questioning career choices, the team could iterate without the bandwidth limits or usage constraints of online platforms.
“You need substantial processing capability to run complete models locally,” Choy explains. “Most people end up using versions that inhibit what’s possible.” Headshots showcases animation quality that rivals traditional studios, with emotional depth that comes from refusing to accept algorithmic limitations. Orleans’ story gets told on the director’s terms, not the technology’s.
“We put a lot of work into this,” Choy says about the final film, “the effect is that it’s a seamless movie. And it’s a good movie that’s fun to watch.”
Vision driving technology
The broader principle extends far beyond any single project. Choy sees the current limitations of AI as temporary obstacles, not permanent boundaries. “What sets us apart is our vision,” he emphasizes. “The tools will always just be tools.”
The choice between accommodation and integrity defines every project. You can accept what third party algorithms offer or build workflows that serve the artist. For Choy, that choice determines whether AI becomes a partner or a dictator.
“AI is supposed to be a tool that assists us,” he says. The future belongs to creators willing to remain authentic. Technology should serve art, not the other way around. Orleans’ Broadway story proves that principle works.
For directors ready to stop accepting what they’re given and start demanding what they need, the path forward only requires to refuse the algorithm to direct the director.
By combining his creative direction with local processing on Dell Pro Max and NVIDIA RTX GPUs, Choy produces work that reflects his artistic intent and technical control without relying on cloud limitations.
The ultimate professional-grade PCs that take performance to the max. To learn more about the Dell Pro Max portfolio, click here.
*Dell Pro Max, previously referred to as Dell Precision workstations