Newsletter Issue #5 - 26th Mar 2025
Federal cuts, mass layoffs, vibe coding and other implications of AI
Editor’s Note
Dear readers,
I would need reams and reams of paper to cover all that has happened in the past two weeks. From Abundance to Adolescence, critiques and theories emerge to articulate a better world.
In this edition of The Political Technologist, we cover two important issues - the abstraction of engineering into vibe coding and the impact of AI on the creative arts and entertainment (visual effects) industry.
What happens when we forgo nuanced decisionmaking for efficiency?
For the answers, you only need to look around.
Wishing you warm days and blue skies (despite technological dreary),
Jyo on behalf of the editorial board
P.S. We welcome any feedback, suggestions or additional expressions of love in the form of messages and comments on Substack. Or better yet, send us an email at thepoliticaltechnologist@proton.me
On the horizon
Looking for the chance to find your tribe, your next adventure, or that dream job? We recommend the following events and opportunities for you to keep track of.
Career opportunities
Kew Research Fellow (Digital Revolution) | Deadline: 20/04/2025
Ada Lovelace - Associate Director | Deadline: 28/04/2025
Greater London Assembly - Front End Developer | Deadline: 13/04/2025
London School of Economics and Political Science - Innovation Policy Fellow | Deadline: 30/04/2025
Journalist/Campaigner: Project “No One Left Behind” (Good Law Project) | Deadline: 31/03/2025
AI Fellowship (Summer 2025) (Cambridge ERA) | Deadline: 08/04/2025
Research Fellow, Formal Methods for Safe AI (University of Birmingham - School of Computer Science) | Deadline: 27/03/2025
Research Fellow (Pivotal Research) | Deadline: 09/04/2025
Other opportunities
Call for Participation: The fifth ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization (EAAMO‘25) | Abstract Submission Deadline: 17/04/2025, Paper Submission Deadline: 24/04/2025
Winter Institute on “Urban, Ecological and Infrastructural Politics in SWANA” | Deadline: 28/04/2025
On everyone’s lips
Trendy topics, juicy gossip, hot takes.
In this issue, our editorial board brings you two bite-sized (no full meals!) stories that arise because of AI, specifically AI-generated code and art.
Vibe coding or vibe chaos?
by Melissa Tranfield (Software Developer)
Lately, we’ve seen many people on social media, those in the AI “hype” communities, venture capitalists, and NYT journalists all proclaim the wonders of “vibe coding”. Vibe coding, as I understand it, is when someone creates a tool by asking AI tools to build it, with very little input on their own and often with a disregard for software engineering standards such as testing, code maintainability, and adequate security. These tools are created to automate daily tasks, such as ideas for their child’s lunch, planning weekly chores, organising social media bookmarks, and so forth. I am going to rebuke their points by responding to the claim that vibe coding is a sign that software engineering will soon be automated, by defining what engineering actually is. I’d also dispute the claim that everything is over for junior developers or for those who are actually willing to teach themselves to code the “slow” way.
I would first like to state that creating small tools using Claude or ChatGPT as a hobby is perfectly fine. The caveat is, a lot of tools one creates as a hobbyist already exist, and one will not learn by creating a new “local” version for your own use solely through the use of AI. If you are repeatedly copying and pasting error messages into ChatGPT until the code eventually starts working, then you aren’t actually programming, or rather, you are not problem-solving on your own. These tools will become very hard to maintain over time if they are built without the intention to write clean code and not adhering to software engineering standards.
What exactly is “engineering”? Ultimately, engineering is the ability to solve problems (with tools). In the case of software engineering, people are solving problems with tools like programming languages (e.g Java, C#, Python), infrastructure as code (e.g Hashicorp’s Terraform, testing), or by having a conversation with others about why an error might be happening. I was recently speaking to a friend who is an engineer at a big tech company. They were sad to see that a junior developer they were mentoring was executing their entire project by generating lines of code in Gemini, and not actually programming on their own, thereby not gaining the necessary skills to become an engineer. It’s one thing for hobbyists to use AI tools to generate code, as these tools likely won’t be shipped as part of a product sold by a company, but it is quite concerning for junior developers in large companies to rely heavily on code generated by a statistical model. I don’t blame them - this type of coding is driven by the AI hype cycle with little thought of the damage they might be causing.
In software engineering interviews, people care about how you got to the code. They don’t care about the code itself. Problem-solving skills are more valuable than the knowledge of the syntax of a language - anyone can learn that. These problem-solving skills won’t improve if you are just putting the problem into ChatGPT and letting it attempt to solve the problem (which it most likely won’t be able to do, depending on the complexity of what you are doing). If you copy and paste from ChatGPT, yes, you might learn something - but why not do it yourself first?
Do you want a tool to be easily maintainable and work for a long period? I’m sure you do, and so do companies. Making a tool is not about writing lines of code; it’s about making something useful for customers/users while maintaining that tool and expanding it (as and when necessary) so that it lasts for a long time. It’s very unlikely for this outcome to be achieved when your project is built solely using AI tools.
So yes - “vibe code” if you want to. But you aren’t doing engineering, you aren’t problem-solving, you aren’t creating a maintainable product/personal project. You’re training the models for free (no shoutouts for you from Open AI or Anthropic). I think we’d all rather have a well-built tool enriching the open source community, and not enriching yet another overvalued AI startup. To those who are continuing to learn to program without AI tools, keep going - you’re building the right skills to become an engineer and definitely a successful problem solver.
The VFX Industry is in a Cocoon Stage — What Comes Next?
by Yash Srivastava, Former Technical Artist at Technicolor
The VFX industry is at a crossroads. Things are slowing down, and it’s not just because of external economic factors. AI-generated content is flooding the market, prioritizing quantity over quality, and studios are chasing profit margins with reckless disregard for sustainability. The result? A flood of uninspired, low-effort content that clogs the internet, reinforcing what some call the Dead Internet Theory, the idea that much of what we consume online is algorithmically generated filler rather than genuine, human-created art.
At the same time, capitalism’s obsession with short-term profits has forced many skilled VFX artists out of work. Studios are underbidding to win projects, only to realize too late that they lack the capital to sustain their businesses. They operate for-profit companies as if they are non-profits — burning through talent with unrealistic demands and then wondering why investors aren’t interested. This mismanagement isn’t just bad for artists; it’s killing the industry itself.
Even industry giants aren’t immune. Technicolor, once a powerhouse in the VFX space, has crumbled under the weight of corporate greed and financial miscalculations. Other studios follow the same dangerous pattern: squeezing artists, cutting corners, and racing to the bottom in pricing—all while failing to see that this model is unsustainable.
The Power of Artist-Driven Creativity
If history has taught us anything, it’s that when artists are given the freedom to innovate, the results speak for themselves. We’ve seen it in films like: Into the Spider-Verse (2018) and Puss in Boots: The Last Wish (2022). Both of them pushed the boundaries of animation with unique visual styles, proving that audiences crave fresh, artist-led storytelling. Loving Vincent (2017) was another passion project that brought Van Gogh’s art to life in a way no corporate-driven film ever could.
These films weren’t just commercially successful — they were artistic milestones. They weren’t created by executives obsessed with profit margins but by passionate teams given the space to create, which is why it’s so infuriating to see studios actively destroy this kind of artistry. Sony’s Spider-Verse franchise is a prime example; the first film was groundbreaking, but as sequels enter production, reports of extreme crunch, unpaid overtime work, and impossible deadlines surface.
Instead of nurturing the artists who made the first film a success, the studio is trying to squeeze their employees and workers without offering them any real incentive or support. This isn’t an isolated case. It’s an industry-wide pattern.
Flow just won the Best Animated Feature at the Oscars, beating Pixar’s Inside Out 2 and Dreamworks’ The Wild Robot. Flow is an interesting case study that shatters a series of Oscar wins claimed by Pixar. Now, the size of a studio no longer dictates the quality of the film. Dream Well, the studio behind Flow, used Blender (a powerful open source 3D package with real-time render capability) against the industry standard, Autodesk (Maya). This win highlights that a radical shift from industry standards and corporate-backed funding models can yield good results for a film and its artists. This win might be radical but it is familiar to those who have been working in the industry. During the production of Dreamworks’ Prince of Egypt, artists unable to meet their targets were reassigned as ‘punishment’ to work on Shrek. This led to Shrek becoming a much more artist-driven project, and its legacy (an Oscar win, two academy nominations and multiple awards) is now well known.
Giving freedom and power back to the artists is the only way forward.
The industry’s future depends on its ability to value creativity over corporate greed. Without artists, there is no VFX industry — only algorithms churning out soulless content to maximise profits for CEOs who don’t care about the craft. If the industry doesn’t change, it risks collapsing under its own weight. The cocoon stage won’t last forever; what emerges next depends on whether studios prioritize short-term profits or long-term creative sustainability. It’s time to make the right choice.
Bookmarked
Queued up podcasts for the gym and saved articles for bus rides, our editorial board shares with you some of the bookmarked items we have been trying to wrap our minds around.
The Zeitgeist (catch up on the last two weeks)
[Talk] Bluesky's CEO Jay Graber on the Future of Social Media
[Talk] Signal’s President Meredith Whittaker on The State of Personal Online Security and Confidentiality
[Podcast] Is an Anti-Fascist Approach to Artificial Intelligence Possible?
[Article] What’s the Matter with Abundance?
[Article] DNA testing firm 23andMe files for bankruptcy as CEO steps down
[Article] Meta settles UK ‘right to object to ad-tracking’ lawsuit by agreeing not to track plaintiff
[Article] OpenAI and Meta Seek AI Alliance With India’s Reliance
[Article] Inside the manosphere luring young Indian men and boys
[Article] Federal agencies plan for mass layoffs as Trump's workforce cuts continue
[Article] Federal job cuts shake the Capital Region
[Article] Most Researchers Do Not Believe AGI Is Imminent. Why Do Policymakers Act Otherwise?
[Article] The Trump Administration Accidentally Texted Me Its War Plans
[Article] The 30-year quest to catch a national records thief
[Article] Scientists Respond to FTC Inquiry into Tech Censorship
[Article] Bubble Trouble
[Article] Trump Order Threatens University Libraries, Museums
[Article] What DOGE is Getting Wrong About Privatizing USPS
[Book] The Neoliberal Roots of the Populist Right by Quinn Slobodian
What have you bookmarked recently and can’t wait to get to consuming it? Share it with us via this form!
Presented Without Comment

Ways to engage with us
This year’s Newspeak House cohort is exploring a range of possible services we could be offering to our peers and colleagues in the civic tech space. We’re always open to new ideas here, but for the time being, here's what we’ve got on offer:
Open coworking days: If you’d like to join some of the current and previous generations of fellows for a blend of quiet, focused work and accidental socialising, get in touch with us. Our code of conduct at the space follows.
Stay overnight: We have a guest room, and we’re always up for hosting people who wish to get the most out of the community. Book your Newspeak stay right now!
Coffee roulette: It’s a non-romantic and political-tech-focused Tinder thing. Sign up to be matched with someone with complementary knowledge, skills, and experience to share!
What’s cooking?
Every Wednesday (almost), for the past 10 years, Newspeak House has been hosting Ration Club dinners where the community of political technologists come together to eat, meet, and greet. It takes place in the common lounge and terrace and is open to anyone who’d like to learn about the college, its work, and its people.
Last week, Chef Claddagh crafted some fresh falafels (not pictured but remain embedded in our soul) -
As this newsletter goes out today, Tristan will be cooking up something special for us this evening. Join us today (6:30 pm onwards), and you might have a chance to bump into one of us.
If you wish to join us for future dinners, sign up here.
If you wish to volunteer as a chef, sign up here.
Remember, they™ don’t want you to forget that the work is important and mysterious.
Absolutely loving this newsletter! It is literally the only one I actually read, although many come to my inbox.
Thank you for the comments on "vibe coding" which does seem suddenly everywhere. I'm a big believer in the value of actually understanding how code works, and if "engineering is the ability to solve problems using tools" .... then wouldn't using ChatGPT (a tool) to produce software and solve problems fit the definition of "software engineering"? We might make the argument that it's an unsafe way to do engineering, but that's a bit different, I think?