- FAQ
The Devil's Corner
Welcome to The Devil's Corner! Open Mind has retained the services of the original Devil's Advocate. (His hourly rates are hell, but worth it.) His burning questions about AI in journalism mirror those we hear from the public, just with more brimstone. Think of this as our supernatural FAQ, where the Open Mind team face our demons head on, and where no question is to hot too handle:

Tor (CEO): The news industry has cut its workforce by more than half in the past 25 years, so I understand your concern. This might seem like an opportunity to continue that trend. But that’s not what we’re seeing, and here’s why: These massive editorial layoffs weren’t driven by productivity improvements, but because the industry lost two-thirds of its revenue during the same period.
The rise of the internet disrupted traditional business models, slashing both subscription and ad revenues. With people’s attention increasingly divided among various media and entertainment options, news organizations simply didn’t have the money to maintain their previous staffing levels.
The journalists and editors who remain in the industry today are struggling under the weight of ever-increasing productivity demands – a response to reduced revenue and constant budget pressure. Many are disillusioned by the reality of modern newsrooms, which often feels far from what drew them to journalism in the first place.
The world needs keen-eyed journalists more than ever, and to ensure this we need tools that enhance the work journalists do, not replace them. We need meaningful productivity improvements that allow quality to flourish rather than forcing compromises. Tools that give journalists more time to interview sources, investigate stories, and hold power to account.
Increased productivity is our best defense against the rising tide of content that entertains but doesn’t inform – or worse. It’s how we keep editorial institutions strong and relevant.
Our AI handles the time-consuming tasks like data processing, background research, and initial drafts – freeing up journalists to do what only humans can do: conduct meaningful interviews, build relationships, and uncover stories that matter. Technology should amplify human journalism, not replace it.
Nicolai (DBD): This question touches on both legal and ethical considerations. Copyright law protects specific expressions of ideas, not the underlying facts or quotes themselves. So while directly copying someone’s article would be infringement, reporting the same facts is perfectly legal – and has been standard journalistic practice throughout history.
However, the more important question is about journalistic integrity and value creation. Our system helps journalists combine multiple sources, add their own analysis, and adapt stories for their specific audiences.
Nicolai (DBD): This question highlights important distinctions in how AI technology is used in journalism. Our approach differs fundamentally from foundation model development in two key ways:
First, we’re not training AI on copyrighted content or reselling that content. Instead, we provide tools that help journalists discover, break down and analyze publicly available information – similar to how search engines index online content. Our customers access and reference this content following traditional journalistic practices for attribution and fair use.
Secondly: While we do use AI services from companies like OpenAI in our technology stack – as do most modern digital companies – we recognize the ongoing industry discussions about AI training data and copyright. Services like Tollbit have emerged to facilitate licensing agreements between AI companies and publishers.
Instead, we believe the future lies in publisher-to-publisher markets that strengthen original reporting and editorial judgment, allowing news organizations to maintain direct relationships with their audiences rather than seeing their work diluted through AI-powered aggregators. We will welcome and collaborate with any such initiatives.
Tor (CEO): They certainly are. This is a crucial concern when introducing AI in journalism, where transparency and editorial control are fundamental. We address this challenge in several ways:
First, we give newsrooms complete control over which AI models power our tools. They can choose from all the leading commercial models and several open-source alternatives, allowing them to select providers whose values and transparency standards align with their editorial principles.
We also have an instant feedback system to identify and correct any unwanted political biases.
Most importantly, our tools are designed to assist journalists in gathering, analyzing, and drafting initial content. We firmly oppose AI autonomy in journalism, as it contradicts fundamental editorial principles.
All editorial decisions remain firmly in the hands of human journalists who understand their publication’s standards and values.
This commitment to editorial integrity is reflected in our SaaS contract that we sign with all customers: all AI-assisted content must carry a human journalist’s byline and be clearly marked as AI-assisted.