Displaying 1 - 10 of 12

This report is free to access

Trump II is already proving to be a more serious threat to an independent, robust news media than Trump I.

Trump’s direct power around news media is limited, but the threat comes from an unprecedented politicisation of federal regulators, enforcement and procurement—to favour friends and punish enemies.

Opposition to Trump II is weaker and more divided than the broad ‘resistance’ to Trump I. Big tech companies are going for a close embrace, hoping to steer policy to their advantage—while others bend the knee to avoid punishment.

Use of publisher content to train AI models is hotly contested. Unacknowledged scraping, licensing deals, and lawsuits all characterise the publisher-AI company relationship.

However, model training is not the whole story. More and more products rely on up-to-date access to content, and some are direct competitors to publisher offerings.

Publishers can’t depend on copyright to deliver them the value of their IP. They need to track which products are catching on with users for licensing deals to make sense for them, and to ensure their own products keep up with the competition.

UK news publishers are experimenting with generative AI to realise newsroom efficiencies. Different businesses see a different balance of risk and reward: some eager locals are already using it for newsgathering and content creation, while quality nationals hold back from reader-facing uses.

Publishers must protect the integrity of their content. Beyond hallucinations, overuse of generative AI carries the longer-term commercial and reputational risk of losing what makes a news product distinctive.

Far less certain is the role of generative AI in delivering the holy grail of higher revenues. New product offerings could be more of an opportunity for businesses that rely on subscribers than those that are ad-supported.

The UK’s choice of policy for rebalancing the relationships between news publishers and tech platforms is on the agenda of the CMA’s Digital Markets Unit for 2025. The UK is expected to steer clear of the pitfalls of Canada’s news bargaining regime, which led Meta to block news, crashing referrals.

In the UK, Google’s relationships with news publishers are much deeper than referrals, including advertising and market-specific voluntary arrangements that support a robust supply of journalism, and dovetail with the industry’s focus on technology (including AI) and distribution.

The rise of generative AI has also ignited the news industry’s focus on monetising the use of its content in LLMs. AI products could threaten the prominence, usage and positive public perceptions of journalism—this might require progress in journalism’s online infrastructure, supported by public policy.

TikTok has been dealt a devastating blow as a US bill has been signed into law forcing owner ByteDance to sell within a year or face its removal from app stores. 

The stakes are higher than in 2020—China's opposition to a divestment will make an optimal sale harder to conclude, so all sides must be prepared for a ban.   

The TikTok bill introduces extraordinary new powers in the context of the US and China's broad systemic rivalry, though online consumer benefits will be limited.  

Big news publishers are pursuing licensing deals with AI companies, chiefly OpenAI. Not all publishers will see a substantial return; while some news may be important for training AI models, not all publisher content will be

Litigation is a threat point when negotiations stall (see the New York Times), but the copyright status of Large Language Models (LLMs) is uncertain. In the UK, there has been no government intervention (on copyright or otherwise) that could facilitate licensing 

Publishers’ bargaining position is strongest when it comes to up-to-date material that could be important in powering some AI consumer products. They should seek deals to support their journalism, while bearing in mind the risk that new products may get between them and their readers

 

The metaverse is a radical expansion of online experiences— sparking a host of new safety challenges on harmful content, economic activity, and privacy.

Building safety into the metaverse will take a village: platforms and communities will set policies and moderation. Regulators could struggle to future-proof their tools, especially with decentralised platforms.

AI age verification and moderation is in a race against AI hazards: disinformation, deepfakes and dynamic user content all intensify harms in immersive settings.

We forecast broadcaster viewing to shrink to below half of total video viewing by 2028 (48%)—down from 64% today—as streaming services gain share of long-form viewing time.

On the key advertising battleground of the TV set, broadcasters will still retain scale with a 63% viewing share by 2028, even as SVOD and YouTube double their impact.

Short-form video will continue to displace long-form as video-first apps (e.g. YouTube, Twitch, TikTok) gain further popularity and others (e.g. Facebook, Instagram) continue a relentless pivot to video. This will expand the amount of video watched and transition habits—even amongst older demographics.

Recent developments in AI have ignited a frenzy in the tech world and wider society. Though some predictions are closer to sci-fi, this new phase is a real advance.

We view AI as a ‘supercharger’, boosting productivity of workers. The impact is already being felt across media sectors, including advertising and publishing.

Firms thinking about using AI should assess which tasks can be augmented and what data is required. Be prepared for unpredictable outputs and a changing legal and tech landscape.

The amended Online Safety Bill contains sensibly scaled back provisions for “legal but harmful” content for adults, retaining the objectives of removing harms to children and giving users more choice. However, this comes at the expense of enhanced transparency from platforms.

News publishers have won further protections: their content will have a temporary ‘must-carry’ requirement pending review when flagged under the Bill’s content rules. Ofcom must keep track of how regulation affects the distribution of news.

The Bill could be further strengthened: private communications should be protected. Regulators will need to keep up with children’s changing habits, as they are spending more time on live, interactive social gaming.