AI in Media & Broadcast – Professional Essentials Guide

21

NEWSCASTSTUDIO.COM

zation is not building and training its own

AI models, but leveraging third-party AI

models through APIs, and exercising that

model on its own dataset or content. Hav-

ing a clear understanding of the range of

potential outcomes using easier-to-assess

metrics such as visible quality, time taken

to deliver assets, or even factors like band-

width utilisation, will help qualify valuable

use cases and help avoid disappointment.

An example evaluation model

for AI integration

There are a number of factors that need

to be considered when evaluating an AI

solution for your organization. To illus-

trate the point, below is an example eval-

uation model for assessing whether or not

to leverage AI in an encoding workflow.

This evaluation model looks at five key

factors:

1. Total cost of ownership (TCO)

AI incorporated into encoding will save

distribution bandwidth but will come with

the cost of additional compute resources

and potential software licensing costs. A

good evaluation model will consider not

only the savings in distribution bandwidth

cost, but also the additional cost of infra-

structure to manage the increased compu-

tational load. Having a benchmark of costs

for an existing process based on a few sim-

ple metrics such as ‘time taken’ or ‘asset

processed’ to compare to an AI workflow

can help with TCO calculations. But al-

ways remember: AI processes go through

both model and workflow improvements

that tend to provide incremental benefits

through subsequent versions, so TCO is an

evolving calculation.

2. End user impact

Additional AI-based processing may in-

troduce latency into the encoding work-

flow. If the AI solution introduces multiple

seconds of latency for a live stream, the

impact on viewer experience may mate-

rially impact the business. In some cases,

the AI solution may also have impacts on

the client side, which may not be accept-

able. A good evaluation model will consid-

er all end-user impacts in implementing

the solution and have a clear threshold for

acceptable performance.

3. Operational impact

Any impacts on the day-to-day opera-

tions should be well understood. Is there

additional monitoring required to ensure

sustained performance of bandwidth sav-

ings and/or picture quality? Do staff need

to be re-trained to understand any new

performance metrics, configurations and

settings? Are there sustainability impli-

cations that need to be evaluated against

the organization’s ESG initiatives due to

increased power consumption?

4. Systemic risks

Are there other systems in the video en-

coding and distribution workflow that also

use automation and/or AI? Are the end-

to-end system risks well understood to

mitigate any business-impacting events?

Could there be potential cascading effects

of a malfunctioning system feeding into

another AI-enabled system, and are the

current failsafes and redundancies suffi-

cient? Running workflows initially in test

and development environments as well as

simulating failures is a great way to under-

stand how failsafes and redundancy fare

ahead of production deployment.

5. Ethical and privacy considerations

Ethical

and

privacy

considerations

should always be part of every evaluation

model. Can the system alter the content in

any way? Is there any possibility that the

AI-powered system could touch customer

data? For example, there could be AI-en-

abled encoding systems that have built-

in mechanisms for automated language

dubbing or in-frame brand detection and

replacement for monetization purposes.

Ensuring appropriate controls and per-

missions to preserve content owner/cre-

ator rights is critical.

Piloting the use case

Once a use case is selected, develop

gradual modes of introduction into the

organization. Constrain the initial imple-

mentation so the implications to the or-

ganization are well understood, as well

as the potential for achieving the desired

outcomes.

Media companies like the BBC have

successfully adopted this approach, pilot-

ing multiple AI-driven initiatives in limited

internal settings. For example, content

personalization features were launched in

controlled settings before deploying them

to a wider audience. The BBC also ensures

that all initiatives are governed by core

principles, which inform their own inter-

nal evaluation models.

It is also useful to consider scenarios

where the system may perform very well

as a pilot but run into significant problems

at scale. Define potential issues that may

affect scaling your AI-enabled solutions as

part of the evaluation model and consider

if rollback mechanisms may be needed.

Positioning for success

in a changing landscape

AI is not just a tool — it’s fast becoming

a strategic imperative for the media and

entertainment industry. By adopting a me-

thodical approach — starting with clearly

defined use cases, supported by robust

evaluation frameworks, and conducting

thoroughly tested pilots in controlled en-

vironments — media companies can lever-

age AI to drive both efficiency and innova-

tion.

From early adopters, it’s clear that AI

isn’t a one-size-fits-all solution. Companies

that excel in harnessing AI are those with

a deep understanding of media workflows,

technical applications, and industry pain

points. These pioneers are best equipped

to utilize AI effectively, customizing its ca-

pabilities to their specific needs.

Another key takeaway is the low cost

of experimentation. By running pilots in

parallel or within non-production environ-

ments, companies can explore AI’s poten-

tial without disrupting ongoing operations.

Crucially, this trial-and-error process not

only fine-tunes AI implementations but

also develops critical internal AI literacy

that will drive long-term value.

Make no mistake — AI is already trans-

forming the industry. A 2023 Gartner poll

of more than 1,400 executive leaders re-

vealed that 45% are piloting generative AI

solutions, and 10% have already deployed

them in production. This is a sharp rise

from just 15% piloting and 4% in produc-

tion the previous year, underscoring the

urgency with which companies are em-

bracing AI to stay competitive.

As the digital landscape rapidly evolves,

those who act now to explore AI’s possibil-

ities, while building the foundational skills

and strategies, will be best positioned

to unlock new growth opportunities and

deepen audience engagement. AI isn’t just

the future — it’s the key to staying ahead in

a fast-changing world. 

With over 20 years of executive

experience in the media and telecoms

space including TandbergTV, Ericsson,

and Mediakind, Narayanan Rajan has led

transformation and integration initiatives

in engineering and operations roles across

multiple organizations. As CEO of Media

Excel, he now leads an organization

developing cutting edge technology for

encoding and transcoding, including AI

based enhancements to improve encoding

performance.

Continued from previous page