TV Channel Recreates Diddy Trial Using AI Video Generator But Raises Legal Red Flags

Diddy AI generated

A television channel’s use of artificial intelligence to recreate courtroom proceedings in the ongoing case involving Sean “Diddy” Combs is stirring debate in legal circles over ethics, accuracy, and privacy concerns — particularly regarding the potential misrepresentation of jurors.

For nearly 200 years, courtroom sketch artists have played a vital role in visually representing trials where cameras are prohibited.

Now, with rapid advances in AI technology, networks are using generative tools to animate transcripts, simulate emotions, and visually depict courtroom scenes — sometimes with stunning realism. In the case of Combs, the recreated footage includes AI-generated images of not only the defendant but purported jurors and other courtroom participants.

While the network claims the video is based solely on court transcripts, critics warn that AI reconstructions may cross a legal line.

“It’s the problem of showing the jurors,” a legal analyst commented. “Even if the faces are AI-generated, if they resemble real people — or are mistaken for real people — this could put individuals at risk and potentially undermine the fairness of a trial.”

Jurors, who are traditionally shielded from public exposure to protect their safety and impartiality, could find themselves inadvertently depicted or misrepresented by generative models. In this case, the generated video included realistic human features that some worry may resemble members of the actual jury.

The AI technology in question is not limited to reenactments. In a separate case, AI-generated testimony was submitted during sentencing. In a recent court proceeding involving a fatal road rage incident, the family of a victim used prior video samples and known personal values to generate a synthetic victim impact statement. The digital recreation of deceased victim Christopher Pelkey — complete with voice and facial expressions — was shown to the man convicted of his killing.

While the video reportedly moved the defendant to tears, legal experts are divided over the implications of using AI to create posthumous statements.

“AI testimony raises complex evidentiary questions,” one expert noted. “Is it truly representative? Who authenticates it? Can it be cross-examined?”

There is also growing concern over AI being used to automate trial reporting and even simulate case outcomes at scale, using data across thousands of federal cases. While potentially beneficial for legal analytics and public understanding, such use poses serious risks of misinformation, bias, and unintended consequences — especially if juries, judges, or the public begin to rely on dramatized versions of legal proceedings rather than verified facts.

As AI reshapes how courtroom narratives are conveyed to the public, legal professionals are urging caution. The portrayal of trials — especially high-profile ones like that of Sean Combs — must strike a balance between innovation and the preservation of judicial integrity.