Identity Becoming Licensed Infrastructure

Identity Is Becoming Licensed Infrastructure

Phillip Shoemaker
March 23, 2026

Table of Contents

What Is Identity Licensing in AI and Why It Matters Now

A new pattern is starting to take shape across sports, media, and technology. Identity is beginning to appear in systems that define how it can be used, who can access it, and how value is assigned to it.

Identity licensing in AI refers to the process of granting permission for systems to use a person’s likeness, voice, or other identifiable attributes. This can include how those attributes are trained on, generated, or distributed, often with terms that define control and compensation.

Recent developments show how quickly this is moving into practice. A newly launched AI registry for professional athletes allows players to formally register their likeness, including motion data and biometric traits, and set conditions for how that identity can be used.

To understand why that matters, consider what was captured at the 2024 Olympic Games in Paris. Computer vision systems produced detailed 3D models of each athlete’s performance, tracking data such as the angles of a gymnast’s feet during a rotation, the distance a volleyball player covered on court, and the precise position of a tennis racket at any moment. That level of biometric detail does not end with the broadcast. It can be stored, analyzed, and reused across different systems.

As more of this data is collected and retained, identity begins to function less as a moment of performance and more as a persistent input into digital systems. This creates a need for clear structures that define how it is accessed and used.

When Your Likeness Becomes a System Input

To understand why that structure is becoming necessary, it helps to look at what identity actually does inside these systems.

AI models depend on large volumes of human data. This includes visual likeness, voice patterns, movement, and behavioral signals. These inputs shape how models are trained and how outputs are generated. Identity, in this context, is part of the system’s foundation.

Consider what happens when a voice actor records a line for a video game. Traditionally, that recording existed as a single performance tied to a specific product. Today, that same recording can be used to build a voice model that generates unlimited new lines of dialogue. The original performance becomes an instruction set, not just a piece of content. The actor’s voice is now powering outputs they never recorded, in contexts they never agreed to, for products that did not exist when they signed the contract.

A single dataset can contribute to many variations, extending the presence of an individual far beyond its original use. As identity moves into this role, it requires defined access. Systems are no longer just displaying it. They are using it to produce new material.

Why Control Has to Start Before the Contract Is Signed

Once identity becomes a system input rather than a surface appearance, the question of where control should be applied changes too.

Licensing frameworks have historically focused on distribution. Agreements defined where and how identity appeared, such as in films, advertisements, or recordings.

AI systems introduce a different structure. Identity can be incorporated during training, reused across multiple models, and surface later in outputs that are difficult to trace. By the time content appears, the connection to the original identity may no longer be visible.

The music industry ran into this directly. When AI music platforms began generating content that sounded like specific artists, the question was not just about the output. It was about what those systems were trained on and whether permission was ever granted at that stage. In June 2024, all three major record labels, Universal Music Group, Sony Music Entertainment, and Warner Music Group, along with the RIAA, sued AI music platforms Suno and Udio, alleging the platforms trained their models on copyrighted recordings without authorization. The recordings had been publicly available, but being publicly available and being free to use as training data are two different things. That distinction is now at the center of most disputes in this space.

This has led to a rethinking of where control is applied. Licensing is moving closer to the point where identity enters the system. Legal disputes are addressing whether systems can be trained on creative work and personal likeness without permission, and contracts are being rewritten to close the gap before it opens.

Licensing is becoming a mechanism for governing access rather than managing individual uses after the fact.

Early Infrastructure Is Starting to Take Shape

With licensing moving upstream, the next question is what the systems that support it actually look like in practice.

The athlete AI registry is one example of this starting to take form. It introduces a structure where identity can be registered, permissions can be defined, and usage can be tracked across different applications. Compensation can then be tied directly to how that identity is used. Instead of negotiating each use independently, identity is managed through a system that standardizes access and records activity over time.

In entertainment, talent agencies have begun moving in a similar direction. Creative Artists Agency partnered with technology vendor Veritone to launch CAA Vault, a system that scans and securely stores digital replicas of a client’s face, body, and voice, allowing talent to maintain ownership and control over how their likeness is used. Any permitted use flows through a controlled channel rather than being negotiated case by case.

On the music side, the move from litigation to licensing has started. Warner Music Group settled with Suno and went further, partnering with the platform on licensed AI music, with Suno implementing a strict opt-in mechanism for WMG artists. That is not just the end of a lawsuit. It is one sign of what a permission structure could begin to look like.

These are early signals, not a settled system. But they point toward repeatable mechanisms for defining identity, setting permissions, and linking usage to value.

Why Identity Rights Still Don’t Follow You Across Platforms

What each of these examples also has in common is that they operate in isolation. And that is the core problem.

Registries, contracts, and platform-level controls do not talk to each other. Permissions defined within one system do not transfer across environments. An actor covered under a union agreement has certain protections on one production. A background performer on a smaller project may have none. A college athlete in one state operates under different rules than one in another, something state-level laws like the ELVIS Act have started to address, though only within their own borders.

Attribution remains difficult as well. When a system is trained on thousands of recordings and performances, tracing which specific inputs shaped a given output becomes a genuine technical problem, not just a legal one. A voice model does not store recordings the way a hard drive stores files. It contains patterns extracted from those recordings, compressed into representations that no longer resemble anything a person could identify or trace back to its source.

Without a shared standard, identity licensing relies on manual processes and isolated agreements, each covering a narrow slice of the problem. A more complete system would require interoperable formats for identity, permissions, and usage tracking that persist across platforms and are recognized by different systems regardless of where content ends up.

The absence of this layer is becoming more visible as adoption increases.

Identity Is Now Something Systems Need Permission to Use

Taken together, what these developments point to is a consistent underlying logic forming across industries.

Identity is moving into a new role within digital systems. It is no longer limited to expression or representation. It becomes something that systems must account for when generating value. Whether through registries, union contracts, or licensing partnerships, the pattern is the same: identity is defined, permissions are established, and usage is connected to control or compensation.

This introduces identity as a permission layer within AI systems. Access is determined in advance, and usage is governed by structured rules.

The people whose identities feed these systems are beginning to understand what that means. The next step is building the shared infrastructure that makes those permissions durable, portable, and enforceable across the systems that increasingly depend on them.

Identity is becoming something that systems need authorization to use.

FAQs

What is identity licensing in AI?

It is the process of granting permission for a system to use a person’s likeness, voice, or other identifiable attributes, with defined terms around how it is used, who can access it, and how value is assigned.

Why does it matter where in the process identity is licensed?

Because by the time content appears, the connection to the original identity may no longer be visible. Licensing needs to happen before identity enters a system, not after it has already been used to produce something.

What does early identity licensing infrastructure look like?

It centers on giving individuals control before their identity enters a system. That means setting permissions in advance, choosing to opt in or out of specific uses, and having those decisions recorded and enforced across the platforms that access them.

Why don’t identity rights carry across platforms?

Each registry, contract, and platform operates independently. Permissions set in one system do not transfer to another. Without a shared standard, rights do not travel with identity the way they need to.

Related Posts

Join the Identity Community