Under the Framework, each member company must limit its processing of sensitive personal data to disclosed purposes (and purposes consented to by the consumer as required by applicable laws) and provide additional safeguards when processing such data. During reviews, the NAI staff asked questions about the internal policies members have in place for assessing when personal information may be sensitive. This included closely examining the various types of health-adjacent data members may collect and process, as well as any information that is derived or inferred from nonhealth information. If it is determined that a member does process sensitive personal data, the NAI staff would review relevant safeguards, disclosures, as well as how and when the member obtains the relevant consent or ensures such consent is obtained on its behalf.
The Challenge of Classifying “Sensitive Data”
Members have overwhelmingly developed internal policies for classifying sensitive personal data. However, given the evolving and somewhat varied frameworks for determining when personal information meets a higher threshold of sensitivity that requires heightened protections, the NAI staff observed a lack of uniformity for classifying sensitive data amongst members. Concerning health data specifically, the NAI staff observed a wide variety of approaches taken at the state level to define health-related sensitive personal data, and the challenges these different approaches present to members. For example, some states define health-related sensitive personal data narrowly, limiting the classification only to data revealing a mental or physical health diagnosis, whereas other states take a broader approach that envelops all data identifying a consumer’s past, present, or future physical or mental health status.27
Beyond statutory definitions, the NAI staff also paid close attention to enforcement actions that provide additional context on when personal data should be classified as sensitive and often during Privacy Reviews raised matters for members’ consideration. For example, in the settlement between the California Attorney General and HealthLine Media, there was strong emphasis placed on the CCPA’s provisions that limit the use of personal information for only those purposes that are consistent with the reasonable expectations of the consumer.28 Known as the “purpose limitation principle,” its application in the Healthline matter suggests that there may be varying degrees of data sensitivity, and that the sharing of personal data “of a more intimate nature” with third parties may be unlawful when consumers would not expect that to happen.
In the face of this complexity, there are a range of compliance approaches that the NAI staff observed during Privacy Reviews: (1) members attempting to force simplicity by over-classifying any information related to health as sensitive and therefore subjecting the data processing to heightened requirements; (2) members under-classifying what constitutes sensitive personal data, or failing to recognize in certain cases where a broader category of data could meet various legal definitions, creating a compliance gap; or (3) members choosing to withdraw altogether from jurisdictions where they perceive risk that almost any data processing could trigger broad definitions of what is sensitive.
To support consistent, well-reasoned sensitivity classifications for health data, the NAI staff developed a Factor Analysis for Health-Related Sensitive Personal Information as a tool to assist members with the growing challenge of classifying health-related sensitive data.29 The tool acknowledges that health data sensitivity often emerges from a confluence of factors: how data is collected, what it contains; how it is used; and what risks it creates. For those members struggling with the complexity of reasoning through sensitivity classifications, the NAI recommended the adoption of this tool, and encouraged members to pay close attention to the concerns of regulators regarding the types of data that should be classified as sensitive under state and federal law.
Disclosure, Consent, and the “Limit the Use” Choice Mechanism
Those members that process sensitive personal data for Network Advertising, must adhere to a stricter—and in some ways more fragmented—regulatory framework. However, despite the disjointed approach to regulating sensitive personal data at the state level, there is a universal understanding that the processing of sensitive personal data carries a heightened risk of harm to consumers that warrants increased transparency and choice. For example, some states have explicit requirements to disclose the categories of sensitive personal data collected and the specific purposes for the processing.30 As part of the privacy review, the NAI staff evaluated privacy notices and other disclosures to understand how members currently approach these heightened disclosure requirements. Overall, members that are knowingly processing sensitive personal data have sufficiently disclosed the categories of sensitive personal data they collect and the purposes for the processing.31
As the vast majority of state comprehensive privacy laws require opt-in consent to process sensitive personal data,32 members that are knowingly processing sensitive personal data have taken varying approaches for either obtaining consent, or verifying that proper consent has been obtained when the data was initially collected. Indeed, consent obligations are generally tied to the processing of sensitive personal data, not merely its collection.33 This means that a controller receiving sensitive personal data from a partner to use for advertising likely needs valid consent for its own processing and cannot simply rely on the fact that the partner obtained consent for its separate use of the data. As such, consent language needs to be transparent and encompass all intended uses of the data—including downstream uses for advertising.
For these reasons, the NAI encouraged members that are processing sensitive personal data to go beyond contractual provisions concerning consent to conduct due diligence and review the consent language presented to consumers. For members that partner with a small number of publishers, this task is relatively easy and manageable, and can be done by reviewing each individual publisher or asking each publisher to provide screenshots of the relevant consent prompt. However, the NAI encouraged members that partner with a large number of publishers providing sensitive data to conduct necessary due diligence, recognizing it may be approached in a more scalable manner.
California is unique in that its comprehensive privacy law requires providing a conspicuous link titled “Limit the Use of My Sensitive Personal Information” when a business is processing sensitive personal data for purposes outside of what is strictly necessary.34 Known as a Limit the Use mechanism, it must be offered alongside California’s other distinct “Do Not Sell or Share My Personal Information” mechanism. With few exceptions, NAI members are well attuned to these requirements, and often bundle these mechanisms under a single “Your Privacy Choices” alternative opt-out link.35
Precise Geolocation Data
During the 2025 Privacy Review cycle, the NAI staff again emphasized to member companies that precise geolocation data remains one of the most sensitive and heavily scrutinized categories of data in the digital advertising ecosystem. Recent state laws in Maryland and Oregon introduced strict limitations on the sale of precise geolocation data, representing a meaningful shift from prior state law frameworks that primarily relied on notice and choice. Rather than conditioning processing on consumer consent alone, these laws impose categorical restrictions that directly constrain how location data may be used and monetized.
These restrictions have had significant and, in some cases, immediate operational implications for companies engaged in location-based advertising. In practice, some companies scaled back or discontinued certain location-based activities, while others reassessed whether to operate in particular state markets at all. These responses reflect the extent to which the new legal requirements are reshaping core business practices, rather than simply requiring incremental compliance adjustments.
In response to these two state laws, NAI members and their partners have made significant adjustments to their standard business models. Some companies have chosen to withdraw certain location-based services from these jurisdictions altogether, rather than attempt to redesign those offerings within the new constraints. As a result, certain products and use cases that depend on precise geolocation (such as in-store visitation measurement or attribution of advertising to real-world outcomes) may no longer be available in those states. Other companies have continued operating in these jurisdictions, but with more limited functionality. In practice, this has generally involved restricting or eliminating the use of precise geolocation data in those contexts and limiting services to those that can be supported without relying on that data. While these approaches allow for continued operation, they can materially reduce the effectiveness and precision of advertising and analytics capabilities, highlighting the tradeoffs companies are making to navigate evolving legal requirements.
Compliance through Technical Measures
As there has been greater scrutiny of the use of precise geolocation information, many companies are exploring and adopting technical measures to reduce the sensitivity of location data while preserving utility. One method observed in the reviews is the truncation of IP-derived location data, such as limiting precision to two decimal places. While these approaches may reduce granularity, they also raise important questions about whether such transformations meaningfully mitigate sensitivity under emerging legal standards, particularly where re-identification risks may remain, or where the data can still be used (alone or in combination with other data) to approximate a user’s precise location or visits to sensitive points of interest. Regulators are increasingly focused not just on how data is labeled, but on whether it can still reveal precise or sensitive insights about individuals in practice.36
In speaking with members that process precise geolocation data, the need for robust filtering of sensitive points of interest remained a key compliance challenge for some. Across reviews, companies demonstrated varying levels of maturity in identifying and excluding locations that may reveal sensitive characteristics, such as healthcare facilities, places of worship, or other locations tied to intimate aspects of individuals’ lives. This issue has taken on heightened importance in light of recent legal developments, including state laws that specifically call out health-related location data as requiring enhanced protections. Notably, New York’s geofencing law prohibits the establishment of a geofence of 1,850 feet or less around a healthcare facility for the purpose of delivering digital advertisements, building consumer profiles, or inferring an individual’s health status or treatment. Since the development of the NAI's Voluntary Enhanced Standards for Location Information Solution Providers in 2022, the NAI has been encouraging members to identify sensitive locations and take steps not to process or share precise geolocation data related to these locations. In the NAI's privacy reviews with members, the NAI staff highlighted the consistent focus of regulators on data related to sensitive locations and promoted the NAI's Voluntary Enhanced Standards.
New state regulations and increased scrutiny of the use of geolocation information, particularly data that can approximate visits to sensitive points of interest, such as health facilities and places of worship, reinforce that compliance in this space must be an ongoing function requiring continuous refinement and validation.
Meaningful & Informed Consent
Regulators and enforcement actions continue to emphasize that the collection and use of precise geolocation data must be grounded in meaningful, informed consent. This expectation extends beyond the initial point of collection to the broader data supply chain, meaning that companies that receive or rely on location data are increasingly expected to conduct due diligence and implement contractual safeguards to ensure that upstream partners, such as app publishers and SDK providers, are obtaining valid consent.37 As a result, accountability for consent is no longer limited to the first-party interface; instead, it is distributed across all participants in the ecosystem.
Recent enforcement actions further illustrate this regulatory focus. State and federal regulators have demonstrated a willingness to bring cases where location data practices are perceived to reveal sensitive information or where consumer expectations are not met.38 Regulators are increasingly evaluating these practices holistically, examining not only technical compliance, but also considering the sensitivity of the data and whether the overall use of the data aligns with consumer disclosures and expectations.
Precise Location Information Solution Provider Voluntary Enhanced Standards
The NAI’s Precise Location Information Solution Provider Voluntary Enhanced Standards (“VES”) represent the NAI’s accountability mechanism for the subset of member companies (known as “signatories”) that collect precise location data and use it to provide location-based audience segments or analytical services. Signatories voluntarily sign on to the VES, which prohibit the use, sale, and transfer of U.S. consumer precise location information related to Sensitive Points of Interest (SPOIs), and separately prohibit the use, sale, or sharing of any U.S. consumer precise location information for law enforcement or national security purposes, except where needed to comply with a valid legal requirement. In 2024, the NAI updated the VES to clarify how nationally recognized industry classification systems, such as the North American Industry Classification System (NAICS), can be used to identify sensitive locations and operationalize a process that aligns with the standards, a practical update aimed at improving consistency and administrability across signatories.
The annual accountability reviews of VES signatories surfaced important lessons about how companies are implementing the standards in practice and where the most persistent operational challenges lie. One of the most significant findings concerns the methods signatories use to comprehensively identify SPOIs. Across signatories, the NAI staff observed a range of approaches, including licensing third-party point-of-interest databases, conducting manual location audits, and applying keyword-based search methodologies to classify location data. No single approach has emerged as definitive. Signatories expressed a shared desire for more resources on this front to strengthen the collective reliability of sensitive location filtering. At the same time, they are acutely aware of the inherent limitations of any SPOI dataset, as it represents a best effort at a moment in time; its coverage is constrained by the availability of sources to quality-check against and its durability over time cannot be fully guaranteed. The NAI staff is actively considering what additional resources and coordination mechanisms could support signatories in this work.
Temporary sensitive locations were consistently a challenge identified in VES privacy reviews, given the nature of these geographic locations that are sensitive only during specific events or timeframes, such as a convention center hosting a health-related conference or a park used for a religious gathering. Identifying these dynamically sensitive locations is significantly more resource-intensive than maintaining a static SPOI directory, and current methodologies are limited in the ability to capture this category. The NAI is continually evaluating what practices or tools could help address these challenges, and encourage members to include contractual limitations and the innovation of techniques to more comprehensively capture SPOIs.
Two additional findings from the VES reviews merit recognition. First, the NAI staff observed that several signatories have developed SPOI categories that go beyond what the VES expressly requires to classify additional location types as sensitive. Examples include the classification of parts of Native American reservations and firearms-related locations—such as gun stores and shooting ranges—as SPOIs. This kind of voluntary expansion of protections reflects a mature privacy culture and a proactive posture toward consumer protection that aligns with the spirit of the NAI Framework. Second, in discussions about business practices, signatories reported that they routinely decline business opportunities they assess as carrying unacceptable privacy risk, which the NAI sees as an indication that the values embedded in the VES are integrated into commercial decision-making, and rather than treated as a separate compliance exercise.
Finally, the NAI staff asked signatories about the volume of law enforcement requests they have received for consumer location data—a topic that has drawn significant public and regulatory scrutiny. The responses indicated that the volume of such requests has not been overwhelming. Importantly, signatories reported that they evaluate each such request carefully, asking whether it is properly propounded and narrowly tailored before any response is provided. These processes reflect the kind of principled gatekeeping that the VES were designed to encourage, and they offer a model for how industry can maintain meaningful accountability over government access to sensitive data.
Children’s Privacy
During Privacy Reviews, the NAI staff discussed the topic of children’s data with member companies, focusing on strong compliance and increased activity by both regulators and legislators. The NAI staff provided member companies with updates on developments in the space among enforcement agencies, and often provided links to resources about new and unfolding efforts among industry, as well as state and federal regulators, to ensure members are forward-thinking in compliance efforts related to children’s data.39 The enforcement of these laws is somewhat in flux, as there have been several constitutional challenges that are being actively litigated.40
Amidst the evolving landscape of privacy laws in force, the NAI staff highlighted the importance of members continuing to evaluate their entire data ecosystem, particularly regarding app-level data sources. Specifically, members should consider whether and how contracts with partners and data source vendors address app-level data sources. It is essential for members to continue assessing the presence of children and app-sourced data within their ecosystems. Members whose business model does not include processing children’s data should confirm that other parties are both doing their due diligence with regard to any appropriate age verification APIs and then follow through and do not share children's information with members.41
During Privacy Review meetings, the NAI staff also encouraged members to consider reviewing and updating their privacy policies to clarify the company’s approach to minors’ data in light of the evolving state privacy laws. There is a growing trend among states placing restrictions on the collection and processing of personal data of consumers under the age of 18—rather than 16, the threshold used in earlier laws. This can be seen with the Maryland Online Data Privacy Act (MODPA), which went into effect on October 1, 2025, and prohibits controllers from processing the personal data of a consumer for targeted advertising or selling the personal data of a consumer if the controller knew or should have known that the consumer is under the age of 18. Additionally, Colorado’s Children’s Privacy Amendment, which also went into effect on October 1, 2025, prohibits controllers from knowingly processing the personal data of minors (under age 18) for the purposes of targeted advertising, sale, and profiling without consent. The NAI staff encouraged members to consider reviewing their privacy disclosures and consider raising the age threshold to 18, if they have not done so already.