When the European Union’s General Data Protection Regulation (GDPR) took effect on May 25, 2018, it reshaped the internet. The sweeping data privacy law set strict rules for how companies collect, store and use personal data — and threatened to hit violators with some significant fines. Suddenly, websites had to rethink everything, from how they asked users for their information to how they let people delete or correct that information.
U.S. Data Privacy Laws to Know
- Health Insurance Portability and Accountability Act
- Fair Credit Reporting Act
- Children’s Online Privacy Protection Act
- California Consumer Protection and Privacy Rights Acts
- Virginia Consumer Data Protection Act
- Colorado Consumer Protection Act
- Maryland Online Data Privacy Act
- Texas Data Privacy and Security Act
Unlike the EU, the United States does not have a single, comprehensive data privacy law at the federal level (yet). But GDPR’s influence has trickled across the Atlantic, creating a patchwork of state-level legislation that aims to limit what type of data businesses can and establish penalties for any misuse.
California led the with its landmark Consumer Protection Act in 2020, and since then some 20 states have followed suit. As more states join in and concern about data privacy continues to grow, the U.S. may one day see a shift that rivals GDPR’s impact on the ways companies do business online. Here’s where data privacy laws stand today, and where they may be headed next.
When It Comes to Privacy, Federal Law Is Very Sectoral
Even without something like the GDPR at the federal level, the United States still has some nationwide laws dealing with information privacy, though they tend to be very sectoral.
For example, the Health Insurance Portability and Accountability Act, or HIPAA, addresses how consumers’ health data can be used, shared and by what groups. Laws like the Fair Credit Reporting Act do that in the financial sector. Other examples include the Gramm-Leach-Bliley Act in the financial sector and the Family Educational Rights and Privacy Act that safeguards students’ data.
While these laws do apply to the online space, they aren’t specific to online data privacy concerns like the GDPR. Online data privacy can mean the same kinds of personal identifiable information that exist in offline spaces. But it usually means details like IP address, search behavior, site visits and online purchasing activities. Namely, the online behavioral information about consumers that can be tracked through cookies.
The closest thing to a comprehensive online data privacy law at the federal level is the Children’s Online Privacy Protection Act. COPPA sets standards for how companies can interact with children and their data online. The law made it unlawful for websites and online services to collect personal information from children under the age of 13 without first obtaining verifiable consent from a parent. It is enforced by the Federal Trade Commission, and breaking it can cost a company up to $43,792 per violation in civil penalties.
Aside from the narrowly focused federal laws, all states — and U.S. territories and protectorates like Guam, Puerto Rico and the Virgin Islands, as well as the District of Columbia — have data breach laws. These laws basically require companies to notify users if their data may have been exposed or compromised. Around 20 states have passed some sort of data privacy laws in recent years.
Navigating America’s Data Privacy Laws
California, Virginia and Colorado were the first states to introduce data privacy laws back in 2020. Since then, more have followed. Below are some of the most influential data privacy laws to know.
California Consumer Protection and Privacy Rights Acts
California has been at the forefront of comprehensive data privacy laws in the United States with the California Consumer Protection Act, which went into effect in 2020. Then later that same year, it passed the California Privacy Rights Act.
The CCPA gives California residents more control over their online data and restricts what companies can do with it. Under the law, Californians have the right to know what information companies collect about them. They have the right to request companies change or delete the information already collected about them. And the CCPA ensures consumers the ability to opt out of the sale of their information.
The law applies to certain for-profit companies — nonprofits and government sites are exempt. Covered companies are those that either gross over $25 million, make half or more of their annual revenue from selling Californian’s information, or buy, receive or sell the personal information of 50,000 or more Californians.
Companies cannot condition website access on Californians exercising their rights under CCPA. Companies must also include specific information about their privacy policies. This includes the type of information collected and what the company plans to do with it.
Californians can file a complaint with the attorney general’s office, which enforces the CCPA, but cannot take action against a company directly except in very specific circumstances. Only when a consumer’s data is exposed through company negligence, for instance, can they seek damages (of no more than $750 per incident).
Shortly after the CCPA took effect, California passed the CPRA. It replaced and amended several elements of the CCPA. In general, the CPRA expanded the privacy protections of the CCPA and increased the regulations on businesses.
For example, the CPRA requires companies to allow Californians to opt out of third-party sharing of their information for advertising purposes. It also requires companies forward consumer requests to groups like data brokers it sold or shared that information with.
The CPRA also expanded covered data to include “sensitive personal information.” This includes social security numbers, bank account numbers, exact geolocations, detailed demographic information, sexual orientation, political and religious affiliation, biometrics and the actual content of personal communications like texts and emails.
Most notably, the CPRA created a new enforcement group, the California Privacy Protection Agency. The five-person board can update existing online privacy regulations and adopt new ones. CPRA also increased the standard fines for CCPA violations. CCPA had a maximum civil penalty of $750 per violation. CPRA raised that to $2,500 per violation against adults and up to $7,500 per violation against Californians a company knew were under 16 years old.
Virginia Consumer Data Protection Act
Virginia’s Consumer Data Protection Act (VCDPA) restricts what “data controllers” can do with personal and sensitive data.
Data controllers can be any group that conducts business in Virginia or delivers goods or services “targeted to residents of the Commonwealth.” They must also either control or process the personal information of more than 100,000 consumers per year, or derive revenue or other benefits from the sale of the personal data of at least 25,000 consumers. State and local governments or Virginian institutions of higher education are exempt from the law.
The law requires data controllers to be transparent with Virginians about what data they collect, and why and how they will use it (including sharing with or selling to third parties). Virginians have the right to opt out of having their data collected and to request companies change or delete existing data.
The VCDPA covers “information that is linked or reasonably linkable to an identified or identifiable individual.” This does not include anonymized data or publicly-available information. Groups seeking to collect sensitive data — racial or ethic origin, religious beliefs, mental or physical conditions, biometric data, sexual orientation, citizenship or exact geolocations — from Virginians must get their direct permission first. In the case of children under 13, groups seeking to collect sensitive data must get permission from a legal guardian.
Like California, the VCDPA vests the state’s attorney general with enforcement powers. It does not, however, set up a new enforcement agency. Civil penalties of up to $7,500 per violation can apply. Fine money will go into a Consumer Privacy Fund, which will be used to pay for enforcement efforts.
Colorado Consumer Protection Act
About three weeks after Virginia’s law passed, Colorado’s Consumer Protection Act was introduced (and passed in July 2021) with almost exact language.
Key differences between the VCDPA and the CPA deal with definitions and how fines are handled. The CPA does not include exact geolocation in its definition of sensitive data like the VCDPA does. And violations of the CPA are treated as deceptive trade practices. This can come with fines of up to $500,000 per violation and up to $3 million per series of violations in civil penalties, depending on the nature of the violation.
Maryland Online Data Privacy Act (MODPA)
Mayland’s Online Data Privacy ACT (MODPA) was signed in 2024 and took effect on October 1, 2025. The law limits what kind of data companies can collect from Maryland residents by emphasizing data minimization. With the new law, companies can only collect and use data that is necessary for providing a requested service.
MODPA applies to companies that conduct business in Maryland or that target the personal data of at least 35,000 consumers, or 20,000 if they profit a certain amount from selling personal data. State and local governments, nonprofits and some financial institutions are exempt.
Like other state privacy laws, MODPA grants Maryland residents several rights over their personal data. They can confirm whether a company can process their information, correct inaccuracies, access that information, delete data and request a copy. Consumers can also opt out of targeted advertising and data sales. While individuals can not take legal action against companies violating MODPA, Maryland's Attorney General's Offices can issue fines up to $10,000 per violation and $25,000 for release offenses.
Texas Data Privacy and Security Act (TXDPSA)
Texas’ Data Privacy and Security Act (TXDPSA) took full effect on January 1, 2025 and granted Texans greater control over their personal data and requires that businesses be transparent about how data is collected, used and sold.
Unlike other state laws, TXDPSA applies to any companies doing business in Texas, regardless of revenue, unless it meets the U.S. Small Business Administration classification of a small business.
Under the law, Texas residents have the right to know whether companies are collecting their personal data, correct inaccuracies, access that information, delete data and request a copy. Additionally, companies looking to use sensitive data, such as geolocation or biometric identifiers, must get prior consent from the users.
TXDPSA is enforced by the Texas Attorney General and provides companies with 30 days to fix a violation before issuing penalties,which can reach $7,5000 per violation.
More Comprehensive Online Data Privacy Laws May Be Coming
Clearly, there are a lot of approaches to online data privacy right now. According to Julie Rubash, chief privacy counsel at Sourcepoint, a privacy software company for digital marketers, staying compliant with the variety is manageable — for now.
There’s been an “exponential increase” in the online privacy legislation recently, Rubash said.
According to the National Conference on State Legislatures, more than two dozen states have introduced comprehensive data privacy legislation since 2021. Not all those proposed bills survived their states’ committee chambers, however. Eighteen are still under consideration in their legislatures in early 2022, according to the International Association of Privacy Professionals.
There is also a proposed federal-level data privacy law that could have a major impact on businesses: the TLDR Act. While “TL;DR” usually refers to internet slang for “too long; didn’t read,” when it comes to the TLDR Act, it means “terms-of-service labeling, design and readability.” Instead of a lot of legal boilerplate in a website’s terms of service, the TLDR Act would require increased clarity and common-language explanations in website privacy policies.
Companies would have to include a short, easy-to-read summary at the top of the terms of service, according to Rubash. These summaries would include total word count and approximate reading time, what sort of sensitive information the site collects, what correction or deletion rights a user has and how to use them, historical versions of the terms of service and a list of data breaches the site has had in the past three years. Anyone operating a website for commercial purposes, except small businesses, will be required to have these summaries.
The bill has been repeatedly introduced in the U.S. Congress — most recently in the 119th Congress in 2025 — and was referred to the House Committee on Energy and Commerce and the Senate Committee on Commerce, Science and Transportation in March 2025. However, there has been no hearing or further action since.
A Path Forward for Data-Collecting Companies
Clearly, there are a lot of approaches to online data privacy right now., with more coming in the next few years. There’s been an “exponential increase” in the online privacy legislation recently, Rubash said, and staying compliant with it all is reasonably manageable — for now.
While there is growing demand for a GDPR-like law in the United States, the patchwork of state laws remains the only protection for consumers. These data privacy laws have forced companies to take a much closer look at their data policies — and that’s good, according to Rubash. But the onslaught of new data privacy laws means companies that rely on consumer data will need to prepare for a more privacy-focused future.
Depending on how existing and future laws are enforced, the United States could quickly become even more of a patchwork of data privacy approaches than it already is. That could be very cumbersome for businesses, Rubash said. Even though comprehensive online data privacy laws are few and far between, she said companies need to take a comprehensive approach to compliance.
“Rather than trying to find the loopholes and make workarounds, it’s important to take a holistic look at all of your data processes and really start to approach it from an ethical and responsible standpoint. And insist that all of your partners do the same,” she said.
“Brands, agencies and publishers need to create advertising strategies that respect user preferences and stop using personalized information for one-to-one targeting ... Those that have adopted a wait-and-see approach need to catch up, as more privacy restrictions are inevitable.”
Aphrodite Brinsmead, senior product manager at Permutive, a company that provides privacy-safe infrastructure for publishers and advertisers, also said that companies looking for workarounds will only add to the issues the advertising industry faces with consumer trust. Instead, companies need to find ad tech solutions that “align with a privacy-safe future.” Ultimately, it means having a solid data strategy.
“A good data strategy means redefining data practices across every department, taking into consideration user consent, privacy and security systems,” she said. For advertisers, for instance, that means using “only consented and owned (first-party) data signals for targeting and reducing the use of personal identifiers and third-party trackers.”
A bad strategy, on the other hand, is running business as usual, Brinsmead said. Not adapting to the coming changes means campaigns won’t reach their desired audiences and ad dollars will be wasted at best and companies could receive fines for violations at worst. Not to mention the potential to continue to erode consumer trust and invite more regulation.
“Brands, agencies and publishers need to create advertising strategies that respect user preferences and stop using personalized information for one-to-one targeting,” Brinsmead said. “Those that have adopted a wait-and-see approach need to catch up, as more privacy restrictions are inevitable.”
AI Is Now The Target of Data Regulation
Companies are not only facing stricter rules around personal information, they’re also contending with a new technology — artificial intelligence. AI systems rely heavily on data collected online and uploaded by users themselves. In essence the more data available to these systems, the more they can do. This has once again caught the attention of lawmakers and regulators, who grow concerned about data privacy, bias and accountability as it relates to AI systems. Just as GDPR reshaped online data practices, emerging AI regulations are poised to redefine how businesses collect, use and manage data for machine learning systems.
As with data privacy, the EU was quick to act on this front as well, passing the Artificial Intelligence Act in March 2024. The law aims to regulate AI systems by classifying them based on the amount of risk they pose, setting strict rules for high-risk AI systems, such as ones used in healthcare, employment, law enforcement and critical infrastructure. However, the law also banned specific uses of AI, including in targeted biometric surveillance or to evaluate and rank individuals based on their online activity. Companies that provide general-purpose AI must also share information about how their systems were trained and what steps were taken to limit risks. The EU’s AI Act will be fully applicable by August 2, 2026.
In the United States there is no comprehensive federal AI law yet. Instead, companies face a combination of existing laws like the Federal Trade Commission Act, along with some newly passed bills. For instance, the Generative AI Copyright Disclosure would require companies to disclose copyrighted works used to train their AI. Other proposals such as the Algorithmic Accountability Act, would force businesses to assess how automation systems handle personal data to make decisions.
States like California, Colorado, New York and Texas are also exploring rules that would require bias audits, transparency in automated decision making, and user consent for the processing of sensitive data. In California, SB 53, which was signed in September 2025, mandates that AI companies disclose how they assess and mitigate catastrophic risks. And in Texas, HB 149 forbids AI systems that manipulate or process biometric data without consent. But even without nationwide laws, the FTC has warned that AI companies could face penalties if their systems misuse personal data, discriminate against users or fail to maintain adequate safeguards.
Frequently Asked Questions
What are data privacy laws?
Data privacy laws govern how organizations collect, store and use personal information, enacted to protect individuals’ rights and control of their data.
Does the U.S. have a comprehensive federal data privacy law?
No, unlike the EU's GDPR, the United States does not have comprehensive data privacy laws at the federal level yet. Federal laws are instead very sectoral, like HIPAA or the Fair Credit Reporting Act. Otherwise, data privacy legislation is largely handled on the state level.
What is the closest thing the United States has to a federal online privacy law?
The Children’s Online Privacy Protection Act (COPPA) is the most comprehensive data privacy law in the United States at the federal level It regulates how companies can collect, use and share data from children under the age of 13.
