Massachusetts Data Privacy Act Approved by Legislative Committee

The Chairs of the Joint Committee on Advanced Information Technology, the Internet, and Cybersecurity of the Massachusetts General Court, Representative Tricia Farley-Bouvier and Senator Michael Moore, announced today that the Massachusetts Data Privacy Act (“MDPA”) had been reported favorably out of Committee. If passed, the MDPA would be the strongest state privacy law in the country.

Last year, EPIC crafted the State Data Privacy and Protection Act, modeled on American Data Privacy and Protection Act (“ADPPA”), to give state legislators the opportunity to use the bipartisan consensus language from ADPPA to strengthen state bills. The MDPA closely follows this model, including key data minimization principles, strong civil rights protections, and robust enforcement mechanisms. The Act’s data minimization requirements include limiting the collection of personal data to what is reasonably necessary for the product or service requested by a consumer, prohibits the sale of sensitive personal data, bans targeted advertising to minors, and prohibits the processing of personal data in ways that discriminate. The Location Shield Act was combined with the privacy bill, banning the sale of precise geolocation data. 

The MDPA includes strong enforcement provisions, which are critical to ensuring privacy laws are complied with. The Attorney General is empowered to enforce the MDPA under its own terms and as a violation of the Massachusetts’ consumer protection law, Chapter 93A. Consumers are also able to bring claims on their own behalf through a private right of action.

EPIC had previously testified in favor of the bill.

Having been approved by the Joint Committee on Advanced Information Technology, the Internet, and Cybersecurity, the bill now will move forward to the Senate and House Committee on Ways and Means for further review.

FTC Finalizes Health Breach Notification Rule Modifications, Improving Health Privacy Safeguards for Consumers

The Federal Trade Commission recently finalized changes to modernize the Health Breach Notification Rule (HBNR), expanding its scope and improving its efficacy to address health privacy and data security risks that fall outside of HIPAA. In the event of a breach of security, the HBNR lays out requirements for covered entities to notify consumers, the Commission, and the public based on the nature of the breach. In addition modifying the HBNR and increased enforcement activity to protect the privacy of consumer health information, the Commission brought its first enforcement actions against entities for failing to comply with HBNR last year.

In our comments to the Commission in June 2023, EPIC highlighted various proposed changes that have now become finalized in the HBNR. First, it is critical that the Commission expanded the scope of covered entities to include mobile apps and other digital service providers to more accurately reflect how consumers create and share identifiable health information in today’s digital ecosystem. Second, the Commission importantly clarified that a breach of security includes unauthorized access to identifiable health information. In other words, beyond a data breach, a breach of security under the HBNR would also include a scenario where an entity acquires identifiable health data without the authorization of the individual.  

EPIC regularly files comments in response to proposed FTC rulemakings regarding business practices that violate privacy rights. Additionally, EPIC has long advocated for health privacy safeguards, including comments to the Department of Health and Human Services supporting its efforts to update the HIPAA Privacy Rule to protect reproductive privacy.

EPIC Urges OMB to Rein In Unchecked Expansion of CBP One App

In comments to U.S. Customs and Border Protection and the White House Office of Management and Budget, EPIC urged the agencies to stop expanding the flawed CBP One app. EPIC highlighted the risks of a proposal that would allow travelers departing from the U.S. to record their exit through CBP One, keeping the agency in compliance with the Biometric Exit requirement. Using CBP One for Biometric Exit effectively shifts the responsibility for collecting exit information from CBP to individuals leaving the U.S., collects unnecessary precise geolocation information from them, and opens the door to mandating that travelers report their exits by the app in the future.

EPIC regularly advocates for the privacy of travelers and immigrants both at the border and in the U.S. EPIC has long opposed the use of facial recognition for Biometric Entry/Exit, including in comments, in FOIA litigation  reavling CBP’s facial recognition procedures and on opt-out practices.  Last year, EPIC urged CBP and DHS not to make CBP One mandatory for asylum seekers.  

The Record: Congress picked a direct fight with ByteDance and TikTok. The privacy implications are less clear. 

Privacy advocates also say the government’s action is unconstitutional because it threatens First Amendment rights to freedom of expression as well as U.S. citizens’ autonomy to use the technology they want as they see fit. 

The Supreme Court weighed in on this idea in the 1969 obscenity case Stanley v. Georgia, John Davisson, director of litigation at the Electronic Privacy Information Center, said via email. 

He cited a section of the court’s opinion — ruling that it is unconstitutional to stop Americans from keeping obscene materials in their homes — that seems to undercut the TikTok legislation. 

“Whatever may be the justifications for other statutes regulating obscenity, we do not think they reach into the privacy of one’s own home,” the opinion said. “If the First Amendment means anything, it means that a State has no business telling a man, sitting alone in his own house, what books he may read or what films he may watch.”  

Davisson said that decision doubtlessly applies to the TikTok law because of how it interferes with individuals’ “private consumption of information (not to mention your personal choices of who to associate with).” 

He called the TikTok law and the government’s focus on China seizing Americans’ data “privacy scapegoating.” 

“We should absolutely be concerned about the misuse of TikTok users’ data, but the problem is so much bigger than one company,” Davisson said. “Without robust privacy rules that apply industry-wide, there’s little stopping businesses and governments from buying the same kinds of information from data brokers that TikTok collects from consumers.” 

Read more here.

EPIC Applauds FTC Prohibition Against Avast’s Disclosure of Sensitive Information for Advertising Purposes

In comments filed earlier this month, EPIC praised the Federal Trade Commission’s consent decree against Avast Limited, one in a series of recent FTC orders against large data controllers for disclosing sensitive consumer information. As detailed in the Commission’s complaint, Avast marketed software to consumers that the company promised would limit tracking online by blocking invasive third-party cookies. But even as it promised privacy protection, Avast collected and sold consumers’ sensitive browsing information to third parties.

In its comments, EPIC applauded the FTC’s proposed prohibition against Avast’s sale or disclosure of browsing data for advertising purposes. EPIC commended the FTC’s ongoing work to prevent such unfair and deceptive data practices—especially in cases where consumers take proactive steps to protect their privacy yet are harmed by a business’s bait-and-switch. EPIC offered two suggestions to make the final order stronger: (1) extend the prohibition to cover the sale or disclosure of browsing information for other purposes, including sales of personal data to national security authorities, and (2) incorporate a comprehensive data minimization framework to limit the collection, disclosure, and retention of personal data.

EPIC has previously called on the FTC to impose data minimization requirements to protect consumers’ privacy and routinely advocates for protections against the sale of personal information to national security and law enforcement agencies.

Dear Diary, It’s Me, Jessica: Part 9

If you’re new here, you may want to subscribe to my RSS feed. Thanks for visiting!

(Psst: The FTC wants me to remind you that this website contains affiliate links. That means if you make a purchase from a link you click on, I might receive a small commission. This does not increase the price you’ll pay for that item nor does it decrease the awesomeness of the item. ~ Daisy)

Missed the other parts? Find them here:

Dear Diary,

It’s me, Jessica.

The weather has turned to where the highs and lows are much more consistent to warmer weather.  HAM guy said we should be past the last frost date.

I helped Mom for three days, planting in every container, every possible space where the chickens could not get to the herbs and veggie seeds.  We mixed in compost and a little chicken manure and then watered.  I was digging up the front lawn when Mom commented on my arms and shoulders.  It never occurred to me until then, but my upper arms, shoulders, and even some of my back were what we would have called ‘built’ before the power went out.  All the manual labor at the Miller’s and around our home, carrying firewood and hauling water, all the walking, and Mom and Dad making sure I had enough to eat, I was kinda ‘built.’  

I think I lost a bra size.

I laughed and said, “Sure Mom, that is what a man looks for in a woman.”

“I am sure Billy doesn’t mind,” Mom said with a sly look.  

I blushed furiously and returned to digging up the front yard for root veggies and corn.  Mom and I helped Joanne with her garden, too.  I moved wheelbarrows of compost and some of our chicken manure back and forth into her garden.  We all should have a good harvest this year. 

The next day was the weekly neighborhood ‘militia’ training.  

Jack and others with military training put the rest of us through basic drills.  Jack said he did not expect us to march a military campaign on Moscow or be a part of some ‘high speed, low drag’ SpecOps team, but we should know at least the basics of advance, tactical retreat, covering fire, cover, and concealment, and flanking maneuvers.  We needed to learn how to work as a team.  Everyone took turns acting as the team leader in a maneuver, even me.

Jack and Rae pulled me aside after ‘militia’ training was over and trained me further in hand-to-hand combat, using my rifle as a physical ‘blunt’ object weapon and the ‘snap’ shot, which did include some live fire.  

Diary, I was tired after militia training.  As I practiced various hand-to-hand techniques and what Jack described as “the mighty butt stroke” with my rifle in hand, I was getting a bit annoyed.  

And it showed.

Then Rae said, “Young lady, this training may save your life one day.  That attitude will not.”

Sheepishly, I nodded and attacked the…

Continue reading here

GAO Reports More Negatives Than Positives to Police Use of Facial Recognition, Highlights Need for Comprehensive Data Privacy Law 

In a report published by the Government Accountability Office (GAO) on April 22, 2024, the agency surveyed federal agencies, biometric tech vendors, academics, and advocacy groups to understand the impact of biometric technologies. EPIC was one of the groups consulted. The GAO focused primarily on facial recognition, finding that even the best algorithms retain racial and gender biases in controlled laboratory testing. The agency also concluded that the reported negative impacts of facial recognition outnumbered positive impacts for law enforcement use of facial recognition. GAO noted serious privacy, civil rights, effectiveness, and transparency concerns across all uses of biometrics. The agency recorded numerous examples of biometric technology exacerbating systemic inequities when used for policing and border enforcement, access to government benefits, and commercial settings.  

In response to serious identified harms, the GAO identified key considerations to prevent harms from biometric technologies, many of which align with the National Academy of Sciences recommendations for facial recognition. Those considerations were:  1) much more rigorous performance testing across biometric technologies, 2) promoting transparency into how the government and vendors design and operate biometric systems, and the limitations of those systems, 3) taking a risk-based approach to rules, guidance, and use of biometric technologies, 4) instituting comprehensive data privacy laws, and 5) improving training and guidance. In particular, the GAO recognized that when a system’s “risks to rights or safety exceed an acceptable level and where mitigation strategies do not sufficiently reduce risk, agencies must stop using the AI system as soon as is practicable.” 

Earlier this year, EPIC submitted comments in response to DOJ and DHS’ Request for Written Submissions on Sec. 13e of ~Executive Order 14074~ urging DOJ and DHS to center vulnerable communities in its crafting of new guidance on the use of facial recognition, predictive policing technologies, social media surveillance tools, and DNA analysis tools. EPIC also submitted comments on facial recognition to the US Commission on Civil Rights . EPIC’s recent comments to the White House Office of Management and Budget (OMB) on improving Privacy Impact Assessments highlighted the ways agencies currently fail to account for the harms of biometrics technologies including facial recognition.

FCC Fines Largest Carriers for Misuse of Location Data—Many Years Later

The FCC issued fines yesterday against the country’s four largest mobile carriers for their roles in selling phone subscriber location data in violation of Section 222 of the Communications Act and of Customer Proprietary Network Information (CPNI) rules. EPIC applauds the FCC for taking this action, but notes that the agency needs to strengthen its protection of subscriber location data, especially in the absence of a federal privacy law.

Before the FCC issues a forfeiture order to fine a company, the FCC first issues a notice of apparent liability (NAL). The NALs in this case were issued against mobile carriers more than four years ago, in February 2020—a delay that is likely due in part to the Senate failing to confirm a fifth FCC Commissioner until last year. In 2020, Commissioner Starks described the investigation leading to the NALs as too slow moving, incomplete, and inadequate. He highlighted issues with the FCC’s procedures for handling company confidentiality requests and argued that these shortcomings would make it harder for other agencies (e.g., the FTC) to follow up with enforcement actions against the bad actors who were beyond the FCC’s jurisdiction.

FCC Chairwoman Rosenworcel has signaled that subscriber location data is a priority for the Commission. She issued letters of inquiry to carriers about location data practices in 2022, and at the beginning of 2024 to car companies and carriers again in response to disturbing news stories. During her tenure, Chair Rosenworcel has also created a Privacy and Data Protection Task Force. This Task Force played a role in enacting the FCC’s data breach reporting requirements, which highlight mobile location data in the first paragraph. While these are all valuable steps in the right direction, in the absence of comprehensive federal privacy protections, the FCC needs to pick up the pace; rules with meaningful penalties must be supported by prompt and robust enforcement actions to protect subscribers from misuse of their location data.

“We commend the FCC for finally penalizing wireless carriers for illegally disclosing their customers’ location information, but it should not have taken four years. The FCC needs to strengthen its protection of subscriber location data, particularly in the absence of a federal comprehensive privacy law.”

EPIC regularly advocates for policies that strengthen data security for consumer information, protecting data from unauthorized access and other misuse.

State Lawmakers Testify about Industry Influence on Privacy Bills in Vermont Hearing 

State lawmakers from across the country testified Friday to Vermont’s House Committee on Commerce and Economic Development about how industry lobbying affected their experiences championing privacy bills in their states. 

The state lawmakers who testified were: Maine Rep. Maggie O’NeilMaryland Del. Sara Love, former Oklahoma Rep. Collin WalkeKentucky Sen. Whitney Westerfield, and Montana Sen. Daniel Zolnikov

All five lawmakers encouraged Vermont to pass a strong privacy law this session, in many cases urging the Vermont legislators to adopt provisions stronger than what they were able to include in their own state bills. The lawmakers all testified that they experienced heavy lobbying to weaken their bills. 

Kentucky Sen. Westerfield highlighted how national industry lobbying groups like the State Privacy and Security Coalition, NetChoice, and TechNet are working with local chambers of commerce and their members to lobby against strong state privacy laws. 

“The Kentucky Chamber of Commerce, informed by a lot of the biggest players in the country and in the world who are parts of the State Privacy and Security Coalition, the AT&Ts and others, Amazon, and the likes,” Sen. Westerfield said. “You had a lot of people demanding that my bill not advance, or something like my bill not advance.”

Del. Love of Maryland also noted just how much money and effort national lobbyists spent trying to weaken her privacy bill. 

“I will tell you, I have not in my six years, in my second term, seen as hard a lobbying job as these folks did. They put so much money into pushing and lobbying,” Del. Love said. “The more they pushed, the more we said — and not just me, also other legislators said — ‘Enough. We’re going to pass something that matters.’”

Del. Love and Maine Rep. O’Neil—both of whom sponsored privacy bills this year—talked about how industry lobbyists used small and local businesses to help fight strong privacy laws. 

“At the very end, when they realized this bill was actually going through and it was going through with data minimization and it was going through the way it was, that’s when they pulled out small business: “Oh, this is going to hurt small businesses,” Del. Love said. “I think they’re learning that that’s something that sways legislators because we all care about our small businesses. And it was fascinating to watch the evolution because that was not an issue early on, and then you could see certain legislators have that talking point near the end.”

Rep. O’Neil pointed to a similar strategy in Maine. 

“Something we saw that was really hard to combat was that these companies organized local businesses and well-known Maine companies to speak on their behalf. In Maine, LL Bean is a well-known big company, and they became a big spokesperson for the interests of the bigger lobbying groups,” Rep. O’Neil said. “In Maine, if LL Bean says something, people are going to respect that.”  

Vermont Rep. Monique Priestley, a lead sponsor of a 

Data Brokers Threaten National Security. The Consumer Financial Protection Bureau’s Fair Credit Reporting Act Rulemaking Can Reduce the Threat.

Data brokers—companies that aggregate and sell our personal data—are a well-known threat to privacy. But data brokers do more than degrade our privacy; they also pose a serious threat to national security. Data brokers routinely build extensive dossiers of information on Americans, including members of the armed forces and others in national security posts. Much of that personal data is collected as we browse the internet and use online services, including sensitive location records, health information, biometric data, and financial information. Many companies extract and sell that information to data brokers, who in turn compile records, draw inferences, and resell datasets and dossiers to willing buyers—whoever they may be.

This lawless data broker ecosystem allows foreign adversaries to purchase detailed records about individuals, which puts service members and other government officials at risk and can reveal sensitive national security information like patrol routes around military bases. Further, bad actors can use information obtained from data brokers to blackmail or use phishing tactics to obtain state secrets from service members or government officials. Data brokers pose a serious threat to national security which must be addressed.

The Consumer Financial Protection Bureau (CFPB) has outlined proposed rules pursuant to its authority under the Fair Credit Reporting Act (FCRA) which would limit how data brokers collect and share information. These rules would reduce the capability of foreign adversaries and bad actors to obtain sensitive personal information with national security implications from data brokers, cutting off the flow of personal data at the source.

The Bureau announced its interest in a FCRA rulemaking in March 2023 and released an overview of the regulations it’s considering in September. EPIC filed comments at both stages. The CFPB is expected to issue a proposed rule, or Notice of Proposed Rulemaking (NPRM), later this year. Members of the public will have an opportunity to submit comments, and the CFPB will consider the comments it receives while formulating a final rule.

How data brokers threaten national security

Researchers have highlighted several troubling ways in which data brokers threaten national security. For example, Duke University researchers found that data brokers are selling sensitive data, including names, addresses, geolocation records, religion, net worth, and health information, about active-duty military members, veterans, and their families. The researchers contacted U.S. data brokers and were able to purchase American service members’ records for as little as $0.12 per record. The Irish Council for Civil Liberties found that foreign adversaries can obtain sensitive data about U.S. service members, politicians, and other high-profile figures through the real-time bidding system used by data brokers to target online advertisements. In 2018, Strava released a global heat map showing user activity records, and researchers and activists found that the heat map could be used to identify the locations of military bases and patrol routes, as well as identifying information for the service members who used Strava in those locations.

What is government doing to address the threat data brokers…