Third State Privacy Act Enacted
EDPB further develops GDPR; DOD cancels JEDI; New CISA head confirmed
Photo by Joshua Woroniecki from Pexels
Colorado has enacted the United States’ (U.S.) third state privacy law, further complicating the task of the Congress in passing a federal privacy statute. The “Colorado Privacy Act” (SB 21-190) bears greater resemblance to Virginia’s “Consumer Data Protection Act” (SB 1392/HB 2307) (see here for more detail and analysis) than the “California Privacy Rights Act” (Proposition 24). The bill uses an enhanced notice and consent model that puts the onus on Colorado residents to opt out of certain personal data collection and processing subject to broad exceptions. Moreover, there is no private right of action and controllers must be given notice and 60 days to cure violations before the attorney general or district attorneys may enforce the new law.
In a sign that this bill is industry friendly, some privacy and consumers rights advocates voiced displeasure with the bill. For example, the Colorado Public Interest Research Group argued:
§ Here in Colorado, our legislature is considering an “opt-out” bill, meaning that companies get access to our personal data unless we tell them to stop. Furthermore, the bill, SB21-190 titled Protect Personal Data Privacy, only lets consumers opt-out of data collection, tracking, selling and sharing for a few purposes.
§ This means that few things will actually change around data privacy in Colorado. We’re enshrining the wrong foundation into law, and the bill will make future data privacy gains harder to achieve.
With privacy legislation currently at a standstill in Congress, it is very likely industry will continue to work state legislatures to block laws seen as unfavorable and get enacted favorable laws. Hence, the U.S. may wind up with piecemeal privacy statutes much like the current landscape of data breach and notification laws.
As always, let’s begin with some of the key definitions. Children are defined as those 12 years of age and under, which matches the federal privacy statute, the “Children’s Online Privacy Protection Act” (COPPA) the Federal Trade Commission (FTC) enforces.
Consent is defined as:
A clear, affirmative act signifying a consumer's freely given, specific, informed, and unambiguous agreement, such as by a written statement, including by electronic means, or other clear, affirmative action by which the consumer signifies agreement to the processing of personal data.
This is a fairly strong definition, strengthened by subsequent language making clear what does not constitute consent:
(a) Acceptance of a general or broad terms of use or similar document that contains descriptions of personal data processing along with other, unrelated information;
(b) hovering over, muting, pausing, or closing a given piece of content; and
(c) agreement obtained through dark patterns.
The bill uses the General Data Protection Regulation’s (GDPR) phraseology of controllers and processors, with the former being an entity “that, alone or jointly with others, determines the purposes for and means of processing personal data” and the latter being an entity that processes personal data on behalf of a controller. Determining which entities are controllers and which are processors will be a fact-specific determination, meaning there is no bright line test or rule in the bill to distinguish between the two.
Personal data is defined widely; it shall be “information that is linked or reasonably linkable to an identified or identifiable individual” that is neither de-identified data nor publicly available information. The key to the strength of the definition will ultimately hinge on how “linked” and “reasonably linkable” are defined, for if Colorado’s courts read these terms broadly, then more of a person’s data would meet the definition and qualify for protection and rights under the bill. If the term is ultimately read narrowly, and undoubtedly any entities fighting enforcement will seek to do just that, then the aperture of personal data rights tightens leaving entities to do what they will with information outside this definition.
Like many recent bills, there is a subclass of personal data, “sensitive data” that is subject to heightened rights. This term is defined as:
(a) personal data revealing racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status;
(b) genetic or biometric data that may be processed for the purpose of uniquely identifying an individual; or
(c) personal data from a known child
The first element is fairly broad and even standard by now in many of the privacy bills. The second is also fairly standard but would seem to exclude the processing of biometric or genetic data for purposes other than “uniquely identifying an individual.” Would the processing of these sorts of data for anything short of uniquely identifying a person be outside the definition of sensitive data? If appears so. Consequently, it may be legal under Colorado’s statute to process genetic or biometric data as regular personal data if the purpose is not to uniquely identify people. Regarding the third element, if my elementary school age son makes a secret TikTok account with a fake birthday, then the platform might arguably claim it did not know my son was a child and therefore was not responsible for meeting the higher standard for sensitive information.
The definition of sale, sell, sold requires the exchange of money of something of value, a potentially significant loophole, for much of the data collection and processing world thrives on trading data. Conceivably, trading personal data would qualify as “other valuable consideration,” but this may remain to be seen once implementation and enforcement of the bill begins. Moreover, sell, sale, and sold do not include:
(i) the disclosure of personal data to a processor that processes the personal data on behalf of a controller;
(ii) the disclosure of personal data to a third party for purposes of providing a product or service requested by the consumer;
(iii) the disclosure or transfer of personal data to an affiliate of the controller;
(iv) the disclosure or transfer to a third party of personal data as an asset that is part of a proposed or actual merger, acquisition, bankruptcy, or other transaction in which the third party assumes control of all or part of the controller's assets; or
(v) the disclosure of personal data:
(a) that a consumer directs the controller to disclose or intentionally discloses by using the controller to interact with a third party; or
(b) intentionally made available by a consumer to the general public via a channel of mass media.
Some of these exception bear note. A number of companies use Amazon Web Services (AWS). Under the above exceptions if company A provides me a streaming service that uses AWS, then the company may provide my personal data to Amazon, and I would not be able to opt out of such a disclosure (more on this later.) This may prove to be a significant loophole.
Additionally, the last exception under which a person “directs” a controller to disclose information may also be an avenue by which controllers can collect and share personal data outside the confines of the new regulatory system. It will almost certainly prove the case that people will continue to see voluminous privacy notices and disclosures, and it will likely become de rigueur to slip language into these requiring Colorado residents to “direct” controllers to disclose information.
The Colorado Privacy Act would apply to many companies operating in the state. The new bill applies to any controller that:
(a) conducts business in Colorado or produces or delivers commercial products or services that are intentionally targeted to residents of Colorado; and
(b) satisfies one or both of the following thresholds:
(i) controls or processes the personal data of one hundred thousand consumers or more during a calendar year; or
(ii) derives revenue or receives a discount on the price of goods or services from the sale of personal data and processes or controls the personal data of twenty-five thousand consumers or more.
This definition would encompass most large companies doing business in the state with significant data collection and processing operations. But many small businesses would be exempt, and they may be a conduit through which a significant amount of personal data may pass to the larger data processing world. Additionally, the bill exempts entities subject to a number of federal privacy statutes, including but not limited to, would not be subject to the Colorado Privacy Act:
§ The Health Insurance Portability and Accountability Act/ Health Information Technology for Economic and Clinical Health Act
§ Fair Credit Reporting Act
§ Financial Services Modernization Act of 1999 (aka Gramm-Leach-Bliley)
§ Family Educational Rights and Privacy Act
Consequently, most entities in the following fields would not need to adhere to the bill: healthcare, financial services, credit reporting, and education. This is fairly standard in many of the more industry-friendly privacy bills that have been introduced in states and Congress, but this recurring set of carveouts seems contrary to the purported motivation for U.S. privacy legislation that would create one standard across the country.
Moreover, the Colorado Privacy Act’s requirements do not apply to controller or processor’s ability to do a number of things, including meet federal and state government and judicial requirements. The exceptions that will likely be construed as liberally as possible may allow controllers and processors to make the argument some of their current activities may not be subject to the bill such as
§ conduct internal research to improve, repair, or develop products, services, or technology;
§ identify and repair technical errors that impair existing or intended functionality;
§ perform internal operations that are reasonably aligned with the expectations of the consumer based on the consumer's existing relationship with the controller;
§ protect the vital interests of the consumer or of another individual;
§ prevent, detect, protect against, or respond to security incidents, identity theft, fraud, harassment, or malicious, deceptive, or illegal activity; preserve the integrity or security of systems; or investigate, report, or prosecute those responsible for any such action;
It would not prove surprising if many of these exceptions are used in ways that prove contrary to the legislative intent of the bill.
In terms of those activities that are subject to the Colorado Privacy Act, processors must follow the direction of controllers and must help controllers in a number of ways, including helping them meet their responsibilities to Colorado residents. There must be a binding contract between controllers and processors that sets out the processing instructions, including the purpose and nature of these activities and the type of personal data to be processed. Another key passage in any such contract is what a processing is to do with the personal data when the processing is finished. A processor must return or delete these data unless retention is required by law. Additionally, controllers and processors must “implement appropriate technical and organizational measures to ensure a level of security appropriate to the risk and establish a clear allocation of the responsibilities between them to implement the measures.” Moreover, like many privacy bills, controllers or processors that disclose personal data to other controllers or processors are not liable for the any of the latter’s violations unless the former had actual knowledge the recipient of the information intended to violate the new privacy regime.
Colorado residents will receive the by now usual set of rights regarding their personal data. One may opt out of the processing of her personal data if the purpose is for:
§ targeted advertising
§ the sale of personal data; and
§ “profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer”
Taking the second activity people can opt out of first, Colorado residents can opt out of having their personal data sold. Of course, privacy and consumers rights advocates would prefer that the mechanism be opt in because in many situations people will choose the default setting whatever that may be. Hence, if the default setting is that one’s personal data may be sold, then more people will allow that instead of an opt in regime which would result in fewer people agreeing as evidenced by Apple’s new iPhone OS.
Aside and apart from the issue, however, granting people the right to opt out of sales makes the definition of what is selling all the more important. Hence, the discussion above about the limits of the term and exceptions. If a controller is giving away personal data and sharing freely, this probably does not constitute a sale under the Colorado Privacy Act. Such a controller may face liability for not securing and safeguarding the information, but that’s a different issue. The central issue is people will only be able to opt out of the selling of their personal data and not the collection and processing of it. Moreover, it is not clear whether trading information qualifies as a sale although it may, for presumably the personal data have value. But some larger entities may claim the data alone lack value until their proprietary processing methods give it value. In any event, this strikes me a big loophole.
Going back to the first practice people can opt out of, “targeted advertising,” the practice, as defined under the bill, encompasses ads based on personal data gleaned from websites or apps not affiliated with a controller. In other words, if I happen to be logged into Facebook, and I have opted out of targeted advertising, and I see an advertisement for German luxury cars based on searches and research not performed on Facebook, this practice would violate the new law. But, if I searched on Facebook (of course, why would I, but bear with me) for German luxury cars, then Facebook could show be targeted ads based on this interest. Or, more likely, if I read about Germans cars, Facebook could start targeting me with BMW and Mercedes ads. The definition of “targeted advertising” is
displaying to a consumer an advertisement that is selected based on personal data obtained or inferred over time from the consumer's activities across nonaffiliated websites, applications, or online services to predict consumer preferences or interests; and
(b) does not include:
(i) advertising to a consumer in response to the consumer's request for information or feedback;
(ii) advertisements based on activities within a controller's own websites or online applications;
(iii) advertisements based on the context of a consumer's current search query, visit to a website, or online application; or
(iv) processing personal data solely for measuring or reporting advertising performance, reach, or frequency.
This would allow much of the current targeted advertising world to continue apace. Moreover, even if I did see targeted advertising contrary to a clear and affirmative opt out, how would I prove a violation? Enforcement of this proscription may prove tricky.
Finally, the Colorado Privacy Act seems to be making the default setting for the collection and processing of personal data that controllers and processors may engage in “profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer.” Before we can digest what this means, we need to review two key definitions. First, profiling is defined as
Any form of automated processing of personal data to evaluate, analyze, or predict personal aspects concerning an identified or identifiable individual's economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.
Consequently, profiling can be my auto insurer using a range of personal data to project the likelihood of my getting into an accident and then setting my rates. Or a potential employer generating a profile on my likely salary request. Or a life insurer digging into my personal data to determine how much I drink, smoke, use illicit drugs, skydive, or whatever to write me a policy.
The other definition spells out the rest of the right a person may exercise the right to opt out of. The bill explains:
"decisions that produce legal or similarly significant effects concerning a consumer" means a decision that results in the provision or denial of financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment opportunities, health-care services, or access to essential goods or services.
In layman’s terms, the default setting under the Colorado Privacy Act is to allow controllers to use automated means to make decisions about Colorado residents that could result in the provision or denial of financial services, housing, insurance, education, employment, and other things. If a person does not opt out, a controller is well within its legal rights to do any of the above bounded by Colorado’s civil rights laws or U.S. civil rights laws as the bill makes clear.
The bill permits the use of what has been deemed a universal opt out if other circumstances under which one downloads and uses, say, a browser extension that she would click on every website she wants to opt out of targeted advertising, the sale of personal data, and profiling. There are other described means by which a person could achieve the same. Moreover, every controller must post clear and conspicuous means on their website. There is a potentially catch, however. Controllers must comply with all such requests if they can authenticate who the person is. Undoubtedly, it would accrue to a controller’s financial interests to authenticate as few people as possible who are requesting to opt out. Depending on how strictly the Colorado Privacy Act is enforced, this may be another off ramp controllers may use to avoid regulation.
Nevertheless, before 1 July 2024, controllers may allow people to use a universal opt out mechanism, but after this date, they have to allow people to do so. Again, the battle may be waged over authentication. Regardless, the Colorado Attorney General must establish technical specifications for universal opt out mechanisms.
Finally, there is a provision allowing controllers to ask people to consent to targeted advertising or the sale of personal data even if a person has already exercised a universal opt out for these practices. This seems contrary to the entire notion of a universal opt out.
And yet, controllers must obtain opt in consent to process sensitive personal data or the consent from the parents of a child (defined as 12 and under) for the processing of personal data.
Colorado residents will also have the following rights:
§ The right to determine whether a controller is processing (but not merely collecting and holding) one’s personal data and to access these data
§ A potential ability to correct inaccuracies in personal data depending on the the nature of these data and the type of processing
§ The right to demand a deletion of personal data
§ The right to port personal data “in a portable and, to the extent technically feasible, readily usable format.”
Controllers have 45 days to respond to people’s requests and may unilaterally extend the period by 45 days “where reasonably necessary.”
Controllers may also refuse requests but must furnish the reasons for the denial. A Colorado resident may appeal such a refusal and each controller must have an internal appeals process. If the denial is “upheld,” then a person may contact the attorney general’s office. Moreover, there is language allowing controllers to request additional information from a person to authenticate her identity. It seems fairly predictable there will be a significant number of controllers will as a matter of course require additional information to honor requests to exercise data rights in order to limit the number of requests they need to honor.
Controllers must provide people “with a reasonably accessible, clear, and meaningful privacy notice that includes:
(i) the categories of personal data collected or processed by the controller or a processor;
(ii) the purposes for which the categories of personal data are processed;
(iii) how and where consumers may exercise the rights pursuant to section6-1-1306, including the controller's contact information and how a consumer may appeal a controller's action with regard to the consumer's request;
(iv) the categories of personal data that the controller shares with third parties, if any; and
(v) the categories of third parties, if any, with whom the controller shares personal data
Moreover, if controllers sell personal data to third parties or process personal data for targeted advertising, they must “conspicuously disclose the sale or processing, as well as the manner in which a consumer may exercise the right to opt out of the sale or processing.”
Controllers must also specify the “express purposes” for which personal data are collected and processed. Likewise, there is a duty to collect only the personal data that “adequate, relevant, and limited to what is reasonably necessary in relation to the specified purposes for which the data are processed.” Moreover, controllers are banned from secondary uses of personal data unless they obtain consent from people. While “secondary use” is not explicitly defined, the bar against secondary use seems to involve use beyond purposes that are reasonably necessary or compatible with the specified purposes for which the data were collected. Controllers also have a duty of care to take “reasonable measures” to protect personal data in “storage” and from “unauthorized acquisition.” Does this duty extend to personal data in transit? It is not clear. Moreover, who authorizes the acquisition of personal data? Might a controller authorize the disclosure of data to a third party without valuable consideration regardless of what a person wants? It seems like this is permissible under Colorado’s new privacy law.
Controllers also have a duty to conduct data protection assessment if its processing presents a “heightened risk of harm” to people. This concept is defined as including:
(a) processing personal data for purposes of targeted advertising or for profiling if the profiling presents a reasonably foreseeable risk of:
(i) unfair or deceptive treatment of, or unlawful disparate impact on, consumers;
(ii) financial or physical injury to consumers;
(iii) a physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers if the intrusion would be offensive to a reasonable person; or
(iv) other substantial injury to consumers;
(b) selling personal data; and
(c) processing sensitive data
Data protection assessments would be made available only to the attorney general and could not be obtained through freedom of information requests. However, data protection assessments are only required after 1 July 2023 and need not be retrospective, meaning they will almost certainly only cover activities after that date.
There is no private right of action, and only the attorney general or district attorneys could enforce the Colorado Privacy Act. The attorney general and district attorneys could seek civil penalties under existing authority to seek $20,000 per violation and injunctive relief for violations. However, they could not seek damages for willful or egregious violations. Finally, before they can go to court to ask for an injunction, the controller must be given notice of the violation and 60 days to cure the conduct.
The statute will come into force on July 1, 2023 unless a referendum petition is filed within 90 days of the end of the legislative session to put the legislation to a vote on the 2022 ballot in Colorado.
Other Developments
§ The European Data Protection Board (EDPB) summarized the results of its most recent plenary session even though not all of the documents have yet been made available:
o During its plenary session, the EDPB adopted Guidelines on Codes of Conduct (CoCs) as a tool for transfers. The main purpose of the guidelines is to clarify the application of articles 40 (3) and 46 (2) (e) of the GDPR. These provisions stipulate that once approved by a competent SA and after having been granted general validity within the EEA by the Commission, a CoC may also be adhered to and used by controllers and processors not subject to the GDPR to provide appropriate safeguards to transfers of data outside of the EU. The guidelines complement the EDPB Guidelines 1/2019 on codes of conduct which establish the general framework for the adoption of codes of conduct.
o The EDPB adopted a final version of the Guidelines on Virtual Voice Assistants (VVA). The Guidelines aim to provide recommendations to relevant stakeholders on how to address some of the most relevant compliance challenges for VVAs. Following public consultation, the Guidelines were updated to reflect comments received.
o Also following public consultation, the EDPB adopted a final version of the Guidelines on the concepts of Controller and Processor. These Guidelines aim to provide clarifications concerning fundamental concepts such as (joint) controller and processor. The final version integrates updated wording and further clarifications in order to address comments and feedback received during the public consultation.
o Following the establishment of TikTok in the EU and the identification of its main establishment in Ireland for the ongoing cases related to the TikTok app, the EDPB decided to disband its TikTok Taskforce. This Taskforce was created to coordinate potential actions from the EEA supervisory authorities (SAs) and to acquire a more comprehensive overview of TikTok’s processing and practices across the EU. At the time the Taskforce was created, there was no main establishment for TikTok in the EU and the Taskforce aimed to facilitate the exchange of information between SAs. Now, the One-Stop-Shop procedure applies and the Irish SA (DPC) was designated as the lead authority in charge of the files.
o Consequently, the SAs involved in the Taskforce will use the designated tools under the cooperation mechanism, while also taking into account article 64(2) GDPR and EDPB opinion 8/2019 on the competence of a supervisory authority in case of a change in circumstances relating to the main or single establishment. Several SAs have already transferred their investigations to the DPC.
o The SAs will have the opportunity to hold discussions on the matter, within the EDPB, and in particular within its Enforcement Expert Subgroup.
o It is important to note that the EDPB can only take action in case the consistency mechanism in article 63 GDPR is triggered. After having issued provisional measures pursuant to article 66(1) GDPR, and having received assurances from TikTok on their application, including commitments that the latter undertook regarding its processing activities, the Italian SA decided that it no longer requires an urgent decision from the EDPB.
o Finally, the EDPB discussed possible topics for its first coordinated enforcement action, following the EDPB’s decision to set up a Coordinated Enforcement Framework on 20 October 2020. The EDPB decided that the first action will concern the use of cloud-based services by public sector bodies and further work will now be carried out to specify the details and the scope in the upcoming months.
§ A “senior administration official” provided a background briefing to media “On President Biden’s Call With President Putin of Russia.” This official claimed:
o President Biden also spoke with President Putin, obviously, about the ongoing ransomware attacks by criminals based in Russia that have impacted the United States and other countries around the world. President Biden underscored the need for Russia to take action to disrupt ransomware groups operating on Russian territory, and emphasized that he’s committed to continued engagements on the broader threat posed by ransomware.
o President Biden reiterated that the United States will take necessary action to defend its people and its critical infrastructure in the face of this continuing challenge. And the President has also called on governments and agencies to modernize their defenses to meet this threat, building on the President’s executive order on cybersecurity that was released in May.
o I want to say a few other things about this: The President really meant what he said just after concluding the summit meeting in Geneva, when he said that our assessment of this process and our evaluation of Russia’s actions would take time and play out over time. The President said six months or more.
o It’s about addressing the challenges posed by cryptocurrency, which provides fuel for these sorts of transactions.
o It’s about ensuring that our allies and our partners are working with us, collaboratively, and upping their own game when it comes to resilience and these broader issues.
o So, this is a broad campaign and won’t have an immediate on-off effect like a light switch, but we’re going to have to stay on top of this over a period of time and remain focused on it.
§ The Senate confirmed Jen Easterly by voice vote to be the second Director of the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA). Last month, the Senate confirmed former National Security Agency Deputy Director Chris Inglis to be the United States’ (U.S.) first National Cyber Director. Now the Biden Administration has filled two of the most critical cybersecurity roles in the government. It remains to be seen how Inglis and Easterly will coordinate with Deputy National Security Advisor for Cyber and Emerging Technology Anne Neuberger who was thus far been the Biden Administration’s point person on cyber and many technology issues. When she was nominated, one of the entities with whom Easterly was affiliated posted a bio:
o Jen Easterly is a Managing Director of Morgan Stanley and Global Head of the Firm’s Cybersecurity Fusion Center. She joined the firm in February 2017 after nearly three decades in U.S. Government service. Prior to joining Morgan Stanley, Jen served on the National Security Council as Special Assistant to President Obama and Senior Director for Counterterrorism, where she led the development and coordination of U.S. counterterrorism and hostage policy. Prior to that, she was the Deputy for Counterterrorism at the National Security Agency. A two-time recipient of the Bronze Star, Jen retired from the U.S. Army after more than twenty years of service in intelligence and cyber operations, including tours of duty in Haiti, the Balkans, Iraq, and Afghanistan. Responsible for standing up the Army’s first cyber battalion, Jen was also instrumental in the creation of United States Cyber Command. A member of the Council on Foreign Relations and a French-American Foundation Young Leader, Jen is a Fellow of the 2018 class of the Aspen Finance Leaders Fellowship and a member of the Aspen Global Leadership Network. She is also a Senior International Security Fellow at the New America Foundation, as well as the past recipient of the Council on Foreign Relations International Affairs Fellowship and the Director, National Security Agency Fellowship. A distinguished graduate of the United States Military Academy at West Point, Jen holds a master’s degree in Philosophy, Politics, and Economics from the University of Oxford, where she studied as a Rhodes Scholar. A Trustee of the Morgan Stanley Foundation, Jen serves on the Board of Nuru International, a non-profit dedicated to the eradication of extreme poverty, and on the Advisory Council for Hostage US, which supports the families of Americans taken hostage abroad and hostages when they return home. She is the 2018 recipient of the James W. Foley American Hostage Freedom Award.
§ Republicans on the House Judiciary Committee published their agenda for “Taking on Big Tech.” In their press release, they argued:
o Big Tech is out to get conservatives. Last week, Leader McCarthy put forth a Republican framework to stop Big Tech. Today, House Judiciary Republicans released their agenda to hold Big Tech accountable. This agenda presents specific proposals that will speed up and strengthen antitrust enforcement, hold Big Tech accountable for its censorship, and increase transparency around Big Tech’s decisions.
o Republicans offered the following:
§ -Speed-
§ Our plan accelerates overdue antitrust scrutiny. The laws currently on the books can and should be used to break up Big Tech. The problem has been, however, that these actions take too long and they allow companies years of legal maneuvering. An important step is to speed up this process and incentivize robust challenges to the dominance of the tech platforms. The conservative response should include the following:
· Expedited trial court consideration. In the early Twentieth Century, Congress required courts to aggressively apply antitrust laws. Consistent with the Expediting Act of 1903, this proposal would require faster treatment of antitrust cases against Big Tech companies at the trial court, create an adequate record for any appeal, and ultimately ensure speedy justice in the fight against Big Tech.
· Direct appeal to the Supreme Court. Antitrust cases take so long to litigate in part because of the length of the appellate process. Borrowing again from aspects of the Expediting Act of 1903, this proposal would speed up consideration of these cases by providing for a direct appeal to the Supreme Court and requiring the Supreme Court to act quickly when these cases get there.
· Empower state attorneys general. State attorneys general are crucial partners in enforcing our nation’s antitrust laws. Several states have started or joined cases targeting Big Tech. This proposal would allow state attorneys general to utilize the same fast- track procedures available to the Federal government so that they will be on equal footing in their cases.
§ -Accountability-
· Our plan subjects Big Tech to legal accountability for its censorship. Platforms like Twitter, Facebook, and YouTube are functionally the public square of the digital age. It is wrong that these platforms control and censor speech with impunity. But it is nearly impossible for Americans to seek a remedy against Big Tech’s censorship decisions in court. In addition, the current regulatory regime divides enforcement between two unrelated agencies, creating an inefficient and unaccountable process. The conservative response to holding Big Tech accountable should include:
· A cause of action to empower Americans. For far too long, Big Tech has been able to censor the views of conservatives with effectively no recourse available to those affected. This proposal would create a statutory basis for Americans to directly challenge Big Tech in court for its censorship and silencing of conservatives.
· Overhauling Big Tech’s liability shield. Congress passed Section 230 of the Communications Decency Act to allow internet platforms to moderate unlawful or offensive content on their platforms. Big Tech has exploited this protection to make subjective content moderation decisions, often in a manner harmful to conservative voices. This proposal will ensure that any content moderation decisions are done in good faith, based on objectively reasonable criteria, and in accord with particularized rules.
· Consolidated antitrust enforcement authority. The current system of splitting antitrust enforcement between the Department of Justice and the Federal Trade Commission is inefficient and counterproductive. The arbitrary division of labor empowers radical Biden bureaucrats at the expense of Americans. This proposal will consolidate antitrust enforcement within the Department of Justice so that it is more effective and accountable.
§ -Transparency-
· Our plan brings transparency to Big Tech’s content moderation decisions. Not only are the platforms currently immunized from lawsuits regarding their censorship, but all of their decisions about who to censor are made in secret. Recognizing that these platforms function as the main vessels for speech in the modern era, this plan for bringing transparency to Big Tech should include:
· Content moderation transparency. Big Tech’s content moderation decisions can be imposed summarily and with little justification. This proposal will require that for large platforms, content moderation decisions and censorship must be listed, with specificity and particularity, on a publicly available website. A platform’s failure to do so would result in a massive fine.
§ The United States (U.S.) Department of Defense (DOD) “canceled the Joint Enterprise Defense Infrastructure (JEDI) Cloud solicitation and initiated contract termination procedures” per its press release. The DOD has faced protracted litigation from Amazon Web Services (AWS) regarding the legality of the JEDI award to Microsoft and its partners and subcontractors worth as much as $10 billion. The DOD continued:
o The Department has determined that, due to evolving requirements, increased cloud conversancy, and industry advances, the JEDI Cloud contract no longer meets its needs. The Department continues to have unmet cloud capability gaps for enterprise-wide, commercial cloud services at all three classification levels that work at the tactical edge, at scale -- these needs have only advanced in recent years with efforts such as Joint All Domain Command and Control (JADC2) and the Artificial Intelligence and Data Acceleration (ADA) initiative.
o Concurrent with the cancellation of the JEDI Request for Proposals (RFP), the DOD announced its intent for new cloud efforts. The Joint Warfighter Cloud Capability (JWCC) will be a multi-cloud/multi-vendor Indefinite Delivery-Indefinite Quantity (IDIQ) contract. The Department intends to seek proposals from a limited number of sources, namely the Microsoft Corporation (Microsoft) and Amazon Web Services (AWS), as available market research indicates that these two vendors are the only Cloud Service Providers (CSPs) capable of meeting the Department’s requirements. However, as noted in its Pre-Solicitation Notice, the Department will immediately engage with industry and continue its market research to determine whether any other U.S.-based hyperscale CSPs can also meet the DOD’s requirements. If so, those Department will also negotiate with those companies.
§ United States (U.S.) Privacy and Civil Liberties Oversight Board (PCLOB) Member Travis LeBlanc released his reasons for opposing a report earlier this year on a key executive order the United States (U.S.) government has used for nearly 40 years to conduct surveillance. Earlier this year, PCLOB issued its long awaited report on Executive Order (EO) 12333, its “capstone to its more than six-year examination of the government’s use” of the EO, a “foundational document for the United States’ (U.S.) foreign intelligence efforts.” However, as pointed out by one expert, this PCLOB report is much shorter than its reports on the Foreign Intelligence Surveillance Act provisions in Sections 215 and 702 and the USA Freedom Act Telephone Call Records Program. Moreover, the EO 12333 report lacks recommendations. This likely displeased more than privacy and civil liberties advocates because the European Data Protection Board (EDPB), among other European Union (EU), were awaiting the PCLOB to provide more insight into the use of EO 12333 for collecting the data of EU residents. LeBlanc stated:
o It is with deep regret that I must write in opposition to the release of a report that the former majority of the Board in 2020 ("former Board" or "former Board majority") rushed last year to approve without adequate investigations analysis, review, or process. While I remain grateful to our Board staff for the many years of effort they have devoted to XKEYSCORE's oversights I had hoped that the former majority of the Board would have conducted a more thorough investigation of this highly-classified surveillance program that is unlikely to be scrutinized by another independent oversight authority in the near future.
o First, I voted against the XKEYSCORE report because the former Board majority failed to use its investigation into Executive Order (EO) 12333 activities to delve into important technological and modern electronic surveillance issues dominating the public discourse, like the use of algorithmic decision-making.
o Second, the former Board majority failed to adequately investigate or evaluate the National Security Agency's ("NSA") EO 12333 collection activities. Obviously, NSA can process and query communications through XKEYSCORE only once it has access to those communications. While collection and querying are separate activities, they are intertwined and both are worthy of review for separate legal analysis, training, compliance, and audit processes. This is true whether the collection and querying activities are performed by humans or machines. What may be a reasonable amount of "incidental" collection in one program or activity may well be unreasonable in other Similarly, protections that are designed to mitigate incidental collection may be reasonable in one program or activity and unreasonable in other contexts. On these points and Others, the former Board's report unfortunately reads more like a book report summary of the XKEYSCORE program than an independent oversight analysis grappling with key concerns in this evolving technological and legal landscape.
o Third, the firmer Board majority had the opportunity to engage in evidence-based policy making; however, it concluded a report lacking analysis Of the efficacy, Costs, and benefits Of XKEYSCORE.
o Fourth, the former Board majority failed to adequately investigate the compliance program in place for XKEYSCORE Unfortunately, when the former Board requested any legal analysis by the NSA or the Department of Justice ("DOJ") regarding the use Of XKEYSCORE's functions in 2015, the NSA provided a 13-page memo prepared by the NSA Office of General Counsel ("OGC") in 2016. The response made it appear as if NSA had not prepared a written analysis Of the legality of XKEYSCORE until prompted by PCLOB.
§ The United Kingdom’s Department for Digital, Culture, Media & Sport (DCMS) announced that “[a] new plan to boost economic growth and help the country seize the potential of digital technology will be launched.” DCMS claimed:
o The government’s Plan for Digital Regulation aims to reduce red tape and cut down on cumbersome and confusing policy so businesses are freed to come up with new ideas, grow their firms and create new jobs and prosperity.
o The vision is to drive prosperity through pro-innovation regulation of digital technologies while minimising serious harms to the country’s economy, security and society.
o The digital sector is one of the UK’s most dynamic and important industries. It contributed £151 billion to the economy in 2019, attracted more venture capital funding (£11.2 billion) than Germany and France combined in 2020, and employs more than two million people.
o The new plan sets out three guiding principles policymakers must follow and states that the government should only regulate when absolutely necessary and do so in a proportionate way. They should:
§ Actively promote innovation. Policymakers must back innovation wherever they can by removing unnecessary regulation and burdens and considering non-regulatory measures such as technical standards first.
§ Achieve ‘forward-looking and coherent outcomes’. Digital technologies are evolving fast and transforming traditional sectors across the economy, so policymakers must make sure new regulation complements, rather than contradicts, existing and planned legislation.
§ Exploit opportunities and address challenges in the international arena. Digital technologies are borderless and policymakers must take a global view. They must always consider the international dynamics of proposed regulation - from our existing international obligations including trade deals, expected future agreements, and the impact of regulations developed by other nations.
§ The United States (U.S.) Department of Defense (DOD) issued its customary overview of its annual budget request for information technology and cyberspace activities. For FY 2022, the DOD explained:
o The Department of Defense (DoD) Fiscal Year (FY) 2022 Information Technology/Cyberspace Activities (IT/CA) Budget Request is $50.6B, including $12B in cyber/ classified IT/CA investments and $38.6B in unclassified IT investments. The FY 2022 request reflects an overall 4% increase from the DoD FY 2021 enacted IT/CA Budget.
o The DOD included this chart:
§ Australia’s Treasury “has released exposure draft amendments to the Consumer Data Right (CDR) rules and explanatory materials for consultation.” The agency added:
o The exposure draft amendments include changes to the CDR Rules to accelerate the benefits of the CDR for consumers by reducing barriers to participate in open banking and by allowing more Australians to leverage their data in common banking scenarios. They will support growth of the CDR ecosystem and increase participation in the CDR by data recipients and consumers by:
§ Introducing a sponsored tier of accreditation and a CDR representative model
§ Allowing consumers to share their data with trusted professional advisers
§ Allowing participants to share CDR insights with consumer consent for specific purposes
§ Creating a single consent data sharing model for joint accounts.
§ The Norwegian Consumer Authority has issued a report titled “Time To Ban Surveillance-Based Advertising: The case against commercial surveillance online” and asserted:
o A ban on surveillance-based practices should be complemented by stronger enforcement of existing legislation, including the General Data Protection Regulation, competition regulation, and the Unfair Commercial Practices Directive. However, enforcement currently consumes significant time and resources, and usually happens after the damage has already been done. Banning surveillance-based advertising in general will force structural changes to the advertising industry and alleviate a number of significant harms to consumers and to society at large.
o A ban on surveillance-based advertising does not mean that one can no longer finance digital content using advertising. To illustrate this, we describe some possible ways forward for advertising-funded digital content, and point to alternative advertising technologies that may contribute to a safer and healthier digital economy for both consumers and businesses.
o As pervasive commercial surveillance seeps into all aspects of our daily lives, it becomes clear that there is a need for a systemic reform of the online advertising industry. Discussions are currently under way in the European Union about how to handle surveillance-based advertising as a part of the Digital Services Act. At the same time, discussions are going on about enacting federal privacy legislation and legislative initiatives to curb surveillance-based advertising in the United States, where many of the companies engaged in surveillance-based advertising are headquartered. We therefore stand before a unique legislative opportunity to solve many pressing issues.
o The result of these discussions could have significant consequences for the business model of the majority of online content, and consumers could stand to benefit from a new preventive approach. This document provides an overview of the challenges of surveillance-based advertising, and can thus be considered a part of these ongoing policy discussions.
§ The Department of Veterans Affairs’ (VA) Office of the Inspector General (OIG) released another in a string of reports on the VA’s troubled acquisition of technology for electronic health records (EHR). In “Unreliable Information Technology Infrastructure Cost Estimates for the Electronic Health Record Modernization Program,” the OIG found:
o This report identified weaknesses in how the Office of Electronic Health Record Modernization (OEHRM) developed and reported the cost estimates for IT infrastructure upgrades needed to support the new electronic health record system. The OIG found that the estimate of about $4.3 billion was not reliable, and a lack of complete documentation made it difficult to determine the accuracy of estimates, in itself a problem. Furthermore, the OIG determined that VA did not report to Congress other, critical program-related IT infrastructure upgrade costs totalingabout$2.5billion, thus underreporting the program life-cycle costs by a significant amount. This lapse in reliable reporting occurred because certain IT infrastructure upgrade costs are assumed by VA’s Office of Information and Technology (OIT) and the Veterans Health Administration (VHA), and there were inadequate procedures for determining if a cost-estimate update is needed in the office’s congressionally mandated reports and, if so, when this update should occur.
Further Reading
Photo by Luis Quintero from Pexels
§ “How Washington got back into trustbusting” By Ron Knox — The Washington Post. Most folks who follow the ebb and flow of federal regulation expected the Senate to easily confirm Columbia University law professor Lina Khan to join the Federal Trade Commission. And, on June 15, it did, by a 69-to-28 vote. Few expected what happened next. Just hours after that vote, Sen. Amy Klobuchar (D-Minn.) leaked the news during another Senate hearing: President Biden had chosen Khan, a 32-year-old tech critic and anti-monopoly crusader, to lead the FTC as its chair. The startling decision put one of the most prominent critics of corporate power in charge of the agency best able to combat big businesses on behalf of workers, small businesses and consumers. “Congress created the FTC to safeguard fair competition and protect consumers, workers, and honest businesses from unfair & deceptive practices,” Khan tweeted. “I look forward to upholding this mission with vigor and serving the American public.”
§ “The Apple-Microsoft Tech War Reignites for a New Era” By Tim Higgins and Aaron Tilley — The Wall Street Journal. A new clash of tech titans is taking shape as Apple Inc. and Microsoft Corp. reignite a feud that dates back to the formative days of the personal computer era. The companies’ co-founders, Apple’s Steve Jobs and Bill Gates at Microsoft, battled early in their history before largely burying the hatchet. In recent months, both companies have taken up arms again in a skirmish that is roiling other tech companies and their customers.
§ “Fast, reliable broadband … it’s now a key selling point for house hunters” By Shane Hickey — The Guardian. It used to be that demand for homes centred on the proximity to good schools, or how close they were to a nice restaurant or pub. Now, before they sign on the dotted line, homebuyers want to ensure they can download a film quickly, or check their work emails without interruption. Access to reliable and fast broadband is one of the key priorities as working from home looks set to become a more permanent arrangement for many. And a surge of interest in people wanting to move to the country has been coupled with demand for good internet in areas that might otherwise have weak connections.
§ “Jack Cable, Stanford student and cyber whiz, aims to crowdsource ransomware details” By Tim Starks — cyberscoop. Ransomware has never been more of a national security concern after a string of hacks against the fuel supplier Colonial Pipeline, meat giant JBS and perhaps thousands of others compromised after a breach at a large IT firm. Few people, if any, seem to grasp the breadth and cost of the scourge, as there are no legal requirements for victims to disclose when they pay hackers to unlock their network. That, combined with the suspicion that most victims don’t report their digital extortion payments, makes it harder for law enforcement and security firms to combat attacks, or even understand how to fight them.
§ “Software Firm at Center of Ransomware Attack Was Warned of Cyber Flaw in April” By Dustin Volz and Robert McMillan — The Wall Street Journal. The software company linked to a massive ransomware spree that began last week and has impacted hundreds of organizations across the globe was notified in early April of a cybersecurity vulnerability used in the attack, according to the Dutch security researcher group that discovered the issue. Kaseya Ltd., a Miami-based software supplier that helps technology-service providers manage computer networks, was told of a serious cybersecurity hole in its Kaseya VSA software on April 6, Victor Gevers, chairman of the Dutch Institute for Vulnerability Disclosure, said Wednesday. Mr. Gevers’s organization, which is a volunteer-run security group, discovered the flaw.
§ “CISA Starts Cataloging Bad Practices in Cybersecurity” By Mariam Baksh — Nextgov.The Cybersecurity and Infrastructure Security Agency released a list of two bad practices Tuesday in an effort to help critical infrastructure providers prioritize their cybersecurity responsibilities. The bad practices are using unsupported or “end-of-life” software, and using known/fixed/default passwords and credentials, according to a blog post published by CISA Executive Assistant Director Eric Goldstein. He said the list is deliberately focused and that the dangerous practices listed are exceptionally egregious in internet-accessible technologies.
§ “It’s 2021, why are dating app algorithms still so bad?” By Emma Hughes — WIRED. It is a truth universally acknowledged that lockdown was a boom time for dating apps. And now that the world is finally opening up again, single people are stampeding towards them in even greater numbers – Hinge has reported a 63 per cent spike in downloads since 2019 and a tripling of revenue in 2020, while May 2021 alone saw more than 6.5 million people downloading Tinder. But while this level of interest might be new, actually being on a dating app seems, anecdotally, to be the same old story: a largely fruitless cycle of swiping, matching, initial interest and near-inevitable disappointment. Nobody who’s spent any amount of time on them would be surprised to hear that Tinder and Grindr rank in the top 10 of apps most likely to leave users feeling sad (meanwhile, not a single dating or hook-up app made a parallel list of the 15 apps most conducive to happiness).
§ “Debit Card Apps for Kids Are Collecting a Shocking Amount of Personal Data” By Todd Feathers — Vice. The fintech company Greenlight says that its app and debit card for kids is a financial literacy tool that gives parents “superpowers” to set strict controls on their children’s spending. Parents can use the app to pay allowances, choose which stores the connected debit cards work at, set spending limits, and receive instant notifications whenever their child makes a purchase. But there’s one thing Greenlight makes it very hard for parents to control: What the company does with the mountains of sensitive data it collects about children.
§ “How to Shop Online and Not Get Ripped Off” By Nicole Nguyen — The Wall Street Journal. Shopping online can feel like magic. A couple of clicks and a few days later, a box is on your doorstep. But it can also feel like playing a game of digital dodgeball. Shoppers must tread carefully to avoid the fake reviews, unsafe or mislabeled products, or counterfeit goods hiding behind legitimate-seeming listings. That’s because we are living in the Marketplace Era. Third-party sellers peddle their wares via the digital bazaars of Amazon, Walmart, Target—and now even Urban Outfitters and J.Crew. The deal could be seen as a win-win: Customers get a larger selection of products and retailers get a cut of more sales. The problem? It adds a layer of mystery for shoppers. Instead of buying from that site you know, you’re buying from a seller you don’t know, and often it’s a product whose brand you’ve never heard of.
§ “Republicans' new plan to tax Big Tech” By Margaret Harding McGill — Axios. Key Republicans are warming to an idea that was once anathema to the party — leveling taxes on big American companies to pay for internet subsidy programs.
§ “The broadband gap's dirty secret: Redlining still exists in digital form” By Shara Tibken — c/net. When Christina Wilson moved into Los Angeles public housing with her husband and teenage daughter four years ago, she tried to transfer her internet service plan to her new home. But, as is the case with many low-income communities in the US, the ISP didn't serve the Housing Authority of Los Angeles' Imperial Courts. In fact, no internet service providers offered speedy plans for any of LA's public housing facilities. Instead, they only offered pricey, slow plans insufficient for today's needs. So the 45-year-old relied on her smartphone's T-Mobile connection for anything she wanted to do online, while her daughter used her phone as a hotspot to attend her virtual film school classes. The mobile devices had unlimited data but came with caveats.
§ “Deputy Defense Secretary Says US Needs to Defend Its Artificial Intelligence Better” By Patrick Tucker — Nextgov. As the Pentagon rapidly builds and adopts artificial intelligence tools, Deputy Defense Secretary Kathleen Hicks said military leaders increasingly are worried about a second-hand problem: AI safety. AI safety broadly refers to making sure that artificial intelligence programs don’t wind up causing problems, no matter whether they were based on corrupted or incomplete data, were poorly designed, or were hacked by attackers.
§ “Rep. Ken Buck is trying to convince the GOP to hold tech companies accountable” By Cat Zakrzewski — The Washington Post. Rep. Ken Buck wants to dismantle the giants of Silicon Valley's power. But first he'll have to win over members of his own party. In interviews and on Twitter, the top Republican on the House antitrust panel has been trying to convince fellow conservatives to back a sweeping package of bills that would strengthen regulators, make it harder for tech companies to buy up rivals, and in the most severe instances, even break them up. The bills cleared their first major hurdle in Congress last week, but a marathon 29-hour markup session highlighted the rifts with Republicans over the legislation.
§ “Lawmakers urge DOJ review of T-Mobile-Dish spat” By Margaret Harding McGill — Axios. A bipartisan pair of Colorado lawmakers want the Justice Department to investigate T-Mobile's plans to shut down a network used by Dish customers — and take quick action if necessary.
§ “Schools and libraries can apply for FCC broadband relief funds starting Tuesday” By Marguerite Reardon — c/net. The Emergency Connectivity Fund, the program run through the Federal Communications Commission to subsidize broadband connectivity and devices for schools and libraries in response to the coronavirus pandemic, will begin accepting applications for funding starting Tuesday. This program is designed to help narrow the digital divide and homework gap that has left out millions of Americans, including school-age children and other vulnerable populations, who have traditionally relied on public libraries for internet access. The $7.17 billion Emergency Connectivity Fund Program, created through President Joe Biden's $1.9 trillion American Rescue Plan, will provide funding for schools and libraries across the country to buy laptops, tablets, Wi-Fi hotspots and broadband connections to help students and teachers access the internet for distance learning. Unlike traditional federal E-rate dollars, which are provided to schools and libraries through the FCC's Universal Service Fund, to help get those physical locations connected to the internet, the Emergency Connectivity Fund money can be used to serve students, school staff or library patrons who are off-campus.
Coming Events
§ On 14 July, the Senate Energy and Natural Resources Committee will “consider an original bill to invest in the energy and outdoor infrastructure of the United States to deploy new and innovative technologies, update existing infrastructure to be reliable and resilient, and secure energy infrastructure against physical and cyber threats, and for other purposes.”
§ The Senate Finance Committee’s Fiscal Responsibility and Economic Growth Subcommittee will hold a hearing titled “Defending and Investing in U.S. Competitiveness” on 14 July.
§ The Joint Economic Committee will hold a 14 July hearing titled “A Second Gilded Age: How Concentrated Corporate Power Undermines Shared Prosperity.”
§ On 14 July, the Senate Veterans Affairs Committee will hold a hearing titled “VA Electronic Health Records: Modernization and the Path Ahead.”
§ On 15 July, the Senate Commerce, Science, and Transportation Committee will convene a hearing titled “Implementing Supply Chain Resiliency.”
§ The House Homeland Security Committee will hold a 15 July hearing titled “Securing the Homeland: Reforming DHS to Meet Today's Threats.”
§ On 21 July, the Federal Trade Commission (FTC) will open its monthly open meeting with this agenda:
o Care Labeling Rule: In July 2011, the Commission initiated a regulatory review proceeding of the Care Labeling Rule. As part of the proceeding, the Commission has solicited public comments on multiple proposals to change the rule, including a proposal to repeal the Rule entirely. The Commission will vote on whether to rescind the proposal to repeal the Care Labeling Rule.
o Proposed Policy Statement on Repair Restrictions Imposed by Manufacturers and Sellers: The FTC Act authorizes the Commission to adopt policy statements. The Commission will vote on whether to issue a new policy statement, following the Commission's “Nixing the Fix” report which was unanimously agreed to and announced on May 6, 2021.
o Policy Statement on Prior Approval and Prior Notice Provisions in Merger Cases: In 1995, the Commission adopted a policy statement regarding “prior approval” and “prior notice” remedies in merger cases. The Commission will vote on whether to rescind this policy statement.
§ On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.