Washington State Privacy Act Advances
The top court in the U.S. handed down a pair of tech cases; the FTC opts against appealing Qualcomm antitrust victory; the FBI issues 2020 cyber crime figures
Data Privacy Bill Changed As It Moves Through House In Washington State
Facing the end of its legislative session, Washington State House lawmakers alter a bill that passed the Senate. It is not clear if these changes will be acceptable.
The Washington State House of Representatives has moved the “Washington Privacy Act” (SB 5062) through two committees, changing the bill in ways that may make it unacceptable to the Senate. The bill is poised to move through a third committee and then quite possibly to the floor of the House where it may be changed even more. However, there is a deadline of 11 April for the House to pass a bill, and if the House does so, the legislature has until 25 April to enact a final bill.
Is the third year the charm? Will Washington join California and Virginia in enacting data privacy laws?
Beyond whether Washington state enacts a data privacy law, one must also consider how each additional state legislating on privacy changes the calculus in Congress. Will a third privacy regime, one admittedly less onerous to industry than the two California bills (i.e. the “California Consumer Privacy Act” (CCPA) (AB 375) and the “California Privacy Rights Act” (CPRA) (Proposition 24)), prompt Democratic and Republican stakeholders to reach agreement on the remaining issues blocking privacy legislation? Will the Democrats in Olympia fail to come to agreement again this year?
The House Civil Rights & Judiciary Committee held a pair of hearings (here and here) on SB 5062 after the Senate passed the bill, the latter of which included changing the bill. Thereafter, the House Rules and Appropriations Committees held hearings (the latter being here), with the latter reporting out the bill. The bill was modified in the House Civil Rights & Judiciary Committee and the other committees have thus far signed off on those changes. Notably, Washington residents would be able to sue for privacy violations but only for injunctive relief and not for civil penalties. This limited private right to sue could possibly serve as a model for federal legislation since this has been long favored by most Democrats and opposed by most Republicans, for it would likely shield companies from large class actions looking for millions in dollars of damages. There are other tweaks that are likely disfavored by industry such as expanding the right of access to include the actual personal data a controller holds as opposed to categories of personal data.
Finally, two of the Congressional stakeholders on privacy and data security hail from Washington state, and consideration and possible passage of a state law may limit their latitude on a federal bill they could support. Senator Maria Cantwell (D-WA) and Representative Cathy McMorris Rodgers (R-WA), who are the chair of the Senate Commerce, Science, and Transportation Committee and ranking member of House Energy and Commerce Committee respectively, are expected to be involved in drafting their committee's privacy bills, and a Washington state statute may affect their positions in much the same way as the CCPA and CPRA has informed a number of California Members' position on privacy legislation, especially with respect to bills being seen as weaker than California’s privacy regime.
On 3 March, the Washington State Senate passed the “Washington Privacy Act” (SB 5062) by a 48-1 vote. SB 5062 tracks closely with the two bills produced by the Washington Senate and House last year lawmakers could not ultimately reconcile. However, there are no provisions on facial recognition technology, which was largely responsible for sinking a privacy bill in Washington State two years ago. (see here for analysis.) However, there is a rival bill in the House, perhaps the first among others, the “People’s Privacy Act” (HB 1433), is among the strongest privacy bills introduced in the United States (U.S.) (see here for more analysis.) Getting to agreement on privacy legislation in Washington will likely not prove easy.
Nonetheless, the bill as passed differs in a few ways from the legislation introduced. First, the passed bill carves out the state’s judicial branch and airlines. Second, it is made clear that controllers and processors are responsible only for the responsibilities the bill assigns them, suggesting there was concern that some ambiguities may have made controllers responsible for obligations processors are to meet and/or vice versa. Thirdly, the language barring the sale of personal information to third parties under loyalty and rewards programs was weakened. Previously, these sales could not occur unless three conditions were met; in the revised bill, controllers still cannot sell a person’s information unless the three conditions are met and the person has exercised her right to opt of the selling of her information. Fourth, the Joint Legislative Audit and Review Committee (a body consisting of four Representatives and four Senators) “must review the efficacy of the attorney general providing controllers and processors with warning letters and 30 days to cure alleged violations in the warning letters…and report its findings to the governor and the appropriate committees of the legislature” by 1 December 2025.
As mentioned, the House Civil Rights & Judiciary Committee took up the second substitute of SB 5062 and made changes. At the end of the bill as modified by the House Civil Rights & Judiciary Committee, an accurate summary of the changes in the bill is provided:
EFFECT: Makes the following changes in Part I of the bill relating to consumer personal data privacy:
(1) Modifies the definition of "deidentified data" to require that controllers take reasonable measures to ensure that the data cannot be associated not only with a natural person, but also with a household or device.
(2) Specifies that personal data includes pseudonymous data.
(3) Adds the definition of "minor" to mean an individual who is at least 13 and under 16 years of age under circumstances where a controller has actual knowledge of, or willfully disregards, the minor's age.
(4) Modifies the definition of "targeted advertising" to mean displaying advertisements selected on the basis of a consumer's activities across one or more distinctly branded websites, rather than across nonaffiliated websites. Specifies that targeted advertising does not include advertising based on activities within a controller's own commonly branded websites, rather than a controller's own websites.
(5) Exempts from the bill nonprofit organizations that are registered with the Secretary of State under the Charities Program, collect personal data during legitimate activities related to the organization's tax-exempt purpose, and do not sell personal data collected by the organization.
(6) Provides that a consumer has the right to access the personal data a controller is processing, rather than the right to access the categories of personal data a controller is processing.
(7) Provides that, beginning July 31, 2023, a consumer may exercise the right to opt out of sale and targeted advertising by designating an authorized agent or via user-enabled global privacy controls, such as a browser plug-in or privacy setting, device setting, or other mechanism, that communicates or signals the consumer's choice to opt out.
(8) Provides that a controller must respond to a request to exercise the right to access personal data within 45 days of receiving the request.
(9) Allows a consumer to appeal within a reasonable period of time after a controller refuses to take action on the consumer's right request, rather than after the consumer's receipt of the controller's notice that the controller did not take action on the consumer's request.
(10) Requires the mandatory privacy notice to use clear and plain language and be understandable to the least sophisticated consumer, as well as be in English and any other language in which a controller communicates with the consumer to whom the information pertains.
(11) Requires controllers to obtain a minor's consent prior to processing the minor's personal data for the purposes of targeted advertising or the sale of personal data.
(12) Adds a private right of action for consumers alleging a violation of the consumer data rights. Limits remedies to appropriate injunctive relief and requires the court to award reasonable attorneys' fees and costs to any prevailing plaintiff.
(13) Expires the right to cure violations one year after the effective date of the bill. Removes the statutory penalties from the provisions related to enforcement by the Attorney General and instead provides that after the expiration of the right to cure, when determining a civil penalty, the court must consider a controller's or processor's good faith efforts to cure as mitigating factors.
(14) Provides that the bill does not create any independent causes of action, except for the actions brought by the Attorney General. Specifies that nothing in the bill limits any other causes of action and that the rights and protections in the bill are not exclusive.
(13) Requires the Joint Legislative Audit and Review Committee study on the efficacy of the Attorney General providing controllers and processors to be completed by December 1, 2023, rather than December 1, 2025.
Makes the following changes to Part 2 of the bill relating to data privacy and public health emergency (private sector):
(1) Modifies the definition of "consent" to align with the same definition in Part 1 of the bill relating to consumer personal data privacy.
(2) Modifies the definition of "deidentified data" to require that controllers take reasonable measures to ensure that the data cannot be associated not only with a natural person, but also with a household or device.
(3) Adds a private right of action for consumers alleging a violation of the consumer data rights. Limits remedies to appropriate injunctive relief and requires the court to award reasonable attorneys' fees and costs to any prevailing plaintiff.
(4) Expires the right to cure violations one year after the effective date of the bill. Removes the statutory penalties from the provisions related to enforcement by the Attorney General and instead provides that after the expiration of the right to cure, when determining a civil penalty, the court must consider a controller's or processor's good faith efforts to cure as mitigating factors.
(5) Provides that the bill does not create any independent causes of action, except for the actions brought by the Attorney General. Specifies that nothing in the bill limits any other causes of action and that the rights and protections in the bill are not exclusive.
Makes the following changes to Part 3 of the bill relating to data privacy and public health emergency (public sector):
(1) Modifies the definition of "consent" to align with the same definition in Part 1 of the bill relating to consumer personal data privacy.
(2) Modifies the definition of "deidentified data" to require that controllers take reasonable measures to ensure that the data cannot be associated not only with a natural person, but also with a household or device.
Makes nonsubstantive technical corrections, such as correcting "if" to "is" in the definition of "technology-assisted contact tracing" in Part 3 of the bill.
A number of these provisions bear note or explication. First, the private right of action is the highest profile change, but it is, as noted above, a limited right, for Washington state residents could sue for “appropriate injunctive relief,” a vague term that does not explain the range of injunctive relief available to plaintiffs. Would this allow people to ask for and receive restitution (i.e., the return or restoration of ill-gotten funds or a benefit) or restitution? If not, the extent of the relief could be to stop a controller or processor from engaging in activity that violates the new statute.
Moreover, the right to sue is limited to only some violations. People could only sue if one of their rights in Section 103 are violated (right to access, right to correct, right to delete, right to port one’s data, and a right to opt out of targeted advertising, the sale of personal data, and profiling people in ways that have legal effects) or those rights in Section 107 (the bar on processing personal data in ways that violate civil rights, processing sensitive data without consent, or processing the personal data of a minor for targeted advertising or selling her personal data without consent.)
People who sue would be able to ask the court to recover their costs and attorney’s fees, which could prove to be an incentive necessary to entice attorneys to take these cases even though they would not be able to recover civil damages.
It must also be noted the House’s bill makes clear that existing causes of action are not prohibited, and so, if a person could currently sue alleging a privacy violation under contract or tort law, she would still be able to do so after enactment. The Senate’s bill would not have allowed this.
The House’s bill also does away with the mechanism by which the attorney general must inform controllers and processors of violations, starting a 30 day period during which they can cure the violation and thus avoid an enforcement action. On 31 July 2023, this process would end, and the attorney general would not need to give entities notice and a chance to correct the wrongdoing before bringing an enforcement action. This was opposed by industry stakeholders in the hearings held on the revised bill.
The right of access is expanded. Now, in response to a request from a resident, controllers must provide the actual personal data they have on the resident. In the Senate-passed SB 5062, controllers only needed to provide the categories of personal information. Moreover, the House’s bill would include requests to access in the same 45-day clock controllers would have to respond to the other requests to exercise rights (i.e., right to delete, right to port, right to correct, and rights to opt out.)
Also of interest, the House’s bill expands the means by which one may exercise his rights to include the use of a designated agent or “user-enabled global privacy controls” (e.g., a browser plug-in or some other technological means) to convey the person’s wishes. And yet, as there is no rulemaking authority provided under either version of SB 5062, what constitutes user-enabled global privacy controls. Based on research about the difficulty of California residents seeking to exercise their rights, one can safely assume that Washington state residents will encounter resistance from businesses if they were to use such controls. What will suffice to serve as user-enabled global privacy controls may well fall to a court to hammer out.
The House added a new category of persons to the bill: minors. This class is defined as those between 13 and 16, and controllers and processors would not be able to sell their personal data or use it for targeted advertising unless the minor opts in.
Finally, privacy and civil liberties groups would like to see stronger provisions added to SB 5062. During testimony before the Appropriations Committee an attorney working with the American Civil Liberties Union questioned whether the attorney general would have the resources necessary to enforce the act in a meaningful considering that committee staff stated the attorney general’s office is only requesting funding for 3 ½ full time positions. This attorney also questioned the wisdom of not allowing plaintiffs to recover civil penalties, for, in his experience, very few attorneys would take such cases, especially in light of the tendency of courts not to allow litigants to recover all their attorneys costs.
Moreover, in the view of many of these organizations, the bill, even as amended, would allow much of current data collection and processing activities to continue. Washington state residents would potentially be faced with opting out with every controller if they do not want their personal data sold or do not want to be targeted by advertising. But, controllers would be able to use their personal data for a wide range of purposes regardless of whether a person objects. For example, the bill would allow controllers to collect personal data and then use it for its own targeted advertising because the definition of the practice exempts these activities if they are “[b]ased on activities within a controller's own commonly branded websites or online applications.” And so, a Google or Facebook could bombard targeted ads at a user so long as they are based only on the personal data the company collected and processed.
The Supreme Court of the United States (SCOTUS) reversed a decision by the United States Court of Appeals for the Ninth Circuit that Facebook’s login notification text system, especially its database of phone numbers, violated a federal law barring abusive telemarketing practices. The case turned on whether Facebook’s system is an “automatic telephone dialing system” as defined by the statute, and SCOTUS found that the company’s system did not for in order for such a system to qualify it must be able to store or produce phone numbers using a random or sequential generator. In relevant part, SCOTUS found:
The Telephone Consumer Protection Act of 1991 (TCPA) proscribes abusive telemarketing practices by, among other things, imposing restrictions on making calls with an “automatic telephone dialing system.” As defined by the TCPA, an “automatic telephone dialing system” is a piece of equipment with the capacity both “to store or produce tele-phone numbers to be called, using a random or sequential number generator,” and to dial those numbers. 47 U. S. C. §227(a)(1). The question before the Court is whether that definition encompasses equipment that can “store” and dial telephone numbers, even if the device does not “us[e] a random or sequential number generator.” It does not. To qualify as an “automatic telephone dialing system,” a device must have the capacity either to store a telephone number using a random or sequential generator or to produce a telephone number using a random or sequential number generator.
Petitioner Facebook, Inc., maintains a social media platform with an optional security feature that sends users “login notification” text messages when an attempt is made to access their Facebook account from an unknown device or browser. If necessary, the user can then log into Face-book and take action to secure the account. To opt into this service, the user must provide and verify a cell phone number to which Facebook can send messages.
In 2014, respondent Noah Duguid received several login-notification text messages from Facebook, alerting him that someone had attempted to access the Facebook account as-sociated with his phone number from an unknown browser. But Duguid has never had a Facebook account and never gave Facebook his phone number. Unable to stop the notifications, Duguid brought a putative class action against Facebook. He alleged that Facebook violated the TCPA by maintaining a database that stored phone numbers and programming its equipment to send automated text messages to those numbers each time the associated account was accessed by an unrecognized device or web browser.
Facebook moved to dismiss the suit, arguing primarily that Duguid failed to allege that Facebook used an auto-dialer because he did not claim Facebook sent text messages to numbers that were randomly or sequentially generated. Rather, Facebook argued, Duguid alleged that Facebook sent targeted, individualized texts to numbers linked to specific accounts. The U. S. District Court for the Northern District of California agreed and dismissed Duguid’s amended complaint with prejudice. 2017 WL 635117, *4–*5 (Feb. 16, 2017).
The United States Court of Appeals for the Ninth Circuit reversed. As relevant here, the Ninth Circuit held that Duguid had stated a claim under the TCPA by alleging that Facebook’s notification system automatically dialed stored numbers. An auto dialer, the Court of Appeals held, need not be able to use a random or sequential generator to store numbers; it need only have the capacity to “‘store numbers to be called’” and “‘to dial such numbers automatically.’” 926 F. 3d 1146, 1151 (2019) (quoting Marks v. Crunch San Diego, LLC, 904 F. 3d 1041, 1053 (CA9 2018)).
We granted certiorari to resolve a conflict among the Courts of Appeals regarding whether an auto dialer must have the capacity to generate random or sequential phone numbers.4 591 U. S. ___ (2020). We now reverse the Ninth Circuit’s judgment.
The Supreme Court of the United States (SCOTUS) also upheld the 2017 Federal Communications Commission’s (FCC) revision of media ownership rules that were challenged on the grounds that the agency did not properly account for the likely effect on minority and female ownership of media. SCOTUS disagreed and found the FCC acted legally in promulgating those rules. SCOTUS explained:
Under the Communications Act of 1934, the Federal Communications Commission possesses broad authority to regulate broadcast media in the public interest. Exercising that statutory authority, the FCC has long maintained strict ownership rules. The rules limit the number of radio stations, television stations, and newspapers that a single entity may own in a given market. Under Section 202(h) of the Telecommunications Act of 1996, the FCC must review the ownership rules every four years, and must repeal or modify any ownership rules that the agency determines are no longer in the public interest.
In a 2017 order, the FCC concluded that three of its ownership rules no longer served the public interest. The FCC therefore repealed two of those rules—the Newspaper/Broadcast Cross-Ownership Rule and the Radio/Television Cross-Ownership Rule. And the Commission modified the third—the Local Television Ownership Rule. In conducting its public interest analysis under Section 202(h), the FCC considered the effects of the rules on competition, localism, viewpoint diversity, and minority and female ownership of broadcast media outlets. The FCC concluded that the three rules were no longer necessary to promote competition, localism, and viewpoint diversity, and that changing the rules was not likely to harm minority and female ownership.
A non-profit advocacy group known as Prometheus Radio Project, along with several other public interest and consumer advocacy groups, petitioned for review, arguing that the FCC’s decision was arbitrary and capricious under the Administrative Procedure Act. In particular, Prometheus contended that the record evidence did not support the FCC’s predictive judgment regarding minority and female ownership. Over Judge Scirica’s dissent, the U. S. Court of Appeals for the Third Circuit agreed with Prometheus and vacated the FCC’s 2017 order.
On this record, we conclude that the FCC’s 2017 order was reasonable and reasonably explained for purposes of the APA’s deferential arbitrary-and-capricious standard. We therefore reverse the judgment of the Third Circuit.
In a blog posting, the United Kingdom’s Information Commissioner’s Office (ICO) detailed its ambitions to update its guidance on anonymisation and pseudonymisation, and “to explore the role that privacy enhancing technologies might play in enabling safe and lawful data sharing.” The agency will solicit input before drafting this guidance, however. The ICO explained:
The recent ICO Data Sharing Code of Practice provides organisations with a practical guide on how to share personal data in line with data protection law. However, we recognise there are other dimensions to data sharing. The code is not a conclusion, but a milestone in this ongoing work. We will continue to provide clarity and advice in how data can be shared in line with the law.
Building on this promise, we are now outlining our plans to update our guidance on anonymisation and pseudonymisation, and to explore the role that privacy enhancing technologies might play in enabling safe and lawful data sharing. We recognise that questions about when data is personal data or anonymous information are some of the most challenging issues organisations face.
Our refreshed guidance will assist organisations in meeting these challenges. We will set out our views on approaches like the spectrum of identifiability, and how these can be practically applied. We will provide advice on how to assess the appropriate controls that need to be in place and we will be grounding our guidance in practical steps organisations can take.
The key topics we will be exploring include:
Anonymisation and the legal framework – legal, policy and governance issues around the application of anonymisation in the context of data protection law;
Identifiability – outlining approaches such as the spectrum of identifiability and their application in data sharing scenarios, including guidance on managing re-identification risk, covering concepts such as the ‘reasonably likely’ and ‘motivated intruder’ tests;
Guidance on pseudonymisation techniques and best practices;
Accountability and governance requirements in the context of anonymisation and pseudonymisation, including data protection by design and DPIAs;
Anonymisation and research - how anonymisation and pseudonymisation apply in the context of research;
Guidance on privacy enhancing technologies (PETs) and their role in safe data sharing;
Technological solutions – exploring possible options and best practices for implementation; and
Data sharing options and case studies – supporting organisations to choose the right data sharing measures in a number of contexts including sharing between different organisations and open data release. Developed with key stakeholders, our case studies will demonstrate best practice.
In a statement, acting Federal Trade Commission Chair Rebecca Kelly Slaughter announced the agency would not appeal the United States Court of Appeals for the Ninth Circuit decision for Qualcomm in the FTC’s antitrust action. Last summer, the Ninth Circuit reversed a lower court’s ruling that FTC has indeed proven that Qualcomm’s chip licensing practices violated United States antitrust law (see here for more detail and analysis.) According to one account, the FTC faced unfavorable timing because it had a deadline, but the agency lacks a full complement of members and the Department of Justice’s (DOJ) Solicitor General’s office is also not fully staffed. Moreover, it was reported the DOJ did not want to appeal the case because the FTC did not have a strong position. This announcement comes a few weeks after Slaughter went before a House committee and argued the FTC should bring hard cases in an attempt to more vigorously enforce antitrust law. Slaughter argued:
Given the significant headwinds facing the Commission in this matter, the FTC will not petition the Supreme Court to review the decision of the Court of Appeals for the Ninth Circuit in FTC v. Qualcomm. The FTC’s staff did an exceptional job presenting the case, and I continue to believe that the district court’s conclusion that Qualcomm violated the antitrust laws was entirely correct and that the court of appeals erred in concluding otherwise. Now more than ever, the FTC and other law enforcement agencies need to boldly enforce the antitrust laws to guard against abusive behavior by dominant firms, including in high-technology markets and those that involve intellectual property. I am particularly concerned about the potential for anticompetitive or unfair behavior in the context of standard setting and the FTC will closely monitor conduct in this arena.
European Data Protection Board (EDPB) adopted a statement on the draft ePrivacy Regulation stated it welcomed “the agreed negotiation mandate adopted by the Council on the protection of privacy and confidentiality in the use of electronic communication services (’the Council’s position’), as a positive step towards a new ePrivacy Regulation.” However, the EDPB offered observations and critiques aimed at ensuring the European Union’s (EU) rewrite of privacy in electronic communications rules heed existing EU law, especially the General Data Protection Regulation (GDPR). The EDPB pointed to recent Court of Justice for the European Union case law barring indiscriminate collection and retention of location and metadata in criminal cases save for the most dangerous threats. The Board stressed the ePrivacy Regulation must ensure the confidentiality of communications, would like to see a number of exceptions to the bar on data processing tightened, emphasized the need for strong encryption, called for better language on consent in line with the GDPR, and advocated for a better system of cooperation among national authorities in enforcing the new regime.
The Government Accountability Office (GAO) issued its most recent assessment of the Office of Management and Budget’s (OMB) Data Center Optimization Initiative (DCOI) at the request of the Armed Services Committees, the Senate Homeland Security and Governmental Affairs, and the House Oversight and Reform Committee. The GAO found that the data agencies reported would achieve $1.1 billion in savings in shutting down or consolidating data centers. However, the GAO found that Trump Administration guidance resulted in agencies being able to measure success in a key metric according to their own standards, which has undermined efficiency. The GAO noted that 53 of its 125 recommendations made on the DCOI since 2016 remain open. The GAO called on OMB to “reexamine its DCOI guidance regarding how to measure server utilization and revise it to better and more consistently address server efficiency.” The GAO concluded:
Agencies continue to report progress toward meeting their goals for data center closures and achieving the related savings. Specifically, almost all of the 24 DCOI agencies met their goals for data center closures in fiscal year 2019 and also planned to meet their closure goals for 2020. Additionally, in fiscal year 2019, almost all of the agencies met their savings goals and all planned to meet their 2020 cost savings goals for a total of $1.1 billion in savings over the 2 years. While agencies’ efforts in both respects have made an important contribution to achieving the overall goals of DCOI, taking action to address our prior recommendations could help those agencies that did not meet their goals to achieve even more benefits from DCOI.
Agencies reported mixed progress against OMB’s optimization metrics for both fiscal years 2019 and 2020. While most agencies have not met all of their optimization targets, taking action to address our prior recommendations could help those agencies to realize fully the expected benefits of DCOI.
While OMB developed an effective server utilization metric in 2016, the agency’s 2019 DCOI guidance revisions resulted in a metric that no longer reported on actual server utilization, resulting in an incomplete picture of utilization. Without better guidance on how to report on server utilization, the server-related optimization metrics will lack meaningful information about agencies’ DCOI performance. Absent complete information, OMB and Congress may be hindered in providing oversight and making appropriate decisions about budgeting for data center utilization.
The Federal Bureau of Investigation (FBI) issued its “2020 Internet Crime Report,” in which the agency reported a record number of claims submitted to its Internet Crime Complaint Center (IC3). The FBI explained:
IC3 received a record number of complaints from the American public in 2020: 791,790, with reported losses exceeding $4.1 billion. This represents a 69% increase in total complaints from 2019. Business E-mail Compromise (BEC) schemes continued to be the costliest: 19,369 complaints with an adjusted loss of approximately $1.8 billion. Phishing scams were also prominent: 241,342 complaints, with adjusted losses of over $54 million. The number of ransomware incidents also continues to rise, with 2,474 incidents reported in 2020.
Senators Josh Hawley (R-MO), Mike Lee (R-UT), and Marsha Blackburn (R-TN) wrote the acting chair of the Federal Trade Commission (FTC) and the chair of the Senate Judiciary Committee regarding Politico’s blockbuster story that the Obama Administration’s FTC overruled staff recommendations that Google be sued for antitrust violations. Then then FTC, led by Chair Jon Leibowitz, declined to move forward with essentially the same case the United States (U.S.) Department of Justice (DOJ) is now bringing against Google. The agency’s lawyers said sue, while the agency’s economists said do not. In their letter to acting FTC Chair Rebecca Kelly Slaughter, Hawley, Lee, and Blackburn asked that she “dispatch relevant officials to Congress as soon as possible,” and they asked Senate Judiciary Committee Chair Dick Durbin (D-IL) “to call Google executives and relevant FTC officials to publicly testify, including the former Commissioners who now represent big tech firms.” The Republican Senators are not calling for any legislation to correct to monopolies they decry in technology markets, however.
A coalition of groups have banded together to ban what they call “surveillance advertising” through a variety of policy changes, including “comprehensive privacy legislation to reforming our antitrust laws and liability standards.” They asserted:
Surveillance advertising – the core profit-driver for gatekeepers like Facebook and Google, as well as adtech middlemen – is the practice of extensively tracking and profiling individuals and groups, and then microtargeting ads at them based on their behavioral history, relationships, and identity.
These dominant firms curate the content each person sees on their platforms using those dossiers – not just the ads, but newsfeeds, recommendations, trends, and so forth – to keep each user hooked, so they can be served more ads and mined for more data.
Big Tech platforms amplify hate, illegal activities, and conspiracism – and feed users increasingly extreme content – because that’s what generates the most engagement and profit. Their own algorithmic tools have boosted everything from white supremacist groups and Holocaust denialism to COVID-19 hoaxes, counterfeit opioids and fake cancer cures. Echo chambers, radicalization, and viral lies are features of these platforms, not bugs—central to the business model.
And surveillance advertising is further damaging the information ecosystem by starving the traditional news industry, especially local journalism. Facebook and Google’s monopoly power and data harvesting practices have given them an unfair advantage, allowing them to dominate the digital advertising market, siphoning up revenue that once kept local newspapers afloat. So while Big Tech CEOs get richer, journalists get laid off.
“China has brought its repressive surveillance tools to Hong Kong” By Dan McDevitt — Nikkei Asia. In this piece by a “grants and communications manager at GreatFire.org, a group focused on monitoring and challenging Chinese internet censorship,” the argument is being made that the People’s Republic of China (PRC) is implementing new measures to restrict the internet access and technological freedom of those living in Hong Kong. The author calls on tech multinationals not to knuckle under to Beijing’s demands as has been the case in the past, for without cooperation from these companies, the government will have a more difficult time cracking down on free speech and dissidents. The tech companies may not want to jeopardize access to the PRC market to the extent they have a presence.
“Ransomware Gang Fully Doxes Bank Employees in Extortion Attempt” by Lorenzo Franceschi-Bicchierai — Vice’s Motherboard. Is doxing the next form ransoming will take? Or perhaps strategic leaks of embarrassing or compromising information?
“Massive Facebook study on users’ doubt in vaccines finds a small group appears to play a big role in pushing the skepticism” By Elizabeth Dwoskin — The Washington Post. It seems that just as with COVID and other viruses, the dispersion of anti-vaccine misinformation can be traced to a small number of super-spreaders. Facebook is currently trying to work through the posts of its platform that may contribute to hesitancy about the vaccine, which can range from relating symptoms after getting dosed to outright lies about the vaccines. In a perfect world, Facebook could land on a way to rejigger its algorithms to stop amplification of harmful content and promote content that would encourage people to get vaccinated.
“China Punishes Microsoft’s LinkedIn Over Lax Censorship” By Paul Mozur, Raymond Zhong and Steve Lohr — The New York Times. The Cyberspace Administration of China, Beijing’s online censor, informed LinkedIn’s subsidiary in the People’s Republic of China (PRC) it ran afoul of the country’s laws banning certain online content. According to this article, the agency did not bother to tell LinkedIn which posts were offensive. The Microsoft-owned company is a small player in the PRC, paling in comparison to WeChat, but it remains the only big American platform operating there.
“China emerges as quantum tech leader while Biden vows to catch up” Akira Oikawa, Yuki Okoshi and Yuki Misumi — Nikkei Asia. By one measure, the United States and Japan lead the People’s Republic of China (PRC) in patents for quantum computing, specifically in quantum computing hardware. But, in another regard, and overall, the PRC is ahead, especially in what experts foresee as the crucial field of quantum encryption and communications (i.e., the ability to both secure and read communications in a world with vastly more powerful computing.) The Biden Administration and Tokyo are keen to work together to catch up and negate the PRC’s lead.
The Federal Communications Commission (FCC) will hold an open meeting on 22 April. No agenda has been announced as of yet.
The Federal Trade Commission (FTC) will hold a workshop titled “Bringing Dark Patterns to Light” on 29 April.
On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.