A U.S. Privacy Bill Is Altered
Anti-TikTok bill is back, and Members are skeptical about Facebook's plans for an under-13 Instagram.
The Revised Information Transparency and Personal Data Control Act
A bill that may be supported by Democrats aligned with business and industry will likely not find much support among more liberal Democrats.
A centrist Democratic privacy bill is back; chances of enactment are low.
Cocktail Party
Representative Suzan DelBene (D-WA) has revised her data privacy bill and beat all the other sponsors of such legislation from the last Congress. Even though industry stakeholders lauded the bill, it is not likely to advance in this Congress. Nonetheless, it may be a good indication of what industry stakeholders and centrist Democrats would like in federal privacy and data protection legislation.
Meeting
DelBene has revised an already industry friendly bill and loosened it even further. This bill likely will not pass muster with fellow Democrats, for it still does not have a private right of action and it would still preempt state privacy laws. It is still based on the notice and consent model but the loopholes that allow for the collection and usage of personal information without consent seem to have been widened. And while the Federal Trade Commission (FTC) would seemingly be lined up with a doubling of its funding, this is not a certainty if the bill is enacted.
Geek Out
Last month, DelBene introduced the first major privacy bill of the 117th Congress that shares a name with the bill she and cosponsors had introduced in 2019: the “Information Transparency and Personal Data Control Act,” (H.R.1816) “legislation that would create a national data privacy standard to protect our most personal information and bring our laws into the 21st Century” according to an assertion in her press release (see here for more detail and analysis.)
Despite sharing the same name and having many of the same features, DelBene’s bill has been changed significantly, mostly in ways industry stakeholders will find desirable and privacy and civil liberties advocates will likely find unfortunate. Additionally, there seem to be some provisions that Democrats more liberal than DelBene may find unpalatable, and so this bill, as written, seems not to thread the political needles necessary for passage of a federal privacy bill.
The Sense of Congress section that apparently functions as something akin to a Findings section has been altered. First, the aim of the bill is modified from establishing global standards in the creation of a U.S. digital privacy framework to complementing global standards. This may be a nod to the fact the General Data Protection Regulation is the closest to a global standard, and moreover may be a concession that even if DelBene’s bill is enacted as is, the GDPR would remain the world’s standard on data protection. Second, in a signal that runs throughout the bill, another “Sense of Congress” is expanded with respect to the types of activities the new federal privacy framework would encompass. In the last version, “federal guidance” was needed merely for the “collection and storage of sensitive data.” In the new version, “federal guidance” is required for the “collection, processing, disclosure, transmission and storage of sensitive data.” This change is present in numerous places throughout the bill, explicitly expanding its scope. Third, there is an entirely new subsection that serves as a succinct summary of the bill:
(6) individuals have a right to—
(A) exercise control over the personal data companies collect from them and how they use it;
(B) easily understandable and accessible information about privacy and security practices;
(C) expect that companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data;
(D) secure and responsible handling of sensitive personal information;
(E) access and correct persona data in usable formats, in a manner that is appropriate to sensitivity of the data and the risk of adverse consequences to consumers if the data is inaccurate; and
(F) reasonable limits on the personal data that companies collect and retain.
This serves less as a statement of changes in the revised version of the “Information Transparency and Personal Data Control Act” and more as a nice sketch of what the bill would do. Of course, this bill relies heavily on the notice and consent model, using the opt-in approach at times and the opt-out approach at times. And, like other bills, there is data collection and processing that residents of the U.S. would not be able to avoid. For example, the bill focuses on notice and consent about “sensitive personal information” but not “personal information,” which can fairly be read to include all those data outside the definition of “sensitive personal data.” Moreover, there are four classes of data specifically identified as being outside this definition, one being “publicly available information,” which may be reasonably understood to mean everything one posts online from an account not set to private. And so, one’s every utterance on Twitter, Facebook, Instagram, etc. could be collected, processed, and used without consent. But, the universe of data outside “sensitive personal data” is vast, for it apparently includes one’s name, address, place of employment, employment history, credit history, and any location data that is not precise geolocation data. On this last point, there is no definition of “precise geolocation information,” and so it is not clear what location data does not qualify as such and may be collected and used as any entity sees fit. Is it 1000 feet? Half a mile? A Mile? The bill isn’t clear.
Turning back to the bill, the Federal Trade Commission (FTC) will still need to conduct a notice and comment rulemaking (i.e., the normal process by which regulations are drafted that the FTC is usually barred from using) to govern data collection and usage. However, the scope and extent of the rulemaking has been both expanded and narrowed in places, and the timeline changed from 12 months after enactment to 18 months after enactment. In terms of the first notable scope change, the rulemaking is expanded from just controllers to including processors and third parties. Given the fungibility of personal data, this would encompass more of the data ecosystem. In a similar vein, “transmission” of “sensitive personal data” is added to the activities the rulemaking must cover. In a place where a narrowing of the scope has likely occurred, the first bill would have directed the FTC to conduct a rulemaking on the collection and usage of “sensitive personal information” “from United States persons or persons located in the United States when the data is collected.” This condition is changed to “when the sensitive personal information is collected, transmitted, stored, processed, sold or shared.” This alteration both expands and narrows the rulemaking. The expansion is to obviously cover more of the activities of controllers, processors, and third parties. And yet, the narrowing is because it is no longer “data” but instead “sensitive personal data,” with the latter being a subset of the former.
The contours of the FTC’s rulemaking are changed. Reflecting the expanded group of personal data activities the FTC is charged with bringing into the rulemaking, the notice controllers must give people is expanded. And hence, people must be informed through a controller’s privacy and data use policy of the controller’s proposed collection, transmission, selling, sharing, or disclosing of a person’s sensitive personal information. Additionally, they must affirmatively consent (aka opt-in) to “any functionality” involved with the aforementioned activities, and the term functionality is not defined in the bill, leaving this job possibly to the FTC. It would seem to be most controllers will want it read as narrowly as possible so their disclosure obligations are lesser.
In any event, the new version expands this portion of the FTC’s rulemaking remit to mandate that the controller’s documented instructions to processors and third parties must adhere to the notice provided to people. But, another new provision immunizes controllers from processors and third parties violating the opt-in consent model, which would seem to remove a powerful incentive for controllers to monitor and oversee their processors and third parties’ activities regarding data collection, usage, and processing. Incidentally, there is a drafting error in the bill text DelBene’s in that a subsection (C) was obviously deleted without the subsections being adjusted. One wonders what was removed.
The new “Information Transparency and Personal Data Control Act” clarifies the provisions spelling out the requirements for privacy and data use policies. First, it makes explicit what may have been implicit in that the FTC will draft and issue guidelines on the “general requirements” of each controller, processor, and third party’s “up-to-date, transparent privacy, security, and data use policy.” However, are these guidelines in the same way the National Institute of Standards and Technology’s (NIST) guidelines are respected but not binding? Or guidelines in the sense of regulations controller, processor, and third parties need to heed. This is not clear.
Another feature of the privacy and data use policies got loosened in the revision process. In the first bill, these policies would disclose the third parties “with whom the sensitive personal information will be shared and for what purposes.” The new bill requires disclosure of categories of third parties. Two other provisions were eliminated that would also have provided people with more information into how their sensitive personal information is being used. First, controllers, processors, and third parties would no longer have to disclose how long such data would be stored. Second, these entities will no longer need to say “[w]hether the sensitive personal information will be used to create profiles about users and whether they will be integrated across platforms.”
As noted above, mere personal information is subject to an opt-out regime, and DelBene’s revisions may remove incentives for processors and third parties to honor one’s wishes to opt out. In the previous bill, one could opt out of the range of data activities, including sharing with third parties. In the new bill, one may still opt out at any time, but even though controllers must pass along a person’s desire to opt out, the controller cannot be held liable for the processor or third party’s failure to heed the opt out. Again, as with the controller being legally protected from a processor or third party’s violations of the opt-in regime for sensitive personal data, the violations of the latter two entities cannot be imputed to the former. So long as the controller communicates the opt out request, it is in the clear. Again, the incentive structure would seem to allow for violations of a person’s wishes to opt out of the use and sharing of her personal data.
This section contains new language governing the contracts that controllers must have with their processors that limits processing to the instructions of the controller. Moreover, no contract can negate the provisions of the bill, so there is not the possibility of a controller circumventing the new law through an otherwise binding legal instrument.
The period of privacy audits is lengthened from annually to every two years but expanded to include processors and third parties. The small business exemption to the audit requirement is dramatically expanded from those entities engaged in the collection, usage, and disclosure of the sensitive personal information of less than 5,000 people a year to less than 250,000 a year.
DelBene’s revised bill contains a curious section tasking the FTC with regulations to block auditors from selling information “under the guise of a potential violation by the controller products or services when there is not a violation of the Act.” This is novel language in Congressional privacy and data protection bills aiming at a problem I’m having trouble seeing. It could be that some entity in the auditing world clued DelBene’s office onto a problem in their world with auditors using the personal data of entities they are auditing. Or is it a nefarious addition someone with an interest in the auditing business persuaded DelBene’s staff was a problem in need of a remedy that is actually a play to eliminate or hamstring competition. Or perhaps this is aimed at the consulting firms one would see as naturals to enter the auditing field who themselves are players in the data brokering world. Hard to say as the language borders on the cryptic.
The new “Information Transparency and Personal Data Control Act” expands the exceptions to consent one finds in all the privacy and data protection bills. The revised bill adds non-sensitive personal information to all the exceptions whereas it was only sensitive personal information that had been covered in the first bill. Another expansion of the times when a person’s consent is not needed for data collection, usage, and disclosure is the crimes a controller, processor, and third party are looking to prevent. In the first bill, it was just fraud, identity theft, and criminal activity. Now it is broadened to include unauthorized transactions, theft, shoplifting, financial crimes, and money laundering. This strikes me as overkill as any of the added crimes would presumably be covered by “criminal activity.” Other exemptions were added: activities authorized under the Fair Credit Reporting Act, completing a transaction after personal information has been collected that is part of an ongoing relationship between the controller and person; complying with federal, state, or local law; or conducting product recalls or servicing warranties.
Then the revised bill dramatically fleshes out the instances where a controller need not obtain opt-in consent from person to collect, process, sell, share, or disclose his sensitive personal information so long as these practices do “not deviate from purposes consistent with a controller’s relationship with users as understood by the reasonable use.” Of course, if notice and consent is the model for DelBene’s bill, why not disclose these practices to the person and allow him or her to decide whether they want to engage with the controller. The answer to this question is probably something along the lines that the reasonable person would understand these activities are simply part and parcel of modern life, and so forcing controllers to disclose them would lengthy notices and are anyway unnecessary. I would suggest true notice would necessarily require disclosure of all purposes. Nonetheless, the bill makes clear these, and possibly other, activities, are outside the opt-in consent requirement:
(A) carrying out the term of a contract or service agreement, including elements of a customer loyalty program, with a user;
(B) accepting and processing a payment from a user;
(C) completing a transaction with a user such as through delivering a good or service even if such delivery is made by a processor or third party;
(D) marking goods or services to a user as long as the user is provided with the ability to opt out of such marketing;
(E) taking steps to continue or extend an existing business relationship with a user, or inviting a new user to participate in a customer promotion, benefit or loyalty program, as long as the user is provided with the ability to opt out;
(F) conduct internal research to improve, repair, or develop products, services, or technology; or
(G) municipal governments.
The new “Information Transparency and Personal Data Control Act” clarifies the FTC’s authority over common carriers by explicitly stating only the FTC will have the authority to police their privacy practices. Without such language, the Federal Communications Commission (FCC) would possibly have had jurisdiction over them, but it also limits the FCC’s ability to regulate common carriers (i.e., telecommunications companies and broadband providers) more stringently.
DelBene’s bill incorporates a new wrinkle in most federal privacy and data protection legislation: an ability for violators to cure violations. The FTC would need to alert entities if they have non-willfully violated the act and give them 30 days to fix the violation. So this would cover negligent violations and perhaps reckless violations but not willful violations where a party knew he or she was breaking the law and proceeded anyway. Knowledge is often a difficult element to prove in enforcement actions, and so this fact may persuade the FTC to treat some arguably willful violations as if they are non-willful in the hopes the violator will correct the wrongdoing. Despite the massive increase in FTC resources, this limit would seem to hamstring the agency’s enforcement of the act. Moreover, state attorneys general looking to use the new privacy law would also have to give offenders 30 days to comply.
As noted, in one sense, DelBene’s revised bill significantly increases the FTC’s resources, but the appropriations committees would have to follow through and actually provide these funds in legislation. Technically the bill increases the authorization of appropriations for the FTC, reflecting an historic method of dividing the ability to set an agency’s appropriations from the purse strings to actually give agencies money. Anymore, authorization levels play little role in what gets appropriated, so the doubling the FTC would see in its authorization level may be seen as mere window dressing. Likewise, the directive for the FTC to hire 500 new employees to enforce privacy is also at the mercy of the appropriators. And so, one should view any claims of increasing the FTC’s funding with a grain of salt unless it is in an actual appropriations bill, a process over which Senate Republicans hold influence, and they are likely not on board for a dramatic enlarging of the FTC.
Turning to the definitions section, the revised bill adds one for collection (“buying, renting, gathering, obtaining, receiving, or accessing any sensitive data of an individual by any means” and one for de-identified data. Likewise, there is a new definition for employee data. However, the definition of processor is dramatically expanded to reflect some of the changes in the revised bill, notably language clarifying that a processor becomes a controller when it processes data on its own behalf. The definition for sensitive personal information is expanded to include gender identity, intersex status; citizenship or immigration status; and mental or physical health diagnosis. But it is also narrowed to exclude
personal information reflecting a written or verbal communication or a transaction between a controller and the user, where the user is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, non-profit, or government agency and whose communications or transaction with the controller occur solely within the context of the controller conducting due diligence regarding, or providing or receiving a product or service to or from such company, partnership, sole proprietorship, non-profit, or government agency;
The revised “Information Transparency and Personal Data Control Act” expands the existing federal statutes exempted from its requirements to include Gramm-Leach-Bliley Act, the Health Insurance Portability and Accountability Act of 1996, the Family Educational Rights and Privacy Act of 1974, and the Fair Credit Reporting Act. Consequently, those financial services, healthcare, educational, and credit reporting agencies regulated under those federal statutes would be exempted from this new regime.
The revised bill cuts the language tailoring the preemption of state privacy laws to not preempt the following:
State constitutional, trespass, contract, data breach notification, or tort law, other than to the degree such law is substantially intended to govern the collection of sensitive personal information and the collection, storage, processing, sale, sharing with third parties, or other use of such information.
Now, it appears upon enactment of this revised bill, one could no longer sue for privacy violations on any of those grounds.
Other Developments
A bipartisan group of Senators and Representatives wrote the President asking for funding to stimulate semiconductor production in the United States (U.S.) per provisions in the FY 2021 National Defense Authorization Act (NDAA) (P.L. 116-283). In a press release issued by one of the signatories, Senator John Cornyn (R-TX), the Members stated:
We write today to encourage you to prioritize securing funding to implement the initiatives authorized in the CHIPS for America Act that were enacted into law as part of the fiscal year 2021 National Defense Authorization Act (referred to as the ‘CHIPS provisions’). We would specifically request you consider joining us in support of funding levels that are at least the authorized amounts proposed in the original bill as you work with Congress on a package of policies to better compete with China and how best to strengthen our country’s economic competitiveness and resiliency as well as national security.
In addition to enabling sustainable economic growth today, funding the CHIPS provisions is a top national security priority. The Chinese Communist Party (CCP) has aggressive plans to reorient and dominate the semiconductor supply chain, pouring over $150 billion in semiconductor manufacturing subsidies and investing $1.4 trillion in their efforts to become the dominate global technological power. Even full funding of the originally filed CHIPS provisions pales in comparison to the investments being made by the CCP, which speaks to why consideration of an even higher level of funding is worthwhile.
The United States must also work with our allies and strategic partners to out-scale the CCP in manufacturing capabilities for advanced semiconductors. If we lose these highly-skilled jobs and know-how to China, the United States will never recapture them. Further, we risk dependence on a strategic competitor for the advanced semiconductors that power our economy, military, and critical infrastructure.
As you develop your FY 2022 budget request, we encourage you to include some initial investments to support semiconductor R&D and manufacturing at agencies like Commerce, DOD, DOE, and NSF as intended by CHIPS.
Finally, should you explore executive actions to address this urgent semiconductor matter, we encourage you to continue pursuing a technology neutral approach.
Senator Josh Hawley (R-MO) has reintroduced his bill, the “No TikTok on Government Devices Act” (S.1143) that would ban United States (U.S.) government employees from downloading TikTok on their government-issued phones. This bill passed the Senate unanimously last year (S.3455), but the House never took up the bill. Representative Ken Buck (R-CO) reintroduced companion legislation (H.R.2566), and Senators Rick Scott (R-FL), Marco Rubio (R-FL), and Tom Cotton (R-AR) cosponsored the Senate version. In his press release, Hawley asserted:
My bill is a straightforward plan to protect American government data from a hostile foreign power, which, less than a year ago, passed the Senate unanimously. TikTok has repeatedly proven itself to be a malicious actor but Joe Biden and Big Tech refuse to take the threat of Chinese espionage seriously. It’s time for Congress to act.
The State Department, the Department of Homeland Security, the Department of Defense, and TSA have already banned TikTok on federal government devices.
Federal Communications Commission (FCC) acting Chair Jessica Rosenworcel “announced that the Commission will re-establish the Communications Security, Reliability, and Interoperability Council (CSRIC), with a primary focus on improving 5G network security” per the agency’s statement. The FCC added:
In addition, following security breaches that have impacted the communications sector, she will ask the Committee to review software and cloud services vulnerabilities and to develop mitigation strategies.
Finally, Acting Chairwoman Rosenworcel will seek to diversify the group’s membership to include a broad variety of stakeholders, including representation from the FCC’s federal partners with similar interests.
In the public notice, the FCC explained:
CSRIC VIII will provide advice and recommendations to the Commission to improve the security, reliability, and interoperability of the nation’s communications systems. Among other issues, Acting Chairwoman Jessica Rosenworcel will ask CSRIC VIII to identify 5G security as a primary focus. In addition, following security breaches that have impacted the communications sector, she will ask CSRIC VIII to review risks to service provider operations from attacks in software and cloud services stacks and to develop mitigation strategies. Finally, in seeking nominations for CSRIC VIII, Acting Chairwoman Rosenworcel will seek to diversify the group's membership to include a broad variety of stakeholders, including representation from the FCC's federal government partners with similar interests.
Senators Ed Markey (D-MA), Richard Blumenthal (D-CT) and Representatives Lori Trahan (D-MA) and Kathy Castor (D-FL) wrote Facebook CEO Mark Zuckerberg “regarding Facebook’s recent announcement that it is “exploring” plans to launch a version of Instagram, which Facebook owns, for users under the age of 13.” They posed a series of lengthy questions aimed at obtaining more information about Facebook’s plans and the safeguards they propose to implement to protect users under 13. They added:
Given Facebook’s past failures to protect children and in light of evidence that using Instagram may pose a threat to young users’ wellbeing, we have serious concerns about this proposal. Children are a uniquely vulnerable population online, and images of kids are highly sensitive data. Facebook has an obligation to ensure that any new platforms or projects targeting children put those users’ welfare first, and we are skeptical that Facebook is prepared to fulfil this obligation.
If Facebook’s objective is to decrease the number of users under the age of 13 on its current Instagram platform, it should invest in efforts to do that directly. The alternative approach that Facebook appears poised to take—specifically, pushing kids to sign up for a new platform that may itself pose threats to young users’ privacy and wellbeing—involves serious challenges and may do more harm than good.
Google’s YouTube announced it “will release a new metric called Violative View Rate (VVR) as part of our Community Guidelines Enforcement Report.” YouTube stated:
The report will have a separate section called ‘Views’ which will lay out historical and the Q4 (Oct-Dec 2020) VVR data, along with details on its methodology. Going forward, we will be updating this data quarterly. VVR helps us estimate the percentage of the views on YouTube that come from violative content.
Put simply, the Violative View Rate (VVR) helps us determine what percentage of views on YouTube comes from content that violates our policies. Our teams started tracking this back in 2017, and across the company it’s the primary metric used to measure our responsibility work. As we’ve expanded our investment in people and technology, we’ve seen the VVR fall. The most recent VVR is at 0.16-0.18% which means that out of every 10,000 views on YouTube, 16-18 come from violative content. This is down by over 70% when compared to the same quarter of 2017, in large part thanks to our investments in machine learning. Going forward, we will update the VVR quarterly in our Community Guidelines Enforcement Report.
VVR data gives critical context around how we're protecting our community. Other metrics like the turnaround time to remove a violative video, are important. But they don't fully capture the actual impact of violative content on the viewer. For example, compare a violative video that got 100 views but stayed on our platform for more than 24 hours with content that reached thousands of views in the first few hours before removal. Which ultimately has more impact? We believe the VVR is the best way for us to understand how harmful content impacts viewers, and to identify where we need to make improvements.
We calculate VVR by taking a sample of videos on YouTube and sending it to our content reviewers who tell us which videos violate our policies and which do not. By sampling, we gain a more comprehensive view of the violative content we might not be catching with our systems. However, the VVR will fluctuate -- both up and down. For example, immediately after we update a policy, you might see this number temporarily go up as our systems ramp up to catch content that is newly classified as violative.
The National Association of Attorneys General (NAAG) wrote Twitter, eBay, and Shopify “to act immediately to prevent people from selling fraudulent Centers for Disease Control and Prevention (CDC) vaccination cards on their platforms.” The attorneys general stated:
We are deeply concerned about this use of your platforms to spread false and misleading information regarding COVID vaccines. The false and deceptive marketing and sales of fake COVID vaccine cards threatens the health of our communities, slows progress in getting our residents protected from the virus, and are a violation of the laws of many states. Multiple states’ laws provide for injunctive relief, damages, penalties, and other remedies for such conduct.
The use of your platforms to disseminate the deceptive marketing and sales of fake vaccine cards is a threat to residents of our states. As a result, we are asking you to take immediate action to prevent your platforms from being used as a vehicle to commit these fraudulent and deceptive acts that harm our communities. Such action should include, without limitation: (1) monitoring your platforms for ads or links marketing or selling, or otherwise indicating the availability of, blank or fraudulently completed vaccine cards; (2) promptly taking down ads or links identified through that monitoring; and (3) preserving records, such as the content, username, and actual user identity, pertaining to any such ads or links.
Apple and Epic Games both filed their findings of fact and conclusions of law they hope the federal trial court will adopt when their trial starts next month. This dispute started when Fortnite started allowing Fortnite players using the iPhone or iPad to pay less for in-app purchases by buying directly from Epic Games instead of using Apple’s App Store which takes 30% of all sales. Apple and Google responded by kicking Fortnite out of their app stores, and Epic Games filed suit. Epic Games was partially successful in seeking an injunction against Apple but failed in getting Fortnite back in the App Store. Epic Games has also filed claims in other jurisdictions or sought to get antitrust regulators on the field in the United Kingdom, the European Union, and Australia.
Apple claimed in its filing:
Epic objects to paying Apple a 30% commission—even though it pays the same commission to many other platforms on which Epic distributes Fortnite. When Apple refused Epic’s request for a special deal, Epic included secret code in a Fortnite update and triggered it, using a server-side “hotfix,” to allow iOS customers to purchase V-Bucks without paying Apple’s commission. This was a breach of the DPLA (as Epic concedes), so Apple terminated Epic’s developer privileges and removed Fortnite from the App Store.
This was all part of a pre-planned media strategy called “Project Liberty.” Epic retained Cravath, Swaine & Moore LLP and a public relations firm in 2019, and this lawsuit is the culmination of that effort. Epic seeks to portray Apple as the “bad guy” so that it can revive flagging interest in Fortnite. Yet, ironically, when Epic got kicked off the iOS platform, it told players that they could continue playing on consoles, PCs, and other devices—demonstrating the existence of competition and the absence of monopoly.
Apple is not a monopolist in any relevant market. Apple does not have market power over digital game transactions. Whether measured in apps or in-app purchases, output has increased while prices have stayed constant or fallen. The restrictions in Apple’s license agreements protect its intellectual property and serve a variety of procompetitive benefits including reliability, security, and privacy. Epic just wants to free-ride on Apple’s innovation.
Epic argued in its filing:
Epic Games, Inc. (“Epic”) has suffered harm from Apple’s conduct. (See Section IX below.)
Epic has an app store, the Epic Games Store, that it has launched on personal computers (“PCs”) and Macs. Epic would launch the Epic Games Store on iOS if it could, but Apple will not allow it to do so. If the Epic Games Store were on iOS, it would provide consumers with the benefits of competition in iOS app distribution.
Absent Apple’s rules, Epic would not distribute its apps through the App Store. (Sweeney.) Instead, Epic would distribute its apps through other means, including from its website and through EGS. (Sweeney.) By distributing its apps through the App Store, Epic has paid supra-competitive commissions and been deprived of the benefits that would flow from a competitive market. (Evans.)
Epic has its own payment processing functionality—Epic direct payment. Apple forbids Epic from using Epic direct payment on iOS, depriving Epic’s customers of payment choices that would allow Epic to offer lower prices and comprehensive customer service.
Many other app developers are harmed by Apple’s practices. (See Part VIII below.)
There are app developers that are unable to innovate and produce products because of Apple’s practices.
There are app developers that cannot offer safety features to users because of Apple’s practices.
There are app developers that cannot financially survive because of Apple’s practices, depriving customers of their offerings altogether.
Muslim Advocates is suing Facebook, arguing that the platform has violated the consumer protection laws of the District of Columbia in attempt to circumvent 47 U.S.C. 230.
In its complaint, Muslim Advocates argued:
Facebook’s executives might believe that they are legally entitled to operate a social media platform that acts as a cesspool for hate. But what its executives certainly cannot do is misrepresent to Congress, national civil rights leaders, and its users in the District of Columbia that Facebook does, in fact, remove or take down content that violates its own standards and policies while routinely refusing to do so. Facebook has no free license to make false or deceptive statements in the District of Columbia as part of its longstanding campaign to make Congress and the American people believe that Facebook is a safe product and to discourage increased regulation by Washington. Just as car manufacturers cannot make false statements about the risk of their vehicles to drivers and pedestrians to drive up sales, Facebook and its executives cannot make false statements about the content and groups that they permit to flourish on Facebook in order to convince the public to keep using its platform and drive-up Facebook’s massive profits.
Facebook routinely refuses to remove hateful and harmful content because, at least in part, doing so is financially lucrative. But making false and deceptive statements about removing hateful and harmful content is illegal in the District of Columbia.
In this case, Muslim Advocates asks Facebook’s leaders to make a clear and simple choice: stop misrepresenting that you will remove content that violates your policies or conform your deeds to your words. Every business that operates in the District of Columbia must follow the same rules, and those rules offer no exception for publicly-traded tech companies.
Further Reading
“A massive Facebook breach underscores limits to current data breach notification laws” by Tonya Riley — The Washington Post. Facebook does not appear to have alerted users of the massive breach of data, and there is no federal data breach law that would require it to. Perhaps states will allege violations of their data breach laws. As a number of experts point out in this piece, despite the size of the breach, Members of Congress are not screaming for hearings, which is maybe a commentary on the regularity of breaches or their focus on other tech issues.
“How Facebook’s Ad System Lets Companies Talk Out of Both Sides of Their Mouths” By Jeremy Merrill — The Markup. No one should be surprised that advertisers are using microtargeting to show ads to people based on their political persuasion.
“How the far-right group ‘Oath Enforcers’ plans to harass political enemies” By Jason Wilson — The Guardian. Sounds like right wing groups may escalate their harassment activities.
“YouTube says it’s getting better at taking down videos that break its rules. They still number in the millions.” By Gerrit De Vynck — The Washington Post.
“China researchers face abuse, sanctions as Beijing looks to silence critics” By Lily Kuo and Gerry Shih — The Washington Post.
Coming Events
On 20 April, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “Strengthening the Federal Trade Commission’s (FTC) Authority to Protect Consumers” with the four FTC commissioners.
The House Agriculture Committee will hold a hearing titled “Rural Broadband - Examining Internet Connectivity Needs and Opportunities in Rural America” on 20 April.
On 21 April, the Senate Judiciary Committee’s Intellectual Property Subcommittee will hold a hearing titled “Improving Access and Inclusivity in the Patent System: Unleashing America’s Economic Engine.”
The Senate Judiciary Committee’s Competition Policy, Antitrust, and Consumer Rights Subcommittee will hold a hearing titled “Antitrust Applied: Examining Competition in App Stores” on 21 April.
The House Energy and Commerce Committee’s Communications and Technology Subcommittee will hold a hearing titled “Leading the Wireless Future: Securing American Network Technology” on 21 April.
On 21 April, the Senate Armed Services Committee’s Personnel Subcommittee will hold a hearing “on the current and future cyber workforce of the Department of Defense and the military services.”
On 21 April, the Senate Armed Services Committee’s Emerging Threats and Capabilities Subcommittee will hold a hearing “on science and technology, technology maturation, and technology transition activities.”
The Federal Communications Commission (FCC) will hold an open meeting on 22 April with this draft agenda:
Text-to-988. The Commission will consider a Further Notice of Proposed Rulemaking to increase the effectiveness of the National Suicide Prevention Lifeline by proposing to require covered text providers to support text messaging to 988. (WC Docket No. 18-336)
Commercial Space Launch Operations. The Commission will consider a Report and Order and Further Notice of Proposed Rulemaking that would adopt a new spectrum allocation for commercial space launch operations and seek comment on additional allocations and service rules. (ET Docket No. 13-115)
Wireless Microphones. The Commission will consider a Notice of Proposed Rulemaking that proposes to revise the technical rules for Part 74 low-power auxiliary station (LPAS) devices to permit a recently developed, and more efficient, type of wireless microphone system. (RM-11821; ET Docket No. 21-115)
Improving 911 Reliability. The Commission will consider a Third Notice of Proposed Rulemaking to promote public safety by ensuring that 911 call centers and consumers receive timely and useful notifications of disruptions to 911 service. (PS Docket Nos. 13-75, 15-80; ET Docket No. 04-35
Concluding the 800 MHz Band Reconfiguration. The Commission will consider an Order to conclude its 800 MHz rebanding program due to the successful fulfillment of this public safety mandate. (WT Docket No. 02-55)
Enhancing Transparency of Foreign Government-Sponsored Programming. The Commission will consider a Report and Order to require clear disclosures for broadcast programming that is sponsored, paid for, or furnished by a foreign government or its representative. (MB Docket No. 20-299)
Imposing Application Cap in Upcoming NCE FM Filing Window. The Commission will consider a Public Notice to impose a limit of ten applications filed by any party in the upcoming 2021 filing window for new noncommercial educational FM stations. (MB Docket No. 20-343)
Enforcement Bureau Action. The Commission will consider an enforcement action.
On 29 April, the Commerce, Science, and Transportation Committee will consider the nomination of Eric Lander to be Director of the Office of Science and Technology Policy (OSTP).
The Federal Trade Commission (FTC) will hold a workshop titled “Bringing Dark Patterns to Light” on 29 April.
The Department of Commerce’s National Telecommunications and Information Administration (NTIA) will hold “a virtual meeting of a multistakeholder process on promoting software component transparency” on 29 April.
On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.