EDPB Explains Reasoning in Directing DPC To Strengthen Initial WhatsApp Decision
The SEC fines firms for security failures; Committee investigating 6 January attacks wants technology records; noyb sues over cookie paywalls
Photo by Tushar Mahajan on Unsplash
In the last issue, we looked at the €225 million punishment for WhatsApp levied by the Data Protection Commission for failing to meet the General Data Protection Regulation’s (GDPR) requirements on transparency in its data processing (see here for more detail and analysis.) Today, let’s look at the European Data Protection Board’s (EDPB) binding decision that directed DPC to revise and stiffen its proposed punishment of WhatsApp, which is more extensive than just a fine, for the company has a significant list of items on which it must act or face future regulatory action. But first, a little background.
The DPC started investigating WhatsApp in December 2018 after complaints from people using WhatsApp and people not using the messaging app. The DPC received 88 complaints from other DPAs, some of which ultimately objected to the DPC’s draft decision. Per the GDPR, the DPC circulated its draft decision in December 2020, and after assessing the objections it received from the other DPAs (aka supervisory authorities (SA)) and considering further WhatsApp input, the DPC issued a compromise decision. However, this compromise satisfied some of the DPAs but not others, and finally in May 2021, the DPC referred the decision to the EDPB for resolution.
In late July 2021, the EDPB announced its adoption of an Article 65 dispute resolution decision that “seeks to address the lack of consensus on certain aspects of a draft decision issued by the Irish (IE) SA as LSA regarding WhatsApp Ireland Ltd. (WhatsApp IE) and the subsequent objections expressed by a number of concerned supervisory authorities (CSAs).” But, at the time, this press release was the only public information offered about the decision that has now been released. The EDPB has already issued one Article 65 decision in regard to the DPC that functionally overruled the agency and revised upward its proposed punishment of Twitter for data breaches. In a recent related development, the EDPB also turned down the Hamburg DPA’s request for an urgent binding order on WhatsApp and Facebook’s new privacy and policy and terms of service (see here for more detail and analysis). The EDPB did urge the DPC to investigate, which appears to overlap with this Article 65 proceeding. In the press release, the EDPB further explained:
§ The LSA issued the draft decision following an own-volition inquiry into WhatsApp IE, concerning whether WhatsApp IE complied with its transparency obligations pursuant to Art. 12, 13 & 14 GDPR. On 24 December 2020, the LSA shared its draft decision with the CSAs in accordance with Art. 60 (3) GDPR.
§ The CSAs issued objections pursuant to Art. 60 (4) GDPR concerning, among others, the identified infringements of the GDPR, whether specific data at stake were to be considered personal data and the consequences thereof, and the appropriateness of the envisaged corrective measures.
§ The IE SA was unable to reach consensus, having considered the objections of the CSAs, and consequently indicated to the Board it would not follow the objections. Accordingly, the IE SA referred them to the EDPB for determination pursuant to Art. 65 (1) (a) GDPR, thereby initiating the dispute resolution procedure.
§ Today, the EDPB adopted its binding decision. The decision addresses the merits of the objections found to be “relevant and reasoned” in line with the requirements of Art. 4 (24) GDPR. The EDPB will shortly notify its decision formally to the concerned supervisory authorities.
§ The IE SA shall adopt its final decision, addressed to the controller, on the basis of the EDPB decision, without undue delay and at the latest one month after the EDPB has notified its decision. The EDPB will publish its decision on its website without undue delay after the IE SA has notified their national decision to the controller.
Coming to the present, in tandem with the publication of the DPC’s revised decision, the EDPB issued “Binding decision 1/2021 on the dispute arisen on the draft decision of the Irish Supervisory Authority regarding WhatsApp Ireland under Article 65(1)(a) GDPR” in which it assessed the DPC’s draft decision and the “relevant and reasoned objections” made by the DPC’s fellow DPAs. Generally speaking, the other regulators saw WhatsApp’s conduct as being far graver and deserving a more significant punishment. One criticized the DPC for not also investigating whether WhatsApp was sharing personal data with Facebook and whether this was being transferred out of the EU. The EDPB ended up in the middle on the appropriate punishment but often sided with the CSAs on the messaging app’s GDPR infringements. One could conclude the EDPB was not impressed with the DPC’s decisions after its investigation, particularly since the Board again and again came back to the investigator’s findings on WhatsApp.
The EDPB summarized its decision:
Not much needs to be said about the EDPB’s summary other than the fact that the German, French, Hungarian, Italian, Polish, and Dutch data protection authorities (DPA) objected to the DPC’s draft decision and punishment.
The EDPB summarized the objections of the DPAs to the DPC’s draft decision:
The EDPB explained the DPC tried to address some of these objections in a compromise draft, but “the responses received from the concerned supervisory authorities (CSA) in relation to the remaining objections showed that there was no single proposed compromise position that was agreeable to all of the relevant CSAs.” And so, the DPC could not please all the CSAs, and the matter was kicked over to the EDPB to decide, the second time a proposed DPC decision had to be decided there.
The EDPB states at the outset of its analysis of the CSAs’ objections that it will only analyze those that are “relevant and reasoned” (i.e., the threshold for the EDPB to intervene under the GDPR). Moreover, the Board does not take any position on those objections that are not relevant and reasoned for future potential action.
The first objection the EDPB examines is over the DPC’s analysis of whether WhatsApp was transparent enough in explaining to users its alleged legitimate interests in processing data per Article 13(1)(d) (i.e., the requirement that a controller must disclose to a person at the time personal data is collected the legitimate interests of the controller or third party is using under the GDPR). The EDPB found the objections the other CSAs raised to be relevant and reasoned despite the DPC’s arguments to the contrary, specifically the Board agreed that WhatsApp was not transparent enough in explaining its legitimate interests to people such that they could then have information sufficient to exercise their other rights. Accordingly, the EDPB agreed with the CSAs (and incidentally the DPC’s investigator) that WhatsApp’s disclosure about the purposes and legitimate interests in data processing were inadequate to the requirements of Article 13(1)(d) and ordered the DPC to revise its decision thusly.
The relevant portions of this analysis are:
As noted, the EDPB concluded:
Presumably, WhatsApp would be wise to consult the Article 29 Working Party’s guidance on transparency the EDPB ratified shortly after its establishment. In a footnote, the EDPB quoted from these guidelines in terms of what not to do that sounds close to how WhatsApp described their “legitimate interests” and the purposes for processing:
Next the EDPB turns to WhatsApp’s “Lossy Hashing” procedure that supposedly turns the phone numbers of a user’s contacts into a hash within seconds. This analysis implicates non-users of WhatsApp and a different but related article of the GDPR. The DPC initially found this processing did not pertain to personal data, and consequently, the company’s infringement of the GDPR’s Article 14 was not as seriously, meaning a significant reduction in the proposed fine. Again, CSAs sided with the DPC’s investigator in arguing the Lossy Hashing process did not change the fact that phone numbers are personal data despite the alleged transformation. Things get very technical hereafter, and I was struggling to follow the arguments of the CSAs as explained by the EDPB. However, the gist seems to be that the lossy hashing procedure does not lead to the anonymization of non-users personal data, at which point the GDPR does not apply. Rather the CSAs argued the process leads to pseudonymization, and that WhatsApp is able to easily reidentify users. Moreover, the 16 numbers associated with each lossy hashed non-user are likely the maximum number of numbers used and not the minimum. The DPC came around to some of the arguments, especially in light of establishing an untenable precedent, but worried about losing in court. The DPC tried splitting the baby in half by proposing:
The EDPB ultimately found all the objections raised to be relevant and reasoned and further explained:
On this point, the EDPB sided with the CSAs again:
Germany’s DPA leveled additional objections, including whether the DPC got the scope of the investigation right and whether the agency should have consulted with other DPAs in making this determination at the outset of the investigation. Clearly, the German DPA favored a wider-reaching investigation, especially into whether WhatsApp was sharing personal data with its corporate parent, Facebook. The DPC did not agree and referenced other, ongoing investigations into both WhatsApp and Facebook with the latter pertaining to international transfers. This is most likely a reference to the long-running litigation brought by privacy advocate Max Schrems that has resulted in two EU adequacy decisions regarding the U.S. being struck down. Earlier this year, Schrems and the DPC reached a settlement that could expedite the agency’s determination of whether Facebook must stop transferring personal data to the U.S. on the basis of the court decisions. In any event, the EDPB did not find the German DPA’s objections relevant and reasoned.
The EDPB did find the Italian DPA’s arguments on whether WhatsApp violated the transparency provisions in Article 5 to be relevant and reasoned. The EDPB found:
The EDPB agreed with CSAs that took issue with the DPC declining to find violations of Article 13(2)(e), which controls some of the information controllers must give to people at the point of data collection. In this case, this provision pertains to “whether the provision of personal data is a statutory or contractual requirement, or a requirement necessary to enter into a contract, as well as whether the data subject is obliged to provide the personal data and of the possible consequences of failure to provide such data.” The EDPB found this to be a relevant and reasoned objection:
On a different infringement the CSAs found under Article 5(1)(c), the EDPB stated the file of the case lacks the elements necessary to make this determination.
The EDPB analyzes at length how the DPC calculated the annual turnover rate for WhatsApp and Facebook, which serves as the basis for any administrative fine, and other facets of how to properly calculate fines. Ultimately, the EDPB reveals the DPC wanted to fine WhatsApp €30-50 million, to which the CSAs objected. For example, the German DPA advocated for a fine in the “upper range” closer to 4% of Facebook’s annual worldwide turnover (over $80 billion), suggesting a fine in the range of $2 billion or more considering in rough number $3.2 billion is 4%. The other DPAs lodged similar objections, arguing that a low fine does nothing to dissuade WhatsApp and other companies from future similar conduct. The EDPB found:
The EDPB directed the DPC to reassess the proposed fine:
One general point I didn’t make in the last post about the DPC’s decision that is also relevant to that article is how long it takes for some DPAs to move from investigation to a final decision. In the timeline here, the DPC started investigating WhatsApp in December 2018, and even if no other DPA had leveled relevant and reasoned objections to the draft decision, the matter would have been settled in December 2020 at the earliest. In those two years, it appears WhatsApp was free to continue collecting and processing personal data as it wished, for there was no sort of order or injunction blocking these activities. Perhaps this is a matter of resources, for in Brave’s 2020 report on DPAs, the browser company alleged that “the EU Member States have not given data protection authorities (DPAs) the tools they need to enforce the GDPR.” And the DPC has complained that its recent funding increases may be inadequate for its workload, which is significant because it is the LSA for Facebook, WhatsApp, Twitter, Apple, Microsoft, and others in the EU.
Other Developments
Photo by Alec Favale on Unsplash
§ The United Kingdom’s National Cyber Security Centre (NCSC) CEO Lindy Cameron and the United States’ Cybersecurity and Infrastructure Security Agency (CISA) Director Jen Easterly held a face to face meeting in London and “affirmed their commitment to tackling ransomware.” Cameron and Easterly also “discuss[ed] their organisations’ priorities, including combatting ransomware” and “reflected on the impact of ransomware attacks this year and the need for industry collaboration to complement government’s operational efforts against ransomware.” The NCSC and CISA continued:
o The issue of gender diversity was also on the agenda, with both agreeing that more needed to be done to remove barriers to entry into the profession for women and girls.
o They discussed the NCSC’s CyberFirst Girls Competition, which aims to get more girls interested in cyber through fun but challenging team events for teenagers, and CISA’s ongoing commitment to expanding opportunities for young women and girls to pursue careers in cyber security and technology and closing the gender gap that exists in these fields.
o The two leaders also discussed government collaboration with industry, including the NCSC’s Industry 100 scheme and CISA’s Joint Cyber Defense Collaborative.
o The Industry 100 scheme has integrated public and private sector talent in the UK to pool their knowledge to tackle key cyber security issues. The Joint Cyber Defense Collaborative has similarly bought American public and private sector entities together to unify crisis action planning and defend against threats to U.S. critical infrastructure.
§ The United States (U.S.) The Securities and Exchange Commission (SEC) “sanctioned eight firms in three actions for failures in their cybersecurity policies and procedures that resulted in email account takeovers exposing the personal information of thousands of customers and clients at each firm.” The SEC stated:
o The eight firms, which have agreed to settle the charges, are: Cetera Advisor Networks LLC, Cetera Investment Services LLC, Cetera Financial Specialists LLC, Cetera Advisors LLC, and Cetera Investment Advisers LLC (collectively, the Cetera Entities); Cambridge Investment Research Inc. and Cambridge Investment Research Advisors Inc. (collectively, Cambridge); and KMS Financial Services Inc. (KMS). All were Commission-registered as broker dealers, investment advisory firms, or both.
o According to the SEC's order against the Cetera Entities, between November 2017 and June 2020, cloud-based email accounts of over 60 Cetera Entities' personnel were taken over by unauthorized third parties, resulting in the exposure of personally identifying information (PII) of at least 4,388 customers and clients. None of the taken over accounts were protected in a manner consistent with the Cetera Entities' policies. The SEC's order also finds that Cetera Advisors LLC and Cetera Investment Advisers LLC sent breach notifications to the firms' clients that included misleading language suggesting that the notifications were issued much sooner than they actually were after discovery of the incidents.
o According to the SEC's order against Cambridge, between January 2018 and July 2021, cloud-based email accounts of over 121 Cambridge representatives were taken over by unauthorized third parties, resulting in the PII exposure of at least 2,177 Cambridge customers and clients. The SEC's order finds that although Cambridge discovered the first email account takeover in January 2018, it failed to adopt and implement firm-wide enhanced security measures for cloud-based email accounts of its representatives until 2021, resulting in the exposure and potential exposure of additional customer and client records and information.
o According to the SEC's order against KMS, between September 2018 and December 2019, cloud-based email accounts of 15 KMS financial advisers or their assistants were taken over by unauthorized third parties, resulting in the PII exposure of approximately 4,900 KMS customers and clients. The SEC's order further finds that KMS failed to adopt written policies and procedures requiring additional firm-wide security measures until May 2020, and did not fully implement those additional security measures firm-wide until August 2020, placing additional customer and client records and information at risk.
o The SEC's orders against each of the firms finds that they violated Rule 30(a) of Regulation S-P, also known as the Safeguards Rule, which is designed to protect confidential customer information. The SEC's order against the Cetera Entities also finds that Cetera Advisors LLC and Cetera Investment Advisers LLC violated Section 206(4) of the Advisers Act and Rule 206(4)-7 in connection with their breach notifications to clients. Without admitting or denying the SEC's findings, each firm agreed to cease and desist from future violations of the charged provisions, to be censured and to pay a penalty. The Cetera Entities will pay a $300,000 penalty, Cambridge will pay a $250,000 penalty, and KMS will pay a $200,000 penalty.
§ The United Kingdom’s NHSX published “new digital guidelines to support local National Health Service (NHS) leaders and organisations to transform services for patients.” NHSX explained:
o The guidance calls for patients to be able to digitally access their care plans and test results, for trusts to explore new ways of delivering care such as remote monitoring and consultations, and to improve care through the use of electronic prescribing systems.
o The What Good Looks Like framework gives NHS managers clear, easy-to-use instructions on what they should be doing to use digital better in their service, and how they should be paying for it.
o It describes the common foundation that should be in place across the NHS, from using a secure digital infrastructure to ensuring that digital systems are designed to meet the needs of their staff and patients.
o NHSX say the resources are an important step in continuing to digitise NHS services and build on the progress made in adopting digital tools during the pandemic.
o The guidelines will be followed up with an assessment process to be outlined by NHSX later this year, so NHS services can identify their gaps and prioritise areas for investment and improvement.
§ The National Institute of Standards and Technology (NIST) issued “the final version of NISTIR 8259B, IoT Non-Technical Supporting Capability Core Baseline…[that] complements the technical abilities defined in NISTIR 8259A, IoT Device Cybersecurity Capability Core Baseline (May 2020) by defining a baseline set of activities that manufacturers should undertake during the planning, development, and operational life of Internet of Things (IoT) devices to address the cybersecurity needs and goals of their customers.” NIST stated:
o The baseline contained in NISTIR 8259B has been refined over more than a year, starting with the initial publication of an online catalog of technical and non-technical capabilities in June 2020. The need for non-technical supporting capabilities was the subject of a panel discussion during the July 2020 workshop on building the Federal Profile, and a draft baseline was published in December 2020 as one of a set of four draft documents released for public comment. Since publishing the drafts, NIST has presented a webinar (January 2021), accepted public comments, held stakeholder outreach meetings (January – April 2021), and conducted a series of four roundtable discussions (June 2021) to gather feedback and refine the contents of NISTIR 8259B and its companion documents. Feedback on each document was also considered for its impact on the companion documents in the set.
o NIST believes that NISTIRS 8259A and 8259B provide a well-founded baseline of capabilities that manufacturers can apply to enhance the cybersecurity of their IoT devices.
§ France’s Commission Nationale de l'Informatique et des Libertés (CNIL) created and published a map detailing “the level of data protection” so companies may understand when they may transfer personal data legally.
§ The Select Committee to Investigate the January 6th Attack on the United States Capitol wrote a number of technology companies and platforms requesting information related to the insurrection attempt on 6 January 2021. The committee asserted:
o This expansion of the Select Committee’s probe comes on the heels of Wednesday’s demands for records from eight Executive Branch agencies. It also follows the July 27 hearing at which four police officers testified about their experiences on January 6th defending the U.S. Capitol in the face of a violent mob aiming to derail the peaceful transfer of power. The officers’ call to action underscored the importance of the Select Committee’s mandate to uncover the facts about January 6th and its causes and to help ensure such an attack on American democracy cannot happen again.
o The letters to the social media companies seek a range of records, including data, reports, analyses, and communications stretching back to spring of 2020. The Select Committee is also seeking information on policy changes social media companies adopted—or failed to adopt—to address the spread of false information, violent extremism, and foreign malign influence, including decisions on banning material from platforms and contacts with law enforcement and other government entities.
o The committee said the follwing entities received requests:
§ 4chan
§ 8kun (formerly known as 8chan)
§ Gab
§ Parler
§ Snapchat
§ Telegram
§ Tik-Tok
§ Twitch
§ YouTube
§ Zello
§ none of your business (noyb) “filed complaints against the cookie paywalls of seven major German and Austrian news websites: SPIEGEL.de, Zeit.de, heise.de, FAZ.net, derStandard.at, krone.at and t-online.de.” noyb provided this background:
o An increasing amount of websites asks their users to either agree to data being passed on to hundreds of tracking companies (which generates a few cents of revenue for the website) or take out a subscription (for up to € 80 per year). Can consent be considered “freely given” if the alternative is to pay 10, 20 or 100 times the market price of your data to keep it to yourself?
o Personalized advertisement necessary for survival? Media outlets are financed by advertisements, subscriptions, subsidies or donations. Online, too, a large part of advertising is not personalized, just like with ads in print media, radio or television. While media companies make good money with directly booked and mostly non-personalized advertising, the "leftover space" on websites is sold off to Ad-Tech companies for a few cents. Valuable user data is then sent to these competitors, which thereby take the lion’s share for themselves.
o Profit and data remain with Google & Co. According to a U.S study, media outlets only get about 4% of the additional revenue from passing on data. In the Netherlands, public broadcasters make more money online with non-personalized advertising than by passing on data to tracking companies.
o Readers to "buy back" data at an exorbitant price? Only about 3% of all users want to agree to the processing of their data. Starting with derStandard.at in Austria, an increasing number of media outlets have implemented a "pay or okay" solution. Users have no free choice whether they want to consent (as provided for in the GDPR), but have to take out a subscription if they do not want to give their consent.
o Saying "no" to tracking is not only time-consuming (you have to enter your name, address and credit card data), but users also have to dig deep into their pockets: while the media companies only get a few cents per user for passing on data, SPIEGEL and FAZ currently charge € 59.88 per year for a tracking-free subscription. Die ZEIT charges € 62.40 and derStandard.at even € 84 per year for its "PUR subscription" without any form of advertising. As soon as these figures are compared to the total revenues of media companies, the extent of the profiteering is obvious: If all readers of spiegel.de took out a "PUR subscription," the company would generate a revenue of around €1.2 billion. Right now, their current digital revenue is around €76.9 million. In other words, the costs for these subscriptions go far beyond making up for lost ad revenue when users do not agree to tracking.
§ Senators Amy Klobuchar (D-Mn), Bill Cassidy (R-LA) and Jon Ossoff (D-GA) wrote Amazon CEO Andy Jassy “requesting information about Amazon’s data collection practices involving biometrics.” They “expressed concerns about the company’s use of data gathered by Amazon One, the company’s palm-print recognition and payment system, noting that this data could be used to ‘further cement its competitive power and suppress competition across various markets.’” Klobuchar, Cassidy, and Ossoff stated:
o Amazon One appears to be a biometric data recognition system that allows consumers to pay for their purchases in grocery stores, book stores, and other retail settings using their palm print. Consumers can enroll in the program at any location with an Amazon One device by scanning one or both palms and entering their phone and credit card information. Amazon One devices are currently in use in more than 50 retail locations throughout the United States, including in Minnesota. Locations with the technology currently include Amazon Go stores, Whole Foods locations, and other Amazon stores.
o Recent reports indicate that Amazon is incentivizing consumers to share their biometric information with Amazon One by offering a $10 promotional credit for Amazon.com products. Amazon has also announced that they have plans to expand Amazon One, which may include introducing the technology in other Amazon stores as well as selling it to third-party stores. Amazon’s expansion of biometric data collection through Amazon One raises serious questions about Amazon’s plans for this data and its respect for user privacy, including about how Amazon may use the data for advertising and tracking purposes.
o Offering products from home devices to health services, Amazon possesses a tremendous amount of user data on the activities of hundreds of millions of Americans. Our concerns about user privacy are heightened by evidence that Amazon shared voice data with third-party contractors and allegations that Amazon has violated biometric privacy laws. We are also concerned that Amazon may use data from Amazon One, including data from third-party customers that may purchase and use Amazon One devices, to further cement its competitive power and suppress competition across various markets.
o Amazon One users may experience harms if their data is not kept secure. In contrast with biometric systems like Apple’s Face ID and Touch ID or Samsung Pass, which store biometric information on a user’s device, Amazon One reportedly uploads biometric information to the cloud, raising unique security risks. Like many companies, Amazon has been affected by hacks and vulnerabilities that have exposed sensitive information, such as user emails. Amazon’s various home device systems have leaked information or been hacked, as highlighted in a recent letter to the Federal Trade Commission (FTC) from 48 advocacy organizations. Company whistleblowers earlier this year also raised concerns about Amazon’s security practices. Data security is particularly important when it comes to immutable customer data, like palm prints.
§ The United Kingdom’s National Cyber Security Centre (NCSC) published “guidance for IT administrators on a new reporting tool that can be added to their organisation’s Microsoft Office 365 accounts.” The NCSC added:
o By clicking the new button, employees will report potential scams directly to the NCSC’s Suspicious Email Reporting Service (SERS) as well as their organisation’s IT team. The automated NCSC service will process emails and take down previously unseen malicious content where found.
o Since its launch in April 2020, the Suspicious Email Reporting Service has received over 6,500,000 reports from the public – resulting in the removal of more than 97,000 scam URLs. This July, it took just four hours on average to remove malicious URLs in phishing emails reported to the SERS.
§ The American Bankers Association, Bank Policy Institute, and Consumer Bankers Association wrote Senate Intelligence Committee Chair Mark Warner (D-VA) and Ranking Member Marco Rubio (R-FL) regarding the “Cyber Incident Notification Act of 2021” (S.2407) and detailed their proposed changes to the bill:
o Extend the timeline for reporting to 72 hours after confirmation an incident has occurred. As drafted, the legislation requires the filing of a report within 24 hours of a cybersecurity incident. The initial stages of an incident response require “all hands on deck” to focus immediately on understanding the incident and implementing mitigation and response measures. Filing government reports would not only distract from that work but also result in reports that are premature and likely erroneous. Here it is important to distinguish between notification and a formal report. The European Union’s NIS Directive as well as the recent Notice of Proposed Rulemaking on Computer-Security Incident Notification Requirements from U.S. financial regulators recognize that within the first 24-36 hours, firms will have limited information on an event and thus call for a simple notification that a cyber incident of a sufficient materiality has occurred, with more detailed reporting to follow. Extending the reporting timeline in the legislation to 72 hours after confirmation an incident has occurred would also be more consistent with the bill’s definition of a “cybersecurity intrusion” which includes incidents involving nation-states or advanced persistent threats – both of which firms would be unable to determine within a 24-hour period given the need for assistance and confirmation of attribution from federal agencies.
o Narrow the scope of reporting to incidents causing actual harm. The bill requires reporting of “potential incidents,” which would create near-constant reporting to CISA by financial services firms based on the number of incidents those firms see on a daily basis. Collecting information on potential incidents would add noise to the signal of material incidents, and thus overwhelm rather than enhance CISA’s analytical efforts. We recommend that the legislation require reporting of incidents that cause actual harm.
o Ensure alignment with existing regulations and avoid duplication with Sector Risk Management Agencies (SRMA). As you are aware, financial services firms are already subject to significant cyber reporting requirements. As drafted, the legislation requires reporting to both CISA and the SRMA. For the financial sector, U.S. Treasury serves as SRMA, but not as regulator as implied in the legislation. Primary regulators that would receive additional reporting include the Federal Reserve Board, the Office of the Comptroller of the Currency, the Federal Deposit Insurance Corporation, and the Securities and Exchange Commission, among others. We have no objection to reporting to CISA; however, we recommend that the legislation include a mandate for CISA to work with all the other regulatory agencies to develop a common reporting form and streamlined process that would be good for one and good for all. Otherwise, still more time will be spent by first responders working with firms’ legal and compliance teams to ensure that each agency’s requirement is met rather than focusing those efforts on protecting critical infrastructure.
o Ensure the rulemaking process allows for meaningful dialogue with critical infrastructure. The rulemaking process should include greater coordination and discussion with critical infrastructure, as many of the details around definitions, the scope of reporting, and specific requirements will be determined through this process. Getting these details right is essential, and the process would benefit from an initial 90-day consultation period with industry followed by a 90-day comment period.
o Harmonize financial penalties for non-compliance with the existing regulatory framework. The legislation includes penalties for firms that fail to report, and we agree that any requirement must come with an enforcement mechanism. Our concern is that financial services firms could be subject to multiple enforcement actions and multiple penalties for the same reporting violation. Here again, we would recommend that the legislation mandate that CISA coordinate any enforcement action and ensure that there are not duplicative penalties for the same conduct.
o Develop mechanisms to notify a critical infrastructure entity when an incident affects a federal system holding the entity’s sensitive data. Many government agencies and regulatory authorities hold sensitive financial institution data that, if breached, could pose risks to national security. Legislation should encourage bi-directional information sharing and greater collaboration between government and critical infrastructure. Should a federal agency experience a cyber incident affecting the operations and security of systems holding sensitive private sector data, notifying the private entity would allow institutions to take proactive measures to mitigate potential attacks.
§ The United States (U.S.) Federal Financial Institutions Examination Council (FFIEC) issuedguidance “that provides financial institutions with examples of effective authentication and access risk management principles and practices for customers, employees, and third parties accessing digital banking services and information systems.” FFIEC said “[t]he guidance:
o Highlights the current cybersecurity threat environment including increased remote access by customers and users, and attacks that leverage compromised credentials; and mentions the risks arising from push payment capabilities.
o Recognizes the importance of the financial institution’s risk assessment to determine appropriate access and authentication practices to determine the wide range of users accessing financial institution systems and services.
o Supports a financial institution’s adoption of layered security and underscores weaknesses in single-factor authentication.
o Discusses how multi-factor authentication or controls of equivalent strength can more effectively mitigate risks.
o Includes examples of authentication controls, and a list of government and industry resources and references to assist financial institutions with authentication and access management.
o The new guidance replaces previous documents issued in 2005 and 2011.
§ The European Data Protection Board (EDPB) responded to Minister of the European Parliament Sophie in’t Veld “regarding the use of Automatic Image Recognition System on migrants in Italy.” The EDPB stated:
o Concerning the use of facial recognition technologies to monitor disembarkation operations in Italy by police authorities, the Italian Data Protection Authority, according to the information shared, examined the so called "Sari Real Time System", on the basis of a data protection impact assessment carried out by the Ministry of Interior, in accordance with the national legislation implementing the Law Enforcement Directive (EU) 2016/6801, prior to the activation of the said system. In particular, this system was not designed to be used specifically for migration, asylum and border control activities, but in general to operate in support of investigative activities.
o In this regard, it is worth mentioning the Joint Opinion 5/2021 recently adopted by the EDPB and the EDPS on the Commission’s proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act), which includes specific restrictions on the use of AI systems for ‘real-time’ remote biometric identification for the purpose of law enforcement.
o In this Joint Opinion, the EDPB and the EDPS underlined that the use of AI in the area of police and law enforcement requires area-specific, precise, foreseeable and proportionate rules that need to consider the interests of the persons concerned and the effects on the functioning of a democratic society.
o More specifically, they called for a general ban on any use of AI for an automated recognition of human features, such as faces, in publicly accessible spaces, in any context. A ban was equally recommended on AI systems categorising individuals from biometrics, including face recognition, into clusters according to ethnicity or other grounds for discrimination under Article 21 of the Charter. Furthermore, the EDPB and the EDPS considered that the use of AI systems intended to be used by competent public authorities, such as polygraphs and similar tools to detect the emotional state of a natural person, is highly undesirable and should be prohibited.
o In addition, as mentioned in its two-year work programme for 2021-2022, the EDPB is currently working on Guidelines on the use of facial recognition technologies in the area of law enforcement.
o Therefore, the EDPB is fully aware of the importance of ensuring that the fundamental rights and freedoms of individuals, including the right to privacy and data protection, are adequately safeguarded when individuals’ biometric data are subject to automatic processing through facial recognition technologies for purposes of border management, and will follow closely the developments on this matter.
§ The Consumer Financial Protection Bureau (CFPB) “filed a proposed settlement to resolve a lawsuit against a debt collection enterprise and its owner…[for] failing to establish or implement reasonable written policies and procedures regarding the accuracy and integrity of the information it furnished to credit reporting agencies and failed to conduct reasonable investigations of indirect consumer disputes, resulting in inaccurate information remaining on consumers’ credit reports.” The agency stated:
o The CFPB alleges that Fair Collections & Outsourcing (FCO) violated the Consumer Financial Protection Act of 2010, the Fair Credit Reporting Act (FCRA), and Regulation V (the Furnisher Rule). Specifically, the CFPB alleges that FCO:
§ Failed to implement proper policies and procedures: The CFPB alleges that FCO lacked reasonable policies and procedures regarding the accuracy and integrity of the information it furnished to consumer reporting companies, specifically with respect to handling of indirect disputes, which are disputes submitted by consumers to credit reporting agencies.
§ Failed to conduct reasonable investigations of disputes, including disputes related to identity theft: The CFPB alleges that FCO failed to conduct reasonable investigations of indirect disputes. The CFPB also alleges that FCO continued to furnish information about accounts for which consumers submitted identity theft reports directly to FCO, without FCO conducting an investigation.
o The CFPB also alleges that FCO and Sobota violated the Fair Debt Collection Practices Act when FCO told consumers they owed certain debts when they did not have a reasonable basis for the assertion.
§ The Federal Reserve System, the Federal Deposit Insurance Corporation, and the Comptroller of the Currency extended the comment period for proposed guidance for managing third-party risk. The agencies explained:
o On July 19, 2021, the agencies published in the Federal Register an invitation to comment on proposed guidance on managing risks associated with third-party relationships. The proposed guidance would offer a framework based on sound risk management principles for banking organizations to consider in developing risk management practices for third-party relationships that takes into account the level of risk, complexity, and size of the banking organization and the nature of the third-party relationship. The proposed guidance would replace each agency's existing guidance on this topic and would be directed to all banking organizations supervised by the agencies. The notice solicited respondents' views on all aspects of the proposed guidance, including to what extent the guidance provides sufficient utility, relevance, comprehensiveness, and clarity for banking organizations with different risk profiles and organizational structures.
o The proposed guidance provided for a comment period ending on September 17, 2021. Since the publication of the proposal, the agencies have received comments requesting a 30-day extension of the comment period. An extension of the comment period will provide additional opportunity for interested parties to analyze the proposed guidance and prepare and submit comments. Therefore, the agencies are extending the end of the comment period for the proposal from September 17, 2021 to October 18, 2021.
§ The United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency’s (CISA) National Risk Management Center (NRMC) published “a new CISA Insights titled, Risk Considerations for Managed Service Provider Customers…[that] provides a framework that government and private sector organizations (to include small and medium-sized businesses) outsourcing some level of information technology (IT) support to managed service providers (MSPs) can use to better mitigate against third-party risk.” The NRMC continued:
o The framework includes best practices and considerations from the National Institute of Standards and Technology and other authoritative sources with guidance geared to the three main organizational levels that play a role in reducing overall risk: 1) senior executives and boards of directors; 2) procurement professionals; and 3) network administrators, systems administrators, and front-line cybersecurity staff.
o The bottom line is that outsourcing IT services provides both increased benefits and risk to an organization. Key responsible individuals should take a step back to look at the security practices in place across their enterprise to answer:
§ Who is responsible for security and operations when outsourcing IT services to an MSP?
§ What are the most critical assets that we must protect and how do we protect them?
§ What should an MSP provide to an organization in advance of a contract award to demonstrate security controls in place?
§ What network and system access levels are appropriate for third-party service providers?
o It will require effort and time upfront for an organization to review their security practices and answer these types of questions. But, in the long run, it will help them spot pockets of risk from third-party vendors and improve their overall security and resilience.
§ The United States (U.S.) Department of Commerce’s National Telecommunications and Information Administration (NTIA) announced “the establishment of the Office of Internet Connectivity and Growth (OICG) and the Office of Minority Broadband Initiatives (OMBI).” The NTIA added:
o The establishment of the Office of Internet Connectivity and Growth fulfills requirements of the ACCESS BROADBAND Act, enacted into law as part of the Consolidated Appropriations Act, 2021. The OICG will be led by Douglas Kinkoph, who has served as the head of NTIA’s broadband program since 2015. The OICG will house all broadband activities at NTIA, including three active broadband grant programs: the Broadband Infrastructure Program, the Tribal Broadband Connectivity Program and the Connecting Minority Communities Pilot Program. The office will also house BroadbandUSA, which provides community outreach, support for state leaders, technical assistance, and helps coordinate federal broadband resources and programs.
o The establishment of the Office of Minority Broadband Initiatives codifies NTIA’s work on its Minority Broadband Initiative since 2018. The OMBI, established within the OICG, fulfills requirements of the Connecting Minority Communities provisions enacted into law as part of the Consolidated Appropriations Act, 2021. The office will continue NTIA’s efforts to collaborate with federal agencies; state, local and tribal governments; Historically Black Colleges and Universities, Tribal Colleges and Universities and Minority-Serving Institutions; and any interested stakeholders to promote initiatives related to expanding connectivity and digital opportunities for anchor communities.
§ The Consumer Federation of America (CFA) issued “a series of fact sheets about surveillance advertising.” The CFA asserted:
o Surveillance advertising, also known as targeted advertising or behavioral advertising, is the practice of showing individual consumers different advertisements based on inferences about their interests, demographics, and other characteristics drawn from tracking their activities over time and space. It raises many concerns for consumers, including the potential for discrimination, manipulation, unfair treatment, identity theft, and law enforcement access to personal information. It is extremely difficult for consumers to avoid the tracking and profiling that underpins the surveillance advertising ecosystem. Furthermore, surveillance advertising does not necessarily deliver the benefits that businesses are promised. These fact sheets created by Consumer Federation of America explain how surveillance advertising works, how the tracking is done, how discrimination and other adverse impacts may occur, who really benefits from surveillance advertising, and why contextual advertising is a good alternative.
o CFA made these fact sheets available:
§ Surveillance Advertising: What is it?
§ Surveillance Advertising: What About Discrimination?
§ Surveillance Advertising: Does it Benefit Small Business?
§ Surveillance Advertising: Good for Consumers or the Ad Tech Industry?
§ Surveillance Advertising: How Does the Tracking Work?
§ Surveillance Advertising: Is Contextual Advertising a Better Alternative?
§ Mina Hsiang was named the new Administrator of the United States Digital Service (USDS). The agency said of Hsiang:
o Mina brings deep technical experience to the position, having spent over a dozen years in private industry building products, teams, and organizations to deliver analytics and digital services, and having also served in government in roles spanning both policy and implementation.
o During the Obama-Biden Administration, she was a vital member of the HealthCare.gov rescue team before the founding of USDS. She became an early team member of USDS, serving as founding Executive Director of the Digital Service at the Department of Health and Human Services (HHS) and leading work from MACRA implementation to precision medicine. In the past year, she served on the Biden-Harris Transition as part of the HHS agency review team and working on the Transition’s COVID-19 Response team. She carried that work back into USDS as a Senior Advisor for Delivery. In that capacity, she successfully led the delivery team to roll out the Vaccines.gov integrated consumer experience.
o Hsiang was most recently Vice President of Technology Products and Vice President of Policy at Devoted Health, an integrated insurance and services startup for seniors in the private sector. She helped start and grow Devoted into a 700+ person company. Before that role, she led new product development for the analytics division of the $60B healthcare tech & services company, Optum. As an investor and advisor on company boards, she also lent her insight and expertise to help start and build innovative and high-performing organizations.
Further Reading
Photo by Phillip Larking on Unsplash
§ “Apple cares about privacy, unless you work at Apple” By Zoe Schiffer — The Verge. Jacob Preston was sitting down with his manager during his first week at Apple when he was told, with little fanfare, that he needed to link his personal Apple ID and work account. The request struck him as odd. Like anyone who owns an Apple product, Preston’s Apple ID was intimately tied to his personal data — it connected his devices to the company’s various services, including his iCloud backups. How could he be sure his personal messages and documents wouldn’t land on his work laptop? Still, he was too giddy about his new job as a firmware engineer to care. He went ahead and linked the accounts.
§ “#AppleToo: employees organize and allege harassment and discrimination” By Dani Anguiano — The Guardian. A group of Apple workers is organizing to fight against what it says are patterns of discrimination, racism and sexism within the company and management’s failure to address them, in a rare public display of dissent within the notoriously secretive company.Last week, a group of employees launched #AppleToo, a campaign to gather and share current and past employees’ experiences of inequity, intimidation and abuse. The group hopes to mobilize workers at a time when workers across the tech industry are calling for greater accountability from their employers, and to push Apple to more effectively address such complaints.
§ “Bitcoin Will Soon be ‘Legal Tender’ in El Salvador – Here’s What that Means” By Jay L. Zagorsky — Nextgov. On Sept. 7, 2021, El Salvador will become the first country to make bitcoin legal tender. The government even went a step further in promoting the cryptocurrency’s use by giving US$30 in free bitcoins to citizens who sign up for its national digital wallet, known as “Chivo,” or “cool” in English. Foreigners who invest three bitcoins in the country – currently about $140,000 – will be granted residency. Panama is considering following El Salvador’s lead.
§ “What Does It Actually Mean When a Company Says, “We Do Not Sell Your Data”?” By Alfred Ng — The Markup. You’ve likely run into this claim from tech giants before: “We do not sell your personal data.” Companies from Facebook to Google to Twitter repeat versions of this statement in their privacy policies, public statements, and congressional testimony. And when taken very literally, the promise is true: Despite gathering masses of personal data on their users and converting that data into billions of dollars in profits, these tech giants do not directly sell their users’ information the same way data brokers directly sell data in bulk to advertisers.
§ “Juniper Breach Mystery Starts to Clear With New Details on Hackers and U.S. Role” By Jordan Robertson — Bloomberg. Days before Christmas in 2015, Juniper Networks Inc. alerted users that it had been breached. In a brief statement, the company said it had discovered “unauthorized code” in one of its network security products, allowing hackers to decipher encrypted communications and gain high-level access to customers’ computer systems. Further details were scant, but Juniper made clear the implications were serious: It urged users to download a software update “with the highest priority.” More than five years later, the breach of Juniper’s network remains an enduring mystery in computer security, an attack on America’s software supply chain that potentially exposed highly sensitive customers including telecommunications companies and U.S. military agencies to years of spying before the company issued a patch.
§ “McDonald’s McFlurry Machine Is Broken (Again). Now the FTC Is On It.” By Heather Haddon — The Wall Street Journal. Can the FTC help get you your McFlurry? As many customers of McDonald’s know all too well, the fast-food chain has struggled for years to keep its ice cream machines working. Without them, people can’t get a milkshake, soft cone or above all a McFlurry, a cup of soft ice cream with candy and cookies that is whipped about in a blender with a specially designed hollow spoon. Late-night TV comics joke about the problem. Rivals Jack in the Box Inc. and Wendy’s Co. have roasted McDonald’s for it on social media. An online tracker called McBroken monitors McDonald’s ice cream machine outages across cities.
§ “House Jan. 6 committee seeks information from tech giants regarding attack on Capitol, attempts to overturn election” By Dave Clarke and Tom Hamburger — The Washington Post. The House select committee investigating the Jan. 6 insurrection asked technology giants Friday for information that could be helpful to its probe as it prepares to tell telecommunications companies early next week to retain relevant phone and text records, including for some members of Congress. The letters sent Friday went out to Facebook, Twitter, Google and several other tech titans. The committee requested “all reviews, studies, reports, data, analyses, and communications” regarding misinformation generated by foreign and U.S. actors, “domestic violent extremists” associated with the attack, and other efforts to overturn the election results.
§ “How Data Brokers Sell Access to the Backbone of the Internet” By Joseph Cox — Vice. There's something of an open secret in the cybersecurity world: internet service providers quietly give away detailed information about which computer is communicating with another to private businesses, which then sells access to that data to a range of third parties, according to multiple sources in the threat intelligence industry. The information, known as netflow data, is a useful tool for digital investigators. They can use it to identify servers being used by hackers, or to follow data as it is stolen. But the sale of this information still makes some people nervous because they are concerned about whose hands it may fall into.
§ “Huawei can prosper despite US sanctions, says board member” By Patrick Wintour — The Guardian. Huawei has been forced to adopt the mentality of a startup partly because of US government sanctions, Catherine Chen, a board member for the Chinese telecommunications company, has said. Helping to run probably the most scrutinised company in the world, she said Huawei would survive and eventually break free of the attempted US shackles by using its technical expertise to forge a path into new markets less dependent on the US, such as energy conservation, artificial intelligence and electric cars.
§ “The Silent Partner Cleaning Up Facebook for $500 Million a Year” By Adam Satariano and Mike Isaac — The New York Times. In 2019, Julie Sweet, the newly appointed chief executive of the global consulting firm Accenture, held a meeting with top managers. She had a question: Should Accenture get out of some of the work it was doing for a leading client, Facebook? For years, tensions had mounted within Accenture over a certain task that it performed for the social network. In eight-hour shifts, thousands of its full-time employees and contractors were sorting through Facebook’s most noxious posts, including images, videos and messages about suicides, beheadings and sexual acts, trying to prevent them from spreading online.
§ “Lookalike tech policies in China, Europe and the U.S.” By Ina Fried — Axios. For all their differences, Europe, China and the U.S. are making remarkably similar moves in tech policy. The big picture: Nations and regions with wildly differing political systems and cultures have converged on a shared set of responses to the power of big tech firms: rein in the companies, avoid dependencies and subsidize critical networks and technologies.
§ “World’s Largest Chip Maker to Raise Prices, Threatening Costlier Electronics” By Yang Jie, Stephanie Yang, and Yoko Kubota — The Wall Street Journal. The world’s largest contract chip maker is raising prices by as much as 20%, according to people familiar with the matter, a move that could result in consumers paying more for electronics. Taiwan Semiconductor Manufacturing Co. plans to increase the prices of its most advanced chips by roughly 10%, while less advanced chips used by customers like auto makers will cost about 20% more, these people said. The higher prices will generally take effect late this year or next year, the people said.
Coming Events
Photo by Pineapple Supply Co. on Unsplash
§ 13 September
o The House Agriculture Committee will mark up its portion of the FY 2022 budget reconciliation package.
o The House Energy and Commerce Committee will mark up its portion of the FY 2022 budget reconciliation package. The committee made available a memorandum and the bill text:
§ Subtitle A: Budget Reconciliation Legislative Recommendations Relating to Air Pollution
§ Subtitle B: Budget Reconciliation Legislative Recommendations Relating to Hazardous Materials
§ Subtitle C: Budget Reconciliation Legislative Recommendations Relating to Drinking Water
§ Subtitle D: Budget Reconciliation Legislative Recommendations Relating to Energy
§ Subtitle E: Budget Reconciliation Legislative Recommendations Relating to Drug Pricing
§ Subtitle F: Budget Reconciliation Legislative Recommendations Relating to the Affordable Care Act
§ Subtitle G: Budget Reconciliation Legislative Recommendations Relating to Medicaid
§ Subtitle H: Budget Reconciliation Legislative Recommendations Relating to CHIP
§ Subtitle I: Budget Reconciliation Legislative Recommendations Relating to Medicare
§ Subtitle J: Budget Reconciliation Legislative Recommendations Relating to Public Health
§ Subtitle K: Budget Reconciliation Legislative Recommendations Relating to Next Generation 9-1-1
§ Subtitle L: Budget Reconciliation Legislative Recommendations Relating to Wireless Connectivity
§ Subtitle M: Budget Reconciliation Legislative Recommendations Relating to Distance Learning
§ Subtitle N: Budget Reconciliation Legislative Recommendations Relating to Manufacturing Supply Chain
§ Subtitle O: Budget Reconciliation Legislative Recommendations Relating to FTC Privacy Enforcement
§ Subtitle P: Budget Reconciliation Legislative Recommendations Relating to Department of Commerce Inspector General
o The United Kingdom’s Joint Select Committee will hold a hearing on the government’s draft “Online Safety Bill.”
§ 14 September
o The European Data Protection Board (EDPB) will hold a plenary meeting.
o The United Kingdom’s House of Commons’ Digital, Culture, Media and Sport Committee will have a hearing in its “Influencer culture” inquiry.
§ 15 September
o The Federal Trade Commission (FTC) will hold an open meeting with the following tentative agenda:
§ Proposed Policy Statement on Privacy Breaches by Health Apps and Connected Devices: The Commission will vote on whether to issue a policy statement on the importance of protecting the public from privacy breaches by health apps and other connected devices.
§ Non-HSR Reported Acquisitions by Select Technology Platforms, 2010-2019: An FTC Study: Staff will present some findings from the Commission’s inquiry into large technology platforms’ unreported acquisitions, including an analysis of the structure of deals that customarily fly under enforcers’ radar. The public release of the report is subject to commission vote.
§ Proposed Revisions to FTC Procedural Rules Concerning Petitions for Rulemaking: The Commission will vote on putting in place a process to receive public input on rulemaking petitions by external stakeholders.
§ Proposed Withdrawal of 2020 Vertical Merger Guidelines: The Commission will vote on whether to rescind the Vertical Merger Guidelines adopted in June 2020 and the Commentary on Vertical Merger Enforcement issued in December 2020.
§ 23 September
o The United Kingdom’s Joint Select Committee will hold a hearing on the government’s draft “Online Safety Bill.”
§ 28 September
o The Information Security and Privacy Advisory Board (ISPAB) will hold an open meeting and “The agenda is expected to include the following items:
§ —Board Discussion on Executive Order 14028, Improving the Nation's Cybersecurity (May 12, 2021) deliverables and impacts to date,
§ —Presentation by NIST, the Department of Homeland Security, and the General Services Administration on upcoming work specified in Executive Order 14028,
§ —Presentation by the Office of Management and Budget on Executive Order 14028 directions and memoranda to U.S. Federal Agencies,
§ —Board Discussion on recommendations and issues related to Executive Order 14028.
§ 30 September
o The Federal Communications Commission (FCC) will hold an open meeting with this tentative agenda:
§ Promoting More Resilient Networks. The Commission will consider a Notice of Proposed Rulemaking to examine the Wireless Network Resiliency Cooperative Framework, the FCC’s network outage reporting rules, and strategies to address the effect of power outages on communications networks. (PS Docket Nos. 21-346, 15-80; ET Docket No. 04-35)
§ Reassessing 4.9 GHz Band for Public Safety. The Commission will consider an Order on Reconsideration that would vacate the 2020 Sixth Report and Order, which adopted a state-by-state leasing framework for the 4.9 GHz (4940-4900 MHz) band. The Commission also will consider an Eighth Further Notice of Proposed Rulemaking that would seek comment on a nationwide framework for the 4.9 GHz band, ways to foster greater public safety use, and ways to facilitate compatible non-public safety access to the band. (WP Docket No. 07-100)
§ Authorizing 6 GHz Band Automated Frequency Coordination Systems. The Commission will consider a Public Notice beginning the process for authorizing Automated Frequency Coordination Systems to govern the operation of standard-power devices in the 6 GHz band (5.925-7.125 GHz). (ET Docket No. 21-352)
§ Spectrum Requirements for the Internet of Things. The Commission will consider a Notice of Inquiry seeking comment on current and future spectrum needs to enable better connectivity relating to the Internet of Things (IoT). (ET Docket No. 21-353)
§ Shielding 911 Call Centers from Robocalls. The Commission will consider a Further Notice of Proposed Rulemaking to update the Commission's rules regarding the implementation of the Public Safety Answering Point (PSAP) Do-Not-Call registry in order to protect PSAPs from unwanted robocalls. (CG Docket No. 12-129; PS Docket No. 21-343)
§ Stopping Illegal Robocalls From Entering American Phone Networks. The Commission will consider a Further Notice of Proposed Rulemaking that proposes to impose obligations on gateway providers to help stop illegal robocalls originating abroad from reaching U.S. consumers and businesses. (CG Docket No. 17-59; WC Docket No. 17-97)
§ Supporting Broadband for Tribal Libraries Through E-Rate. The Commission will consider a Notice of Proposed Rulemaking that proposes to update sections 54.500 and 54.501(b)(1) of the Commission’s rules to amend the definition of library and to clarify Tribal libraries are eligible for support through the E-Rate Program. (CC Docket No. 02-6)
§ Strengthening Security Review of Companies with Foreign Ownership. The Commission will consider a Second Report and Order that would adopt Standard Questions – a baseline set of national security and law enforcement questions – that certain applicants with reportable foreign ownership must provide to the Executive Branch prior to or at the same time they file their applications with the Commission, thus expediting the Executive Branch’s review for national security and law enforcement concerns. (IB Docket No. 16-155)