

Discover more from The Wavelength
Committee Renders Its Opinion On UK Online Safety Bill
U.S. government warns about Russian hacking; Hong Kong DPA explains PRC's PIPL; Canada charts ambitious course on tech
First, a bit of news. The Wavelength will transition to a paid product, but there will still be a free version available. The scope and shape of this change is still in the making but should be realized by January 2022.
The United Kingdom’s (UK) Parliamentary committee has completed its work on the “Online Safety Bill” and is recommending extensive revisions of the bill that would remake how the UK regulates online content. The Online Safety Bill was unveiled last spring, and the Parliament has been consulting with stakeholders and interested parties (see here for more detail and analysis on the bill as introduced in May 2021.) And while the committee that consists of Members of the House of Commons and the House of Lords went over the government’s bill thoroughly, there is no guarantee that Prime Minister Boris Johnson and his ruling Conservative Party will heed their recommendations.
Last year, the Joint Select Committee convened to examine and recommend changes to the Online Safety Bill has issued its final report along with recommendations for changes in the bill. The committee has also been posting transcripts of the hearings and responses from stakeholders.
The committee explained the history of the British government’s efforts to address online harms:
The draft Online Safety Bill is the result of an extensive public policy and parliamentary process, going back nearly half a decade. The draft Bill was published by the Government on 12 May 2021. It followed the Online Harms White Paper, published in April 2019 and the Government’s interim (February 2020) and full (December 2020) responses to the consultation on it. The White Paper itself was the result of a commitment made in the Internet Safety Strategy Green Paper, published in October 2017.
The chair of the committee, Damian Collins MP, sounded an anti-Big Tech note in the committee’s press release:
The Committee were unanimous in their conclusion that we need to call time on the Wild West online. What’s illegal offline should be regulated online. For too long, big tech has gotten away with being the land of the lawless. A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and in some cases even loss of life.
Collins declared that “[t]he era of self-regulation for big tech has come to an end.” He continued “[t]he companies are clearly responsible for services they have designed and profit from, and need to be held to account for the decisions they make.” This is not the type of rhetoric one expects from the UK’s center-right party, but so widespread is the disenchantment with the big tech companies, most of which are United States (U.S.) multinationals, that even members of a political party more inclined to let markets discipline the actions of private parties have run out of patience and belief in Silicon Valley.
Turning to the report, the committee is urging the government to strengthen the enforcement powers of Ofcom over online platforms, restructure the duties these companies will need to fulfill, expand the acts that will be illegal online (which will require the passage of companion legislation) such as cyberflashing, and establish a right for people to sue platforms for violations. The recommendations also focus on the design of platforms to promote some content over other content, and so the algorithms of platforms will be subject to duties to foresee and address the risks of extreme content and the delivery systems that promote such material.
The committee conceded that many online platforms have programs in place and are making efforts to monitor and moderate content:
All the service providers we heard from were taking measures to reduce activity that creates a risk of harm and illegal activity on their platforms. These measures were wide-ranging and included explicit content filters on search results; manual curation of content on public-facing areas of the service; user voting which affects the visibility of content; and, following the introduction of the Age Appropriate Design Code, default privacy settings for children who use their platforms.
Like many other legislators the world over, the committee concluded these efforts are not enough:
Nevertheless, we heard that illegal and harmful activity remain prevalent online. Throughout our inquiry, we have heard about the failures of self-regulation by online service providers. Witnesses have told us that the current system of self-regulation is akin to allowing service providers to mark their own homework, and that this has made the online world more dangerous. This has real-world implications—during the short timescale of our inquiry, illegal and harmful activity online has been linked to the suicide of 15 year old Frankie Thomas and the kidnap, rape, and murder of Sarah Everard. To give just a few examples of events that occurred in the months and years immediately preceding our inquiry:
§ The Internet Watch Foundation (IWF) Annual Report 2020 reported record increases in self-generated child sexual abuse material.
§ 5Rights’ “Pathways” research showed how the design and operation of major social media services led to children being exposed to extreme pro-suicide, eating disorder and pornographic content.
§ 2000 abusive tweets were directed at four Black players following the England national football team’s loss at the Euro 2020 final.
§ A record number of antisemitic incidents were reported in the UK in May-June 2021, such that the Community Security Trust termed this period “the month of hate”. Many of these incidents took place online.
§ Facebook was implicated in the mass murder of Rohingya Muslims in Myanmar.
§ The House of Commons Department for Culture, Media, and Sport (DCMS) Committee inquiry into Disinformation and “fake news” and the Intelligence and Security Committee of UK Parliament both concluded that Russian agents had used social media to attempt to influence UK elections. The United States Senate Select Committee on Intelligence found similar attempts to influence the 2016 US Election.
§ Twitter was implicated in the 6 January 2021 riot at the US Capitol, after which Twitter permanently suspended former US President Donald Trump from the platform for violating policies around inciting violence.
The committee noted that many online platforms support the British government’s effort to regulate the online world:
Online service providers are broadly supportive of the Government introducing regulation that aims to enhance online safety. Facebook themselves have said that they feel they are currently making societal decisions that are better made by Government and regulators.
However, many of the platforms and other stakeholders took issue with the bill as introduced in that a number of them found the bill overly complex, unclear, a threat to free expression, and lacking an adequate grant of power to the Office of Communications (Ofcom).
The committee ultimately agreed with the original intent of the bill and the Johnson government’s goals broadly. The committee stated:
We came away from those discussions certain that people’s rights must be protected against an ever-growing onslaught of online harms. This can be achieved through the shaping of a Bill which is practical, implementable, and which will empower the Regulator to take decisive action against platforms which neglect their safety duties. We wholeheartedly support the ambition of the Online Safety Bill—to make the United Kingdom the safest place in the world to be online—and we trust that our recommendations will bring the final Act closer to achieving that aim.
There is an infographic to explain how the revised bill would work:
The committee added:
We recommend that the Bill be restructured to contain a clear statement of its core safety objectives—as recommended in paragraph 52. Everything flows from these: the requirement for Ofcom to meet those objectives, its power to produce mandatory codes of practice and minimum quality standards for risk assessments in order to do so, and the requirements on service providers to address and mitigate reasonably foreseeable risks, follow those codes of practice and meet those minimum standards. Together, these measures amount to a robust framework of enforceable measures that can leave no doubt that the intentions of the Bill will be secured.
We believe there is a need to clarify that providers are required to comply with all mandatory Codes of Practice as well as the requirement to include reasonably foreseeable risks in their risk assessments. Combined with the requirements for system design we discuss in the next chapter, these measures will ensure that regulated services continue to comply with the overall objectives of the Bill—and that the Regulator is afforded maximum flexibility to respond to a rapidly changing online world.
The committee warned the government that picking and choosing recommendations would result in the goals of the Online Safety Act not being met because the delicate balance that was achieved in crafting the recommendations would be disrupted:
This Report must be understood as a whole document, comprising a cohesive set of recommendations working in tandem to produce a new vision of the Online Safety Act. The Government should not seek to isolate single recommendations without understanding how they ft into the wider manifesto laid out by the Committee. Taken as a whole, our recommendations will ensure that the Bill holds platforms to account for the risks of harm which arise on them and will achieve the Government’s ultimate aim of making the United Kingdom the safest place in the world to be online.
The committee basically calls on the UK government to rewrite the bill to clarify and strengthen the bill:
We recommend the Bill is restructured. It should set out its core objectives clearly at the beginning. This will ensure clarity to users and regulators about what the Bill is trying to achieve and inform the detailed duties set out later in the legislation. These objectives should be that Ofcom should aim to improve online safety for UK citizens by ensuring that service providers:
a) comply with UK law and do not endanger public health or national security;
b) provide a higher level of protection for children than for adults;
c) identify and mitigate the risk of reasonably foreseeable harm arising from the operation and design of their platforms;
d) recognise and respond to the disproportionate level of harms experienced by people on the basis of protected characteristics;
e) apply the overarching principle that systems should be safe by design whilst complying with the Bill;
f) safeguard freedom of expression and privacy; and
g) operate with transparency and accountability in respect of online safety.
Without touching on all the elements of the committee’s report and recommendations, the committee’s call for stronger powers for Ofcom to enforce the bill bears some mention:
Robust regulatory oversight is critical to ensuring the ambition of the Online Safety Bill is fully met. Tech companies must not be allowed to snub the regulator, to act with impunity, to continue to rely on self-regulation, or to abdicate responsibility for the harms which occur through the operation of their services or because of their governance structures. In turn, Ofcom must be able to move at pace to hold providers to account authoritatively to issue substantial fines, and assist the appropriate authorities with criminal prosecutions. The Bill extends substantial powers to the Regulator, but there are improvements to be made if the Government is to ensure the Bill is enforced effectively.
The committee is also calling for a revamp of the redress measures users could avail themselves after a platform’s internal adjudication process has run its course:
Our proposed external redress process would not replace service providers’ internal processes or run concurrently to them, nor would it address individual complaints about individual pieces of content or interactions. Rather, for a victim of sustained and significant online harm, someone who has been banned from a service or who had their posts repeatedly and systematically removed, this new redress mechanism would give them an additional body to appeal those decisions after they had come to the end of a service provider’s internal process.
In order for an external redress process to work, clear direction is needed in the Bill about Ofcom’s responsibility to set quality standards for service provider’s internal complaints procedures, and in relation to complaints about failures to meet those standards. We hope that the Government will consider our recommendations in this area, and that by improving the quality of service providers’ internal complaints procedures, any system of external redress will be needed only rarely and for the most serious cases.
The committee came down foursquare on the side of allowing people to sue:
We believe that this Bill is an opportunity to reset the relationship between service providers and users. While we recognise the resource challenges both for individuals in accessing the courts and the courts themselves, we think the importance of issues in this Bill requires that users have a right of redress in the courts. We recommend the Government develop a bespoke route of appeal in the courts to allow users to sue providers for failure to meet their obligations under the Act.
The committee heeded recommendations from witnesses and others that the ambit of the regulated matter of the bill should be broader than just “content:”
We recommend that references to harmful “content” in the Bill should be amended to “regulated content and activity”. This would better reflect the range of online risks people face and cover new forms of interaction that may emerge as technology advances. It also better reflects the fact that online safety is not just about moderating content. It is also about the design of platforms and the ways people interact with content and features on services and with one another online.
Finally, the committee’s recommendations place the onus on platforms to identify and minimize the risks users face:
We recommend that the Bill includes a specific responsibility on service providers to have in place systems and processes to identify reasonably foreseeable risks of harm arising from the design of their platforms and take proportionate steps to mitigate those risks of harm. The Bill should set out a non-exhaustive list of design features and risks associated with them to provide clarity to service providers and the regulator which could be amended by Parliament in response to the development of new technologies. Ofcom should be required to produce a mandatory Safety by Design Code of Practice, setting out the steps providers will need to take to properly consider and mitigate these risks. We envisage that the risks, features and mitigations might include (but not be limited to):
a) Risks created by algorithms to create “rabbit holes”, with possible mitigations including transparent information about the nature of recommendation algorithms and user control over the priorities they set, measures to introduce diversity of content and approach into recommendations and to allow people to deactivate recommendations from users they have not chosen to engage with;
b) Risks created by auto playing content, mitigated through limits on auto-play and auto-recommendation;
c) Risks created by frictionless cross-platform activity, with mitigations including warnings before following a link to another platform and ensuring consistent minimum standards for age assurance;
d) Risks created through data collection and the microtargeting of adverts, mitigated through minimum requirements for transparency around the placement and content of such adverts;
e) Risks created by virality and the frictionless sharing of content at scale, mitigated by measures to create friction, slow down sharing whilst viral content is moderated, require active moderation in groups over a certain size, limit the number of times content can be shared on a “one click” basis, especially on encrypted platforms, have in place special arrangements during periods of heightened risk (such as elections, major sporting events or terrorist attacks); and
f) Risks created by default settings on geolocation, photo identification/sharing and other functionality leading to victims of domestic violence or violence against women and girls (VAWG) being locatable by their abusers, mitigated through default strong privacy settings and accessible guidance to victims of abuse on how to secure their devices and online services.
Other Developments
§ The United States (U.S.) Cybersecurity and Infrastructure Security Agency (CISA), Federal Bureau of Investigation (FBI), and National Security Agency (NSA) issued a joint Cybersecurity Advisory (CSA) titled “Understanding and Mitigating Russian State-Sponsored Cyber Threats to U.S. Critical Infrastructure.” The agencies stated:
o This CSA provides an overview of Russian state-sponsored cyber operations; commonly observed tactics, techniques, and procedures (TTPs); detection actions; incident response guidance; and mitigations. This overview is intended to help the cybersecurity community reduce the risk presented by these threats.
o CISA, the FBI, and NSA encourage the cybersecurity community—especially critical infrastructure network defenders—to adopt a heightened state of awareness and to conduct proactive threat hunting, as outlined in the Detection section. Additionally, CISA, the FBI, and NSA strongly urge network defenders to implement the recommendations listed below and detailed in the Mitigations section. These mitigations will help organizations improve their functional resilience by reducing the risk of compromise or severe business degradation.
o Be prepared. Confirm reporting processes and minimize personnel gaps in IT/OT security coverage. Create, maintain, and exercise a cyber incident response plan, resilience plan, and continuity of operations plan so that critical functions and operations can be kept running if technology systems are disrupted or need to be taken offline.
o Enhance your organization’s cyber posture. Follow best practices for identity and access management, protective controls and architecture, and vulnerability and configuration management.
o Increase organizational vigilance. Stay current on reporting on this threat. Subscribe to CISA’s mailing list and feeds to receive notifications when CISA releases information about a security topic or threat.
§ The United States (U.S.) Consumer Financial Protection Bureau (CFPB) published its annual report of credit and consumer reporting complaints that found that “[i]n 2021, Equifax, Experian, and TransUnion together reported relief in response to less than 2% of covered complaints, down from nearly 25% of covered complaints in 2019.”
§ Hong Kong’s Privacy Commissioner for Personal Data, Ada Chung Lai-ling published an article on "Cross-border Transfer of Data under the Personal Information Protection Law of the Mainland," the People’s Republic of China’s recently passed data protection act. She explained that “[a]s the PIPL imposes requirements on the transfer of personal information from the Mainland to other jurisdictions, this article attempts to highlight the rules and the more salient requirements for businesses in Hong Kong.”
§ The United States (U.S.) Department of Health and Human Services (HHS) issued guidance on “the requirements of the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule for covered health care providers in relation to” “extreme risk protection order” (ERPO) laws. HHS stated:
o Does the Privacy Rule permit a covered health care provider to disclose protected health information (PHI) about an individual, without the individual's authorization, to support an application for an ERPO against the individual?
o In limited circumstances, yes. The Privacy Rule permits a covered health care provider to disclose PHI to support an ERPO application by the provider or another person in certain circumstances, including the following:
o When the disclosure is required by law. A covered health care provider may disclose PHI when the disclosure is required by law (e.g., statute, regulation, court order, subpoena) and the disclosure complies with and is limited to the relevant requirements of such law.
o When the disclosure is in response to an order of a court or administrative tribunal, subpoena, discovery request, or other lawful process in the course of a judicial or administrative proceeding. The Privacy Rule places conditions on disclosures for these purposes, including when such disclosures are required by other law.
§ Canada’s Prime Minister Justin Trudeau wrote one of his ministers tasking him with achieving a number of technology policy changes:
o Establish a digital policy task force to integrate efforts across government and position Canada as a leader in the digital economy and in shaping global governance of emerging technologies.
o Introduce legislation to advance the Digital Charter, strengthen privacy protections for consumers and provide a clear set of rules that ensure fair competition in the online marketplace.
o Accelerate broadband delivery by implementing a “use it or lose it” approach to require those that have purchased rights to build broadband to meet broadband access milestones or risk losing their spectrum rights.
o Working with the Minister of Justice and Attorney General of Canada, Minister of National Defence and Minister of Public Safety, and with the support of the Minister of Foreign Affairs, continue to advance the National Cyber Security Action Plan, ensuring Canada is well positioned to adapt to and combat cyber risks, and ensure the security and integrity of Canada’s critical systems.
o Work with the Minister of Public Safety, the Minister of Foreign Affairs, and the Minister of National Defence, and in collaboration with implicated ministers, to develop and implement a renewed National Cyber Security Strategy, which will articulate Canada’s long-term strategy to protect our national security and economy, deter cyber threat actors, and promote norms-based international behavior in cyberspace.
o Advance the Pan-Canadian Artificial Intelligence Strategy and additional measures, such as advancing standards and continuing to lead international efforts around coordination, to support artificial intelligence innovations and research in Canada.
o Launch a National Quantum Strategy to amplify Canada’s strength in quantum research and grow our quantum-ready technologies, companies and talent.
§ The United States (U.S.) Federal Trade Commission (FTC) settled claims that a company violated the Gramm-Leach Bliley Act’s Safeguards Rule. The FTC explained in its press statement:
o In a complaint first announced in December 2020, the FTC alleged that Texas-based Ascension Data & Analytics, LLC violated the Gramm-Leach Bliley Act’s Safeguards Rule, which requires financial institutions to develop, implement, and maintain a comprehensive information security program and ensure third-party vendors are capable of implementing and maintaining appropriate safeguards for customer information. The FTC alleged that Ascension failed to do this.
o The FTC alleged that a vendor Ascension hired to perform text recognition scanning on mortgage documents stored the contents of the documents—which included names, dates of birth, Social Security numbers and other personal information—on a cloud-based server in plain text, without any protections to block unauthorized access, such as requiring a password. As a result, the server with the mortgage information was accessed dozens of times.
o After receiving one comment on the settlement, which also was announced in December 2020, the Commission voted 2-1-1 to finalize the settlement and to send a response to the commenter.
o Chair Lina M. Khan did not participate. Commissioner Rebecca Kelly Slaughter voted no and issued a dissenting statement.
§ Ireland’s Data Protection Commission (DPC) issued its “Regulatory Strategy for 2022-2027,” and the agency laid out its vision for the next five years:
o The Data Protection Commission is committed to being an independent, internationally influential and publicly dependable regulator of EU data protection law; regulating with clear purpose, trusted by the public, respected by our peers and effective in our regulation. The DPC will play a leadership role in bringing legal clarity to the early years of the General Data Protection Regulation. The DPC will apply a risk-based regulatory approach to its work, so that its resources are always prioritised on the basis of delivering the greatest benefit to the maximum number of people. The DPC will also be a rewarding and challenging place to work, with a focus on retaining, attracting and allocating the most appropriate people to deliver on its mandate, recognising the value and capacities of its staff as its most critical asset.
§ The United States (U.S.) Financial Crimes Enforcement Network (FinCEN) issued “a request for information (RFI) seeking comments on ways to streamline, modernize, and update the anti-money laundering and countering the financing of terrorism (AML/CFT) regime of the United States.” FinCEN stated:
o FinCEN is particularly interested in comments on ways to modernize risk-based AML/CFT regulations and guidance, issued pursuant to the Bank Secrecy Act (BSA) so that they, on a continuing basis, protect U.S. national security in a cost-effective and efficient manner. Today’s RFI also supports FinCEN’s efforts to conduct a formal review of BSA regulations and related guidance, which is required by Section 6216 of the Anti-Money Laundering Act of 2020. FinCEN will report to Congress the findings of the review, including administrative and legislative recommendations.
§ The United Kingdom’s Information Commissioner’s Office (ICO) opened a consultation “seeking feedback on their:
o Statutory Guidance on our Regulatory Action; and
o Statutory Guidance on our PECR Powers.”
o The agency added “[t]aken together, these three documents set out how the ICO aims to carry out its mission to uphold information rights for the UK public in the digital age.”
§ The United States (U.S.) Government Accountability Office (GAO) published a report titled “Information Technology: Digital Service Programs Need to Consistently Coordinate on Developing Guidance for Agencies.” The GAO concluded:
o Given the magnitude of the federal government’s investment in IT, it is important that federal agencies responsible for developing and acquiring systems avoid overlapping or duplicating IT work. Agencies have made progress addressing recommendations we have made to address duplicative IT and improve management roles and responsibilities. However, as of October 2021, 102 of 392 recommendations had not been implemented. Until agencies fully implement these recommendations, they will not be positioned to fully oversee and effectively manage their IT acquisitions. Moreover, they will continue to risk wasting federal funds and other resources on duplicative IT investments. USDS and 18F help agencies develop and acquire IT, and they have successfully coordinated to avoid overlapping or duplicating efforts on agency projects. In addition, they have coordinated recruiting efforts to address their need to hire IT experts. However, they do not consistently coordinate their plans to develop and issue IT acquisition and development guidance for agencies, which risks overlapping or duplicating work or presenting conflicting information. Further, by not coordinating in a more strategic manner on their guidance development efforts, USDS and 18F diminish their opportunities to leverage each other’s resources and achieve greater outcomes.
Further Reading
Photo by Hello I'm Nik on Unsplash
§ “Amazon workers in Alabama will vote again on unionization in February” By Rachel Lerman — The Washington Post. Amazon warehouse workers in Bessemer, Ala., will soon begin voting on whether to form a union, a year after the large unionization effort failed amid controversy over the e-commerce giant’s tactics. Ballots will be mailed Feb. 4 and votes will begin to be counted on March. 28, the National Labor Relations Board announced Tuesday. Amazon workers previously overwhelmingly rejected a unionization effort at the warehouse last year, but the NLRB called for a revote after finding that Amazon improperly interfered in that election. An NLRB official specifically cited Amazon’s placing of an unmarked U.S. Postal Service mailbox in front of the warehouse just after voting started, writing that Amazon “essentially highjacked the process and gave a strong impression that it controlled the process.”
§ “As Beijing Takes Control, Chinese Tech Companies Lose Jobs and Hope” By Li Yuan — The New York Times. Like many ambitious young Chinese, Zhao Junfeng studied hard in college and graduate school so he could land a coveted job as a programmer at a big Chinese internet company. After finishing graduate school in 2019, he joined an e-commerce company in the eastern Chinese city of Nanjing, got married and adopted a cat named Mango. In November of 2021, he moved to Shanghai to join one of China’s biggest video platforms, iQiyi. He was on track to achieve a much-desired middle-class life, documenting his rise on his social media account. Then barely a month into his new job, he was let go when iQiyi laid off more than 20 percent of its staff.
§ “A Missouri Reporter Is Getting Blamed For the Security Flaw He Exposed” By Jack Gillum — Bloomberg. St. Louis Post-Dispatch reporter Josh Renaud discovered a major security flaw on a Missouri state website last October. With a simple right-click and some elementary decoding, he found that anyone with a web browser could view thousands of educators’ Social Security numbers. Soon after, the governor of Missouri vowed to prosecute him for “hacking.”
§ “Facebook contractors threaten to stop work over missing paychecks” By Russell Brandom — The Verge. Facebook moderators at an Accenture site in Austin are facing a payroll disaster that has left many without their holiday paychecks. Workers at the site handle moderation, customer service, and other tasks for Facebook and WhatsApp — and a work stoppage has already been threatened if the situation is not resolved.
§ “Federal Judge Steps Aside From High-Profile Amazon Case, Citing Financial Conflict” By Joe Palazzolo — The Wall Street Journal. A federal judge removed himself from a nearly two-year-old Amazon.com Inc. case, citing a financial conflict, after a Wall Street Journal report about his family’s Amazon stockholdings. U.S. District Judge Liam O’Grady had ruled in Amazon’s favor during the 20 months he oversaw the civil case, in which the online retailer accuses two former employees of taking kickbacks from a real-estate developer and violating Amazon’s conflict-of-interest policies.
§ “Economists Pin More Blame on Tech for Rising Inequality” By Steve Lohr — The New York Times. Daron Acemoglu, an influential economist at the Massachusetts Institute of Technology, has been making the case against what he describes as “excessive automation.” The economywide payoff of investing in machines and software has been stubbornly elusive. But he says the rising inequality resulting from those investments, and from the public policy that encourages them, is crystal clear.
§ “Hackers Can Cut the Lights With Rogue Code, Researchers Show” By Jack Gillum — Bloomberg. As Ang Cui added more juice to the power grid, overhead electric lines began to glow bright orange. Then, within seconds, the power lines evaporated in a flash of smoke, leaving an entire section of Manhattan in the dark. No actual buildings or people lost power because, luckily, this was just a simulation — a tabletop diorama of Manhattan complete with tiny copper power lines and the Statue of Liberty relocated to a pared-down Central Park. Cui’s colleagues at Red Balloon Security Inc. had unleashed a few lines of malicious code that knocked out a computer designed to protect electrical lines.
§ “Covid Test Misinformation Spikes Along With Spread of Omicron” By Davey Alba — The New York Times. On Dec. 29, The Gateway Pundit, a far-right website that often spreads conspiracy theories, published an article falsely implying that the Centers for Disease Control and Prevention had withdrawn authorization of all P.C.R. tests for detecting Covid-19. The article collected 22,000 likes, comments and shares on Facebook and Twitter.
§ “CISA director: Log4Shell has not resulted in ‘significant’ government intrusions yet” By Adam Janofsky — The Record. Top officials at the US Cybersecurity and Infrastructure Security Agency on Monday said the Log4Shell vulnerability has mostly resulted in cryptomining and other minor incidents at federal agencies, but warned that threat actors may soon start actively exploiting the vulnerability to disrupt critical infrastructure and other assets.
§ “Huawei Pours Money Into China’s Chipmaking Ambitions” By Dan Strumpf — The Wall Street Journal. Blocked by the U.S. from buying many of the chips it needs, Huawei Technologies Co. is stepping up investments in companies that are racing to build China’s semiconductor supply chain. The Chinese technology giant is investing in the companies through a fund that it launched in 2019, around the time Washington began putting export bans on Huawei. The fund, Hubble Technology Investment Co., has backed 56 companies since its founding, according to data compiled by PitchBook, a capital markets research firm.
Coming Events
Photo by Zuza Gałczyńska on Unsplash
§ 11 January
o The United States (U.S.) House Oversight and Reform Committee will hold a hearing titled “Cybersecurity for the New Frontier: Reforming the Federal Information Security Management Act” as explained in this background memorandum prepared by staff.
o The United Kingdom’s House of Commons’ Foreign Affairs Committee will hold a “Formal meeting (oral evidence session): Tech and the future of UK foreign policy” as part of its inquiry.
§ 12 January
o The United States (U.S.) Senate Indian Affairs Committee will hold a roundtable discussion titled “Closing the Digital Divide in Native Communities through Infrastructure Investment.”
o The United States (U.S.) House Agriculture Committee will hold a hearing titled “Implications of Electric Vehicle Investments for Agriculture and Rural America.”
o The United Kingdom’s House of Commons’ Science and Technology Committee will hold a “Formal meeting (oral evidence session): UK space strategy and UK satellite infrastructure” as part of its inquiry.
o The United Kingdom’s House of Lords’ Justice and Home Affairs Committee will hold a “Formal meeting (oral evidence session): New technologies and the application of the law” as part of its inquiry.
§ 17-28 January
o The United Nations (UN) Ad hoc committee established by General Assembly resolution 74/247 will meet. The UN explained:
§ Through its resolution 74/247, the General Assembly decided to establish an open-ended ad hoc intergovernmental committee of experts, representative of all regions, to elaborate a comprehensive international convention on countering the use of information and communications technologies for criminal purposes, taking into full consideration existing international instruments and efforts at the national, regional and international levels on combating the use of information and communications technologies for criminal purposes, in particular the work and outcomes of the open-ended intergovernmental Expert Group to Conduct a Comprehensive Study on Cybercrime.
§ 18 January
o The European Data Protection Board will hold a plenary meeting.
§ 27 January
o The United States (U.S.) Federal Communications Commission (FCC) will hold an open meeting with this agenda:
§ Empowering Broadband Consumers Through Transparency. The Commission will consider a Notice of Proposed Rulemaking that would propose to require that broadband internet access service providers display, at the point of sale, labels to disclose to consumers certain information about their prices, introductory rates, data allowances, broadband speeds, and management practices, among other things. (CG Docket No. 22-2)
§ Connecting Tribal Libraries. The Commission will consider a Report and Order that would amend the definition of library in the Commission’s rules to clarify that Tribal libraries are eligible for support through the E-Rate Program. (CC Docket No. 02-6)
§ Updating Outmoded Political Programming and Record-Keeping Rules. The Commission will consider a Report and Order to update outmoded political programming rules. (MB Docket No. 21-293)
§ Facilitating Better Use of ‘White Space’ Spectrum. The Commission will consider a Second Order on Reconsideration and Order resolving pending issues associated with white space devices and the white spaces databases, enabling unlicensed white space devices to continue operating efficiently while protecting other spectrum users. (ET Docket Nos. 04-186, 14-165)
§ Updating Equipment Authorization Rules. The Commission will consider a Notice of Proposed Rulemaking that would propose to update existing equipment authorization rules to reflect more recent versions of the technical standards that are incorporated by reference and incorporate by reference a new technical standard so that our equipment authorization system can continue to keep pace with technology developments. (ET Docket Nos. 21-363, 19-48)
§ Restricted Adjudicatory Matter. The Commission will consider a restricted adjudicatory matter.
§ National Security Matter. The Commission will consider a national security matter.
§ Enforcement Bureau Action. The Commission will consider an enforcement action.
§ 22 February
o The European Data Protection Board will hold a plenary meeting.
§ 16-17 June
o The European Data Protection Supervisor will hold a conference titled “The future of data protection: effective enforcement in the digital world.”