chevron-down Created with Sketch Beta.
December 14, 2023 Feature

Proceedings of the 16th Annual ABA Science & Technology Law Section Information Security Committee “Pre-RSA” Conference

Jennifer Adney-Schell

Since 2007, the Information Security Committee (InfoSec or ISC) of the ABA Science & Technology Law Section has held a conference in San Francisco bringing together its members (both attorney/law student and technical SMEs), government officials, industry leaders, and others to attend the annual RSA Conference, one of the largest information technology/cybersecurity events in the world.

As in years past, the ABA InfoSec has benefitted from the presence at RSA in San Francisco of a significant fraction of the global cybersecurity law and policy community, providing the opportunity to develop a rich and varied agenda, with panels and keynotes devoted to a range of current and emerging IT-related topics of interest to the legal community and those who rely on us. InfoSec Co-chair Steve Millendorf’s firm Foley Lardner LLC again graciously served as host at their offices at 555 California Street for the InfoSec’s conference April 22–23, 2023, preceding the RSA Conference.

Attendees received keynote addresses and participated in panel dialogues that included the new U.S. National Cybersecurity Strategy, the challenges posed by the explosive evolution of Generative AI both in the legal community and in global critical infrastructures, and the continuously morphing international cybersecurity environment, including challenges posed by Russia’s use of cyber tools in its war against Ukraine.

After the InfoSec sessions, including our annual dinner at the Maltese Falcon restaurant, John’s Grill, committee members, past SciTech Section leaders, and other attendees delivered several Law Track panels at the RSA Conference, including the annual “Hot Topics in Cyber Law” panel moderated by InfoSec Co-chair Michael Aisenberg, and presented by past Section Chairs Cindy Cwick, Ruth Bro-Hill, and Lucy Thomson. Co-chair Hoyt Kesterson again organized a Mock Trial, this year addressing the potential liability associated with storing fabricated information to deceive attackers. Vice Chair Steve Teppler played the part of one of the attorneys. A number of InfoSec members had roles in other RSAC panel and Birds-of-a-Feather presentations, many of which are available online at RSAC23’s website by selecting the Law Topic.

Here are the highlight presentations and panels from this year’s Information Security Committee sessions:

National Cybersecurity Strategy

Keynote Speaker: Jim Halpert, General Counsel, Office of the National Cyber Director; Co-chair, Partner, DLA Piper—U.S. Data Protection, Privacy & Security

[The Office of the National Cyber Director was established by Congress in 2021 as a component of the Executive Office of the President in the White House, designed to advise the president and vice president of the United States on cybersecurity policy and strategies. It served as a hub organization in the development of the administration’s new National Cybersecurity Strategy.1]

Mr. Halpert began his keynote stating the need for cybersecurity policies to guide the implementation of safe and resilient cybersecurity strategies. But the administration recognizes that the users of such policies are very diverse, ranging from small companies to large state actors, from developers to industry sector regulators, and from businesses that use the technology to the businesses that produce the technology. The success of a realistic system of interconnectivity where every American can thrive and results are sustainable relies on the diversity of perspectives. He commended the value of advancing collaborative opportunities to gain multiple viewpoints across the broad range of contributors.

He described decades of learning and sharing the vast evolution of ideas shared across the dynamic disciplines including information technology, cybersecurity, agency law, contract law, and technology law. All these contribute to both focused conversations in the room, as well as educational opportunities and innovative advancements fundamental to the collective preservation of knowledge and the richly diverse understanding that is necessary to preserve the integrity and safety of every American—all goals promoted by the Biden administration with the NSC.

Halpert insightfully encouraged the concept of “Regulatory Harmony,” which:

  1. Adopts risk-focused baseline regulations that are advanced across critical infrastructures;
  2. Utilizes contractors, consultants, and lawyers best situated to translate industry-specific standards and requirements; and
  3. Implements regulatory reciprocity rationalizing enforcement and oversight compliance that aligns with those structured baseline requirements.

He asked attendees: Who should bear the burden of software security liability? Which government entity holds the authority to enforce compliance regulations over all software companies without unfairly disadvantaging smaller business entities by only favoring compliance solutions that favor cost structures generally only attainable by larger entities? Are there equitable solutions the government can promote to equitably support secure software development practices? These are all areas where answers initially appear intuitive and seem logically clear; but without the ability, understanding, or authority to enforce equitable solutions, the burden of cybersecurity prevention and failures invariably falls upon small businesses, individuals, and other entities least capable of affording the necessary investments.

As alternatives, Halpert proposed some of the following human capital cyber workforce and educational approaches:

  • There are over 760,000 vacancies in cyber positions across the United States, which constitute a national security risk that must be tackled aggressively. These high-paying jobs improve the economy, secure critical infrastructure, and advance our digital way of life.
  • Solutions to filling these critical cybersecurity vacancies should be inclusive to underserved populations and neurodivergent populations by remaining within reach for any American who wishes to pursue this field and not unreasonably limiting candidates.
  • Skill-based training programs and inclusive hiring standards, while promoting equitable opportunities—such as utilizing community colleges, apprenticeship programs, and other nontraditional training programs—will encourage and broaden strong opportunities across the socioeconomic spectrum.
  • Cybersecurity risks and decisions affect the regular decisions of every American—every individual requires a basic understanding and technological skillset in order to equitably prosper in an interconnected civilization.

Halpert concluded with the following questions: How do we avoid infringing on civil liberties and encumbering individual autonomy while maintaining national and economic security from virtual criminals and adversarial foes? Past congressional strategies have left a legacy of simply shifting and rebalancing responsibilities across cyberspace defense, creating a burden of risk that is too often placed on individual consumers and/or small businesses/organizations, which are least capable and least likely situated to defend themselves. Or the risks and cost of defending against malicious cyber attackers may shift to the rest of society by empowering governments, publicly funded entities, and private sponsors—the entities best situated to respond to these destructive risks.

Cryptocurrency: FTX, SEC Regulations, BlockChain Updates

Hoyt L. Kesterson II, Co-chair, ABA InfoSec Committee; Security & Risk Architect

Lucy Thomson, Founding Principle of Livingston PLLC in Washington, D.C., CISSP

These two veteran authorities presented a brief but informative understanding of how cryptographic hashing operates and forms a core tool in Block Chain implementations. Mr. Kesterson explained the concerns surrounding cryptocurrency mining operations. Advancements in technology have made the energy-consuming, powerful computing equipment necessary for successful crypto mining operations fairly inexpensive and accessible. But all of these advances to respond to crypto mining have increased vulnerabilities to passwords. Kesterson described the big problems with the desire to avoid a centralized authority, the lack of energy efficiency, and what seemed to be the “Wild, Wild West” rules governing energy contracts.

Ms. Thomson illustrated how technological advancements, left unregulated, can have devastating effects using the “Crypto Winter” of 2022–23 as a prime example. After soaring U.S. inflation caused the Federal Reserve to raise interest rates aggressively in 2022, investors responded by rapidly selling off riskier assets like cryptocurrency. The drop in crypto prices exposed the deceptive business practices and mismanagement across crypto lenders, exchanges, and hedge funds, leading to a chain reaction of bankruptcies filed by many of the largest crypto lenders. Not only were the investors financially injured by these swindlers, the process of selling off the bankrupt institution’s assets means investor’s private, personal information is sold without their consent and absent any regulatory protections.

A New Due Diligence? Trust, Transactions, Data-Driven Decision Making, UCC/ULC Revision

Michael Aisenberg, Co-chair, ABA InfoSec Committee; Cyber Law & Policy Consultant

Dr. Andrea Little Limbago, SVP, Research & Analysis, Interos, Inc.

Robert Metzger, Chair, Cybersecurity and Privacy Practice; Rogers Joseph O’Donnell

Steven Teppler, Chair, Cybersecurity and Privacy Practice; Partner, Mendelbaum Barrett P.C.

Contract law under the UCC follows the traditional principle of caveat emptor, “buyer beware,” placing the principal burden of evaluating vendor trustworthiness and product quality prior to purchase on the buyer.

However, when it comes to sophisticated ICT purchases (microelectronics, software), transactional evaluations, and the parties’ relative capacity to assess the security of the developer’s supply chain, is it fair to expect an end-user, even one that employs an IT department, to bear the sole burden of determining the credibility of vendor representations regarding integrity and product quality? If the software carries a product warranty, which part of the warranty covers security vulnerabilities caused by developer’s use of third party and open-source software and tools? What about software and applications the U.S. government uses taxpayer funds to purchase—especially those products used in critical infrastructures like aviation and health care, and in national security and intelligence applications.

Mr. Metzger stated that in commercial transactions, it shouldn’t be the responsibility of the buyer to determine which or whether open-source products were used by the seller or its suppliers to create an end-user product. However, he agreed that a much higher standard should be required by government customers. Another area of concern is high-risk sources, especially when it came to Chinese technology products where the final product is often not only from a high-risk source but often several degrees of separation from any remedy.

Questions for the panel included the enormous volume of product source and pedigree data, and whether it is realistic to expect technology producers/sellers to track the origin of data, software code, and generic algorithms. Is there a specific duty on vendors to identify and disclose source codes, and algorithms, or the identity of increasingly generic component hardware producers/designers selling computer boards and commodity semiconductor chips? What about tech companies that sell to entities in adversary countries that are under effective control of their governments, like China, North Korea, or Russia?

This discussion illustrated the unique diversity of the InfoSec Committee membership and of the attendees at this Pre-RSA conference, driving home the vital importance of collaborating across the industries to better understand how the various viewpoints are integral to asking and answering the right questions. Here were government experts, legal advocates, technology experts, and legal and technology academics, with many attendees claiming multiple sources of expertise.

Attendees from both the technology and legal communities voiced concerns that the legal burden of due diligence responsibility should continue to shift from the “exclusively on the buyer” model to more of a shared responsibility model with enlarged seller burden; however, others disagreed, pointing out difficulties in assessing the scope of product pedigree and provenance requirements reasonable to place on sellers, as well as the limitations on effectively enforcing these requirements.

Mr. Aisenberg pointed out that the seller is the one who is in the best position to verify the authenticity of the products they are selling to the consumer. There are safeguards for fractional and component suppliers and sellers who do their due diligence, permitting the responsibility for disclosure to fairly shift down the chain of supply to the entities with the resources, data access, and knowledge to verify the safety, security, and authenticity of the products and resources they use. Structures can protect intellectual property and other proprietary interests of sellers and their upstream suppliers and still provide buyers with more transparent bases for trust and acquisition decisions. Technology supply chains should not be held to any lesser standard.

Keynote: Generative AI and the Law

Daniel “Dazza” Greenwood, Esq., Executive Director, Computational Law, MIT Computer Lab; Founder & CEO Civics.com

[Note that this conference was held prior to the breaking news of New York lawyers sanctioned for using a generative AI tool to draft a trial brief submitted to the court citing six nonexistent legal cases.]

Mr. Greenwood opened the panel by addressing the “elephant in the room”: the anxiety and stress levels that have been building throughout the legal sector since the news earlier this year reporting that ChatGPT 4.0 had successfully passed the Multistate Bar Exam. The truth is, Greenwood stated, that ChatGPT needs significant due diligence and legal assurances before it should ever be relied on independent of human intervention in business or government.

“AI/ML tools must offer assurances of data integrity, data accuracy, and the absence of prejudice and bias; users must be prepared to assure relying parties of avoiding an overreliance on the results by providing consistent evidence that generative AI tools meet these tests in application. . . . Today, this doesn’t exist . . . yet,” Greenwood said. “But the natural fear arises that courses and training across the legal communities should be reassessed.” Greenwood focused this statement on an important understanding: “AI is not purely based on programming logic. . . . AI is based in language, data, and vectors—algebraic values, matrixes of vectors. Because of this, language-based technology is taking to law like a duck to water because law is so incredibly language-based,” said Greenwood.

However, he warned that AI/ML use acceleration is real, and it isn’t going away. It isn’t a matter of “if” the next AI winter will start but “when.” Yes, it may be a while as it will take time for generative AI to blend with machine learning AI, but Greenwood predicted a long tail of acceleration to keep us busy for the next year or more.

In the meantime, increasing use will impact adjudication, rulings, trends, legal leanings—and mistakes. “We need to stay aware.” Greenwood also predicted we would start seeing an uptick in law practice applications centered around legal drafting, editing, research, communications, and organizing large summary documents. “Unfortunately, with the sugar comes the spice.”

Greenwood provided extensive warning of “Prompt Engineering.” The machine learning model being trained through the “prompt process” to follow human-given instructions can be manipulated by malicious users inserting hidden or ambiguous prompts. The structuring of the tool training process through “prompts” thus can be the source of bias injection, inaccuracy, and outcome determination. False output, known as “hallucinations” can emerge. Prompt injections include methods where the user can manipulate the process, injecting hidden prompts into

Sunday Keynote: International Cyber: Recent Developments from Ukraine/Russia & Customary Law of Cyber Conflict

Christopher Painter, Esq., President of The Global Forum on Cyber Expertise Foundation

Mr. Painter, a frequent contributor to the work of the Information Security Committee and the SciTech Section has had a distinguished career leading international aspects of ICT/cyber policy issues. One of the first leaders of the Justice Department’s Computer Law and Cyber Crimes capability within the Criminal Division, he has been a federal prosecutor and founder and convener of a U.S.-led multilateral forty-nation “cyber G-40” cybercrime/policy forum following the 9/11 attacks, and has served as the principal deputy for ICT/cyber policy to Secretary of State Hillary Clinton during the Obama administration.

Painter covered a range of international issues, including the urgent concerns surrounding Russia’s use of cyber exploits as an element of its war on Ukraine. He also provided an update on pending work in a variety of multilateral venues on AI policy, continuing efforts at privacy harmonization and implementation of the Budapest Cyber Crime Convention.

The Ukraine discussion consumed a significant portion of the session with strong attendee interest on the issues of available remedies, whether in kind or kinetic; measures of proportionality; and speculation regarding targets and impact of future Russian malign behavior.

Cyber Case Law Update

Professor Rick Aldrich, Cybersecurity and Compliance Analyst with Booz Allen Hamilton

[Professor Aldrich supports the Cyber Security and Information Systems Information Analysis Center (CSIAC), a component of the U.S. Department of Defense’s (DOD) Information Analysis Center (IAC) with the purpose of providing the DOD, federal government users, supporting academia, and industry partners with numerous resources and services focused on research and expert assistance that is unbiased both scientifically and technically. These services include research databases, newsletters, technical topic publications, training, knowledgebase management, and other national security IT policy matters.]

As in prior years, Aldrich focused on recently decided and pending appellate cases addressing policy issues of interest to the ICT community, engaging attendees in forecasting the decision of the court.

Aldrich began by discussing two pending Supreme Court Cases: Gonzalez v. Google, No. 21-1333, and Twitter v. Taamneh, No. 21-1496, both addressing responsibility of social media providers. The focus of the discussion was understanding whether, and under what laws, the Court should hold internet platforms liable for injuries—especially whether they may have “knowingly” assisted in fatalities. Were the platforms responsible for “aiding and abetting” criminals? With over 500 pages of amici briefs citing various political issues, Aldrich asked attendees a key question before the Court in both cases: Is the platform “neutral” when Google makes the decision to display certain videos before others? Aldrich opined that the Twitter case seemed more like a “negligence” case instead of a “substantial assistance” case. But he identified as a key issue whether the platform operator met the required elements in either case. Aldrich didn’t believe so.

Turning to a widely discussed legislative policy issue, Aldrich introduced section 702 of the Foreign Intelligence Surveillance Act (FISA), which will expire on December 31, 2023. Aldrich said the current administration supports reauthorization based on its successes, though there have been some concerns. He described a pending amendment (S. 1265 and H.R. 2738) introduced to protect the Fourth Amendment privacy considerations by preventing law enforcement and intelligence agencies from obtaining subscriber/customer records through exchanges for anything of value, thereby allowing records to be obtained without warrants or court orders needed to obtain similar information. Aldrich said it had bipartisan support, but it wasn’t clear if this bill would be reintroduced or not.

Aldrich addressed a cluster of Fourth Amendment privacy cases regarding geofencing where people deleted their data, and after a week or two, Google still maintained the data and was able to offer it to law enforcement. There is no discussion of why Google had kept the data; the question is: Is this a violation of general warrant requirements? The Court in United States v. Rhine2 said “no” under the “Good Faith” exception favoring the government. There was no violation of the Fourth Amendment because the government met the particularized probable cause standard. Additionally, Aldrich pointed out that keyword warrants (police asking Google for IP addresses that searched for a keyword in a particular time), according to the Court, are not violations of the Fourth Amendment if they are specific, procedurally sound, and supported by probable cause. However, he differentiated the situation if the searches were conducted by Google, and not the police.

Overall, the panel presenters, the attendees, and guests of the 2023 InfoSec Pre-RSA Conference went on to the RSA Conference invigorated with new thoughts and ideas to consider when serving clients, as well as understanding our own roles in these “interesting times.”

Endnotes

1. White House, National Cybersecurity Strategy (Mar. 2023).

2. No. 21-0687 (RC) (D.D.C. Jan. 24, 2023).

Entity:
Topic:
The material in all ABA publications is copyrighted and may be reprinted by permission only. Request reprint permission here.

Jennifer Adney-Schell

pring 2023 Law Student Intern on the Information Security Committee of the ABA Science & Technology Law Section

Jennifer Adney-Schelle is the Spring 2023 Law Student Intern on the Information Security Committee of the ABA Science & Technology Law Section. A 2023 graduate of Saint Louis University School of Law, Jennifer is a 2024 Missouri Bar examinee and legal intern SWMW Law, St. Louis, Missouri. She holds a B.S. in CIS, Information Security and Assurance and a minor in Business Continuity from St. Louis University. She has more than 25 years of technology and database administration across various industry sectors, including manufacturing, higher education, and law firm organizations.