Cracking the Code: Navigating Regulations Governing Emerging Technologies in Government Contracts

Footnotes for this article are available at the end of this page.

As the federal government seeks to procure more artificial intelligence (“AI”), machine learning (“ML”), synthetic content1 and other emerging technologies in the coming years, government contractors must be prepared to comply with numerous new regulatory requirements. Unfortunately, the rules and regulations governing such compliance obligations are issued by multiple federal agencies, all of whom are constantly updating such regulations in real time. Thus, tracking compliance obligations is rather complicated and requires constant monitoring.

An Update on FedRAMP

The Federal Risk and Authorization Management Program (“FedRAMP”), which was established in 2011 and became law in 2022, is a government-wide program that provides a framework for federal government agencies to secure cloud services and products that comply with White House and National Institute of Standards and Technology (“NIST”) requirements. Although FedRAMP empowers agencies to use modern cloud technologies, FedRAMP certification is both a costly and time-consuming process for government contractors. Further, the federal government agency requiring a government contractor to have FedRAMP certification is not going to pay for it. Pursuant to President Biden’s Executive Order 14110 on Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, which was issued on October 30, 2023, FedRAMP is tasked with the creation of a framework to prioritize emerging technologies, beginning with generative AI, in the security authorizations process, for all federal government agencies. FedRAMP has released a draft Emerging Technology Prioritization Framework pushing:

  1. chat interfaces;
  2. code generators; and
  3. debugging tools to the front of the authorization process line.

So, government contractors selling such emerging technologies “should” move to the front of the FedRAMP certification line.

The Importance of Executive Order 14110

EO 14110 has played an important role in many other aspects of these new regulations and specifically established a comprehensive framework for the development, deployment and regulation of AI technologies. Under EO 14110, the NIST was tasked with establishing guidelines relating to AI. At the present time, the NIST guidelines consist of voluntary frameworks to assist companies with risk management in the development of AI, but such guidelines could possibly be adopted as laws or regulations in the future.

EO 14110 also directed federal agencies to complete a myriad of duties within months (270 days at the most) from the date it was issued. EO 14110 tasked the heads of agencies with authority of 16 critical infrastructure sectors2 to work with the U.S. Department of Homeland Security’s (“DHS”) Cybersecurity Infrastructure and Security Agency (“CISA”) to evaluate potential risks related to the use of AI.

The Role of DHS’s CISA and Other Federal Agencies

CISA also released a proposal rule in April 2024 pursuant to the Cybersecurity Incident Reporting & Critical Infrastructure Act (“CIRCIA”) that will require covered entities in each of the 16 critical infrastructure sectors to report ransom payments to CISA within 24 hours and substantial cyber incidents within 72 hours. But, CISA is not acting in a vacuum. Several agencies are working on their own pilot programs:

  • The U.S. Department of Defense is working on a pilot program for AI tools to identify and address software vulnerabilities relating to the military and national security.
  • The DHS is working on facilitating the Safe and Responsible Deployment and Use of Artificial Intelligence in the Federal Government, Critical Infrastructure, and U.S. Economy.
  • The U.S. Department of Commerce is proposing regulations for Infrastructure as a Service (“IaaS”) providers to develop Know Your Customer programs to verify the identity of any foreign person that obtains an IaaS account from a foreign reseller.
  • The U.S. General Services Administration (“GSA”) released its Generative AI and Specialized Computing Infrastructure Acquisition Resource Guide in April 2024. This guide outlines potential uses of generative AI for government agencies and procurement strategies. GSA does not currently plan for a separate schedule contract for AI (although that could change), but instead encourages the use of Multiple Award Schedule Information Technology or Government-Wide Acquisition Contracts for such purchases.

Conclusion

Staying current on compliance obligations for government contractors selling AI, ML, synthetic content and other emerging technologies will require disciplined diligence. It’s imperative that government contractors understand the new regulations not only to avoid penalties, but because such regulations serve to ensure U.S. national security. Maybe AI will play a role in helping us monitor these important developments.

 

The author thanks AGG intern, Katherine B. Kline, for her contribution to this article.

 

[1] Synthetic content is a catch-all term for the artificial production, manipulation and modification of data and media by automated means, especially through the use of artificial intelligence algorithms, such as for the purpose of misleading people or changing an original meaning.

[2] The 16 critical infrastructure sectors include Chemical; Commercial Facilities; Communications; Critical Manufacturing; Dams; Defense Industrial Base; Emergency Services; Energy; Financial Services; Food and Agriculture; Government Facilities; Healthcare and Public Health; Information Technology; Nuclear Reactors, Materials, and Waste; Transportation; and Water and Wastewater Systems.