Future Tech

Defense AI models 'a risk to life' alleges spurned tech firm

Tan KW
Publish date: Fri, 06 Sep 2024, 09:53 PM
Tan KW
0 474,653
Future Tech

In-depth Chatterbox Lab CEO Danny Coleman alleges that after three and a half years of uncompensated work to provide the US Defense Department with tools for "Responsible AI," he found himself accused of trying to blackmail the government.

"It's the worst thing that's ever been leveled at me in my 32-year career," Coleman told The Register. "The legal ramifications for that individual, once I get to that stage, are going to be around defamation."

Coleman's supposed crime was to object to the reversal of the publicly announced award of a five-year contract worth $24 million to his company and strategic partner Deloitte. The deal called for Chatterbox to provide the Pentagon's recently formed Chief Digital and Artificial Intelligence Office (CDAO) with "Responsible AI" software.

The proposal to use Chatterbox Labs' AIMI software to evaluate the safety of DoD AI Algorithmic Warfare models - action called for in President Biden's October 2023 Executive Order 14110 - had progressed successfully through preliminary evaluation since it was solicited in 2020.

Coleman said Chatterbox Labs had been working through the DoD production contract obligations since September 7, 2023, and that each procurement request had been accepted. "We were told by a DoD acquisition executive on Friday, December 1st, that the deal is done and will be concluded on 5th December with Stephanie Wilson. The contract was agreed including all final terms and conditions on the 5th December 2023 via Stephanie Wilson, head of contracting," he said.

And on December 5, 2023, in a Spotify-hosted podcast titled "Accelerating & Scaling Defense Acquisition" hosted by Bonnie Evangelista, the Acting Deputy of the CDAO Acquisition Directorate, CDAO acquisition executive Sharothi Pikar described the project as "the responsible AI thing we just awarded."

Huge

"We were elated and excited," said Coleman, who explained that his company had been working with the Pentagon since November 2020 to provide the capabilities of its software. "To be the first company to roll out Responsible AI across DoD was a huge achievement in that we worked through three separate leadership changes and were still successful."

This was to be just the beginning as at least 15 other major government agencies planned to deliver the same project under Executive Order 14110, according to Coleman. The value of the US government relationship, he said, would be more than $100 million.

But then the CDAO backtracked. In January 2024, Coleman was told there had been a change in strategic direction and the government was never going to buy Chatterbox Labs' AIMI software.

Deloitte have tried to throw me under the bus by protecting their six billion of revenue stream from the government

And Deloitte, Chatterbox Labs' strategic partner dating back eight years, refused to support Coleman's objections, he claimed. "To cut a long story short, Deloitte have tried to throw me under the bus by protecting their six billion of revenue stream from the government, saying, 'Oh we could never dispute that,'" he alleged.

Coleman reckons Deloitte still owes Chatterbox Labs $1 million from its participation in a $3 million prototype stage of the deal primarily awarded to Deloitte - Deloitte was the prime supplier to the government for that work, and Chatterbox Labs served as the so-called non-traditional vendor. Government contracting rules [PDF] state that non-traditional vendors should generally receive at least a third of the contract value.

A spokesperson for Deloitte told The Register, "Our work with Chatterbox is complete and all invoices have been paid. Chatterbox has provided no details to substantiate additional claims for payment, nor have they accepted our offer to speak with their legal counsel."

Our work with Chatterbox is complete and all invoices have been paid

Coleman responded: "Without Chatterbox Labs being a non-traditional vendor, this contract would never have been awarded to Deloitte. We even introduced Deloitte to the project. It's a condition of the contract. Under government contracting law Chatterbox Labs should be paid a third."

"Deloitte even refused to share the DoD contract we are a material part of, citing confidentiality clauses ... We have provided Deloitte with the OUSD T&Cs, which clearly highlight that one third is payable to Chatterbox Labs and sent through an invoice in May 2024 to be paid," he told us, referring to the Office of the Under Secretary of Defense (OUSD) for Acquisition and Sustainment, which supervises procurement among other things.

"Deloitte legal counsel has been provided with OUSD T&Cs, therefore, payment is still outstanding and requires payment."

Two months later, in a phone call with CDAO officials, Coleman said his request for clarity was characterized as blackmail.

On April 24, 2024, Coleman said he reached out to the Defense Department's Inspector General Randolph Stone to assist with an investigation [PDF] of the CDAO acquisition process that was announced in January. But he has not heard back.

Since the CDAO abruptly abandoned the project, it has been re-competed three times, on February 24, June 24, and July 24. Federal acquisition rules require competition for contracts, with limited exemptions. But once a bid has survived the competition process, it's not supposed to be re-competed without justification.

Coleman said he would like the CDAO to explain the supposed change in strategic direction. "I've been asking them for months to provide it," he said. "They refuse. They won't engage. Why not? It's a legitimate question to ask. You can't just stonewall someone."

Yet you can. The Register asked Bonnie Evangelista to comment on Coleman's allegations via phone and email several times and received no reply. We also asked Sharothi Pikar to comment and received no response - Pikar a few days ago announced she had accepted a position at Google as Director at Google Public Sector.

Radha Plumb, sworn in as Chief Digital and Artificial Intelligence Officer (CDAO of the CDAO) in April, declined The Register's request for an interview.

According to DefenseScoop, Plumb since her appointment "has appointed multiple investigators to review and respond to reports of wrongdoing and alleged unethical conduct within the maturing AI hub."

Asked to confirm this, a Dept of Defense official said: "The CDAO advances data, analytics and AI-enabled capabilities of the Department of Defense. As it is a comparatively new organization, an important CDAO goal is to institutionalize processes that enable the organization to support digital transformation, while ensuring the organization can recruit, retain, and develop a talented, professional workforce to accomplish our mission.

"To that end, CDAO is currently conducting internal inquiries to ensure efficiency and effectiveness in CDAO programs and operations and to ensure the organization creates and maintains the highest standards of professionalism across its workforce. Although we do not have additional information to provide at this time, the CDAO team will continue to perform its vital mission to accelerate DoD adoption of data, analytics, and artificial intelligence."

In response to The Register's specific request to explain the claimed change of strategic direction that led to the abandonment of the Chatterbox Labs contract, a Defense official offered the following statement:

Coleman said he could understand if a valid reason was provided, such as lack of budget or if the project wasn't going to be re-competed - subjected to the federally required competitive bidding process again. "But to say there was a change of strategic direction and then go and re-compete the project three times, it doesn't make sense," said Coleman, adding that he thinks the Defense Department has now set its AI mission back by more than two years.

Coleman says he is blowing the whistle not just as a matter of professional integrity and financial interest but as a matter of public interest.

US DoD algorithmic warfare models are not fit for purpose and pose a risk to life

"Simply put, US DoD algorithmic warfare models are not fit for purpose and pose a risk to life," he claimed, citing seven separate meetings - in person and over Microsoft Teams - last year and this year when concerns about DoD AI models were raised. The sort of safety concerns one would hope could be mitigated or fully addressed using "responsible AI" software, safeguards, and practices.

"This is about integrity and professional business conduct," Coleman said, alleging: "The lies, deceit, and contract cover-ups are endless at the US DoD even when the subject is AI safety. It's in the public interest to understand why AI safety is non-existent."

Asked whether he could provide more detail about the shortcomings of the DoD models at issue, Coleman said he's not allowed to go into too much detail. But, generally, he said the problem has to do with these models not being trained on data that's relevant to the battlefield environment. The concern is that these models will not be able to recognize the objects they're supposed to be looking for, such as military assets.

He also judged the models as brittle and vulnerable to jail-breaking and other adversarial challenges. "The security is minimal," he claimed. Worse still, he alleged, these models perform poorly in non-optimal weather conditions.

"More importantly, the Algorithmic Warfare mission wants to be number one in the world. They're worried about the PRC [People's Republic of China] and other adversaries, but they're years behind."

The Register contacted the Defense Department's Office of Inspector General (OIG) and inquired about Coleman's allegations.

Mollie F Halpern, spokesperson for the Office of Legislative Affairs and Communications, replied: "It is the policy of the DoD OIG not to confirm nor deny our investigations to protect privacy and the integrity of the process. Your sources may identify themselves to you, but the DoD OIG will not violate privacy. I'd also like to reiterate that the DoD OIG takes whistleblower reprisal complaints seriously and allegations that we do otherwise would be contrary to the DoD OIG's mission."

In January, the DoD OIG announced it had begun an investigation of CDAO. The inquiry aims "to assess the effectiveness of the Chief Digital and Artificial Intelligence Office's (CDAO) development of artificial intelligence (AI) strategy and policy for the DoD, and the CDAO's acquisition and development of AI products and services."

Then, in May, the defense watchdog said it had begun an investigation "to determine whether Defense Digital Service [DDS] engagements achieved their intended purpose and were executed in accordance with DoD and Federal policies." The DoD IG said a complaint received in January claimed "that the Chief Digital and Artificial Intelligence Office (CDAO), particularly DDS officials, relied on waivers they granted themselves to use unauthorized information technology tools and services in violation of DoD policy."

CDAO, particularly DDS officials, relied on waivers they granted themselves to use unauthorized information technology tools and services

The DoD IG expects to issue a report on its January investigation this fall. The defense watchdog earlier this year received poor marks from US Senator Chuck Grassley (R-IA) who in March
slammed the DoD OIG's review of the $10 billion Joint Enterprise Defense Infrastructure (JEDI) contract as "a disgraceful example of government oversight."

Bureaucratic shifts and unfulfilled promises

The CDAO was formed from four organizations: The Joint Artificial Intelligence Center (JAIC), the Defense Digital Services (DDS), the Office of the Chief Data Officer, and the Advana program. It was set up in 2022 to steer the DoD's strategy on data analytics and AI under the leadership of Craig Martell, who left his position as the head of machine learning at Lyft, and stepped down in April, to be replaced by Radha Plumb.

The office's operations have been the subject of ongoing controversy, which perhaps explains the recent leadership change. A US Government Accountability Office report from December 2023 found that the DoD in general had failed to define its AI workforce, resulting in an inability to assess those working on AI projects or their needs.

The story here, said John Weiler, head of the nonprofit Information Technology Acquisition Advisory Council (IT-AAC), involves "institutional fraud that carried over from the dysfunction of the Defense Digital Service. So DDS did not get removed. It got merged and rebranded [as CDAO]. But the DNA, one of disregard for the rule of law and personality-based contracting, still persists."

The DNA, one of disregard for the rule of law and personality-based contracting, still persists

Weiler's whistleblowing contributed to the cancellation of the Pentagon's JEDI contract.

Weiler, who has been embroiled in a dispute with the DoD over a single source contract, said other private sector companies trying to do business with the DoD have received similar treatment.

"We have all these laws over the past 40 years that have been put in place to protect the integrity of the taxpayer and of the mission from malfeasance, fraud, waste and abuse, and they have ignored all of it," claimed Weiler. "And I can verify based on their own statements, they are not performing."

He went on to allege: "They are out of compliance with the Clinger Cohen Act. They're out of compliance with the Competition In Contracting Act. They have violated USC §4023. They have violated the restrictions on use of OTAs [Other Transaction Authorities, PDF] and they have directed contracts based on personal relationships."

A private sector executive familiar with these matters, who asked not to be identified, claimed other companies involved with the CDAO acquisition process have found themselves in a situation similar to Chatterbox Labs.

"Uniformed people at the CDAO, and there are not that many of them, are fantastic," our source said. "They've very mission-focused. It's the civilians who aren't any good. They've had very, very poor leadership there and they continue to have poor leadership."

The civilians, we're told, are primarily focused on their own legacy, rather than on the national security concerns of the DoD. "They're worried about their legacy and what they're trying to do in their little pet projects," our source alleged. "And their pet projects are usually supported by Booz Allen or a SETA [PDF] (systems engineering and technical assistance) contractor."

Software

Our source explained that at the Defense Department, leaders tend to be judged by how many people they manage. And software projects tend not to have lots of people to manage. So there's less interest in investing in software.

Chatterbox's software was perhaps dropped so the funds could be used instead on full-time equivalents, or FTEs [PDF], workers directly employed by the government or by favored contractors, our source speculated.

The CDAO was set up to support US military services. But of the civilians at the office, our source alleged: "They're building their own empire there. They got $1.2 billion last year. I don't know where this money is going, except for SETA contractors."

"So what they were trying to do in Chatterbox - which was a really, really good idea - is set up a service that they could pilot," our source said.

What Chatterbox Labs was doing could have allowed the various service branches to assess their AI models.

"That's a good idea," our source said. "But they've gotten away from those kinds of things. They're doing their own thing. They're not talking to the services. They're their own little empire. They're not meeting with any customers, for the most part."

Beyond the claim that CDAO's civilian leaders have focused on their own interests at the expense of the national interest, our source opined that personnel churn and lack of leadership has hobbled the organization.

"They need someone like General [Leslie] Groves, who oversaw the atomic bomb," our source said. "They don't have anyone like that. And they need someone like that, someone who knows the establishment, not an academic. Somebody who gets stuff done." ®

 

https://www.theregister.com//2024/09/06/defense_ai_models_risk/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment