Latest News
LOS ANGELES — MITRE, the federal contractor that runs R&D labs for the U.S. government, is developing a space cyber lab where real satellite hardware and software can be tested to ensure security. It’s just one of a host of new measures that space companies are adopting to harden their systems against hackers, panelists at the CyberLEO conference said May 12.
The lab will explore how vulnerabilities discovered in software and hardware components could be exploited by hackers in real space systems, said Jeff Finke, principal engineer and group leader at MITRE’s National Cybersecurity Center of Excellence.
“We have a 3U cubesat in the lab, except for the camera being different and the solar arrays not having solar panels, we could put it on a rocket and launch it into space,” he said. That authenticity is important because satellite software and firmware runs on exotic systems unlike those used in conventional IT — which can make it harder to determine the impact of vulnerabilities for both attackers and defenders.
Initiatives like the space cyber lab are also needed in part, because of the enormous complexity of satellite supply chains, added fellow panelist Phil Robinson, chief security officer for space data relay company SpaceLink. A great deal can be achieved through careful drafting of contracts, added Robinson, but there are limits.
“It comes down to negotiating with our prime suppliers, our subcontractors, our satellite manufacturers. … Do we have our contracts appropriately written in a way that covers risk?” Robinson asked.
Covering risks might mean insuring against them, or it might come in the form of guarantees from the manufacturer or other parties, Robinson continued. “Trust, but verify, right? We’re glad you put it in the contract. But I want to verify that you’re actually doing it as well.”
Relationships with vendors require trust, yet operators needed to ask themselves: “What kind of processes are you putting in place to verify and validate that trusted relationship? Are you actually looking at their practice? Are you talking to their coders that are pulling down code libraries from Lord knows where?” Finke added.
The point, Finke said, is that risks don’t fade as they recede from the first-party vendors. “What are you doing, satellite operators, to trust that relationship from your vendors? How far back are you willing to go? It’s one thing to check out your first level of third party partners. Okay, that’s great. But who are they in business with? Are you willing to spend the resources to then go to that next level, and the next level beyond that, all the way into the chip foundry, all the way to whoever wrote that first [code] library?”
Yet for companies working to turn a profit, the cost of peeling back the onion layers of the satellite supply chain can quickly become unsustainable, Finke warned. “How much, as a commercial entity, where I have to increase shareholder equity or make money — which is a good thing — how much am I willing to invest to mitigate some of this? … How much risk am I willing just to accept knowing it’s out there, versus where I’m going to put resources to mitigate?”
The dangers of vulnerable components are rendered worse because comparatively little research has been done on the unique architectures and embedded systems used in satellites, according to Ang Cui, CEO of Red Balloon Security. Embedded devices are specialized pieces of equipment very different from the general purpose computers of conventional IT. They generally have a single purpose and must run reliably for a dozen or more years. Cui compared satellite embedded devices to those used in industrial control systems known as ICS — the specialized computing systems that run factories, oil refineries, and power stations.
“I would say the security posture of the firmware inside those [ICS embedded] devices is about five to eight years behind general purpose computing security. Having looked at quite a bit of aerospace products … I would say a lot of the firmware inside aerospace things are about five to eight years behind ICS.”
Such a mountainous security debt put satellite companies in an impossible position, he added. “If I went to anyone here and said, ‘Build a company, but you can only do it with an unpatched Windows 90 laptop, and you can’t make any modifications to any of the code because that’s not your property.’ You would say, ‘That’s a bad idea. That’s a crazy thing to do.’ But in a lot of these situations, that is exactly how we’re operating. We’re using these devices that we can’t change the firmware of because it has [outdated] security [requirements]. It has liability insurance, legal obligations. We’re stuck in that situation.”
As is often the case, the security debt impacts defenders much worse than attackers.
“From what I’ve seen over the last decade, that offensive capability is so much more advanced than defensive capability, in all things embedded. And that gap is growing,” Cui said.
Classified conversations tend to focus on the extraordinary capabilities of government hackers, but the real danger is that those capabilities are quickly proliferating into the hands of criminal groups, too — becoming more widely available. “Those capabilities will spill over. And it’s not just in the hands of nation states. I think that’s the thing that we’re starting to see,” Cui noted.
Not everyone agreed. In a subsequent panel, retired Air Force Maj. Gen. Brett Williams, a co-founder of IronNet Cybersecurity, dismissed the idea that it is possible to secure components through testing — especially against deliberate insiders bent on mischief.
“The thinking you’re going to inspect everything, whether it’s hardware or software, and validate that it’s safe is a non-starter,” Williams said. Instead, he argued, a better approach is to try and validate the behavior of components — to ensure they do what they are supposed to.
“The real market opportunity is finding ways to understand that this stuff is doing what it’s supposed to do,” Williams said. “Even though you and I are using the same component, we’re using it a little bit differently, it’s connected to different things, it does different things. There’s got to be an understanding, is it doing what I need it to do?”
Unlike governments, commercial enterprises can’t put absolute restrictions on their vendor relationships. “The government can say … we aren’t buying anymore Lenovo computers. We aren’t using Kaspersky antivirus. But [in the private sector] you don’t necessarily have that option,” he said.
For instance, one government requirement was that only U.S. nationals could work on coding or making other components, Williams said. “You couldn’t have any foreign nationals touch your software. How many people build software today that doesn’t have a foreign national touch it?”
Government regulations can easily become too burdensome, he noted. “I think the nuclear power industry is a good example of that. Right now, the nuclear power plants are run by commercial companies, but they’re so heavily regulated that the cost is humongous. It’s a really hard problem.”
Get the latest Via Satellite news!
Subscribe Now