I recently skimmed this GAO report on maritime security. I have to conclude that it totally misses the mark. But that didn't surprise me in the least. I would have been surprised by an insightful and intelligently written document that prioritized the real issues and strategies that will make a big difference.
There is a list of threats in the document that seems totally out of line: "￼Table 1: Sources of Cyber-based Threats" Their threats are:
- Bot-network operators
- Business competitors
- Criminal groups
- Spyware or malware authors
Why all of those groups are real, their categories are somewhat nonsensical. I can't figure out what they use as a criteria for the categories. For example, a nation (e.g. North Korea) may imploy or buy from an author of malicious software (The Hacking Team), but does that make two sources of threats?
And without trying to figure out the ontology issues, there are a couple changes to that list that I would make right off. First, my number one source for threats is software developers. I've been working on auditing and fixing the Generic Sensor Format (GSF) that is used for sonar mapping and I'll use that as an example. This is C code developed by professional programmers at SAIC for the US Navy and has been around since the early 1990's. I took the code (not that it is open sourced under the LGPL 2.1 license) and threw it in Coverity. Right off the bad, I got a whole pile of coding issues that include multiple buffer overflows and all sorts of use of unsanitized data from files. Many of these issues have been in the code for > 25 years. If this is in open code that has been used by many companies for ages, what is hiding in all the closed source code in the maritime industry? There wasn't a good testing strategy for the GSF C code. Does your ECDIS have decent automated testing? This situation is likely way worse. I talked to a maritime professor teaching ECDIS about 10 years ago. His number one lesson to students was to make sure that the ECDIS computer had not stopped updating by watching the seconds of the on screen clock. And the students were supposed to do this in every sweep of their watch (so multiple times per minute). In addition to bad code, there is also bad design. These are things like inventing your own encryption or not validating data or patches that go into a system. A nice example of this is with digital charts. The rules say that a US chart (e.g. an S-57 file) is valid only if you got it directly from NOAA or an authorized retailer. That really doesn't mean anything. What if someone man-in-the-middled the download or it got corrupted somewhere along the way. I'd take a cryptographically signed file is worth more than the source.
The next change is with hacking. I'd call this category cracking. And I'd split it up into two groups. The first are the smart ones doing things themselves. They are doing real work and really discovering things. The next category are "script kiddies". These folks really have no idea what they are doing and just blindly apply tools that are available on the internet. They often have no idea what they are breaking into and what the consequences are.
Another change to that list would be to add a lack of reasonable support to mariners from the world's "competent authorities". If the Hydrographic Offices (HOs) and Coast Guards (CGs) around the world, can't give reasonable guidance to software developers and mariners using the gear, then all it lost. This boils down to people making decisions they shouldn't (e.g. they are not trained for - electrical engineers and lawyers defining software) and/or closed specs that don't have a way to get audited by professionals. This IEC specs for AIS gear.