IoT Security: Confusing And Fragmented

Regulations and compliance are inconsistent and often inadequate, but adding better security boosts cost and impacts performance and power.


Security regulations for Internet-of-Things (IoT) devices are evolving around the world, but there is no consistent set of requirements that can be applied globally — and there may never be.

What exists today is a patchwork of certification labs and logos. That makes it difficult for IoT-device designers to know where to get their security blessed. Unlike in data centers, where there is a range of ISO and SAE standards, as well as a long list of best practices, there is nothing comparable in the IoT world.

To some extent, this is a function of the diversity of IoT devices and systems, as well as a broad range of applications and use cases. But it also reflects some ambivalence about security for a broad spectrum of typically inexpensive devices, despite the fact that they are connected to much larger systems, many containing highly sensitive or valuable data.

“Designers have big challenges making decisions on what certification and what technology to use,” said Dana Neustadter, senior product marketing manager for security IP at Synopsys. “It’s very complex, and it’s a soup of things.”

Attitudes are starting to change, particularly as more devices as connected to the Internet and to each other, and as more established companies develop strategies that incorporate good security practices.

“The core competencies you need for building up an IoT system are sense, control, actuate, and connect, and it all has to be done securely,” said Shawn Slusser, senior vice president of sales, marketing, and distribution at Infineon Technologies. “You need to make sure the people using the system are the right people, and that the information flowing is the right information.”

Building this into an IoT device or system isn’t always easy, though. “You have to have security at a very small level all the way up the stack, but the more you add the more expensive it is and the more power it takes,” Slusser said. “So you have to find the right balance. What are you trying to achieve? How sophisticated do you want it? And how do you make it affordable so it doesn’t drive the end solution out of whack in terms of price and resources?”

This is where standards efforts run into trouble. While there are a number of possible security certifications, there is little consistency. Some operate at the chip level, while others focus on the entire system, including components and development process. And most of them are compliance checks, not independent attestations of security. This makes for a confusing landscape, and it deprives consumers of a consistent way to judge the security on a device they’re buying.

Scaling down for IoT security
While significant effort has been made toward securing large-scale computers and networks, those systems have resources for heavyweight security implementations. By contrast, the IoT consists largely of small systems, most of which have limited resources, and many of which rely on batteries for power. That calls for a lighter-weight approach — and yet that can’t be permission for lower security.

“The big challenges you have with IoT devices are that they’re very small devices, they’re deeply embedded, they operate at low power, and they are low-cost devices,” noted Mike Borza, security IP architect at Synopsys. “But that means that they’re also widely available to people who want to attack them.”

Some technologies are highly regulated when it comes both to security and to other considerations, like radiation. Examples are cell phones and financial systems.

“With mobile phones, the SIM cards have very specific manufacturing requirements,” noted Haydn Povey, general manager, embedded security solutions at IAR Systems. “Similarly, the amount of radiation and things along those lines are all defined in the 4G and 5G specifications. With consumer electronics, none of that exists.”

A rough organization of the IoT would include consumer devices, industrial devices, health care electronics, and even future automobiles, which some see as an extension of the IoT. The needs of each of these groups are different. Of them, only the health care and automotive groups are highly regulated. When it comes to consumer or general industrial devices, there is little guidance to follow.

“You have different drivers for certifications,” explained Neustadter. “First, you have the laws and regulations that can mandate that data shall be private to users, and you cannot share it. In addition, you have standards bodies and associations that drive standards and certification.”

Some of these groups originate out of governments, others out of industrial consortia. But exactly how they interrelate is often not clear.

Kimberly Dinsmore, MCU security solutions manager at Renesas, pointed to a project underway in the United Kingdom to enumerate all of the various security specifications globally. It was still counting as it approached 450. Part of that arises from the different needs of different applications, but it highlights the potential challenges for designers and consumers.

Scaling up for IoT security
Not all IoT devices are simple. Large corporations and governments use IoT devices, as well, to collect different types of data from multiple different sources, which could include anything from an industrial plant to an insurance company, and many employees of companies use their personal devices at work.

IoT devices have been responsible for everything from distributed denial of service attacks to ransomware attacks. And given the diversity of the systems under attack, the resources and sophistication of the attackers — as well as the growing value of data — it’s difficult to come up with a single solution that will work for very long. Security standards are based upon known attack vectors, but new ones are being created all the time.

“Security is an arms race,” said Jason Moore, senior director of engineering at Xilinx. “The adversary is never going to stand still, and our requirements are always going to be changing. Another important piece that has to be considered is the capability of the adversary. Who is your adversary? This is a common question that we have when we deal with customers. Security is very similar to insurance. How much do you want to pay for your insurance? How much is your IP worth? Who is the adversary that’s attacking it?”

One need look no further than the constant stream software updates required to patch security holes in devices. “It’s been very undisciplined in terms of how these rollouts have happened,” said John Hallman, product manager for trust and security at OneSpin Solutions. “We’ve just started to blindly accept all the new terms that come with an update because nobody wants to read the fine print of all the disclaimers for an upcoming release or the release notes with the software patch or firmware update. We’ve lost the sense of discipline that catches these updates as they come out, and the attackers are just playing on that lack of discipline and can slip into these updates.”

No security technology is bulletproof. With enough resources and determination, even the most secure devices can be hacked, as the recent SolarWinds attack showed. The challenge is balancing an acceptable level of risk against cost and preparing for a relatively painless recovery in case of a breach.

Governments set the basics
Picking the right security level is challenging in its own right, but as governments begin setting minimum standards, this adds a whole new level of uncertainty because security regulations can vary significantly from one region to the next. Some regulations may be extremely vague, while others are more specific. Still, together they at least form the baseline for chipmakers and systems developers.

Historically, much of what drives U.S. policy stems from the National Institute of Standards and Technology (NIST). Its rules largely apply to requirements for equipment that federal organizations may purchase, but that also affects any commercial device if the feds might be a future customer.

NIST has a couple of specific security standards, 8259A and 8259D. The former is relatively general, while the latter is more specific. But Mike Dow, senior product manager, IoT security at Silicon Labs, notes that 8259D is targeted at large computing systems more than the IoT.

NIST also is known for the FIPS 140, which is a list of basic cryptographic requirements that affect both hardware and software. But it’s usually considered too heavyweight for IoT devices, with the exception of select aspects like random-number generation.

What might be better attuned to the needs of IoT devices is NIST’s Cryptographic Algorithm Validation Program, or CAVP. “You can pick which certifications you want and make sure that the algorithms that you’re going to be implementing have been proven to work,” said Renesas’ Dinsmore.

The idea here is that you select which of the NIST algorithms to use, and then you verify that your implementation works correctly. It costs $2,000 per algorithm (and there will likely be more than one), but it’s more affordable than going the full FIPS 140 route. That could cost 100 times more, Dinsmore said.

Hardware can carry its certification forward, whether it’s a physical module or IP that gets used in multiple designs. Formal verification is used to validate that the IP remains unchanged from what was certified in a prior design.

Software, by contrast, must always be re-certified. And a full device with both hardware and software will be certified revision by revision. “Certification will be against a specific hardware/software version,” added Dinsmore. “As soon as you change the version, you break certification.”

Meanwhile, other laws are being passed in the US. Congress, such as the IoT Cybersecurity Act of 2020. In addition, the Cyber Shield Act was re-introduced this year. According to Dow, it’s likely to follow 8259A, with a focus on interfaces, software and firmware updates, and software attestation. From there, it moves toward public signature checks and secure boot.

Some U.S. states have their own regulations. For example, both California and Oregon have a requirement for “reasonable security,” which is difficult to define and which will likely have to be sorted out in future lawsuits.

Europe, meanwhile, has ETSI, which is technically a non-profit organization, but which generates standards recognized throughout the European Union. It has an IoT cybersecurity standard called EN 303 645.

From a legal standpoint, the EU also passed its own European Cybersecurity Act. It strengthens ENISA, the organization that has the authority to set basic technical standards for the EU certification system.

The United States and Europe tend to set the direction for other countries, like Singapore, Finland, and Japan. The U.K. has added some of its own lightweight requirements. ISO is likely to unify the U.S. and European approaches via ISO 27402, according to Dow. China, meanwhile, has its own OSCCA organization.

Industry consortia pick up from there
Several other groups are building on the government basics to set context-specific guidelines, and some of them come with a certification process. The challenge with many of these groups is they don’t clearly define why they’re needed.

Their “about us” descriptions tend to focus generically on the importance of security, but they don’t position themselves amongst other groups. In fact, if you read them, you might conclude that each group is the only one that exists. This can make it extremely confusing when trying to figure out which organization to follow.

The most recognizable set of guidelines is the Common Criteria. But it is a heavyweight system typically used for smart cards and trusted execution environments (TEEs).

“Common Criteria would have been the default organization to do a lot of this,” said Synopsys’ Borza. “But they’ve been very slow to adapt to the requirements of the IoT market.”

Like FIPS 140, it’s a hard standard to require for lightweight devices. So other groups, while possibly following the Common Criteria model, target small IoT devices more directly. And they go beyond simply validating cryptography implementations. “A lot of the other organizations are filling in voids that have been left, because getting through a Common Criteria certification is just not feasible for so many people,” Borza said.

A variety of standards groups
Starting at a low level, the IoT Security Foundation (IoTSF) establishes basic guidelines and an approach for self-certification. “The approach that the IoT Security Foundation has taken is to prove that you’ve been through a process,” said Povey, which is intended to ensure that the appropriate thought and technology have been brought to bear for a given device.

Moving up to the chip level, the most prominent U.S. organization was started by Arm and its ecosystem partners. Called PSA, for Platform Security Architecture, it has a “PSA Certified” imprimatur. Labs collaborate to validate the security claims made for a device being certified. The focus here is largely on MCU- and MPU-based systems.

“PSA introduces things like roots of trusts and isolations and separations and device-lifecycle management — things that are much more holistic,” said Dinsmore.

That said, PSA may be perceived as being focused on Arm structures. “PSA is relevant for people developing Arm-based devices, but no other processor vendors have really gotten on board with that,” noted Borza.

Arm, however, emphasizes that the use of Arm technology is not a PSA requirement. “PSA Certified is an architecture-agnostic security initiative that is formed of six security labs along with security consultants Prove&Run and Arm,” said David Maidment, senior director of the secure device ecosystem, automotive, and IoT line of business at Arm. “It is run in an independent way and uses TrustCB as its certification body overseeing consistency and impartiality. The labs review security implementations against defined criteria for the specific level of certification… and also align with NIST and ETSI cyber-security baselines. As a co-founder, Arm provides resources and guidance on how to build a PSA Certified product.”

While PSA Certified appears to be widely viewed as chip-focused, Maidment notes that they also include system-level certification. “PSA Certified exists to lobby the entire ecosystem to integrate security best practices,” he said. “The scheme covers chips, software platforms, RoT components, boards, platforms and devices.”

In Europe, the Global Platform organization has developed SESIP (Security Evaluation Standard for IoT Platforms), which is still silicon-focused, but which is more extensive than PSA. It’s like a lightweight version of Common Criteria. Given certification to the SESIP PSA profile, a device can achieve both SESIP and PSA certification. The reverse is not true, however: PSA certification alone would not qualify for SESIP certification.

“Certifications like SESIP make you, for better or worse, pay attention to process and technology,” said Scott Best, director of anti-tamper security technology at Rambus. “There are insider attacks, there are design tool threats that can come up, and there are supply chain threats, and a lot of the certification process goes into, ‘How did you build this product. Where was it manufactured? How much control did you actually put on people who had their fingers in the design at any point?’ You may have 1,000 people with access to an SoC design before it ever hits the market.”

While far from perfect, Best said these kinds of certifications at least force companies to examine their design and engineering processes and their facilities. “I don’t put a great deal of credence in those certifications, but they do fix some otherwise glaring flaws,” said Best. “Certification is not necessarily a penetration test, and there are few requirements to do more, unless it’s going into some safety application. But every device that will be used for safety applications can’t be tested for penetration, either.”

Next we move up to the system level. “There’s one standard specifically for IoT that borrows a little bit from Common Criteria, and that’s the ioXt (Internet of Secure Things),” said Mark Stafford, director of applications engineering at Infineon Technologies. “There are accredited labs that have a formal methodology specifically for IoT.”

The ioXt focuses on the IoT at scale. It’s a “composite” approach that accepts component and module certifications when certifying the entire device. For example, a chip that’s been certified by PSA can “import” that certification and use it for ioXt certification.

“Another way to help smaller entrants to the market that don’t have security expertise is to utilize an already pre-built secure system, like a secure enclave,” noted Neustadter. That makes it beneficial for chipmakers to have their chips pre-certified. It is then easier to sell to system builders, since some of the certification burden will have been removed. Likewise, using pre-certified IP can make chip certification easier.

Meanwhile, Eurosmart is developing approaches based on the European Cybersecurity Certification Framework:

  • GSMA’s IoT SAFE proposes an architecture leveraging SIM cards as roots of trust
  • ISO 21434 contains requirements for road vehicles, and
  • CTIA enforces its Global Certification Forum (GCF) rules for devices that communicate over the cellular system.

On top of all of that, cloud providers may have their own rules that must be adhered to for devices that communicate with them.

And, not to be outdone, both IEEE and Underwriters Laboratories (UL) are expected to get into the game as well. One benefit of UL involvement would be consumer recognition, since we are used to seeing them attesting to overall device safety on familiar home appliances.

Industry groups, meanwhile, are defining application-specific security profiles that the industry can rally around. While these efforts are often dominated by large players, they are open organizations, and it’s useful for smaller companies to participate as well to ensure that the rules don’t unduly favor deep-pocketed incumbents.

Through all of this fragmentation, the only sign of any collaboration or merging appears to be the ability to get a PSA certification out of SESIP validation. No other consolidation appears imminent.

Rigor and metrics — or not
Different approaches come with differing levels of rigor. Clearly, the least rigorous and easiest approach is self-certification. You document what your security strategy is, and trust in that document is what carries you.

Because other organizations use labs to verify security claims, they’re considered more rigorous. But even there, it’s the developer that states what security features a device has. The role of the labs is to validate that you did what you said you did. “Certification schemes standardize only how you test,” said Dow.

Some of the programs come with levels, such that, if you meet a certain set of criteria, you qualify for that level. There’s no uniform definition of those levels.

What you won’t see are metrics. Such metrics are extremely difficult to define — and even harder to get agreement on.

“Devising metrics that tell you something about how secure you are, on a scale of one to five, turns out to be very, very difficult,” noted Borza. “It depends a lot on who’s doing the test. If you’re talking about penetration testing, it depends on the skill of the penetration tester.”

In addition, one of the major challenges of security is that it is constantly moving. “When, at a point in time, your score on a certain metric is a 3.7, the only thing you can say reliably about security is that it will never be better than 3.7,” continued Borza. “And at some future point in time, it may be a lot worse.”

Setting expectations
Getting certification takes time and effort. That process must be built into the overall development plan so that the schedule accurately sets expectations.

An extreme example is for secure elements (SEs). Those can take two years to certify, and, according to Dow, even a change to a single line of code can add months to that schedule. PSA, by contrast, can take a couple of months to issue certification.

Terminology can also vary. For instance, NIST uses “validated” for what we’re calling certification, while Common Criteria and PSA use “certified.”

The word “compliant” is more complicated. “Companies that don’t want to go through the time, effort, and expense of doing a full-on validation of their products will often say that they are ‘compliant’ as a kind of weasel-word to avoid admitting that they haven’t been tested,” said Borza. “Especially in the case of FIPS 140, NIST will find products that claim compliance and that have not been validated, and they will send cease-and-desist notices.”

What are developers to do?
Given the complex landscape, it’s hard for developers to know where to turn for certification. This becomes a role for the product manager: They must decide early on what the certification strategy should be.

“It’s non-trivial for the product manager to decide the standard for which to implement the verification processes or get certification,” said Frank Schirrmeister, senior group director, solutions and ecosystem at Cadence. “They need to find the region for the market that defines the standards, and then it trickles down which standardization committees you need to go to.”

Schirrmeister proposed an escalating scale for certification. Beyond government-mandated security, it starts with IoTSF’s self-certification. From there, you can add the next level with PSA and then with SESIP. Other organizations take a system-wide look, so the appropriate one there will vary by application.

If the highest level of blessing is desired, one can go for FIPS 140 or Common Criteria certification, but that’s a significant process that should not be embarked on until one is sure that one will pass.

“There are significant costs in going that route,” cautioned Povey. “You want to go through it only when you know, you’re going to be successful. You have to be pretty certain that you’ve done your homework and you’ve implemented it correctly.”


Fig. 1: One possible way of organizing a hierarchy of certifications. The bottom illustrates the most basic sets of rules, while the topmost box requires significant effort and resources to achieve. Source: Bryon Moyer/Semiconductor Engineering

Fig. 1: One possible way of organizing a hierarchy of certifications. The bottom illustrates the most basic sets of rules, while the topmost box requires significant effort and resources to achieve. Note that, while many see PSA as chip-level, Arm says that they also do system-level certification. They also claim ioXt to be application-specific. Source: Bryon Moyer/Semiconductor Engineering

According to Dinsmore, if there were one thing to implement that could combat many of the threats, it’s the use of identity and the public key infrastructure (PKI) — that is, public/private key pairs that can be used to authenticate a device. Even that, however, can be challenging in a small device run by a small microcontroller.

“I’ve got people running little 32-bit micros with 256 KB of flash at 48 MHz, and I tell them, ‘You need to dedicate 80 KB for the cryptographic algorithms that you need just to prove who you say you are,” she said.

What about IoT consumers?
As for consumers, there’s realistically nothing for them at present. The security certification world is far too complex for them to navigate. They’re going to want to see a label that they trust before a purchase. No such label exists now, although, if UL comes through with well-accepted certification processes, that would be a name consumers could look to.

Today, with no labels, consumers purchase devices assuming they’re secure. That’s usually a bad assumption. Security is still not accepted as a must-have, but rather as a cost. Povey noted that, per the U.K.’s DCMS (Department of Digital, Culture, Media, and Sport), only 5% of devices had security. So consumers who haven’t been made aware are making purchases with a false sense of security.

“There’s education to be done on the implementation side, and then there are questions around how you then communicate that to consumers,” noted Schirrmeister.

“Consumers don’t have any real assurance, and they can’t really be expected to sort out what is going to be good enough,” said Borza. “So this is an area that’s crying out for some kind of protection. And I hesitate to say that it needs regulation by governments.”

Leaving it up to the industry can work, but only up to a point. “Apple has taken the position that they are the company of privacy, and that’s a great position,” Borza continued. “But that’s one board meeting away from being turned on its ear.”

If regulation is a step too far, then at least a disinterested third-party consumer group could help to referee. “There’s got to be some more serious reporting by consumer groups or some watchdog that that can monitor device security,” said Neustadter.

Even if secure devices are available alongside non-secure ones, the secure ones are going to cost more — potentially measurably more. Here again, regulation could set a security baseline that all participants have to achieve so that security ceases to be a point of price competition. Education also will help consumers understand why it can be worthwhile to spend extra on security.

“I haven’t seen anything to explain the value of security well enough that customers will be willing to spend 20 pounds more on their webcam because it’s proved to be secure,” said Dinsmore.

For industrial devices, it’s easy to assume that such companies will employ security specialists that can navigate certification. But that’s often not true. “Most of them don’t have the expertise in house,” said Borza. “And they’re hard pressed to hire.”

Until things settle down into a simpler scheme, it will remain confusing and difficult to figure out the best approach to getting a device certified for security. That makes it even more important that the process be planned up front, along with all of the other requirements for the chip or system design.

Always On, Always At Risk
Chip security concerns rise with more processing elements, automatic wake-up, over-the-air updates, and greater connectivity.
IC Security Threat Grows As More Devices Are Connected
Awareness increases, but so does the complexity of systems and the potential attack surface.
Semiconductor Security Knowledge Center
Special reports, top stories, white papers, technical papers, videos and blogs on security.
Why It’s So Hard To Stop Cyber Attacks On ICs
Experts at the Table: Any chip can be reverse-engineered, so what can be done to minimize the damage?
Dealing With Security Holes In Chips
Challenges range from constant security updates to expected lifetimes that last beyond the companies that made them.

Leave a Reply

(Note: This name will be displayed publicly)