Chip Security Needs A New Language

SystemVerilog assertions can nicely capture many hardware requirements. However, more is needed for security verification.

popularity

By Sven Beyer and Sergio Marchese

Safety- and security-critical systems, such as connected autonomous vehicles, require high-integrity integrated circuits (ICs). Functional correctness and safety are necessary to establish IC integrity, but not sufficient. Security is another critical pillar of IC integrity. Systems and products using ICs with security vulnerabilities ultimately undermine the safety and privacy of people. However, hardware security is still in its infancy. A recent survey, focusing on the security of the automotive supply chain, found that only 47% of companies assess security vulnerabilities during the early stages of the product release process, namely requirements and design phase, and development and testing phase (see Fig. 1). As stated in the survey report, “this process is contrary to the guidance of SAE J3061 Cybersecurity Guidebook for Cyber-Physical Vehicle, which advocates for a risk-based, process-driven approach to cybersecurity throughout the entire product development life cycle.”


Figure 1: Percentage of companies that assess security vulnerabilities of automotive software/technology/components at a specific stage in the product release process. Source: Securing the Modern Vehicle: A Study of Automotive Industry Cybersecurity Practices, Ponemon Institute.

Automotive safety, on the other hand, is in a more mature state, although still challenging. To a large extent, that is thanks to the ISO 26262 functional safety standard. The standard, which has enjoyed widespread adoption over the last decade, covers the entire life cycle of automotive electrical and electronic (E/E) systems. It also includes provisions to establish and foster a safety culture in engineering organizations. The ISO/SAE 21434 “Road vehicles – Cybersecurity Engineering” standard, currently under development and expected to be published in 2020, promises a similar approach to automotive security.

Another important initiative, which goes beyond the scope of automotive security, is the Accellera working group on Intellectual Property Security Assurance (IPSA), OneSpin is a member of this group. The goal of this working group, established in September 2018, is to provide a security assurance standard for hardware IPs to reduce and manage security risks when integrating IPs in embedded systems.

After this high-level overview, let’s take a closer look at some of the nitty-gritty aspects of hardware security.

Information CIA
Nowadays, most chips include features that are leveraged by software layers to implement security functions. Examples include authentication, handling of signatures for secure over-the-air software updates, and fast encryption and decryption of private data. Certain hardware memory regions may be reserved and accessible only to applications with high privilege level. Some registers may contain secret data, for example an encryption key. In more general terms, hardware must ensure that information security is maintained. This includes ensuring information confidentiality, integrity, and availability (CIA). Attackers may try to extract a secret key, for example, thus breaking information confidentiality, sometimes referred to as data leakage. They could also try to overwrite the secret key, replacing the lock rather than stealing the key, thus breaking information integrity. This is sometimes referred to as data sanctity. Both information confidentiality and integrity are critical aspects of hardware security that need rigorous pre-silicon verification.

Security requirements
Verifying confidentiality in hardware often includes checking that the content of certain registers or protected memory regions cannot reach certain outputs, no matter how the hardware is used. In practice, requirements might be more complicated as confidential data might be allowed to reach an output but only when requested by an authorized agent. Using pseudo code, one could express this requirement as “not_propagate(secret_x, output_y)”. The SystemVerilog hardware description language (HDL) offers many constructs to capture hardware requirements in assertions. These can be used during functional verification through design simulation, emulation, and formal analysis. Security requirements, however, are not easy to express in SystemVerilog assertions.

Naively, one could write “output_y != secret_x” as an assertion. Even ignoring timing issues (it could take several clock cycles for the secret to reach the output) this expression has two fundamental problems. Firstly, the output could, at some point in time, accidentally have the same value of the secret while operating correctly, thus leading to false failures during security testing. The problem here is that the SystemVerilog assertion talks about equality of values, but not about what causes the output value to be there. Secondly, the assertion only checks for equality of values. Unfortunately, confidentiality would also be violated if the secret is leaked through more complex schemes. The secret bits, for example, could be inverted or go through more complex transformations. Moreover, they could be split into several chunks and streamed out in a number of steps rather than all at once. The simple assertion would not cover these scenarios.

Knowing the full design structure, one could write numerous assertions to check all possible paths between secret and output, one section at a time. While this approach is possible in theory, it would be unfeasible in practice as, even for a simple but real design, it would be too effort-intensive and error-prone. Engineers need a SystemVerilog extension, or some other standardized language, to express this type of hardware security requirements in a concise way. Electronic design automation (EDA) tools can then read these requirements and compile them into the appropriate checks.

At present, commercial EDA tools offer proprietary methods to information flow requirements. A standardized method would enable tool interoperability, improve reusability of requirements across design iterations, and allow providers of semiconductor IPs to deliver executable security specifications that could be independently checked by SoC integrators, and reused to ensure chip-level security.

Conclusion
The modern world relies more than ever on complex electronic systems. From autonomous vehicles to medical devices, from planes to critical infrastructure, from defense systems to 5G networks, the safety and privacy of people depends on the security of the chips used to implement these systems. Security needs to become an integral part of the hardware development life cycle. While security specialists are needed more than ever, there is also a need for more widespread awareness, expertise, and capabilities. Engineers at all development stages need to have the appropriate knowledge, processes, and tools to detect and address security vulnerabilities as early as possible in the flow. A standardized approach to concisely capture secure information flow requirements at the hardware level would enable more robust processes, and better interoperability among tools, while also simplifying the sharing of expertise. For more information about IC security and trust assurance, read the OneSpin Trust and Security Solution flyer.

Sergio Marchese is technical marketing manager at OneSpin Solutions. He has 20 years of experience in electronic chip design, and deployment of advanced hardware development solutions across Europe, North America, and Asia. His expertise covers IC design, functional verification, safety standards, including ISO 26262 and DO-254, and detection of hardware Trojans and security vulnerabilities. He is passionate about enabling the next generation of high-integrity chips that underpin the Internet of Things, 5G, artificial intelligence, and autonomous vehicles.



Leave a Reply


(Note: This name will be displayed publicly)