Co-design package planning; PCIe 6.0 considerations; diffractive optics; remote attestation; processor design skills.
Siemens’ Keith Felton argues that co-design-driven semiconductor package planning and prototyping is critical for design success and points to how interchange formats enable designers to make trade-off decisions for both the package and the board and communicate those recommendations back to the other design team in formats that are native to their tools.
Cadence’s Xin Mu explains precoding in PCIe 6.0, a technique that can help mitigate the number of errors in a burst at high data rates by converting continuous multi-bit burst error into 2-bit start error and end error.
Synopsys’ Gary Ruggles and Madhumita Sanyal point to key things to consider when deciding whether to move from PCIe 5.0 to 6.0, including more complicated network switch design, verification challenges, and specification ecosystem maturity being a couple of years out.
Ansys’ Csilla Timar-Fulep and David Vega check out the role diffractive optics, which scatter light based on their index of refraction and their physical geometry, have to play in healthcare, such as through improved confocal laser scanning microscopy, optical coherence tomography, and artificial lenses.
Infineon’s Sneha Prahalad explores how attestation protocols in a Trusted Platform Module works to prove to a remote party that an operating system and application software running on the system is intact and trustworthy.
Keysight’s Mike Hodge points to the difference between simulations and digital twins and argues that the latter could provide a path to first-pass success by enabling engineers to analyze variables and scenarios to accurately predict how a design will perform under test.
Codasip’s Roddy Urquhart notes a scarcity of processor design skills just as the demand for custom compute is emerging in many applications and points to the modularity of RISC-V as part of the solution.
Arm’s David Whaley explains why all new Arm Cortex-A CPU cores are now 64-bit only, including both ‘big’ and ‘little’ cores, including the enablement of security features to detect common programmer errors when dealing with memory and prevent the most common memory safety vulnerabilities afflicting the mobile ecosystem.
SEMI’s Jaegwan Shim expects that a memory market rebound will help spark a global semiconductor industry recovery, but warns of supply chain challenges and an anticipated bottleneck in supplies of semiconductor materials.
Nvidia’s Brian Caulfield checks out how archaeologists are using AI to identify new geoglyphs, giant line drawings created in or on the terrain by ancient civilizations, in the Nazca region of Peru.
And don’t miss the blogs featured in the latest Test, Measurement & Analytics newsletter:
Onto Innovation’s Keith Best shows how to boost yield and improve package performance without adding more RDL layers.
National Instruments’ Jake Azbell examines the challenges of meeting aggressive time-to-market requirements and evaluating solutions.
Synopsys’ Ramsay Allen digs into different types of glitches and how design-for-test (DFT) logic must evolve to ensure greater levels of test robustness and silicon health.
Advantest’s Quaid Joher Furniturewala looks at the importance of thermal and power integrity analysis as power ratings increase.
proteanTecs’ Nir Sever explains how quality risks can be averted with 100% lane coverage, allowing engineers to detect defects under real-life conditions.
Leave a Reply