Author's Latest Posts


System Bits: June 19


ML algorithm 3D scan comparison up to 1,000 times faster To address the issue of medical image registration that typically takes two hours or more to meticulously align each of potentially a million pixels in the combined scans, MIT researchers have created a machine-learning algorithm they say can register brain scans and other 3D images more than 1,000 times more quickly using novel learning... » read more

Near-Threshold Issues Deepen


Complex issues stemming from near-threshold computing, where the operating voltage and threshold voltage are very close together, are becoming more common at each new node. In fact, there are reports that the top five mobile chip companies, all with chips at 10/7nm, have had performance failures traced back to process variation and timing issues. Once a rather esoteric design technique, near... » read more

Farming Goes High-Tech


Data from dirt — literally — is enabling farmers to perform detailed analysis to make their farming practices smarter, more efficient, and significantly more productive. Companies in every market are leveraging data to their business advantage, and the agricultural sector is no different. Even the venture capital community has taken note. According to ABI Research, some sizeable venture ... » read more

System Bits: June 12


Writing complex ML/DL analytics algorithms Rice University researchers in the DARPA-funded Pliny Project believe they have the answer for every stressed-out systems programmer who has struggled to implement complex objects and workflows on ‘big data’ platforms like Spark and thought: “Isn’t there a better way?” Their answer: Yes with PlinyCompute, which the team describes as “a sys... » read more

System Bits: June 5


The right squeeze for quantum computing In an effort to bring quantum computers closer to development, Hokkaido University and Kyoto University researchers have developed a theoretical approach to quantum computing that is 10 billion times more tolerant to errors than current theoretical models. The team said their method may lead to quantum computers that use the diverse properties of sub... » read more

FPGAs Becoming More SoC-Like


FPGAs are blinged-out rockstars compared to their former selves. No longer just a collection of look-up tables (LUTs) and registers, FPGAs have moved well beyond into now being architectures for system exploration and vehicles for proving a design architecture for future ASICs. This family of devices now includes everything from basic programmable logic all the way up to complex SoC devices.... » read more

Ensuring Chip Reliability From The Inside


Monitoring activity and traffic is emerging as an essential ingredient in complex, heterogeneous chips used in automotive, industrial, and data center applications. This is particularly true in safety-critical applications such as automotive, where much depends on the system operating exactly right at all times. To make autonomous and assisted driving possible, a mechanism to ensure systems ... » read more

System Bits: May 29


Ultra-low-power sensors carrying genetically engineered bacteria to detect gastric bleeding In order to diagnose bleeding in the stomach or other gastrointestinal problems, MIT researchers have built an ingestible sensor equipped with genetically engineered bacteria. [caption id="attachment_24134598" align="alignleft" width="300"] MIT engineers have designed an ingestible sensor equipped with... » read more

Analog Migration Equals Redesign


Analog design has never been easy. Engineers can spend their entire careers focused just on phase-locked loops (PLLs), because to get them right the functionality of circuits need to be understood in depth, including how they respond across different process corners and different manufacturing processes. In the finFET era, those challenges have only intensified for analog circuits. Reuse, fo... » read more

System Bits: May 22


AI disruptions and benefits in the workplace According to Stanford University researchers, artificial intelligence offers both promise and peril as it revolutionizes the workplace, the economy and personal lives. Visiting scholar James Timbie of the Hoover Institution, who studies artificial intelligence and other technologies, said that in the workplace of tomorrow, many routine jobs now p... » read more

← Older posts Newer posts →