中文 English

Using ML In EDA


Machine learning is becoming essential for designing chips due to the growing volume of data stemming from increasing density and complexity. Nick Ni, director of product marketing for AI at Xilinx, examines why machine learning is gaining traction at advanced nodes, where it’s being used today and how it will be used in the future, how quality of results compare with and without ML, and what... » read more

Tradeoffs Between Edge Vs. Cloud


Increasing amounts of processing are being done on the edge, but how the balance will change between what's computed in the cloud versus the edge remains unclear. The answer may depend as much on the value of data and other commercial reasons as on technical limitations. The pendulum has been swinging between doing all processing in the cloud to doing increasing amounts of processing at the ... » read more

Education Vs. Training


While writing my recent articles on the subject of training, a number of people pointed out that training and education are not the same thing. In a very simple sense, training is defined to be learning a skill or behavior that enables you to 'do' something, whereas education is the acquisition of knowledge from study or training. These definitions leave me cold and, in my mind, miss a very ... » read more

What Is Intern Reading Club?


As the summer winds down, interns are busy completing their assigned projects and preparing their end of summer presentations. These presentations have been a rite of passage for interns on the Pointwise team for many years and gives each intern a chance to show off what they learned and accomplished. And the rest of the team gets to hear all the details of what they've been working on. Anothe... » read more

Continuous Education For Engineers


Continuous education is essential for engineers, but many companies don't recognize the value or they are unwilling to provide the necessary resources. This should be a line of questioning before every new hire makes the decision about where they want to work, because it not only affects their future career, but also impacts the value they can provide to that company during the course of the... » read more

RaPiD: AI Accelerator for Ultra-low Precision Training and Inference


Abstract—"The growing prevalence and computational demands of Artificial Intelligence (AI) workloads has led to widespread use of hardware accelerators in their execution. Scaling the performance of AI accelerators across generations is pivotal to their success in commercial deployments. The intrinsic error-resilient nature of AI workloads present a unique opportunity for performance/energy i... » read more

Challenges Of Edge AI Inference


Bringing convolutional neural networks (CNNs) to your industry—whether it be medical imaging, robotics, or some other vision application entirely—has the potential to enable new functionalities and reduce the compute requirements for existing workloads. This is because a single CNN can replace more computationally expensive image processing, denoising, and object detection algorithms. Howev... » read more

Architectural Considerations For AI


Custom chips, labeled as artificial intelligence (AI) or machine learning (ML), are appearing on a weekly basis, each claiming to be 10X faster than existing devices or consume 1/10 the power. Whether that is enough to dethrone existing architectures, such as GPUs and FPGAs, or whether they will survive alongside those architectures isn't clear yet. The problem, or the opportunity, is that t... » read more

Applications, Challenges For Using AI In Fabs


Experts at the Table: Semiconductor Engineering sat down to discuss chip scaling, transistors, new architectures, and packaging with Jerry Chen, head of global business development for manufacturing & industrials at Nvidia; David Fried, vice president of computational products at Lam Research; Mark Shirey, vice president of marketing and applications at KLA; and Aki Fujimura, CEO of D2S. Wh... » read more

HBM2E Raises The Bar For AI/ML Training


The largest AI/ML neural network training models now exceed an enormous 100 billion parameters. With the rate of growth over the last decade on a 10X annual pace, we’re headed to trillion parameter models in the not-too-distant future. Given the tremendous value that can be derived from AI/ML (it is mission critical to five of six of the top market cap companies in the world), there has been ... » read more

← Older posts