Scaling AI/ML Training Performance With HBM2E Memory


In my April SemiEngineering Low Power-High Performance blog, I wrote: “Today, AI/ML neural network training models can exceed 10 billion parameters, soon it will be over 100 billion.” “Soon” didn’t take long to arrive. At the end of May, OpenAI unveiled a new 175-billion parameter GPT-3 language model. This represented a more that 100X jump over the size of GPT-2’s 1.5 billion param... » read more

July’19 Startup Funding


During the month of July, 21 technology startups took in mega-rounds of $100 million or more. Those companies together received more than $7.5 billion. On the other end of the financing spectrum, dozens of startups got seed funding or a Series A round. The dollar amounts were much smaller. Still, they are the beating heart of entrepreneurship around the world. It also was a month when som... » read more

Systems Bits: Feb. 27


Prepare to prevent malicious AI use According to the University of Cambridge, 26 experts on the security implications of emerging technologies have jointly authored a ground-breaking report thereby sounding the alarm about the potential malicious use of artificial intelligence (AI) by rogue states, criminals, and terrorists. The report forecasts rapid growth in cyber-crime and the misuse of... » read more