TSMC: King Of Data Center AI


Large language models (LLMs like ChatGPT) are driving the rapid expansion of data center AI capacity and performance. More capable LLM models drive demand and need more compute. AI data centers require GPUs/AI Accelerators, switches, CPUs, storage and DRAM. About half of semiconductors are consumed by AI data centers now. This percentage will be much higher by 2030. TSMC has essentially 1... » read more

Enhancing AI Datacenter PSUs With Hybrid-Si, SiC, And GaN Power Devices


The rapid growth of artificial intelligence (AI) is driving an unprecedented demand for processing power in data centers, resulting in a surge in power demand at the rack level. With the existing data center rack sizes, the challenge is to deliver more power and efficiency in the same physical footprint apart from costs and cooling. To address this, Infineon has developed a range of hybrid powe... » read more