Photonics: The Former And Future Solution

Twenty-five years ago, photonics was supposed to be the future of high technology. Has that future finally arrived?


Experts at the Table: Semiconductor Engineering sat down to talk about where photonics is in the hype cycle and its secure role in data centers, with James Pond, fellow at Ansys; Gilles Lamant, distinguished engineer at Cadence; and Mitch Heins, business development manager for photonic solutions at Synopsys. What follows are excerpts of that conversation.

[L-R]: Ansys’ Pond, Cadence’s Lamant, Synopsys’ Heins.

SE: Twenty-five years ago, there was a lot of excitement about photonics. Are we in another hype cycle, or will photonics truly become mainstream this time?

Heins: We’ve all been using photonics for the last 25 years. It’s in all long-haul communications, including transoceanic communications. It’s in all the data centers. The racks are fully connected to each other through integrated photonics. From that regard, it’s a lot different than it was 25 years ago. These are production chips and production systems that have been in place for quite some time. The biggest changes in the last 5 to 7 years are the push to silicon photonics and integrating a lot more onto a single die. That’s where things really have shifted quite a bit. The promise there is you’re leveraging all the CMOS manufacturing infrastructure that’s been in place. It’s depreciated. It’s well understood. We’re not plowing tons of new ground trying to manufacture things. There are challenges, but it’s nothing like it was 25 years ago. In that regard, we’re in good shape, and there’s a lot of nice promising technology areas that you can apply this to, so there isn’t as much hype as people might think.

Lamant: I agree. Photonics in general has been around for a long time. For example, companies like Infinera have been doing longline datacom-types of projects. What is changing is that we are seeing a lot more CPU, GPU, or XPU-type communications appearing in the data center, so it’s getting closer, instead of being used on long links, like transatlantic. This is driven by multiple factors. It’s the energy efficiency and the ability to swap things for datacom. That’s where AI comes in. There is a tremendous need for bandwidth in high-performance computing (HPC). It doesn’t have to be photonic, but to load it, to get the bandwidth from the memory to the AI processing — whether it’s a GPU from NVIDIA or AMD, or whether it’s a photonic processor — the amount of data that needs to go back and forth is what is actually the challenge. Most companies are calling that little bit of communication the ‘photonic engine.’ We see a lot of companies focusing on the photonic engine, which is dedicated interfaces for doing the data exchange. This is definitely not a reversible trend. Most of those things leverage existing fabs. From a cost perspective, 25 years ago, it was extremely hard to get something that was financially viable. Today, from both a cost perspective and a bandwidth perspective, we’re close to where we need to be.

Photonics for quantum computing may be a little bit hyped for now, but there is no turning back on the photonic engine and the interconnects, whether it’s inside the rack, between racks, or between two countries. We’ve reached the point where even within the rack, there are theoretical limits to what you can do with copper. It’s extremely hard to get past that, so I don’t see it going back, at least in that very focused area of photonics — basically, HPC. But even in HPC, it’s not the computing, but the data exchange between the different units, and it’s that part that is complete. There is no turning back on that.

Pond: I’ve been working in photonics since 2000, so I lived through the first wave. My first job was at a startup that was trying to commercialize photonic crystal technology, which was a big hype at the time. Photonics was used for long-haul and metro communication, but people wanted to try to use it for integrated photonics. The company I was at competed with Luxtera, and we were trying to build a platform for integrated photonics. In hindsight, it was completely unrealistic. After the telecom bubble, the crash in 2003 or so, that kind of work went back into academia. Then, around 2009 to 2012, there was the big boom where people realized we can make high-speed modulators in silicon, and they went back to simpler waveguides — the more tried-and-true photonic technology.

That was a really exciting time. It suddenly felt like we can actually do this. There are products out there. They’re being used. The energy efficiency, the cost is there. We can manufacture things that we could never do in the past. You can make a low enough loss waveguide, and so on. However, I agree with Mitch and Gilles that it always comes back to datacoms. That’s where photonics is creeping up, with shorter distances and higher speeds due to the fundamental limitations of copper. There’s a whole bunch of other promising applications around that, such as sensing, and the possibility of quantum information technologies. There may be some compute for AI and other approaches, but all of that is still, ‘Who knows?’ But what’s very real is the datacom and the interconnect, and that’s not going away.

Heins: Bandwidth density and power are paramount. If you want to go to software-defined networking and software-defined data centers, latency is also an important consideration. If you want to have all your memory sitting over in one corner of the design center, and the processors in another, you’re not going to do that with electronics. The latency would create significant problems. But with photonics you can. The power that it takes to get it to the other side of data centers is significantly different with photonics. It’s one of the few technologies where you get more speed for less power, and at the same time, even more density, more bandwidth, as you can run multiple wavelengths simultaneously on the same piece of fiber.

SE: What are the problems that photonics can solve that electronics can’t?

Heins: Latency is a huge one. You must keep repeating that signal over and over with electronics to get a good clean signal at any speed that you want to run it. For example, you want to run 100 GHz or the like, you’re reconditioning the signal every x number of meters. With photonics, once you’ve modulated it on there, it’s zipping around the data center all in one shot, 1000X the speed of the electronics, so certainly latency is one big distinction. Bandwidth is another one, you can run multiple wavelengths. A lot of people now are looking at WDM and DWDM. With electronics, you can either increase the number of channels or increase the speed of each channel and that gets more difficult as you go. By contrast, photonics offers you the ability to run multiple wavelengths on the same physical connection which lets you add many more channels in the same allotted space, giving you much more bandwidth.

Pond: It’s really difficult to first convert your signal to modulate it onto light. But once you’ve done that, then you can go almost any distance with that because the losses in an optical fiber are so low that you can go anywhere from centimeters to kilometers, which is the real advantage. The other thing to remember is the underlying carrier frequency is around 200THz. In terms of bandwidth density, there’s a lot of room to do a lot of things, to send signals at extremely high speeds when your carrier is 200THz. Finally, you’re facing off against the fundamental limitations of copper, that as you go to higher and higher speeds, you get increasing losses that you just can’t support. There’s no doubt that data communications over anything more than a few millimeters is always going to be better with the higher speed, lower power in photonics.

As for the other areas where photonics can come into play over electronics, sensing is one. There are different things you can do in photonics that don’t make any sense in electronics. One of the challenges we fight in photonics is that the light is sensitive to almost anything, which is frustrating in data communication sometimes, but it’s great for sensing, because it is incredibly sensitive to very small changes in temperature or refractive index, so there’s an opportunity there.

As for quantum computing, time will tell. There are many different avenues people are pursuing for quantum computing, and photonics is a very promising one. It’s hard for anyone to say at this point which one of those is going to win, or maybe they will all win and have different applications for certain quantum calculations. The other application that is quite interesting is that certain calculations for AI —essentially, multiply and add, which are exactly what you need for matrix/tensor multiplication — can be done in photonics at much lower power compared to electronics. That’s very exciting, and it needs to be pursued. At the same time, betting against electronics for that type of application has been a bad bet for decades. Will the photonic engines be able to scale and continue to compete for future AI? Time will tell. It’s certainly exciting and needs to be explored, but it’s not proven yet.

Lamant: I agree. The very first two things are bandwidth and power. None of us will disagree on where it can help versus electronics. As James said, sensing is also definitely an interesting thing as well as multiply/add. The challenge is the electronic-to-light conversion, in either direction. It’s the expensive part. Once you’re in the light domain, or if you can start in the light domain, for example as in lidar, you need to stay in the light domain as long as you can. Thus, we also need to see investment in photonic processing, such as DWDM, where you have multiple frequencies that are very close to each other. You need the ability to stay in the optical domain to filter those frequencies. Having filters and the ability to generate a comb of light, all of those things need to make progress for photonics to be even more efficient. Today, all the power you gain on a multiplication matrix, you can lose because you need to process everything from the memory, and transform it into a form that can be used by the light. Those sides are very expensive. Once you’re in there, a lot of things are very cheap energy-wise, but the conversion between the two domains is very expensive and very power-hungry.

Heins: There are a lot of things you can do in the sensing domain that make no sense in electronics. For example, interferometric-type sensing. Essentially, you’re able to use a beam of light that is interacting with the object and coming back, and then you’re interfering it with itself. It gives you a signature of what it is you’re seeing and can be used for applications such as molecular biosensing. These are very interesting approaches that could revolutionize a lot of point-of-care functions. Today, you take a blood test, where they take some of your blood and they send it to a lab, and it has to go through a big machine and lengthy processes. With photonics, you could bring that down to the size of something you could carry in your hand, and basically do the test at the bedside and get an answer. There are people who are already making progress on this. Indeed, as bad as COVID was, it became a great testing ground for some of these ideas. They now have sensors they can use to find COVID at essentially molecular levels. Of course, mil/aero companies are very interested in this kind of technology. They want to know what their soldiers are walking into, if there’s sarin gas or some other nasty element in the air or close by. They want to know it as soon as possible at the lowest levels possible so they have time to react. These are all very cool applications that photonics is well suited for.

Lamant: To give you another example of where we see the advantage of photonics for computing in terms of energy, remember the basic physics example of a prism. If you shine light on the prism, you get the light to decompose into multiple colors. That’s actually a Fourier transform. You went from a mix of things, and then you get for each frequency you have, how much if one light is brighter than the other one. Or if you shine red light, you only get red light out. If you shine white light, how much energy did you spend in the prism before doing the conversion? About zero.

If you wanted to do it with a DSP, you actually need to spend a lot of energy to do that same Fourier transform, to learn how much energy is in each channel of light. When I go to schools and talk about photonics, I’m saying to them, look at what the light can do. Most of those kids don’t understand yet what a Fourier transform is, but this is a very natural, super simple example to understand the potential. How you get the light and how you actually measure the intensity of the different streams of light on the other side, that’s expensive. But the prism in the middle and what happens there is free, with a very little loss of energy if you have a good prism. This is just an example, which is easy to understand, of the power of what you can do with photonics.

Related Reading
Sweeping Changes For Leading-Edge Chip Architectures
Large language models and huge data volumes are prompting innovation at every level.
AI Drives Need For Optical Interconnects In Data Centers
Old protocols are evolving as new ideas emerge and volume of data increases.

Leave a Reply

(Note: This name will be displayed publicly)