中文 English

Analyzing Electro-Photonic Systems

Systems that contain both electrical and photonic components have to be designed as a single system, but modeling issues are slowing adoption.

popularity

The design and analysis of electro-optical systems is pushing tools into the complex multi-physics domain, making it challenging to create models that execute at reasonable cost — especially when they include thermal impacts.

The lack of models and standards also is slowing the progression of the technology. Still the advantages are worth it to those willing to make the investment.

Traditionally, optical components were manufactured independently from each other and from the electronics. Later people started to use thin-film fabrication technologies, which enabled electro-optical devices. Today, foundries offer monolithic electro-optical PDKs and fabrication technology for electrical devices and photonics devices that are closely coupled, where the ability to simulate electrical and optical behavior at the same time is now a requirement.

While some optical devices can be analyzed in isolation, others cannot. “Co-design and analysis become critical for photonic integration, especially where you have a component like a phase shifter,” says Jigesh Patel, technical marketing manager for photonic IC & system tools at Synopsys. “The phase shifter component has a waveguide and an electrode right on top of it. How do you model those? How do you separate out the photonic part from the electronic part? If there is a load on the electrode, it affects the performance of the waveguide. If there is a load on the source side, it effects the optical performance. This is where you basically need electro-optical co-simulation. People were taking an approximate electrical equivalent model and integrated that with other electrical circuits. That’s no longer a sound approach.”

There are secondary connections between electrical and optical, as well, that have to be taken into account. “Doing the physics, the extraction, one piece at a time, and stitching components together is not good enough for the speeds we’re talking about in photonics,” says Gilles Lamant, distinguished engineer at Cadence. “There is cross-coupling between the components within the chip, and you actually have to perform EM extraction, looking at multiple fabrics at the same time. This is also true for thermal. While the physics and the basic engines are known, what is changing is the need to go across multiple fabrics.”

Temperature is an important element of the analysis. “Even a fraction of a degree change in a photonic demodulator can have a big impact on the performance,” says James Pond, director of product management at Ansys. “It can drive it completely out of its operating points. It’s not the absolute temperature that matters. It is the differential temperature between different arms of an interferometer.”

Manufacturing variability also takes a step up in sensitivity. “A photonic signal traveling in a photonic integrated circuit is actually an electromagnetic wave traveling with a very high frequency,” says Twan Korthorst, director of photonic solutions at Synopsys. “It is very sensitive to small disturbances, such as how it has been manufactured. A digital IC or digital switch fabric still functions when fabrication tolerances are plus or minus a few percent. A photonic waveguide or photonic filter is very sensitive to fabrication variations.”

Many photonic systems have to brought into a desired operating point. “There is a quite a bit of control electronics that is required for silicon photonics just to keep everything operating properly, to maintain the thermal heaters and feedback loops, to keep everything going,” says Ansys’ Pond. “Once you start stacking these devices together, where some of them are generating a lot of heat, you’re going to be modifying the temperature across an integrated photonic interposer. The modeling challenge is to be able to understand what is happening to the temperature of the entire 3D-IC. And from a simulation perspective, you need to be able to calculate the operating temperature in different configurations and to determine the local temperature of every individual photonic component. Then we can study the impact of that on the performance of the photonic circuit.”

The modeling challenge
There are several factors that make modeling of photonic systems much more complex than equivalent models in the electrical domain.

“From a modeling point of view, the difficulty comes from the fundamental difference between what an electrical signal is and what an optical signal is,” Synopsys’ Patel explains. “The electrical signal is characterized by a current or voltage, and has a frequency associated with it. In this high-speed age, the electrical signals speeds are several tens of GHz. On the other hand, on the optical side, the optical signal can be having multiple wavelengths, not a single wavelength signal. If you convert those wavelengths into frequency, it is in the order of 200 THz. That frequency creates all sorts of issues. The Nyquist theorem says that if you want to model or reproduce a signal, you at least have to sample them at twice the frequency content of the signal.”

There are a number of elements involved. “It can be a very broadband signal,” Patel says. “It can be multi-channel, like Wavelength Division Multiplexing (WDM), which can be 64 or 128 lasers. You do not see that kind of thing in an electrical signal. The signal itself is different. An electrical signal is a real signal. On the other hand, the optical signal is a complex signal. There is a real part and an imaginary part.”

In addition, there is no equivalent of polarization in the electrical domain. “An electrical signal is baseband,” he says. “It is just a signal. There is no concept of polarization. With light there is at least X and Y polarization, and depending on the kind of device, that device may not be circular symmetric. It can be any shape, and so different polarizations of the light will have different velocities as they propagate in a photonic device. At the output of the device, you have to make sure that the polarization and the dispersion between the polarization is accurately accounted for. Also, there is something called a transverse mode profile. This is the electromagnetic pattern of radiation in the plane perpendicular to the plane of propagation. For a laser, it has a very small aperture from where light is emitted. If it goes into a very small core, the transverse mode doesn’t come into the picture that much, but in a photonic integrated circuit, device size may be larger than fiber port size. As a result, it may excite many more modes, more transverse mode profiles.”

There are other complications, as well. “Light also has a tendency to bounce back,” says Cadence’s Lamant. “So you have forward propagation of the light, but there is a certain amount of light that will bounce back and you do need to model that.”

This is in addition to the external factors that affect performance of the components. Temperature and external electric fields can have a significant influence on the propagation constants of modes in waveguides. Temperature changes can cause materials to expand or contract, while electric fields distort the internal shapes of crystalline materials.


Fig. 1: Optical signals, photonic models, and means of analysis. Source: Synopsys

The time constants involved almost defy comprehension. The optical signals are in the 200 THz range, electrical in the gigahertz, but to study the thermal impacts, analysis may have to cover seconds. This suggests that reduced order models will be necessary for some types of analysis.

“The optical part definitely needs a lot of multi-physical simulations,” says Andy Heinig, group leader for advanced system integration and department head for efficient electronics at Fraunhofer IIS’ Engineering of Adaptive Systems Division. “Until now, simulations could be carried out separately for the optical part. But after the multi-physical simulations, very accurate behavioral models must be derived and/or developed. Later in the development process, co-simulation between the electrical and the optical part (based on the behavioral models) has to be done. In the future, better automatic derivation of the behavioral models should be developed. Also, models that can be reduced for different use cases are necessary in the future to support better simulation times.”

Using Verilog-A
An approach used in the past was to create behavioral models for the photonic components in languages like Verilog-A. “Electrical simulators do not know how to handle photonic devices,” says Patel. “And then, specifically for photonic integrated circuits, components are in close proximity to each other, so you will have reflections going through multiple parts, like standing waves. These differences make it tricky to do photonic simulation using electrical languages, or inside an electrical simulator.”

Lamant agrees. “If you have a methodology where you use a mathematical model for propagating signals, such as a Verilog-A, propagating backward requires a lot of extra equations. When you go with a more optical representation, where you represent the optical connection as a waveguide, it is more like an etch parameter and is bi-directional. This is another reason why it is important to pull together a full system simulation.”

So what is the right approach? “It’s not really a language issue,” says Patel. “C++ or Matlab — these are languages. It is how you use those languages to characterize the signal. Our models are written in C++. But how do you define optical signals? Because the optical signal is complex, exists in arrays, and requires matrix operations, Verilog-A is not a very friendly choice to do those kinds of operations. The trick is basically how you combine electrical signals, because in the photonic integrated circuit you will have electrical and optical signals. It’s how you combine those, how you take account of multi-channel nature, how you account for polarization, and so on. I have not seen the attributes of the optical signal properly translated into Verilog-A. The Verilog-A language is not really suitable for handling complex numbers and multi-element signals and operations, but that doesn’t mean can’t do it. There may be a way to characterize all of those things, but the question is at what cost.”

It may be acceptable if you are willing to give something up. “You will find people that are using a Verilog-A approach,” says Synopsys’ Korthorst. “It is used to describe photonic devices in an abstract way. But those models really miss the ability to cover reflections or multiple wavelengths, or wavelength mixing or dispersion, etc.”

At other times, a different form of analysis can provide the required answers. “You can characterize many components in terms of photonic S-matrix, and that’s faster than analyzing the same component for every time-step of the incoming signal,” says Patel. “There are more efficient ways, and if you can identify within your photonic system certain parts that can be characterized as linear-time invariant, for example, then you can use a much faster method of analysis.”

Standard models
While significant progress has been made on the creation of accurate models, this is also the limiter today. “If there are photonic simulators, and they are able to cope with these challenges, and if there are solutions for electro optical co-simulation, how do you actually enable those capabilities for a variety of foundries and/or tools?” asks Korthorst. “The standardization of models is still very, very early days. So that is a challenge for the industry, or for the vendors and the foundries. There is no standard that you can just pick up and work on.”

Others agree. “The lack of standardization is an obstacle for everybody in photonics,” says Patel. “Even though foundries have PDKs, people still need custom components.”

Many times, a PDK is delivered as an encrypted model, so even simple customizations require foundry intervention. And a lot of customizations are often needed. Without detailed knowledge of how a foundry has implemented electrical mapping of optical signal attributes, designers cannot add custom components without help from the foundry. Even if such help is free as part of the foundry agreement, it nonetheless costs time and productivity.

So how does the industry move forward? “In 2012, Si2, had a silicon photonics activity,” says Korthorst. “But it was way too early. Today, it looks like a good time to reverse course and revive that activity. As an industry, it makes sense to create some standards, because a foundry doesn’t want to create models for tool A, B, and C, and vendors don’t like to support standard A, B, and C.”

Conclusion
Electro-optical systems have made huge strides over the past decade and their applications are extending into multiple areas. While consolidation has been happening within the industry, it has not yet come together enough for common models to be created. Until that happens every design will be a full-custom challenge, and that will limit growth.

Can the industry agree on what needs to be done? Probably, but standards take time, especially if there is no de facto leader. Every minor issue will cause huge debate. Design by committee tends to be slow and produces inefficient results.

Related
Chipmakers Getting Serious About Integrated Photonics
This technology could enable the next wave of Moore’s Law. What’s needed to make that happen?
SemiEngineering’s Photonics Knowledge Center
Top stories, white papers, videos and blogs on photonics.
Developers Turn To Analog For Neural Nets
Replacing digital with analog circuits and photonics can improve performance and power, but it’s not that simple.
Testing Silicon Photonics In Production
Much work still needs to be done to reduce costs and improve speed, and that requires an entire ecosystem.



1 comments

Does no Matter at the moment says:

déjà vu works on the basis of Electro-Photonics.

Leave a Reply


(Note: This name will be displayed publicly)