Standards: The Next Step For Silicon Photonics

More data and denser designs are opening the door for photonics.


Testing silicon photonics is becoming more critical and more complicated as the technology is used in new applications ranging from medicine to cryptography, lidar, and quantum computing, but how to do that in a way that is both consistent and predictable is still unresolved.

For the past three decades, photonics largely has been an enabler for high-speed communications, a lucrative market that now tops $1 billion. But with the need for high-speed, low-power communication becoming more pervasive — especially as more data needs to be moved and processed — photonics is gaining traction across a variety of markets. Pushing all that data through thin wires and over longer distances requires power and generates heat, and photonics offers a proven alternative.

As the market opportunity widens, so does the clamor for standards to ensure everything from functionality to reliability. The question now is who defines them, the hundreds of smaller players developing niche applications, or the handful of communications giants that have been working with photonics for years?

“Some 4,300 companies and over a million people in more than 50 countries are now involved in the field,” wrote John Kulick, chair of IEEE’s Photonics Standards Committee. “But the great majority of these companies are small- and medium-sized firms with diverse technical understandings, capabilities and experience. This means there is a great need to reduce technical ambiguity and develop greater clarity in definitions and other terminology.”[1]

Standards could help significantly. “There isn’t a commercialized, turnkey production test solution today,” said Matt Griffin, product manager at Teradyne. “There is simultaneously the challenge of combining electrical and optical measurements into the same test system at the wafer level and the lack of standardization on what test coverage will be needed. But high throughput test is going to be a critical capability to have as volumes continue to increase and these challenges will have to be solved to get there.”

The result is a series of do-it-yourself configurations that are unlikely to scale, according to Tom Daspit, product manager at Siemens Digital Industries Software. Picture a tabletop with fiber optics coming in, fiber optics going out, driven by a PC.

“When I visit my customers, I always ask, ‘How do you test it?’ And they reply, ‘We built our own in-house environment,’” said Daspit. “You feel for them because you know how much work they have to do. On the IC side, we have years of experience building test programs for electrical testers and automatic test pattern generation. There’s none of that on the photonics side. Everybody’s ad hoc.”

That translates into increased testing costs. While semiconductor testing is typically about 10% of component costs, according to some estimates, silicon photonics testing can be anywhere from 60% to 90% of product costs. Worse, cautioned Frank Chen, director, Applications and Product Management at Bruker, the current situation can put pressure on new companies trying to prove the value of their prototypes. “They want to achieve minimum viable product as quickly as possible, but existing learning cycles are slow and expensive. Unfortunately, they don’t have the production volume to trigger the adoption of novel and more effective solutions.”

To move forward, an entire two-hour panel at this year’s Optical Fiber Conference was dedicated to standardizing photonic IC (PIC) testing, organized by NVIDIA, Advanced Micro Foundry and Université Laval, with speakers from companies including Ansys, Keysight, and Intel.

“It’s meant to be the first in a series of panels about PIC testing at the wafer-level with the goal of eventually bringing some standardization into the process,” said Ryan Scott, photonics research manager at Keysight, who participated in the panel. “The organizers and speakers hope that by bringing together leading companies from across the PIC ecosystem to describe challenges and propose some potential solutions, it will spur the discussion and move the community towards standardization.”

Indeed, Keysight, CompoundTek, and Wuhan’s National Information Optoelectronics Innovation Center (NOEIC) announced a collaboration in 2020 to establish a globally recognized, standardized approach to PIC layout. In their joint release, they described the value of PICs as offering a multitude of advantages over their discrete components and bulk optics counterparts, including significant footprint reduction, improved stability, and lower energy consumption.

Dave Armstrong, co-chairman of the Test TWIG of the Integrated Photonic Systems Roadmap (IPSR) and principal test strategist at Advantest America, pointed to some of the challenges that need to be tackled. “A comprehensive test solution needs four things — optical positioning, optical instruments, a digital interface and control, and an analog thermal control interface and control,” he said. “Most optical instruments have a heater built into them to control the wavelength, so the control loop of the heaters and the device temperature is a fourth domain you have to control in order to do an effective test of these devices.”

The delicate nature of nanophotonics handling adds to the difficulties of achieving standardization, Daspit said. “When I test electrical die, I have a probe card. I can make good electrical contact once I get that all set up. But if I have optical going in, how do I align that? I’ve got a piece of fiber that’s got to hit that die at the top at a proper angle or the die at the side at a proper angle. If that die is a little bit shifted, I may not line that up. So I now need a way to move that fiberoptic, going in or out, to make sure I get the alignment.”

Alignment is the Achilles’ heel of optical test. “To be able to test the active components, you need to combine electro-optical testing,” said Dimitrios Velenis, group manager, 3DSiP Devices and Components at imec. “One big challenge for the optical part of the test is to make sure you have a good alignment. There are active alignment procedures of the input fibers that take a significant part of test time. You want to do that without fixing the permanent component because you want to reduce the cost of attaching that. So for testing, even if you do it with a multi-fiber array, the landing time is something that you need to consider. You need to test in parallel, to have as many optical inputs and outputs as possible during testing. Once you do that, then you need to access the active components with electrical signaling to test those components.”

Today there are generally up to 16 lanes of optical fibers on a part, but the number of lanes is increasing. Each one needs to be individually aligned, which increases both production and testing time. If the typical testing index time of a wafer prober is a fraction of a second per step, this process can take several seconds per alignment, not counting how many alignments are needed per die or how many sites are tested in parallel.

The difficulty is that silicon photonics requires both horizontal and vertical coupling. Photonic interfaces are typically horizontal and tend to come out the side of the die, which means a photonic interface is buried. As a result, the PICs are not accessible prior to dicing.

That means the device needs to be integrated using a grating coupler—essentially a mirror that takes the signal from the side of the PIC vertically so the light can be driven into and out of the photonic component. The problem is that today’s grating couplers provide only vertical coupling of signals, while PICs naturally have horizontal coupling. And even though there is widespread agreement that horizontal coupling should be a standard, no one yet has a widely accepted means for how to achieve that economically at production scale. The industry is still grappling with how long the alignment procedure takes, how many fibers can be aligned simultaneously, and the number of fibers that can be connected to the wafer at once.

Current inspection, testing tools
Fortunately, there is some good news. Even though standards are still being worked on, there are photonics inspection tools in the market, including dedicated offerings from Keysight. In addition, traditional inspection tools, like X-ray, can be used for certain photonics applications.

However, there’s a domain split when it comes to awareness of what’s possible, according to Christopher Claypool, senior director of R&D for FilmTek Products at Bruker. “Within the semiconductor industry, there seems to be a general awareness that the non-contact techniques of reflectometry and ellipsometry are standard methods for measuring film thickness and refractive index,” he said.

By contrast, the technique of record in the photonics industry is still the prism coupler, even though there’s general recognition that it’s not a viable production method because it’s a contact technique with sub-optimal resolution. “In the photonic industry, there seems to be a real lack of awareness that you can use the non-contact technique of multiple-angle reflectometry as a high-resolution measure of refractive index and film thickness,” Claypool explained.

Another inspection and characterization method uses X-rays, which complements optical inspection methods and can measure, for instance, degrees of warpage. “Novel X-ray systems have drastically reduced inspection time, enabling a breakthrough in die-attach process control,” said Chen. “With inspection immediately following die-attach, process drift can be quickly identified and corrected before it generates failed or marginal parts. X-ray metrology provides an intuitive physical understanding of the failure mode and highlights fragile connections most likely to fail.”

Fig. 1: Examples of warpage in silicon photonic packages, detected by X-ray. Bumps highlighted in red are fragile connections most likely to fail. Source: Bruker

Juliette van der Meer, XRD product marketing manager at Bruker, concurs, noting that X-ray may have an ideal place in silicon photonics inspections in the future. “X-ray diffraction has a unique capability to do epitaxial characterization. In a non-destructive way, it gives information about the thickness of the epi layer, the concentration, as well as the strain,” she said. “For III-V materials, we already do this. It’s our bread and butter. So we likely don’t need to develop new hardware tools because the materials are already well-known. It’s the applications that are new. In my estimation, we don’t need new X-ray metrology techniques, but we need to do the application development together with our customers.”

Modularity: Standardization with customization
While marquee companies have joined with niche firms and start-ups to develop standards, the challenge remains — are the economics there?

“Big companies providing production-grade tools for integrated photonics say, ‘Yes, we can make the tools for volume production, we can make them fast enough and precise enough, but is there a market to convince us to build these systems?’ said Sylwester Latkowski, scientific director of the Photonic Integration Technology Center (PITC) and researcher at Eindhoven University of Technology in the Netherlands, which is developing terminology and definitions for photonics EDA and optical distribution frame (ODF). “The smaller companies can take advantage and provide such systems, but they say, ‘We know how to do it. We know there is demand. But a system that we build for one company is typically customized to such a degree that a similar system delivering the same functionality to another customer has to be built from the basis. We cannot do it the same way because our customers are typically unwilling to share fine details of the processes used inside.’”

Latkowski, who is also the chair of IEEE Working Group P3112, said standards and common solutions would dramatically reduce non-recurring engineering costs and shorten time to market. The question now is which is the best way forward.

“We’re actively monitoring the space to see where it makes sense to build a solution,” said Teradyne’s Griffin. “The hope is that as we see the volumes increase, as we see a little bit of a standardization, we can know the tests you’d expect to run in a production environment, and more solutions will be made publicly available.”

Adding to the drive for a universal solution is the difficulty of exchanging information, even between experts within the same company. “If you take a photonics expert and mechanical engineer who are needed for a product, they understand their field, but they cannot easily communicate with one another,” said Latkowski. “Even if they exchange the data, they don’t have tools that can easily cross-operate.”

As one approach to solving the problem, Latkowski and his colleagues published a paper in 2019 proposing ideas that are still in play. [2] Their solution is to create modular systems with application programmable interfaces (APIs). Thus, there would be a degree of standardization to keep production profitable, with enough space for customization and competitive differentiation. The paper describes layout templates, which can be used with electronic-photonics design automation (EPDA) tools, including die orientation, naming and locations of input/output ports, fiducials, and restricted areas. It also discusses an Open Test Framework with the API needed for a modular structure, as well as the openEPDA formats for data exchange.

Silicon photonics is being used in more designs to provide high-throughput data pathways with low heat, but standards for designing, manufacturing, and especially testing this technology have been slow to follow. That’s beginning to change.

“We are finally seeing the industry develop modifications to standardize test equipment in order to support testing of high-density optical engines,” said Manish Mehta, vice president of marketing and operations, Optical Systems Division at Broadcom. “It is critically important over the next three to five years that test equipment developers continue to invest in the space because testability is going to be a key part of the industry’s ability to scale.”

As Advantest’s Armstrong summed it up: “We need standards, but they need to be cost-effective. The test has to be short and sweet, but thorough.”

1. Kulick, J. How Can Photonics Standards Help to Transform Our Digital Future? IEEE SA Standards Association. June 22, 2021.
2. S. Latkowski, D. Pustakhod, M. Chatzimichailidis, W. Yao and X. J. M. Leijtens, “Open Standards for Automation of Testing of Photonic Integrated Circuits,” in IEEE Journal of Selected Topics in Quantum Electronics, vol. 25, no. 5, pp. 1-8, Sept.-Oct. 2019, Art no. 6100608, doi: 10.1109/JSTQE.2019.2921401.

Leave a Reply

(Note: This name will be displayed publicly)