/A Five-Year Challenge Roadmap for Photonics-Based Computing (via Qpute.com)
A Five-Year Challenge Roadmap for Photonics-Based Computing

A Five-Year Challenge Roadmap for Photonics-Based Computing (via Qpute.com)

Silicon photonics has been proving its worth in telco and communications but there is a much brighter opportunity photonics-based computing. The energy efficiency and data movement potential is promising, especially for increasingly data-laden analytics and AI/ML applications, but the road to a diverse hardware ecosystem for compute is still long.

We likely are at least five years to a decade away silicon photonics-based computing, but unlike other emerging technologies that require a fundamental rethink of computational methods (quantum, for instance) the challenges are surmountable and can iterate on existing technology developments—at least for hardware.

If we are to reach a viable new ecosystem of photonics-based computing vendors and use cases, there are four bumps along the roadmap. These include cost; power consumption; reliability and robustness; and scalability. At the tail end we can start to wonder what the programming landscape looks like but in the case of photonics, the hardware has to come first.

The most important challenges ahead for creating a viable compute market for silicon photonics are both cost and technology driven and unfortunately, many of the barriers fall into the “chicken and egg” scenarios where it’s impossible to have one element without another and it’s not clear which is most important first. Some commercial entity has to invest in a prototype but to do so, there needs to be clear demonstration that such a prototype is worth the investment. We’ve seen this play out over the decades but for now, there are only a small handful of startups dipping a toe into photonics-based computing (as opposed to networking).

Cost aside, the technical challenges ahead for photonics-based computing include power overhead, reliability and robustness, and scalability. Each are somewhat interlinked and represent difficult but not insurmountable barriers.

While power efficiency is one of the lures of photonics, at least for communications now, for computing the power footprint will take some serious rethinking. For large applications, including AI/ML and large-scale analytics, power dissipation across many components is expected to be high—an order of magnitude higher than current systems, according to Sudeep Pasricha, Professor and Chair of Engineering at Colorado State University.

Pasricha and team are directly focused on future opportunities for photonics-based computing and have recently delved into how barriers, including power consumption, can be knocked out on the path to scalable photonics computing. He says that while power consumption is one of the most critical technical barriers now, research work with new devices that minimize footprint has been promising.

“If we are able to replace communications, one of the biggest bottlenecks in all forms of computing, with photonics there is a huge opportunity to push the boundaries of performance and energy efficiency across platforms. Communications is the starting point but the grand vision is understand where it fits with computation.”

Reliability and robustness in the face of fault and noise are equally important. The ability for photonics components to operate in an error-free manner that can be resilient to the inherent noise with the traversal and coupling of light is required but takes some different technical chops. The current way of thinking about photonics-based computing requires a standard CMOS chip substrate. “A lot of the fabricated chips today have variations because of semiconductor process imperfections. We’re trying to operate in an error free manner in the presence of such chip level imperfections,” Pasricha says. While some of the normal ways we deal with error correction in hardware and software can help, the fundamental physics interactions will take specialized work to ensure resilience.

Both power and resilience combine forces to make scalability more challenging. As the complexity of an application scales on these new photonics substrates, all the challenges related to power and reliability are enhanced with more devices and complexity. “As we innovative at the device, circuit, and architecture level, the expected scalability will be more feasible but at the same time, the workloads are also going to be more complex and we have to keep pace with that,” Pasricha explains. He adds that current work is focused on making the chip substrates more scalable as well.

These challenges are all related to hardware, which means the next five will be devoted to getting applications to run efficiently and scale. That means a huge software lift—one that will take more than a “magic compiler” if the benefits are to be realized. “Custom software development will require close interaction with the hardware substrate. It’s still early days and it’s back to the chicken and egg problem,” says Pasricha.

All of this is five years out, if not more, at least in terms of a commercial market for silicon photonics-based computing. Pasricha says that the first products will likely be accelerators or co-processors focused on large-scale analytics and AI/ML. Until then, we’ll continue following progress on the four major technical barriers, which we expect will become the work of academics but also on the part of companies that have a pipeline already, including those who are currently focused on photonics communications.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.

Subscribe now

This is a syndicated post. Read the original post at Source link .