Software, not hardware, will drive quantum and neuromorphic computing

In emerging computing fields like quantum computing and neuromorphic computing, hardware typically gets the lion’s share of the attention. You can see systems and the chips that drive them, talk about qubits and computing that simulates how the human brain works, sort through speeds and fluxes, talk about interconnects, power consumption and transistors, and imagine it all getting smaller and denser as the later generations roll out.

But as Intel noted this week at its Intel Innovation 2022 show, while hardware is important in bringing quantum and neuromorphic to life, what will drive adoption is the software that comes with it. Systems are nice to look at, but they’re decorations if organizations can’t use them.

That was the message behind some of Intel’s news related to quantum and neuromorphic computing at a conference in San Jose, California. On the quantum side, Intel unveiled the beta of its Quantum SDK (software development kit), a package that includes various applications and algorithms, a quantum runtime, a C++ quantum compiler and the Intel quantum simulator.

In the neuromorphic realm, the company unveiled Kapoho Point, a system board containing Intel Loihi 2 research chips that can be used in form factors as small as drones, satellites and smart cars. The cards – which can drive AI models with up to 1 billion parameters and solve optimization problems with up to 8 million variables – can also scale by stacking up to eight (for now) to solve bigger problems.

According to Intel, Kapoho Point offers 10 times the speed and 1,000 times the power efficiency of most modern processor-based systems.

However, Intel also offered an incremental enhancement to Lava, its open-source software stack for neuromorphic computing first introduced a year ago with Loihi 2. Enhancements to the modular framework for developing neuromorphic algorithms include better support for Loihi 2 features, including programmable neurons, graduated events and continuous learning.

Such offerings align with the expanding software approach Intel is taking across the company under CEO Pat Gelsinger as it seeks to modernize a company known for decades as a hardware and components maker, but now finding its way into a changing computing world where hardware is dictated by application needs.

As with many of the company’s software, it remains to be seen how Quantum SDK and Lava will evolve within Intel’s business model. The company says both of these open software packages will be used to help expand these relatively new markets, but one question is whether Intel will find ways to monetize the software to increase its bottom line in the future or if they’re more useful as ways to accelerating the adoption of its own quantum and neuromorphic ambitions by expanding the customer base of the offerings.

Speaking to reporters a few days before the start of the innovation, Anne Matsuura, director of quantum and molecular technologies at Intel, said that the company had been working on software to support its quantum efforts for some time, but had not not until recently thought of pulling together into an SDK that others could use.

“Even though we say ‘software first’ and evolve into a software-centric company, Intel is a hardware company,” Matsuura said. “Initially, we didn’t plan to release the software separately. But in the beginning, we were inspired by other companies. We thought, “We also need to consider building an ecosystem of users for Intel’s quantum technologies,” and we started to realize that it was actually a good idea to release the software first. That’s the only reason we waited so long. It hadn’t occurred to us.

That said, software is a key component to the eventual deployment of an Intel-based quantum computer, she said. One goal is “to get people used to using our software, to get them used to using Intel’s quantum technology. In this way, you basically learn how to program an Intel quantum computer using the Quantum SDK. It’s kind of the impact. In terms of performance, I don’t know if there’s much impact on the actual performance of the qubits themselves, but our software SDK, our stack, was created in tandem with our qubit chip, we’re therefore particularly suitable for operating Intel qubits.

The foundation of the Quantum SDK is the use of the LLVM mid-level description of classical computing and is optimized for classical quantum algorithms, or variational algorithms, which are among the most popular today. According to Matsuura, the software will also work with components such as the company’s quantum simulators and possibly Intel’s spin-qubit-based quantum chips, which look like transistors.

Intel has created a line of 300-millimeter wafers for its quantum dot spin-qubit chips and focuses on everything from hardware and software architectures to applications and workloads.

Currently, the SDK includes a compiler for a binary quantum instruction set and a quantum runtime to handle program execution, Matsuura said. The compiler “allows user-defined quantum operations and breaks them down into the operations available on Intel’s quantum dot qubit chip,” she said. “We have improved the industry standard LLVM [low-level virtual machine] intermediate representation with quantum extensions. We intentionally did this using industry standards so that in the future we can open compiler source and allow users to use any frontend compiler they want and target our LLVM IR interface.

Intel has already opened the quantum simulator in the SDK and can perform one- and two-qubit operations and can simulate 30 qubits on a laptop and more than 40 qubits on distributed compute nodes. Intel intends to roll out version 1.0 of the SDK in the first quarter of next year and later a full SDK including both hardware and software.

In version 1.0, Intel is working to allow developers to import non-Intel tools, like Qiskit and Cirq, into the SDK. At the same time, Intel also helps fund institutions of higher education like Penn State in the United States and the Deggendorf Institute of Technology in Germany.

As with the Quantum SDK, Intel initially had no intention of creating a software framework from the software developed and used internally for neuromorphic computing, according to Mike Davies, principal engineer and director of the maker’s Neuromorphic Computing Lab. fleas. However, it became apparent that the lack of a general-purpose software stack was hampering industry efforts in this area.

“Until Lava, it was very difficult for groups to build on the results of other groups, even within our own community, because the software tends to be very siloed, very laborious to build these compelling examples “Davies told reporters. “But until those examples are developed in a way that can’t easily be transferred between groups and you can’t design them at a high level of abstraction, it becomes very difficult to move that into the business realm where we need to reach a large community of mainstream developers who haven’t spent years doing PhDs in computational neuroscience and neuromorphic engineering.

Lava is an open-source framework with a permissive license, so other neuromorphic chip makers – like IBM, Qualcomm, and BrainChip – are expected to port Lava to their own frameworks. It’s not proprietary, although Intel is the major contributor, Davies said.

The latest iteration of Lava is version 0.5, although the company has regularly deployed versions of Lava to GitHub, he said. It also illustrates, along with the circuit improvements in Loihi 2, advancements in the chip for running deep feedforward neural networks, a basic type of network used in some supervised learning uses. There are less chip resources needed to support these networks, inference operation is up to 12 times faster than Loihi 1 and 50 times more energy efficient.

These aren’t the kinds of workloads Loihi was designed to support – GPUs and other accelerators can run deep direct-fed networks well. And Intel knows it. “It’s very important that we support feedforward neural networks of the type that everyone uses and loves today, because they are just an important building block for future neuromorphic applications.” Davies said.

Leave a Reply

%d bloggers like this: