Microsoft Debuts Azure Quantum Elements and Azure Quantum Copilot LLM

By John Russell

June 22, 2023

At a virtual event yesterday kicked off by CEO Satya Nadella, Microsoft introduced Azure Quantum Elements (AQE), a new set of services and tools for quantum chemistry and materials science. It also added to the Microsoft AI Copilot family with “Copilot in Azure Quantum”, a GPT-4 based LLM tool to assist quantum researchers. Lastly, Microsoft broadly reviewed its quantum technology roadmap and introduced a new metric it calls reliable Quantum Operations Per Second (rQOPS in Microsoft parlance).

Nadella said, “Today, we are bringing together AI and quantum for the first time with the announcement of Azure Quantum Elements [and] are ushering in a new era of scientific discovery. Azure Quantum Elements is the first of its kind, comprehensive system for computational chemistry and material science. It applies our underlying breakthroughs in supercomputing, AI and quantum to open incredible new possibilities and take scientific discovery to a complete new level. We are also announcing Copilot in Azure Quantum so scientists can use natural language to reason or some of the most complex chemistry and material science challenges.

Satya Nadella, CEO, Microsoft

“Imagine using natural language to generate code to help model the electronic structure of a complex molecule and predict its exact properties. Or imagine simply describing the scientific problem you’re trying to solve and having the system configure the underlying software needed on the best hardware to run it. Just like GitHub Copilot is transforming software development helping developers write better and faster code, our ambition is that Copilot and Azure Quantum will have [a] similar impact on the scientific process. Our goal is to compress the next 250 years of chemistry and material science progress into the next 25. And of course, Azure Quantum Elements is only the first step as you prepare for an even bigger transformation ahead of us [with] quantum supercomputing.”

Microsoft’s big bets on AI and quantum computing were on full display. The entire event was a well-polished, tightly-scripted video with a number of senior Microsoft/Azure execs including, Nadella; Jason Zander, EVP, strategic missions and technologies; Nihit Pokhrel, senior applied scientist; Krysta Svore, general manager, quantum; Brad Smith, vice chair and president; and Matthias Troyer, technical fellow and corporate vice president of quantum.

One quantum analyst welcomed Microsoft’s latest work but cautioned against expecting too much, too soon.

Heather West, research manager, quantum computing infrastructure systems, platforms, and technology, said, “With advancements in quantum hardware and software development, as well as new error mitigation and suppression techniques, the ability to gain a near-term advantage may be approaching faster than what many expected, including Microsoft. While Microsoft’s announcements will certainly contribute to this achievement, Microsoft’s roadmap to developing a quantum supercomputer and its rQOPS metric for measuring a quantum supercomputer’s performance seems more aspirational than achievable given how they are still in the very early stages of their qubit development.”

The main new offering, Azure Quantum Elements (AQE), is available now in private preview but presumably will become more broadly available. Microsoft says there have been handful of early users, among them for example, BSF, Johnson Matthey, and SCGC. As a big investor on OpenAI, Microsoft has ready access to underlying GPT technology and will likely create many domain-specific versions. The GitHub Copilot was first released in 2021. Just this March, the company released a business productivity tool, Microsoft 365 Copilot.

Zander has written a blog summarizing yesterday’s announcements. Near-term, AQE and the related Copilot tool are perhaps more likely to drive Azure HPC resources than quantum exploration, although Azure provides ready access to a variety of tools and NISQ (noisy intermediate scale quantum) system including, for example, Rigetti, IonQ, Quantinuum, QCI, Pasqal, and Toshiba.

“[AQE] includes many popular open source and third party tools. In addition, scientists can use Microsoft specialized tools for automated reaction exploration to perform chemistry simulations with greater scale,” said Zander yesterday. “[AQE] also incorporates Microsoft’s specialized AI models for chemistry, on which we train on millions of chemistry and material data points. We base these models on the same breakthrough technologies that you see in generative AI. However, instead of reasoning on top of human language, they reason on top of the language of nature, chemistry, we are using those models to speed up certain chemistry simulations by half a million times.”

There’s been an emphasis on retaining relative ease-of-use and familiar interfaces such as Jupyter notebooks. Pokhrel, gave a brief demonstration and said, “Most customers will start at our [AQE] custom user portal. Here, they can quickly get access to several widely used computational packages optimized for Azure hardware. Customers also have access to new custom AI power tools developed by Microsoft. Tying it all together are the workflow tools that enable scientists to automate and scale their discovery pipelines.”

Microsoft says AQE will enable users to:

  • “Accelerate time to impact, with some customers seeing a six-month to one-week speedup from project kick-off to solution.
  • “Explore more materials, with the potential to scale from thousands of candidates to tens of millions.
  • “Speed up certain chemistry simulations by 500,000 times, effectively compressing nearly one year of research into one minute.
  • “Improve productivity with Copilot in Azure Quantum Elements to query and visualize data, write code, and initiate simulations.
  • “Get ready for quantum computing by addressing quantum chemistry problems today with AI & HPC, while experimenting with existing quantum hardware and getting priority access to Microsoft’s quantum supercomputer in the future.
  • “Save time and money by accelerating R&D pipeline and bringing innovative products to market more quickly.”

During the virtual roll-out, Helmut Winterling, BSF senior vice president, digitalization, automation, and innovation management, said, “Quantum Chemistry offers invaluable insights to boost the related r&d work. However, this requires a huge amount of computing power. And even then, many problems cannot yet be solved even on the largest computers in the world. By integrating our existing computing infrastructure. With Azure quantum elements in the cloud, our team will be able to further push the limits for in silico development. This includes for instance, chemists and other inorganic materials.”

Jason Zander, speaking at Microsoft’s quantum event.

One area Microsoft hopes Copilot in Azure Quantum will be successful is in training.

Svore noted, “We need people who get chemistry and material science to learn quantum. We need people who get quantum to learn chemistry and material science. Moreover, we need to grow the community tenfold together. The idea of empowering [researchers] is the motivating force behind Azure Quantum, and in particular, the new Copilot in Azure Quantum. This GPT-4 based Copilot is augmented with additional data related to quantum computing, chemistry and material science. It helps people learn quantum and chemistry concepts, and write code for today’s quantum computers. All in a fully integrated browser experience for free. No Azure subscription needed, you can ask Copilot questions about quantum, you can ask it to develop and compile quantum code right in the browser.”

On the broader question of Microsoft’s quantum technology roadmap, there were few details presented. Svore gave the bulk of this presentation. There are many qubit types being developed – superconducting, trapped ion, diamond/nitrogen vacancy, photonic, etc. So far, they are all too prone to error. There seems to be consensus among most major players that at least a million error-corrected logical qubits will be needed for quantum computers to fulfill expectations. Microsoft also believes this.

Microsoft and a few others are betting on something called topological qubits that are inherently resistant to error. Microsoft’s approach depends on a mysterious quasi-particle – the Majorana – that has been hard to pin down. Discussing its merits is a technical topic for another time. There’s also a fair amount of agreement in quantum community that if one could, practically, create and harness topological qubits, it would solve many challenges facing today’s NISQ systems. The Quantum Science Center, one of the the U.S. National Quantum Information Research Centers (based at Oak Ridge National Laboratory), for example, is actively pursuing topological qubit development.

Svore said, “To create this new qubit, we first needed a major physics breakthrough, and I’m proud to say we’ve achieved it, and the results were just published in the journal by the American Physical Society[i]. We can now create and control Majorana. It’s akin to inventing steel, leading to the launch of the Industrial Revolution. This achievement clears the path to the next milestone, a hardware protected qubit that can scale, which we’re engineering right now. The third milestone is to compute with the hardware protected qubits, through entanglement and braiding. The fourth milestone is to create a multi qubit system that can execute a variety of early quantum algorithms.

Krysta Svore, Microsoft

“At the fifth milestone (slide above), a fundamental step change occurs. As Jason shared, our industry needs to advance beyond NISQ. This is the moment that happens for Microsoft. We will have logical qubits for the first time and a resilient quantum system. Once we have these reliable logical qubits, we are able to engineer a quantum supercomputer, delivering on our sixth milestone –  unlocked solutions that have never been accessible before, solutions that are intractable on classical computers. Executing on this path will be transformative, not just for our industry, but for humanity.”

It’s probably fair to call the roadmap directional. Microsoft has not said much about its hardware development efforts and the proposing of a new metric – reliable Quantum Operations Per Second (rQOPS) – seems a bit early. As both Zander and Svore noted in their comments, there are many basic science and engineering challenges ahead.

Svore said, “Today, NISQ machines are measured by counting their physical qubits, or quantum volume (QV, an IBM-developed quality metric). But for a quantum supercomputer, measuring performance will be all about understanding how reliable the system will be, for solving real problems. To solve valuable scientific problems, the first quantum supercomputer will need to deliver at least 1 million reliable quantum operations per second. With an error rate of at most, only one every trillion operations. Our industry as a whole is yet to achieve this goal. As we advance towards a quantum supercomputer.”

About rQOPS, Zander said, “We’re offering a new performance measurement called reliable quantum operations per second or rQOPS. Since today’s quantum computers are all in level one (see slide), they all have an rQOPS of zero. Once we can produce logical qubits, we need to create machines that can solve problems that are unsolvable with classical computers. This will happen at 1 million rQOPS. The crowning achievement will be a general purpose programmable quantum supercomputer. This is the computer that scientists will use to solve many of the most complex problems facing our society”

It’s a grand vision. There are challenges ahead. It’s probably worth noting that Microsoft’s Troyer and two ETH colleagues published a cautionary paper – Disentangling Hype from Practicality: On Realistically Achieving Quantum Advantage. Nevertheless, Microsoft seems to be ramping up its quantum push.

Stay tuned.

Link to Zander blog, https://blogs.microsoft.com/blog/2023/06/21/accelerating-scientific-discovery-with-azure-quantum/

Link to Microsoft paper (InAs-Al hybrid devices passing the topological gap protocol), https://journals.aps.org/prb/abstract/10.1103/PhysRevB.107.245423

 

[i] Abstract from paper, InAs-Al hybrid devices passing the topological gap protocol, Physical Review B, published 21 June 2023

“We present measurements and simulations of semiconductor-superconductor heterostructure devices that are consistent with the observation of topological superconductivity and Majorana zero modes. The devices are fabricated from high-mobility two-dimensional electron gases in which quasi-one-dimensional wires are defined by electrostatic gates. These devices enable measurements of local and nonlocal transport properties and have been optimized via extensive simulations to ensure robustness against nonuniformity and disorder. Our main result is that several devices, fabricated according to the design’s engineering specifications, have passed the topological gap protocol defined in Pikulin et al. (arXiv:2103.12217). This protocol is a stringent test composed of a sequence of three-terminal local and nonlocal transport measurements performed while varying the magnetic field, semiconductor electron density, and junction transparencies.

“Passing the protocol indicates a high probability of detection of a topological phase hosting Majorana zero modes as determined by large-scale disorder simulations. Our experimental results are consistent with a quantum phase transition into a topological superconducting phase that extends over several hundred millitesla in magnetic field and several millivolts in gate voltage, corresponding to approximately one hundred microelectronvolts in Zeeman energy and chemical potential in the semiconducting wire. These regions feature a closing and reopening of the bulk gap, with simultaneous zero-bias conductance peaks at both ends of the devices that withstand changes in the junction transparencies. The extracted maximum topological gaps in our devices are 20–60µeV. This demonstration is a prerequisite for experiments involving fusion and braiding of Majorana zero modes.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Google Announces Sixth-generation AI Chip, a TPU Called Trillium

May 17, 2024

On Tuesday May 14th, Google announced its sixth-generation TPU (tensor processing unit) called Trillium.  The chip, essentially a TPU v6, is the company's latest weapon in the AI battle with GPU maker Nvidia and clou Read more…

ISC 2024 Student Cluster Competition

May 16, 2024

The 2024 ISC 2024 competition welcomed 19 virtual (remote) and eight in-person teams. The in-person teams participated in the conference venue and, while the virtual teams competed using the Bridges-2 supercomputers at t Read more…

Grace Hopper Gets Busy with Science 

May 16, 2024

Nvidia’s new Grace Hopper Superchip (GH200) processor has landed in nine new worldwide systems. The GH200 is a recently announced chip from Nvidia that eliminates the PCI bus from the CPU/GPU communications pathway.  Read more…

Europe’s Race towards Quantum-HPC Integration and Quantum Advantage

May 16, 2024

What an interesting panel, Quantum Advantage — Where are We and What is Needed? While the panelists looked slightly weary — their’s was, after all, one of the last panels at ISC 2024 — the discussion was fascinat Read more…

The Future of AI in Science

May 15, 2024

AI is one of the most transformative and valuable scientific tools ever developed. By harnessing vast amounts of data and computational power, AI systems can uncover patterns, generate insights, and make predictions that Read more…

Some Reasons Why Aurora Didn’t Take First Place in the Top500 List

May 15, 2024

The makers of the Aurora supercomputer, which is housed at the Argonne National Laboratory, gave some reasons why the system didn't make the top spot on the Top500 list of the fastest supercomputers in the world. At s Read more…

Google Announces Sixth-generation AI Chip, a TPU Called Trillium

May 17, 2024

On Tuesday May 14th, Google announced its sixth-generation TPU (tensor processing unit) called Trillium.  The chip, essentially a TPU v6, is the company's l Read more…

Europe’s Race towards Quantum-HPC Integration and Quantum Advantage

May 16, 2024

What an interesting panel, Quantum Advantage — Where are We and What is Needed? While the panelists looked slightly weary — their’s was, after all, one of Read more…

The Future of AI in Science

May 15, 2024

AI is one of the most transformative and valuable scientific tools ever developed. By harnessing vast amounts of data and computational power, AI systems can un Read more…

Some Reasons Why Aurora Didn’t Take First Place in the Top500 List

May 15, 2024

The makers of the Aurora supercomputer, which is housed at the Argonne National Laboratory, gave some reasons why the system didn't make the top spot on the Top Read more…

ISC 2024 Keynote: High-precision Computing Will Be a Foundation for AI Models

May 15, 2024

Some scientific computing applications cannot sacrifice accuracy and will always require high-precision computing. Therefore, conventional high-performance c Read more…

Shutterstock 493860193

Linux Foundation Announces the Launch of the High-Performance Software Foundation

May 14, 2024

The Linux Foundation, the nonprofit organization enabling mass innovation through open source, is excited to announce the launch of the High-Performance Softw Read more…

ISC 2024: Hyperion Research Predicts HPC Market Rebound after Flat 2023

May 13, 2024

First, the top line: the overall HPC market was flat in 2023 at roughly $37 billion, bogged down by supply chain issues and slowed acceptance of some larger sys Read more…

Top 500: Aurora Breaks into Exascale, but Can’t Get to the Frontier of HPC

May 13, 2024

The 63rd installment of the TOP500 list is available today in coordination with the kickoff of ISC 2024 in Hamburg, Germany. Once again, the Frontier system at Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Leading Solution Providers

Contributors

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

Some Reasons Why Aurora Didn’t Take First Place in the Top500 List

May 15, 2024

The makers of the Aurora supercomputer, which is housed at the Argonne National Laboratory, gave some reasons why the system didn't make the top spot on the Top Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

Intel Plans Falcon Shores 2 GPU Supercomputing Chip for 2026  

August 8, 2023

Intel is planning to onboard a new version of the Falcon Shores chip in 2026, which is code-named Falcon Shores 2. The new product was announced by CEO Pat Gel Read more…

The NASA Black Hole Plunge

May 7, 2024

We have all thought about it. No one has done it, but now, thanks to HPC, we see what it looks like. Hold on to your feet because NASA has released videos of wh Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

How the Chip Industry is Helping a Battery Company

May 8, 2024

Chip companies, once seen as engineering pure plays, are now at the center of geopolitical intrigue. Chip manufacturing firms, especially TSMC and Intel, have b Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire