The quest for a silicon quantum computer

The promise of silicon

Andrea Morello is not what you expect when you think of quantum computing. Tall, lizard-thin and sporting a luxuriant ponytail and greying goat-patch beneath his lower lip, in skin-tight pants and a pendant, he has an intense gaze that could almost hold you in a trance.

“There were no silicon qubits before we started working on this,” says Morello, winner of the American Physical Society’s inaugural Rolf Landauer and Charles H. Bennett Award in quantum computing in 2017 for work some deemed almost impossible. “We’ve really contributed to making it work, and it’s now created a field. And we’re in the lead.”

In this, Morello is part of a two-man act. His friend and one-time mentor, Andrew Dzurak, who has been working on silicon quantum computing concepts since 1998, lured the young Italian postdoctoral fellow from the University of British Columbia in Canada – where he’d been working at TRIUMF, the venerable national physics laboratory. Morello joined Dzurak in 2006 as a senior research associate at what eventually became UNSW’s Centre for Quantum Computation and Communication Technology (CQC2T).

The challenge? To build on UNSW’s promising work on solid-state quantum devices, utilising the fuzzy superpositioned data that is a feature of quantum computers – known as quantum bits, or ‘qubits’– and develop techniques for quantum control of single atoms in silicon.

A decade later, the duo are in the hot seat of what has been called the ‘space race of the century’: the global effort to build super-powerful quantum computers that could solve problems beyond the practical reach of even today’s best computers, like integer factorisation or the simulation of quantum many-body systems. And at CQC2T, the duo are key players in the world’s largest collaboration working to create a complete ecosystem for universal quantum computing.

{%recommended 3772%}

Unlike almost every other major group elsewhere, CQC2T’s quantum computing effort is obsessively focused on creating solid-state devices in silicon, the heart of the US$380 billion global semiconductor industry that makes all of the world’s computer chips. And they’re not just creating ornate designs to show off how many qubits can be packed together, but aiming to build qubits that could one day be easily fabricated – and scaled up to the thousands, millions and billions.

“The infrastructure you need to make the kind of silicon qubit devices we build is unheard of in a university environment,” says Morello. “There’s basically no one else in a university setting that has access to these sort of advanced facilities.”

Dzurak, bespectacled and with a mop of tousled hair and a penchant for sharp jackets, is the man who makes that infrastructure possible. After a PhD at the University of Cambridge, where he had been working on gallium arsenide quantum dots to investigate fundamental physics phenomena, he joined UNSW in 1994 to establish Australia’s highest resolution electron-beam lithography capability, building devices as small as 10 nanometres.

He then became one of the architects of the UNSW node of the Australian National Fabrication Facility, an advanced nanoscale manufacturing hub with a complete silicon metal-oxide- semiconductor process line – the precise infrastructure needed to make silicon qubit devices, and he now serves as director of UNSW’s 750m2 facility.

“The complexity of the infrastructure necessary to fabricate at that scale and with that level of precision is only found in billion-dollar factories where silicon integrated circuits are made,” says Dzurak. “Our goal is to develop, and demonstrate, the science and engineering of building a 10-qubit silicon device within five years. Ideally, you want to get to 10 qubits using a method that doesn’t substantially deviate from the way that a billion transistors are put on a chip.”

Morello adds: “If you can do that, then you’ve hit the jackpot, because we would then know how to make a qubit chip at scale and sell it at an affordable price.”

Dzurak agrees: “A big challenge is to actually get near 10 qubits. Once you get to that scale, the pathway forward becomes much clearer.

Once we have shown the scientific and technical basis for 10 qubits, then our aim is to prove that you can use it to make 100, or 1,000 or 10,000.”

And this is the core of the gamble that UNSW and its six corporate and three government backers, its 12 university and research institute partners – and, in a sense, Australia – is making: that the three elegant designs for making qubit chips in silicon developed by CQC2T’s researchers will be much more feasible, more practical and easier to scale than any of the other technologies under development.

The lure of quantum computing

The heart of a modern computer is the microprocessor, or CPU (central processing unit), a complete computation engine on a single chip. Information is represented by bits, a binary code (yes/no, on/off) which is always either a 0 or a 1. Each of the 0s or 1s is called a ‘binary digit’, or bit, the core of the binary code that drives today’s computers.

This binary code is translated into instructions, which create everything from large calculations to luscious graphics on a screen. While computers these days are extremely fast, the mathematics of binary operations have limits – they still require calculations to be performed one after another, in a serial fashion, essentially in the same way as by hand. Even when multiple CPUs break down this task, such as in parallel processing, they still do it sequentially.

Computers cannot defy the mathematics of binary operations; all they can do is make those calculations faster and faster, by increasing the number of transistors on a chip, and making the transistors so small and responsive that they can switch over a billion times per second. Which is why modern chips pack hundreds of millions of transistors on a silicon chip the size of a fingernail.

Binary code is processed in circuits called logic gates, made from a number of transistors strung together. Logic gates compare patterns of bits, and turn them into new patterns of bits, with the output from one gate feeding the input of the next, and so on. If done in a human brain, this would be called ‘addition’, ‘subtraction’ or ‘multiplication’.

{%recommended 3796%}

A quantum computer exponentially expands the vocabulary of binary code words by using two spooky principles of quantum physics, namely ‘entanglement’ and ‘superposition’. Qubits can store a 0, a 1, or an arbitrary combination of 0 and 1 at the same time. Multiple qubits can be made in superposition states that have strictly no classical analogue, but constitute legitimate digital code in a quantum computer.

And just as a quantum computer can store multiple values at once, so it can process them simultaneously. Instead of working in series, it can work in parallel, doing multiple operations at once.

Only when you try to interrogate what state the calculation is in at any given moment – by measuring it – does this state ‘collapse’ into one of its many possible states, giving you the answer to your calculation problem.

If software algorithms can be designed to make use of superposition and entanglement to arrive at an answer in a much smaller number of steps, the ability of a quantum computer to work in parallel would make it, in many cases, millions of times faster than any conventional computer.

How fast is that? One way to measure the processing speed of computers is to calculate their number of ‘flops’, or FLoating-point OPerations per Second. Today’s typical desktop computers run at gigaflops (billions of flops, or 109).

By comparison, the world’s fastest supercomputer, China’s Sunway TaihuLight, runs at 93 petaflops (93 quadrillion flops, or around 1017) – but it relies on 10 million processing cores and uses massive amounts of energy.

In comparison, even a small 30-qubit universal quantum computer could, theoretically, run at the equivalent of a classical computer operating at 10 teraflops (10 trillion flops, or 1012 ), according to David Deutsch, at the University of Oxford’s Centre for Quantum Computation.

Granted, quantum computers are not measured in flops – in fact, they just don’t operate like the computers we’ve come to know at all. But it’s this promise of extraordinary processing power, plus the fact we’re reaching the limits of what can be done with classical computers, that make them such an intoxicating prospect.

The 21st-century space race

The big players in quantum computing research are the UK, Australia, Canada and the USA, according to Ned Allen, chief scientist at Lockheed Martin in Bethesda, Maryland, USA. But there are many more rookies playing hard to catch up: “There are significant investments being made in Russia and practically every other developed country,” says Tim Polk, former assistant director of cybersecurity at the White House’s Office of Science and Technology Policy.

It’s not just computing: the unique properties of the quantum world are also revolutionising everything from sensing, measurement and imaging, to navigation and communications, with a plethora of useful applications in healthcare, defence, oil and gas discovery, flood prevention, civil engineering, aerospace and transport.

It’s being called the Second Quantum Revolution: the first rewrote the rules that govern physical reality, while the second takes those rules and uses them to develop new technologies.

The UK is investing US$76 million a year in quantum technologies, the USA some US$260 million, China US$158 million and Germany US$90 million, according to a UK Government Office for Science report. Another by consulting firm McKinsey estimated that, in 2015, there were 7,000 people working in quantum technology research worldwide, with a combined budget of US$1.5 billion.

However, it’s in quantum computing where the race is fiercest. Tech giants Intel, Microsoft, IBM and Google are all ploughing tens of millions of dollars, mostly betting on different horses – or, like Intel, on a few.

{%recommended 960%}

Because the need for the faster calculations promised by quantum computing are so urgent, and the potential payoff so large, some have settled on an interim solution: an ‘adiabatic’ quantum processor that can solve specific problems, such as finding the global minimum value of a highly complex function. A nifty example of this approach was developed by Canada’s D-Wave Systems, which uses a kind of quantum tunnelling, coupled with binary inputs from classical computers, to develop a near-range solution to complex calculations.

But D-Wave’s computers don’t always provide the most efficient, exact solution; and while rapid, they are not always faster than a classical supercomputer. The quantum states of its qubits are more fragile, and their manipulation less precise – all of which narrows its usefulness to optimisation problems that need to be solved fast but don’t need to be perfect, such as pattern recognition and anomaly detection. Which is why Google and Lockheed Martin have each bought one of the US$10 million machines.

Nevertheless, D-Wave’s design doesn’t have the flexibility in the interaction between qubits to be a truly universal quantum computer – one that can perform a broad range of computations much faster than a classical computer; in some cases, exponentially faster. And that’s the holy grail.

This race, however, is still wide open, with competing technologies in play, all exploring different approaches to achieve the same goal.

Three approaches

Dzurak was recruited to UNSW by Bob Clark, a former Australian Navy officer and one-time chief defence scientist who is considered a visionary and one of the few Australians bestowed with a US Secretary of Defence Medal. Clark joined the navy at 15, rose to lieutenant, then left to complete a PhD in physics at UNSW and later at the University of Oxford, where he ended up heading a research group in experimental quantum physics at the famed Clarendon Laboratory, before returning to his alma mater in 1991.

At UNSW, Clark founded the centre that is now CQC2T, pushing for the creation of nanofabrication facilities. In 1997, Bruce Kane, then a senior research associate at UNSW, hit upon a new architecture that could make a silicon-based quantum computer a reality. His colleagues were enthralled, and Clark pressed Kane to develop the idea, and patent it. Kane’s 1998 Nature paper has now been cited over 2,500 times.

Kane’s paper also sparked the interest of Michelle Simmons, then a research fellow at the University of Cambridge’s renowned Cavendish Laboratory who’d gained an international reputation for her work in quantum electronics. “I wanted to build something – something that could prove to be useful,” she recalls. Whereas in Britain, she worked “with pessimistic academics who will tell you a thousand reasons why your ideas will not work … Australia offered the freedom of independent fellowships and the ability to work on large-scale projects.”

Cosmos / UNSW

She joined UNSW in 1999, became a founding member of CQC2T, and in 2010 took over as director. Simmons now leads the UNSW-based collaboration of more than 180 researchers across six Australian universities – UNSW, Melbourne, Queensland, Griffith, Sydney and the Australian National University – as well as Australia’s Defence Science and Technology Group andthe Australian Signals Directorate. International partners include the universities of Tokyo, Wisconsin- Madison, Purdue, Singapore National and Oxford, Germany’s Max Planck and Walter Schottky institutes, and corporate partners IBM Research, Toshiba, Zyvex and Quintessence.

Recently, the UNSW quantum computing effort – its funding renewed for another seven years – has attracted another A$70 million from the Australian government, the Commonwealth Bank of Australia and telecom giant Telstra to create a consortium that will commercialise the three unique architectures they have developed. Discussions underway with additional partners will likely take this sum to A$100 million.

For Simmons, it’s the sheer range of problems quantum computers can solve that excites her, “problems where computers work on large databases or consider lots of variables, such as predicting the weather, stockmarkets, optimising speech, facial and object recognition such as in self-driving cars, looking at optimising aircraft design, targeting drug development to a patient’s DNA … [These are] ideal problems for a quantum computer.”

Along with CQC2T colleague Sven Rogge, Dzurak, Morello and Simmons are the architects of the three related yet unique approaches to quantum. Dzurak’s approach uses quantum dots, which are nano-scale semiconductor devices that straddle the behaviour of semi-conductors and discrete molecules.

In a 2015 paper in Nature, Dzurak’s team detailed the first quantum logic gate in silicon, showing quantum calculations could be performed between two qubits in silicon, leading Physics World to name it one of the top 10 advances of the year. His name is on more than 150 scientific papers and he’s a co-inventor on 11 patents.

Morello’s team was the first to demonstrate the read-out and the control of the quantum state of a single electron and a single nuclear spin in silicon. In a 2014 paper in Nature Nanotechnology, they set the record for how long a quantum superposition state can be held in the solid state, exceeding 30 seconds – 10-fold better than before. This helped him, with just 50 scientific papers under his belt, to win the Landauer-Bennett Award in 2017, “for remarkable achievements in the experimental development of spin qubits in silicon”. 

Simmons’s group has developed the world’s smallest transistors and the narrowest conducting wires in silicon made with atomic precision. In a 2015 Science Advances paper, her group – working with Lloyd Hollenberg at the University of Melbourne – outlined a complete architecture for an error-corrected quantum computer, using atomic-scale qubits aligned to control lines inside a 3D design. 

Simmons has published over 400 papers, has her name on seven patents, and a slew of prizes, including the Foresight Institute’s 2016 Feynman Prize for her experimental work in atomic electronics. In 2017, she was in Paris to collect the L’Oréal-UNESCO For Women in Science Award.

Rogge works closely with Simmons’s team to help understand fundamental issues related to the qubit environment. In a 2013 Nature paper, Rogge’s team detailed how to couple a silicon qubit to photons for quantum interconnects, and in a 2016 Nature Nanotechnology paper – in collaboration with the universities of Melbourne and Purdue – showed how to pinpoint a phosphorus atom in silicon with absolute atomic accuracy. 

“In many ways, the quantum dots and the phosphorous atom approach, although different, are very complementary,” Morello says of UNSW’s chip designs. “Some could be made in a silicon foundry in one go, others require a serial manufacturing process. But some are almost identical.”

The road ahead

No-one yet knows what qubit design will eventually power a universal quantum computer. Or what technological approach will create the most efficient universal quantum computer that can be scaled up, at a reasonable cost, to solve the curly problems beyond the ken of supercomputers today.

All groups are racing to achieve ‘quantum supremacy’, in which a quantum computer performs a calculation faster than any known computer could. While an important milestone, this alone will not determine the winner. That will take a decade – or more – to settle. History is replete with examples where it’s not the best technology that gains a foothold, or the cheapest, or even the one that scales up fastest. Cars, light bulbs, video recorders, nuclear reactors – all have seen less than optimal designs rise to prominence. And occasionally, better designs make a comeback, like electric cars; or designs that were once thought to be in competition – like AC versus DC for electricity, or AM versus FM for radio – carve out their own niches.

It’s also possible quantum computing will not be like classical computing, with a ‘one size fits all’ approach winning out. And it will take a long time for a standard to emerge, as it did with today’s computers. It took four generations of computers to get from bulky vacuum tube circuitry and magnetic drum memory to the multi-core CPUs with silicon semiconductor logic gates common today.

It may be that some chip designs are very powerful for specific uses, whereas others will be more easily adapted to universal computations. In all cases, it’s unlikely quantum computers will grace desktops of the future: because they require specific environments – a vacuum and low temperatures – they will most likely be used via cloud-based computing access from central sites.

“I’m passionate about it because we’ve got a chance,” says Dzurak. “And if we make it work, then it’s something that could change the world.”


This article was first published in INGENUITY, Winter 2017, a publication of the Faculty of Engineering at UNSW.  Read the original story here.

It is published under a Creative Commons licence (CC BY-NC 3.0 AU). It is permitted to copy and redistribute the material in any medium or format; and further to remix, transform, and build upon the material, provided there is clear attribution (appropriate credit and an indication if changes were made); and that the use is non-commercial.

Please login to favourite this article.