text
stringlengths
4.06k
10.7k
id
stringlengths
47
47
dump
stringclasses
20 values
url
stringlengths
26
321
file_path
stringlengths
125
142
language
stringclasses
1 value
language_score
float64
0.71
0.98
token_count
int64
1.02k
2.05k
score
float64
3.5
4.53
int_score
int64
4
5
secrets ride phone lines Technology Research News The ability to safeguard secret messages using the quirks of quantum physics has been thoroughly demonstrated in the laboratory. Now field tests of quantum cryptography are showing that the technology can withstand the rigors of real-world communications. Researchers in Switzerland have used this type of cryptography, which represents bits of information using single photons, to send theoretically perfectly secure messages between the cities of Geneva and Lausanne, which are 67 kilometers apart. Quantum cryptography provides perfect security because it allows users to tell for sure whether the key they are using to encrypt and decrypt a message has been compromised. Researchers at Los Alamos National Laboratory previously proved that a quantum signal could travel 50 kilometers. But that was over a spooled fiber-optic line contained in a laboratory, said Nicolas Gisin, a physics professor at the University of Geneva. "In our case the two end points were really spatially separated," he said. More importantly, the Swiss experiment used existing fiber-optic phone lines. The fibers were "nothing special," said Gisin. They were not in commercial use during the experiment, but were part of a cable containing many fibers that were, he said. Key encryption schemes use a unique mathematical key to mask each message. The sender and intended recipient use the key to encrypt a message, send it over unsecured channels, then decrypt it. The trick to keeping the message secret is making sure no one but the sender and receiver have access to the key. The quantum cryptography scheme sends encryption keys over fiber-optic lines in a perfectly secure way by representing each bit with only one photon. Using two or more photons per bit makes it possible for an eavesdropper to siphon off some extra photons in order to peek at the key without being detected. Using only one photon per bit means that an eavesdropper would have to replace the photons she intercepted, but it is impossible to replicate all of the photons correctly. This is because any given photon, or particle of light, can have one or more attributes, including polarization, which has to do with how the photon vibrates, and wave phase. The researchers' quantum cryptography scheme generates photons in one of four states based on their wave phases. The system splits each photon, sends the halves down short pieces of fiber of slightly different lengths, and then joins the two halves. Because the halves travel different distances, their waves are out of phase, meaning the crests and troughs are out of sync by a particular amount. The photons' four phase states come in two types: those whose waves match or are exactly opposite, and those whose waves are half way out of phase with one wave ahead of the other. Each type can be used to represent the 1s and 0s of digital information. It is a quirk of quantum physics -- the Heisenberg uncertainty principle -- that makes the scheme perfectly secure: you can't look for both of the pairs of states at the same time, and you only get one look before the photon disappears. If you measure a photon to see if it is a 1 or 0 based on one pair of states, but it was generated in one of the other two states, you're out of luck. Your measuring device has absorbed the photon during your first look so you will never know whether it represented a 1 or 0. This means an eavesdropper would only be able to correctly measure half of the photons he intercepts and would have to guess at the other half to produce substitutes. And he would only get about half the missing half right by chance, meaning one quarter of the substitute bits would be wrong. The sender and receiver can check the error rate and so detect the eavesdropper by comparing a few bits. If the key has been compromised, they can throw it out and send another until they get an uncompromised key to encrypt their data. To form a key, the receiver measures the photons by randomly picking one of the two sets of states. Then they compare notes and the sender tells the receiver which photons he measured correctly. They then use those bits as the key. The researchers' quantum key distribution system can only be used across relatively short distances because its performance drops off as the distance increases. At 10 kilometers the system can transmit quantum keys at 4,000 bits per second. At 20 kilometers the bit rate drops to 1,500 per second, and at 50 kilometers it drops to 100 bits per second. An ordinary modem transmits 56,000 bits per second. Once the users have an uncompromised key, however, the encrypted data can be sent over fast communications lines that include repeaters. Today's fiber-optic communication systems compensate for diminishing signal strength -- and thus span great distances -- by using repeaters, which copy and retransmit fading light pulses. Repeaters can't be used to send quantum keys because they would intercept photons in the same manner as The company id Quantique in Geneva, a spinoff from Gisin's laboratory, is marketing the quantum key distribution system. It consists of a pair of 18-inch-wide boxes that connect to personal computers via USB ports, and to each other over a fiber-optic line. Gisin's research colleagues were Damien Stucki and Hugo Zbinden of the University of Geneva, and the Olivier Guinnard and Grégoire Ribordy of id Quantique SA. They published the research in the July 12, 2002 issue of the journal New Journal of Physics. The research was funded by the TRN Categories: Quantum Computing and Communications; Cryptography Story Type: News Related Elements: Technical paper, "Quantum Key distribution over 67 km with a plug & play system," New Journal of Physics, July 12, Ultimate memory demoed makes bugs sing Nanotubes grown in place Quantum secrets ride Chip keeps atoms in line Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:fe96c08d-538b-4d36-9a8d-b5ac1fc771eb>
CC-MAIN-2014-15
http://www.trnmag.com/Stories/2002/080702/Quantum_secrets_ride_phone_lines_080702.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223206147.1/warc/CC-MAIN-20140423032006-00052-ip-10-147-4-33.ec2.internal.warc.gz
en
0.915263
1,317
3.546875
4
For sending information across continents and around the globe, light is the medium of choice. The ability to send multiple wavelengths at high speeds within fibers has transformed communications. But light could do even better, much better, if it weren’t hobbled by the electronic switches, routers and other devices of optical communications technology. Since they operate by converting optical signals to electronics and back again, these devices considerably reduce the efficiency of current optical networks. Is it possible to create all-optical circuitry — something analogous to the microcircuitry of “chips” but that doesn’t require converting light to electrical current? It’s a challenge many scientists worldwide are addressing. Shanhui Fan and Fatih Yanik, Stanford University Using LeMieux, PSC’s terascale system, to simulate how light behaves, applied physicist Shanhui Fan of Stanford and graduate student Mehmet Fatih Yanik have made notable progress. Using all 3,000 LeMieux processors, they showed that it’s possible to stop light and hold it captured — in an optical holding cell — until a subtle shift in optical features releases it. Unlike earlier attempts to capture light, their finding — reported in 2004 — suggests it may be possible to corral complicated light pulses and, moreover, to do it in a way that integrates easily with existing chip technology. So far, Yanik and Fan’s device exists only in simulation, but they have teamed with a laboratory group at Stanford to build and demonstrate their scheme. Because of the powerful ability of their simulations to accurately predict how light behaves in fascinating materials called “photonic crystals,” the researchers are confident the laboratory work will yield an all-optical device to stop light in its tracks. It made news in 2001 when researchers brought light to a standstill for the first time. Two groups at Harvard demonstrated a technique that captured light in clouds of gaseous atoms. But these systems of atomic gases are impractical for an all-optical circuit. Rather than gases, the Stanford team’s approach relies on photonic crystals — layered materials, often silicon or other semiconductors, made with cavities in patterns within the crystal. Because such a device will operate at room temperature and be only microns in length, it could easily integrate with traditional microcircuitry. By careful design of irregularities in the patterns of the cavities, photonic crystals can allow — or forbid — the passage of certain wavelengths of light. This handy trick makes them attractive filters, with the potential to act as gatekeepers that allow only selected wavelengths to pass through the crystal on prescribed paths. Exactly which wavelength, or band of wavelengths, can travel through or not depends on the properties of the crystal. Yanik stumbled on the light-stopping mechanism while using LeMieux to simulate the impact of changing one property of a crystal, the index of refraction — the ratio of light’s speed in a vacuum (well established at 186,000 miles per second) to its speed in a medium, where it travels more slowly. His original goal was a tunable switch — a crystal that could be prompted, by small changes in the refractive index, to allow safe passage to different wavelengths of light. This graphic from simulation shows snapshots of the positive (red) and negative (blue) electric fields as an optical pulse propagates (left to right) through a photonic crystal, shown in three segments at four times (top to bottom). Resonant frequencies of the cavities (black dots) are tuned to stop the pulse during the time interval shown in the second and third snapshots, until the cavities are detuned and the pulse is released. For one possible design of such a switch, the simulations indicated the effect could be quite strong. Small changes in refractive index allowed a large change in the bandwidth of allowed wavelengths. And that wasn’t all. “I saw an optical signature very similar to the ones observed in atomic media,” says Yanik. “So the question became, could we use the cavities in the crystal to store electromagnetic pulses, just as they were stored in atomic media? If somehow we could get light into this structure, and then change the properties of the entire structure while the light was inside, we could change the properties of light as well and trap it.” The idea depends on a phenomenon called optical resonance, which is similar to why long and short pipes in an organ produce notes of different frequency. In an organ, each pipe is cut to the length required to amplify sound waves of a desired frequency. The sound energy bounces back and forth inside the pipe and establishes an unmoving wave pattern, or resonance, at the desired frequency. In the Stanford team’s approach, the role of the organ pipe is played by a waveguide — either an empty channel or closely spaced cavities inside the crystal that allow light to propagate. Prior to this work, many groups had used optical resonators to trap light of a single wavelength. Optical communication, however, uses light pulses to encode and transmit information, with each pulse composed of many wavelengths. Trapping such a multi-wavelength pulse in a single resonator would lose the information carried by the pulse. Yanik and Fan’s idea, however, goes a crucial step further by tuning all of the wavelengths within a pulse to the same frequency and, at the same time, adjusting the crystal to resonate at that frequency. They do this by adjusting the index of refraction once the pulse has entered the crystal. As all the frequency components are collapsed to a single frequency, the information becomes encoded by the phase and intensity of light along the waveguide. Changing the resonance of the crystal, Yanik explains, is like adjusting the spacing of stepping stones across a river. Shifting the crystal’s index of refraction is similar to spreading the stones out, so that photons — the tiniest energy chunks of light — of a particular frequency can no longer hop from stone to stone. They have been trapped. When the pulse needs to be released, the index of refraction is shifted back, the stones move closer together, and the photons zip away. “The entire idea,” says Yanik, “from refractive-index switches to light-trapping devices, was first realized on a supercomputer.” Once he and Fan identified the light-stopping possibility, Yanik adapted software he’d already written to simulate it. Using almost every one of LeMieux’s 3,000 processors, they simulated a series of possibilities until arriving at a 100-micron waveguide with 120 side-cavities. “A hundred microns,” says Fan, “fits on a chip, a small distance in practice, but a long distance to simulate.” The beauty of photonics simulations, he explains, is the ability to use the full form of Maxwell’s equations. This set of four equations, named for James Clerk Maxwell, a 19th century Scottish physicist, governs most optical and electromagnetic phenomena. Not so long ago, notes Fan, limitations in computing technology required clever approximations to apply these equations. “With a system like LeMieux,” he says, “we have the ability to solve the entire set exactly.” This means that the computational experiments precisely mimic physical reality and give the researchers high confidence that their predictions can be realized in the laboratory. To exploit the large-scale parallelism of LeMieux’s 3,000 processors, Yanik’s software parceled separate parts of the crystal waveguide to separate processors. It took 10 simulations to describe the light-trapping behavior, with each simulation of a light pulse entering the wave guide requiring two hours, which Yanik estimates as a year’s worth of computing on a desktop PC. The simulations showed that shifting the index of refraction around the pulse forces the wavelengths to adopt a single frequency, and traps the pulse in and between cavities. In the 100 micron, 120 side-cavity waveguide, a 1/10,000th shift in the index of refraction is enough to capture the information in commonly used pulses of light. Another surprising result of the simulations, says Fan, is that if the index of refraction were tuned beyond the point where the light pulse screeches to a halt, the pulse would not merely stop, but reverse in its tracks, backing out of the crystal as though it were a train reversing direction to re-emerge, caboose first, from a tunnel. This time reversal effect, he says, might prove useful in repairing signal degradation. Efforts to build the device in the lab, in collaboration with Stanford colleagues Martin Fejer and James Harris, are now running parallel to more simulations. “What we’ve done so far is a two-dimensional simulation,” says Fan, “as a proof of principle. We are now extending it to a three-dimensional simulation to arrive at the exact structure the device needs to take.” For optical networks, a device that can catch and hold light for an arbitrary length of time offers promise to alleviate the congestion that happens when too many pulses arrive simultaneously at a network junction. Beyond that, there’s the promise of quantum computing, the vision of transistors that manipulate single photons rather than electrons. It’s a future, perhaps sooner than we think, in which circuits will be a thousand times smaller and faster. Yanik and Fan’s simulations with LeMieux bring us a step closer.
<urn:uuid:32e836bd-33f1-4d25-b5f7-7ad712d6c0b7>
CC-MAIN-2014-15
http://www.psc.edu/science/2005/fan/
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609533121.28/warc/CC-MAIN-20140416005213-00270-ip-10-147-4-33.ec2.internal.warc.gz
en
0.932926
2,001
4
4
The demand for faster Internet speeds now and in the future is a given, specifically when considering the popularity of streaming services such as Netflix and the explosion of the Internet of things. Specifically, with the increasing trend to utilise big data sets to systematically extract and analyse information, faster Internet and more broadband are in demand. Thanks to the latest fibre optic technology, for some fortunate users, broadband speeds could soon be significantly faster than anything today. Researchers from the Royal Melbourne Institute of Technology (RMIT) in Australia, developed a nanophotonic device that can encode more data, and process it incredibly fast using a special form of ‘twisted’ light. The technology comprises a miniscule detector which replaces current readers as big as ‘dining tables’. This new development in fibre optics involves detecting light that has been twisted which could result in Internet speeds up to 100 times faster. The scientists, who published the results in the journal Nature Communications, indicate that the technology can be used to upgrade existing networks and significantly boost efficiency. Existing broadband fibre optics carry information on pulses at the speed of light, but the encoding and decoding of data affects data speeds. Fibre optic cables transmit information as pulses of light, but it can only be stored through the colour of the light consisting of horizontal or vertical waves. However, by twisting light into a spiral, a third dimension of light to carry information is created. This is referred to as the level of orbital angular momentum, or spin. Min Gu from RMIT states: “It’s like DNA, if you look at the double helix spiral,” “The more you can use angular momentum the more information you can carry.” The technology thus uses the oscillation, or shape, of light waves to encode data by making use of light invisible to the naked eye thereby increasing bandwidth. The light waves twisted into a spiral is known as light in a state of orbital angular momentum (OAM). According to Gu, the detector can also be used to receive quantum information sent via twisting light, meaning it could have applications in a whole range of cutting-edge quantum communications and quantum computing research. While researchers in the US had created a fibre that could twist light previously, Gu’s team is the first to create a detector that can read the information it holds. “We could produce the first chip that could detect this twisting and display it for mobile application,” Gu said. The nanophotonic device can encode more data, and process it incredibly fast using a special form of ‘twisted’ light to unlock super-fast, ultra-broadband communications. The nanophotonic device is required to overcome the “capacity crunch” of current fibre optic technology, which according to Dr Haoran Ren from RMIT’s School of Science, co-lead author of the paper, “fail to keep up with the ever-increasing demands of Big Data.” Ren said, “Our miniature OAM nano-electronic detector is designed to separate different OAM light states in a continuous order and to decode the information carried by twisted light.” Gu also estimates that the nano-electronic device will unlock the full potential of twisted light for future optical and quantum communications.” Prof Min Gu from RMIT indicated that the technology would be compatible with existing silicon-based materials and can thus be applied to broadband networks. “This technology’s high performance, low cost and tiny size makes it a viable application for the next generation of broadband optical communications,” he said. He further stated that “It fits the scale of existing fibre technology and could be applied to increase the bandwidth, or potentially the processing speed, of that fibre by over 100 times within the next couple of years. This easy scalability and the massive impact it will have on telecommunications is what’s so exciting.” The OAM nano-electronic detector can be compared to an ‘eye’ that can ‘see’ information carried by twisted light and decode it to be understood by electronics. Gu said that this technology’s high performance, low cost and tiny size makes it a viable application for the next generation of broadband optical communications. Despite the possibility that the technology could be used to upgrade fibre optic networks, the use of fibre optics instead of copper wire is still contentious. Many households receive the cheaper option of fibre to the node which produces slower speed. With fibre to the node, optic fibre cable only runs as far as a central point in the neighbourhood, and copper wire connects that node to each home. An interesting fact is that original ADSL connections use an average of 2.5km of copper wire per connection, fibre to the node uses 500 metres, fibre to the curb uses 30 metres, and fibre-to-the-premises uses none. In south Africa, fibre optic technology offers a viable alternative specifically since it is prone to recurring copper theft. Take your business connectivity guide to find the perfect solution for your business!
<urn:uuid:628979f5-e63a-4b17-905f-75304426a567>
CC-MAIN-2022-33
https://www.bitco.co.za/fibre-optic-light-breakthrough-could-make-internet-100-times-faster/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571536.89/warc/CC-MAIN-20220811224716-20220812014716-00119.warc.gz
en
0.92555
1,039
3.515625
4
A superconductor is a material that achieves superconductivity, which is a state of matter that has no electrical resistance and does not allow magnetic fields to penetrate. An electric current in a superconductor can persist indefinitely. Superconductivity can only typically be achieved at very cold temperatures. Superconductors have a wide variety of everyday applications, from MRI machines to super-fast maglev trains that use magnets to levitate the trains off the track to reduce friction. Researchers are now trying to find and develop superconductors that work at higher temperatures, which would revolutionize energy transport and storage. Who discovered superconductivity? The credit for the discovery of superconductivity goes to Dutch physicist Heike Kamerlingh Onnes. In 1911, Onnes was studying the electrical properties of mercury in his laboratory at Leiden University in The Netherlands when he found that the electrical resistance in the mercury completely vanished when he dropped the temperature to below 4.2 Kelvin — that's just 4.2 degrees Celsius (7.56 degrees Fahrenheit) above absolute zero. To confirm this result, Onnes applied an electric current to a sample of supercooled mercury, then disconnected the battery. He found that the electric current persisted in the mercury without decreasing, confirming the lack of electrical resistance and opening the door to future applications of superconductivity. History of superconductivity Physicists spent decades trying to understand the nature of superconductivity and what caused it. They found that many elements and materials, but not all, become superconducting when cooled below a certain critical temperature. In 1933, physicists Walther Meissner and Robert Ochsenfeld discovered that superconductors "expel" any nearby magnetic fields, meaning weak magnetic fields can't penetrate far inside a superconductor, according to Hyper Physics, an educational site from the Georgia State University department of physics and astronomy. This phenomenon is called the Meissner effect. It wasn't until 1950 that theoretical physicists Lev Landau and Vitaly Ginzburg published a theory of how superconductors work, according to Ginzburg's biography on The Nobel Prize website. While successful in predicting the properties of superconductors, their theory was "macroscopic," meaning it focused on the large-scale behaviors of superconductors while remaining ignorant of what was going on at a microscopic level. Finally, in 1957, physicists John Bardeen, Leon N. Cooper and Robert Schrieffer developed a complete, microscopic theory of superconductivity. To create electrical resistance, the electrons in a metal need to be free to bounce around. But when the electrons inside a metal become incredibly cold, they can pair up, preventing them from bouncing around. These electron pairs, called Cooper pairs, are very stable at low temperatures, and with no electrons "free" to bounce around, the electrical resistance disappears. Bardeen, Cooper and Schrieffer put these pieces together to form their theory, known as BCS theory, which they published in the journal Physical Review Letters. How do superconductors work? When a metal drops below a critical temperature, the electrons in the metal form bonds called Cooper pairs. Locked up like this, the electrons can't provide any electrical resistance, and electricity can flow through the metal perfectly, according to the University of Cambridge. However, this only works at low temperatures. When the metal gets too warm, the electrons have enough energy to break the bonds of the Cooper pairs and go back to offering resistance. That is why Onnes, in his original experiments, found that mercury behaved as a superconductor at 4.19 K, but not 4.2 K. What are superconductors used for? It's very likely that you've encountered a superconductor without realizing it. In order to generate the strong magnetic fields used in magnetic resonance imaging (MRI) and nuclear magnetic resonance imaging (NMRI), the machines use powerful electromagnets, as described by the Mayo Clinic. These powerful electromagnets would melt normal metals due to the heat of even a little bit of resistance. However, because superconductors have no electrical resistance, no heat is generated, and the electromagnets can generate the necessary magnetic fields. Similar superconducting electromagnets are also used in maglev trains, experimental nuclear fusion reactors and high-energy particle accelerator laboratories.Superconductors are also used to power railguns and coilguns, cell phone base stations, fast digital circuits and particle detectors. Essentially, any time you need a really strong magnetic field or electric current and don't want your equipment to melt the moment you turn it on, you need a superconductor.(opens in new tab) "One of the most interesting applications of superconductors is for quantum computers," said Alexey Bezryadin, a condensed matter physicist at the University of Illinois at Urbana-Champaign. Because of the unique properties of electrical currents in superconductors, they can be used to construct quantum computers. "Such computers are composed of quantum bits or qubits. Qubits, unlike classical bits of information, can exist in quantum superposition states of being '0' and '1' at the same time. Superconducting devices can mimic this," Bezryadin told Live Science. "For example, the current in a superconducting loop can flow clockwise and counterclockwise at the same time. Such a state constitutes an example of a superconducting qubit." What's the latest in superconductor research? The first challenge for today's researchers is "to develop materials that are superconductors at ambient conditions, because currently superconductivity only exists either at very low temperatures or at very high pressures," said Mehmet Dogan, a postdoctoral researcher at the University of California, Berkeley. The next challenge is to develop a theory that explains how the novel superconductors work and predict the properties of those materials, Dogan told Live Science in an email. Superconductors are separated into two main categories: low-temperature superconductors (LTS), also known as conventional superconductors, and high-temperature superconductors (HTS), or unconventional superconductors. LTS can be described by the BCS theory to explain how the electrons form Cooper pairs, while HTS use other microscopic methods to achieve zero resistance. The origins of HTS are one of the major unsolved problems of modern-day physics. Most of the historical research on superconductivity has been in the direction of LTS, because those superconductors are much easier to discover and study, and almost all applications of superconductivity involve LTS. HTS, in contrast, are an active and exciting area of modern-day research. Anything that works as a superconductor above 70 K is generally considered an HTS. Even though that's still pretty cold, that temperature is desirable because it can be reached by cooling with liquid nitrogen, which is far more common and readily available than the liquid helium needed to cool to the even lower temperatures that are needed for LTS. The future of superconductors The "holy grail" of superconductor research is to find a material that can act as a superconductor at room temperatures. To date, the highest superconducting temperature was reached with extremely pressurized carbonaceous sulfur hydride, which reached superconductivity at 59 F (15 C, or about 288 K), but required 267 gigapascals of pressure to do it. That pressure is equivalent to the interior of giant planets like Jupiter, which makes it impractical for everyday applications. Room-temperature superconductors would allow for the electrical transmission of energy with no losses or waste, more efficient maglev trains, and cheaper and more ubiquitous use of MRI technology. The practical applications of room-temperature superconductors are limitless — physicists just need to figure out how superconductors work at room temperatures and what the "Goldilocks" material to allow for superconductivity might be.
<urn:uuid:5103ec36-6929-473c-8881-88d420792ea4>
CC-MAIN-2022-33
https://www.livescience.com/superconductor
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572033.91/warc/CC-MAIN-20220814113403-20220814143403-00119.warc.gz
en
0.92414
1,645
4.28125
4
Google’s recent announcement that its quantum computer had achieved “quantum supremacy” has garnered significant global attention. And for good reason. Sycamore, Google’s 53-bit quantum computer reportedly performed a 200-second calculation in the time it would have taken the world’s fastest supercomputer, the IBM Summit, 10,000 years. Beyond conventional silicon computers, quantum computers represent a new era in the evolution of computational technology. Nonetheless, the challenges confronting the field suggest that there is a very long way to go. Born out of the thinking of Max Planck, Niels Bohr, and Albert Einstein, quantum theory offers new and unexplored potential for driving the evolution of computer science. Quantum computers operate on completely different principles compared to their conventional counterparts. Where classical computers are fast and efficient, they are simply not very good at problems that involve exponential complexity. Quantum researchers utilize the properties of electrons as an engine for performing exponentially fast calculations. Quantum computers are expected to transform cryptography, pattern matching, drug discovery and ultimately boost artificial intelligence (AI) training. However, the current generation of quantum computers are extremely sensitive to perturbations, noise, and other environmental effects that can cause their “quantum state” to waver and disappear— an effect referred to as decoherence. Contemporary quantum computers require exacting demands of stability and temperature for maintaining quantum states. In fact, researchers have only been able to maintain a quantum state for a tiny fraction of a second— not long enough to carry out a useful algorithm. This instability remains the biggest challenge facing quantum computing. Designing a quantum computer with qubits Research on quantum computing remains at a very early stage. Much like the traditional computers introduced in the 1950s, quantum computers remain big, clunky machines. The most common design of quantum computers rely on multiple layers of superconducting circuits sequestered in a controlled environment and cooled step-wise to temperatures colder than deep space. Where a conventional computer uses transistors as a substrate for information processing, quantum computers can use anything that demonstrates quantum behavior. This can include an atom, a molecule, or more commonly, an electron. Due to “superposition”, quantum computers can perform multiple calculations at once, giving them the potential to be exponentially more powerful than conventional computers. Superposition is best understood as the capacity for electrons to be at different positions at the same time. Quantum computers leverage the superposition of quantum states to manage calculations on orders of magnitude faster than silicon processors. As demonstrated by the famous double-slit experiment involving a single photon of light, photons may produce a wavelike interference pattern or superposition of all available paths. The most common quantum computers today leverage electrons to move beyond the binary logic of silicon computing. In conventional computing, information is stored as bits and exist as either ones or zeros. Unlike a conventional bit, the quantum bit or qubit can store and manipulate much more information than just ones and zeros. For example, A 10-qubit quantum computer can process 1,024 possible inputs at once (instead of analyzing them one at a time). The magic of qubits is that they can exist in superposition, or in multiple states at once. Using the example of Schrödinger’s cat, any given qubit can hold a 0 and a 1 at the same time. Thus, a single qubit can represent far more information than a binary bit. As an example, a four-qubit computer register can hold 16 different numbers simultaneously. Using code to manipulate electrons, many engineers are hoping to develop quantum algorithms to exploit the vast computational potential of quantum computers. Generally, the goal is to encode parts of a problem into a complex quantum state using qubits. Then, manipulating that state in order to drive it towards something that will eventually represent the solution. Solutions can be measured by collapsing the superpositions into deterministic sequences of zeros and ones. The race for high-performance quantum computers Quantum computers hold the promise of virtually limitless supercomputing power, pushing the envelope on supercomputing or high-performance computing (HPC). However, the current state of noisy quantum computers have a coherence time of a mere 100 microseconds. This is the maximum length of time in which an experiment can be run on a quantum processor before errors take over. The most common quantum computer designs today consist of superconductor computers and spin computers. Superconductors are the most well-established method for maintaining a quantum state: Metallic superconductors are used at near-zero temperatures in order to conduct electrons. Electrons must be free from all radiation or light particles and kept at a freezing temperature. Google’s quantum computer, for example, is cooled to an astonishing 460 degrees below zero. The more recent spin method of quantum computing uses a single electron within silicon to create qubits. Only a few nanometers in size, these electrons are called quantum dots and can operate at higher temperatures. In fact, a new silicon chip capable of manipulating the spin of a single electron could ultimately allow future quantum computers to be built using conventional electronic technology. Thanks largely to research by IBM, Google, Microsoft and others, the United States remains the leader in patents related to quantum computers. In the future, quantum computers are expected to become very good at highly specific problem-solving. Quantum computing performs best in probabilistic situations such as weather prediction, market forecasting, and breaking encryption. In the U.S., IBM and Google are racing to create the first truly useful quantum computer. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule. IBM is also working on developing quantum computing technologies and recently introduced the IBM Quantum Experience, a quantum computing platform delivered via the Cloud. Since 2016, IBM has provided researchers with a five-qubit cloud-based quantum computer and made its 20-qubit system available online at the end of 2017. In addition to IBM and Google, D-Wave, a Canadian company based in Vancouver has also been a leader in developing an early-stage quantum computer. D-Wave utilizes a method known as quantum annealing. Running adiabatic quantum computing algorithms, D-Wave’s machine finds a “good enough” or “local minima” solution. Volkswagen has leveraged D-Wave’s quantum annealing technology, using it to carry out research on traffic flow optimization in Beijing with 2,000 qubits. One very promising application of quantum technology is quantum communications. Researchers are working towards creating ultra-secure communication networks that could form the basis of a quantum internet. Where sensitive data is currently encrypted and transmitted using digital “keys” (1 and 0s), quantum communications has already demonstrated the capacity to secure encrypted information using qubits. Quantum key distribution (QKD), for example, combines digitally encrypted data with keys that are encoded and transmitted using quantum state using qubits. China has become a global leader in the drive to develop quantum communication technologies. Pouring vast sums of money into quantum research, China filed almost twice as many patents as the United States in the field of quantum technology in 2017 alone. That same year, the country launched a dedicated quantum communications satellite called Micius, staging the world’s first quantum key distribution-secured video conference between Beijing and Vienna. An arcane field only a decade ago, quantum computing has matured at an astonishing pace. As countries around the world continue to move the needle on supercomputing, we will likely see revolutionary applications in the field of quantum technology. Nonetheless, the mainstream application of quantum computing remains decades away. Quantum computing represents a revolution in computational technologies; that goes without saying. But there remains significant work ahead.
<urn:uuid:3601d90f-a10e-4e1c-911f-3bf8de8f1334>
CC-MAIN-2022-33
https://netsmiami.com/a-deeper-dive-into-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571950.76/warc/CC-MAIN-20220813111851-20220813141851-00520.warc.gz
en
0.907226
1,598
4
4
Unlike Bilbo’s magic ring, which entangles human hearts, engineers have created a new micro-ring that entangles individual particles of light, an important first step in a whole host of new technologies. Entanglement – the instantaneous connection between two particles no matter their distance apart – is one of the most intriguing and promising phenomena in all of physics. Properly harnessed, entangled photons could revolutionize computing, communications, and cyber security. Though readily created in the lab and by comparatively large-scale optoelectronic components, a practical source of entangled photons that can fit onto an ordinary computer chip has been elusive. New research, reported today in The Optical Society‘s (OSA) new high-impact journal Optica, describes how a team of scientists has developed, for the first time, a microscopic component that is small enough to fit onto a standard silicon chip that can generate a continuous supply of entangled photons. The new design is based on an established silicon technology known as a micro-ring resonator. These resonators are actually loops that are etched onto silicon wafers that can corral and then reemit particles of light. By tailoring the design of this resonator, the researchers created a novel source of entangled photons that is incredibly small and highly efficient, making it an ideal on-chip component. “The main advantage of our new source is that it is at the same time small, bright, and silicon based,” said Daniele Bajoni, a researcher at the Università degli Studi di Pavia in Italy and co-author on the paper. “The diameter of the ring resonator is a mere 20 microns, which is about one-tenth of the width of a human hair. Previous sources were hundreds of times larger than the one we developed.” From Entanglement to Innovation Scientists and engineers have long recognized the enormous practical potential of entangled photons. This curious manifestation of quantum physics, which Einstein referred to as “spooky action at a distance,” has two important implications in real-world technology. First, if something acts on one of the entangled photons then the other one will respond to that action instantly, even if it is on the opposite side of a computer chip or even the opposite side of the Galaxy. This behavior could be harnessed to increase the power and speed of computations. The second implication is that the two photons can be considered to be, in some sense, a single entity, which would allow for new communication protocols that are immune to spying. The Latest on: Entanglement on a chip via Google News The Latest on: Entanglement on a chip - A key role for quantum entanglementon July 29, 2022 at 12:29 pm An international team of scientists has now demonstrated experimentally, for the first time, an approach to quantum key distribution that is based on high-quality quantum entanglement -- offering ... - Ships must slow down more often to save whales, feds sayon July 29, 2022 at 8:43 am Vessel strikes and entanglement in fishing gear are the two biggest threats to the giant animals, which number less than 340 and are falling in population. Efforts to save the whales have long ... - Breakthrough could save us from the ‘quantum apocalypse’on July 27, 2022 at 10:11 am And it relies on quantum entanglement: the strange and still mysterious behaviour that Albert Einstein described as “spooky action at a distance”. In today’s computers, communications are ... - Quantum cryptography: Hacking is futileon July 27, 2022 at 10:07 am To create an entanglement, first the scientists excite each of the atoms with a laser pulse. After this, the atoms spontaneously fall back into their ground state, each thereby emitting a photon. - Everything You Wanted to Know about Quantum Computingon July 17, 2022 at 4:59 pm And entanglement is the idea that if you have different pieces ... Here is an example of a multi-qubit chip for computing things, such as quantum simulations of advanced materials, that was created at ... - quantum mechanicson July 12, 2022 at 2:40 am Scientists Capture Photographic Proof of Quantum Entanglement July 15 ... Google Announces ‘Bristlecone’ Quantum Computing Chip March 6, 2018 at 2:23 pm Google has just announced a new ... - Researchers Set New Quantum Entanglement Distance Recordon July 12, 2022 at 2:40 am Scientists have been grappling with the strangeness of quantum entanglement for decades, and it’s almost as mysterious in 2022 as it was when Einstein famously dubbed the phenomenon “spooky ac ... - ICFO boosts performances of fiber-integrated quantum memorieson July 11, 2022 at 12:53 pm Researchers from ICFO (Spain), IFN-CNR (Italy), and Heriot-Watt University (UK), have demonstrated entanglement between a fiber-integrated quantum memory and a telecommunications-wavelength photon. via Bing News
<urn:uuid:c81cc9a4-ec1c-481d-b68a-5d427fde5b54>
CC-MAIN-2022-33
https://innovationtoronto.com/2015/01/entanglement-on-a-chip-breakthrough-promises-secure-communications-and-faster-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571996.63/warc/CC-MAIN-20220814052950-20220814082950-00720.warc.gz
en
0.929333
1,064
3.90625
4
Miniaturization, the process of making components smaller, has been key to the realization of Moore’s Law in computing. Moore’s original observation began with noticing the trend that the number of components integrated into a semiconductor circuit would double each year.1 With this doubling came the increase in computational power that means chips less than a centimeter in size can now perform calculations that would have been unachievable by room-sized supercomputers in the past. Image Credit: Macro photo/Shutterstock.com While the continued exponential scaling of the component number of Moore’s Law appears to have reached its limit2, the huge rise and success of information technology over this period is due to the successful miniaturization of electronic components. The limits being reached now are issues with the thermal load on components and the challenge and costs associated with the machining of chips on this scale and further shrinkage will require the use of new architectures beyond just silicon components. Increasingly, light is now being used in applications that would have previously utilized electrons and therefore been covered by the domain of electronics. Examples of this include the use of optical fibers and light transmission for faster information exchange and light is an emerging technology for use in quantum computing architectures.3 Light has the advantage of being able to travel faster than electrons and can support greater bandwidth in data transfer. However, if photonics is truly to replace electronics, there is a need for photonic chips that can perform logic operations with photons similar to the electronic transistor. Such chips need to be sufficiently small so that they can be incorporated into devices without requiring a significant footprint. Atoms have proved an invaluable tool for quantum information. The ability to prepare atoms in a superposition of states and control this at will using laser fields has made them a popular tool in quantum computers, sensors, and devices.4 However, systems based on optical trapping of atoms require extensive amounts of cooling and often require specially designed buildings and laboratories to manage vibration levels to ensure experiments can remain interferometrically stable. While such technologies seem to be scalable,4 these technologies are a long way from the compact photonics chip. However, recent research from the University of Illinois at Urbana-Champaign has demonstrated a way of creating simple, compact circuits that uses sound waves to control the stability of light.5 The new stabilization scheme is designed to be compatible with atomic control systems and work as an isolator to improve the stability of such experiments. Scaling down large atom-based experiments has proved challenging to date but with the use of these new isolators, smaller-scale quantum devices may now be a possibility. When light interacts with matter, it can undergo several processes. Those include absorption, which can promote atoms into a superposition of states needed for quantum information processes, but also other unwanted processes such as scattering. Even from well-collimated point sources such as lasers, light can be difficult to control with issues such as beam divergence and aberrations introduced by optical components, adding further challenges. The development of optical fibers and waveguides has opened new possibilities for the control of light, including mode selection. Improvements in optics and manufacturing processes have made it possible to create resonators with minimal light loss. A resonator is an optical cavity that allows a beam to travel in a closed path. These are used routinely in laser systems to allow for multiple passes of light over a given distance, enabling greater levels of light amplification. In waveguide-resonator systems, the resonator is coupled to a waveguide and any light that is far detuned from the resonance of the absorber will pass through the waveguide without interruption. For a critical coupling regime, there will be strong absorption of the light by the resonator. By using a chiral waveguide-resonator system, the team created a system that completely blocks light passing in one direction, but not the other. Normally, the waveguide would be transparent to non-resonant wavelengths but they could potentially be transmitted in both directions along the guide were there to be any back reflections. Unidirectionality of this kind has been previously achieved in waveguide-resonator systems but using magnetic fields which often requires additional bulky equipment. The achievement of unidirectionality by switching the resonator material for a chiral substrate removes the need for such fields and is an important step towards the miniaturization of atom-based devices. Back reflections can cause damage to optical components as well as unwanted behavior in optical components, so being able to suppress these also improves the device performance. As the system is compatible with 780 nm light, it is ideal for use with rubidium-atom-based systems that are currently being widely explored for quantum sensors. References and Further Reading - Mack, C. A. (2011). Fifty Years of Moore’s Law. IEEE Transactions on Semiconductor Manufacturing, 24(2), 202–207. https://doi.org/10.1109/TSM.2010.2096437 - Theis, T. N., & Wong, H. P. (2017). The End of Moore’s Law: A New Beginning for Information Technology. Computing in Science & Engineering, 19(2), 41–50. https://doi.org/10.1109/MCSE.2017.29 - Kok, P., Dowling, J. P., & Milburn, G. J. (2007). Linear optical quantum computing with photonic qubits. Reviews of Modern Physics, 79, 135–174. https://doi.org/10.1103/RevModPhys.79.135 - Cirac, J. I., & Zoller, P. (2000). A scalable quantum computer with ions in an array of microtraps. Nature, 404, 579–581. - Sohn, D. B., Örsel, O. E., & Bahl, G. (2021). Electrically driven optical isolation through phonon-mediated photonic Autler – Townes splitting. Nature Photonics. https://doi.org/10.1038/s41566-021-00884-x
<urn:uuid:47637cb1-9f99-4bb3-a6ca-d0355b4367b8>
CC-MAIN-2022-33
https://www.azooptics.com/Article.aspx?ArticleID=2056
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572515.15/warc/CC-MAIN-20220816181215-20220816211215-00721.warc.gz
en
0.925609
1,279
3.984375
4
Quantum communication systems offer the promise of virtually unbreakable encryption. Unlike classical encryption, which is used to send secure data over networks today and whose security depends on the difficulty of solving mathematical problems like the factoring of large numbers, most quantum encryption schemes keep the encryption key separate from the data. This approach ensures that an eavesdropper with access only to the data could not decipher the key. However, researchers have recently demonstrated that even quantum encryption may be susceptible to hacking. In a presentation next month at the Conference on Lasers and Electro-Optics (CLEO: 2013) in San Jose, Calif., Renato Renner of the Institute for Theoretical Physics in Zurich will discuss how he and his team of theoretical physicists are working on new ways to calculate the failure probability of certain quantum encryption schemes. The numbers would allow users to estimate how likely it would be that an adversary could read their secret messages—information that is critical for ensuring the overall security of quantum communications. Quantum key distribution (QKD) is a kind of quantum encryption in which a secret password is shared between two distant parties (usually named Alice and Bob in thought experiments). The secret password, or key, is distributed as bits of quantum data, so that if an eavesdropper (usually named Eve) tries to intercept the message, the bits will be disturbed and Alice and Bob will know the transmission has been compromised. If the key is not disturbed, it can be used to encode messages that are sent over an insecure channel. “The security of Quantum Key Distribution systems is never absolute,” says Renner. He notes that the security of QKD systems depends on three assumptions: the initial secrecy of the password, the correctness and completeness of quantum theory, and the reliability of the devices in the quantum communication system. Recent work by other research groups has illustrated how real-world devices that are not 100 percent reliable can leave weaknesses in quantum communication schemes that may be exploited by a clever hacker. For example, the photon detectors used in QKD should click with a certain probability whenever a photon is detected, but in practice the devices can be “blinded” by a strong light pulse and not click. “In fact, an adversary may use strong light pulses to ‘remotely control’ the detector,” says Renner. Since such bright light hacking techniques were first demonstrated in 2010, physicists have been keen to find ways to calculate the security of quantum encryption schemes without making assumptions about the reliability of the devices. The quest has generated a lot of interest in a field called device-independent cryptography. The Latest Bing News on: - Quantum Computing Threat Treated With Increasing Seriousness by Federal Government With Announcement of New Cryptographic Standards and Toolson August 5, 2022 at 9:00 am Concrete steps to address the threat quantum computing poses to current cryptographic standards have been taken by NIST, as it has selected four algorithms for future use. - The time is now for quantum-safe securityon August 5, 2022 at 6:35 am Agencies must understand what data is at risk and mitigate that risk with crypto-agile solutions as post quantum crypto standards are finalized. - Amazon, IBM Move Swiftly on Post-Quantum Cryptographic Algorithms Selected by NISTon August 4, 2022 at 2:03 pm A month after the algorithms were revealed, some companies have already begun incorporating the future standards into their products and services. - Single-Core CPU Cracked Post-Quantum Encryption Candidate Algorithm in Just an Houron August 4, 2022 at 2:59 am It took researchers about 62 minutes to crack a late-stage Post-Quantum Encryption candidate algorithm using a single-core CPU. - Post-quantum cryptography candidate cracked in hours using simple CPUon August 3, 2022 at 8:09 am Researchers claim to have cracked SIKE using a single-core Xeon processor - a far cry from the exotic world of quantum computers ... - MUCSE introduces first commercial post-quantum cryptography chip ready for the post-quantum eraon August 2, 2022 at 3:16 am Wuxi,Jiangsu,China -- Beijing, July 11, 2022, MUCSE, a pioneer in security and integrated circuit, announced its latest secure chip -- PQC 1.0. The PQC 1.0 is believed to be the first commercial ... - Hack Post-Quantum Cryptography Now So That Bad Actors Don’t Do It Lateron July 28, 2022 at 8:33 am The U.S. government should consider offering a public cash bounty to anyone who can crack the new forms of encryption that are being rolled out to defend against quantum computers. - Quantum cryptography: Hacking is futileon July 27, 2022 at 1:00 pm An international team has successfully implemented an advanced form of quantum cryptography for the first time. Moreover, encryption is independent of the quantum device used and therefore even more ... - A key role for quantum entanglementon July 27, 2022 at 10:57 am A method known as quantum key distribution has long held the promise of communication security unattainable in conventional cryptography. An international team of scientists has now demonstrated ... - IBM bolsters quantum cryptography for z16 mainframeon July 27, 2022 at 8:03 am IBM adds NIST’s new public-key encryption and digital signatures algorithms to defend against attacks by future quantum computers. The Latest Google Headlines on: The Latest Bing News on: - University of California, Los Angeles: UCLA-led team develops new approach for building quantum computerson August 3, 2022 at 6:32 am Quantum computing, though still in its early days, has the potential to dramatically increase processing power by harnessing the strange behavior of particles at the smallest scales. Some research ... - Developing a new approach for building quantum computerson August 2, 2022 at 10:00 am In the long term, quantum computers could provide unbreakable encryption and simulations of nature beyond today's capabilities. A UCLA-led interdisciplinary research team including collaborators ... - ‘Quantum cryptography’ raises possibility of unbreakable codeson July 27, 2022 at 4:01 pm Scientists have achieved a new form of quantum cryptography that harnesses the laws of physics to create unbreakable codes ... on the internet uses a form of encryption based on the mathematics ... - Why Public-Private Partnership Can Spur QKD Adoption In The U.S.on July 27, 2022 at 6:15 am Using QKD, two parties can create a shared random “key” to encrypt and decrypt messages, i.e., data communication sent through a designed fiberoptic cable. - The Best VPN Apps: Top 10 for 2022on July 15, 2022 at 2:28 pm This VPN service boasts lightning-fast bandwidth, unbreakable encryption, and an independently certified no-logs policy. Because of its user-friendly interfaces and superior functionality ... - Mega's unbreakable encryption proves to be anything buton June 22, 2022 at 2:01 pm Mega, the New Zealand-based file-sharing biz co-founded a decade ago by Kim Dotcom, promotes its "privacy by design" and user-controlled encryption keys to claim that data stored on Mega's servers ... - Message Encryption on Androidon February 1, 2022 at 4:04 am Install Unbreakable SMS from the Android Market. Open Unbreakable SMS and set an encryption password, known as a cipher key in the app. Type a message into the Type Your Plain Text Message box ... - What is encryption?on October 1, 2021 at 8:12 am If you've read anything about technology in the last few years, you may have seen the term encryption floating around. It's a simple concept, but the realities of its use are enormously complicated. - Unbreakable Encryption: Work Has Begun on the World's First Quantum Enigma Machineon September 15, 2016 at 7:04 am Several recent studies in cryptography and encryption have led scientists to theorize that we could send an unbreakable encrypted message with a key that is much shorter than the message itself.
<urn:uuid:4e658b49-63a7-4ea0-9f0d-19d41247b19e>
CC-MAIN-2022-33
https://innovationtoronto.com/2013/05/just-how-secure-is-quantum-cryptography/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572212.96/warc/CC-MAIN-20220815205848-20220815235848-00321.warc.gz
en
0.916283
1,727
3.703125
4
⇧ [VIDÉO] You might also like this partner content (after ad) The search for ever higher computer performance is a strong motivation for scientists. The computers of tomorrow will be quantum, allowing rapid and extremely complex calculations, the complete simulation of molecules, or the development of innovative materials. However, before accessing it, it is first necessary to create the components of these supercomputers. Recently, engineers in Sydney demonstrated a quantum integrated circuit made of silicon, made up of 10 phosphorus atoms. This represents an important step in the development of useful quantum computing in real conditions. By precisely controlling the quantum states of atoms — the different energy levels of the electrons belonging to the atom — the new silicon processor can simulate the structure and properties of an organic molecule with astonishing precision. The atomic-scale integrated circuit milestone is the culmination of 20 years of research led by Scientia’s Michelle Simmons, founder of UNSW start-up Silicon Quantum Computing (SQC). In 2012, his team had created the very first “quantum transistor”. Transistors are small electronic components that store bits of information. They are made with semiconductor materials, allowing a switching effect and the encoding of information. This is because in semiconductors there is a large group of electrons. However, according to quantum mechanics, an electron can only occupy certain energy levels. This is how the levels of the electrons making up the semiconductor correspond to “bands” or variations in permitted energy values. When a transistor is turned on — the electrical voltage is in the energy band — current flows and the computer detects the value “1”. When a transistor is in off mode—the electrical voltage is outside the permitted energy band—current no longer flows and the computer interprets this as a “0” value. Remember that a quantum computer is the equivalent of classical computers, but performing its calculations using the laws of quantum physics directly. While a classical computer manipulates bits of information, which are either 0s or 1s, a quantum computer uses qubits. These are generalizations of the classical bits, which are sort of a simultaneous superposition of these two states. Thus, recently, a team of quantum computing physicists from UNSW Sydney, in partnership with the start-up Silicon Quantum Computing, designed an atomic-scale quantum processor to simulate the behavior of a small organic molecule, mimicking its structure and energy states. This represents a major milestone in the race to build the world’s first quantum computer, and demonstrates the team’s ability to control the quantum states of electrons and atoms in silicon to a level never before achieved. Their results are published in the journal Nature. Imitate nature, but in a very demanding way This technological innovation addresses a challenge first postulated by pioneering theoretical physicist Professor Richard Feynman in his famous 1959 lecture. Plenty of Room at the Bottom. During this lecture, Feynman asserted that in order to understand how nature works, it is essential to be able to control matter at the same length scales from which matter is constructed — that is, to be able to controlling matter on the atomic scale. Scientia Professor Michelle Simmons, lead researcher on the study, said in a statement: And so that’s what we do, we literally build it from the bottom up, where we mimic the polyacetylene molecule by putting atoms in the silicon with the exact distances that represent the single and double carbon-carbon bonds “. This molecule has the advantage of being well known by researchers. They can therefore immediately determine the consistency of the result, and by extension the reliability of the chip. To design the first quantum integrated circuit, the team had to perform three distinct technological feats of atomic engineering, in near-absolute vacuum. Indeed, at this scale, a single hydrogen atom can compromise the whole manipulation. The first feat was to create small dots of uniformly sized atoms, so their energy levels would line up and electrons could easily pass through them. These dots, called Quantum Dots (QD), are dots of phosphorus atoms. By configuring their layouts, they can behave like real quantum transistors. In the present study, the quantum integrated circuit includes a chain of 10 quantum dots to simulate the precise location of atoms in the polyacetylene chain. Nevertheless, the tolerable energy band, as mentioned earlier for conventional transistors, is extremely small. This is where the second technological feat comes in, the ability to adjust the energy levels of each point individually, but also of all the points collectively. So, using a nanometric precision system, they added six control electrodes (G1 to G6 in the image below) to adjust the energy levels. This gives complete control of where electrons exist in the polyacetylene chain. By adding source (S) and drain (D) conductors, they could then measure the current flowing through the device as electrons passed through the string of 10 quantum dots. Finally, the third technical challenge was to achieve the ability to control distances between points with sub-nanometer precision. If they are too close, the energy produced is too powerful to be mastered. If they are too far apart, interactions between them become risky. The points must therefore be close enough, but remain independent, to allow the coherent transport of electrons through the chain. To be doubly sure of this consistency of the results produced by the circuit, the researchers simulated two different strands of the polymer chains at 10 points of the molecule. In the first device they cut a piece of chain to leave double bonds at the end giving 10 peaks in the current. In the second device, they cut a different fragment of the chain to leave single bonds at the end, resulting in only two peaks in the current. The current through each chain was therefore radically different due to the different bond lengths of the atoms at the end of the chain. Professor Simmons explains: “ What this shows is that you can literally mimic what is actually going on in the molecule. And that’s why it’s exciting because the signatures of the two chains are very different. Most other quantum computing architectures lack the ability to engineer atoms with sub-nanometer precision or allow atoms to be that close. This means that we can now begin to understand increasingly complicated molecules by putting the atoms in place as if they were mimicking the real physical system. “. And now ? Quantum biology… According to Professor Simmons, it is not by chance that a carbon chain of 10 atoms was chosen, because it is within the size limit of what a conventional computer is able to calculate, with up to 1024 distinct interactions of electrons in this system. Increasing it to a chain of 20 points would see the number of possible interactions increase exponentially, making it difficult for a typical computer to solve. She says: ” We are approaching the limit of what conventional computers can do, so this is like a step into the unknown. […] We are going to be able to understand the world in a different way, by addressing fundamental questions that we have never been able to answer before “. Moreover, we are talking about quantum biology. This recent disciplinary field deals with the study of processes at work in living organisms involving the laws of quantum physics. Photosynthesis, the orientation of migratory birds or even bioluminescence are governed by quantum processes. Understanding these phenomena paves the way for many innovations in the field of biomimicry. The team believes that the development of quantum computers is on a trajectory comparable to the evolution of classical computers — from a transistor in 1947 to an integrated circuit in 1958, then small computer chips that have been integrated into commercial products, like calculators or so, five years later. Incidentally, the production of this atomic-scale integrated circuit, which functions as an analog quantum processor, came less than a decade after the team declared (in 2012) that they had made the first transistor. single atom in the world, completed two years ahead of schedule. Finally, using fewer components in the circuit to control the qubits minimizes the amount of any interference with quantum states, allowing devices to be scaled up to create more complex and powerful quantum systems.
<urn:uuid:92a429a4-5a5e-4cb8-9e79-09628c75c690>
CC-MAIN-2022-33
https://articleyarn.com/an-atomic-scale-integrated-quantum-circuit-propels-us-towards-quantum-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570879.37/warc/CC-MAIN-20220809003642-20220809033642-00524.warc.gz
en
0.937817
1,677
3.9375
4
What’s Under the Hood of A Quantum Computer? (PhysicsToday) The separation between hardware and user interface is the product of decades of development. Now quantum computer developers are navigating similar terrain. The quantum computing stack is everything that lies between a user and the physical qubits. The stack needs to perform essential functions; for instance, it must facilitate user interaction, turn inputs into hardware manipulation, and correct for numerous error sources. There’s no one right way to divide those tasks into discrete levels, though, and researchers and technology companies are still pursuing different visions for future quantum architectures. Harrison Ball, Michael Biercuk, and Michael Hush present the quantum computing stack proposed by Q-CTRL in this articl NOTE: The following is a summary by IQT-News’ of this article’s description of each of the key components of a quantum computer. The summary and the original article are both worth the time to read. Additional sources are provided in the original article as well for indepth followup. Classical computers store information as bits that each take a value of 0 or 1. Underlying those bits are field-effect transistors that act as switches; each can take a value of either 0 or 1 depending on whether the switch is on or off. At the most basic level, everything a computer does—save information, execute calculations, run programs—is just manipulating the values of those billions of bits with small electrical voltages. Quantum computers instead rely on qubits that can be in one of two states, ∣0〉 or ∣1〉, or a linear superposition. Whereas classical computing has largely settled on one type of bit hardware, qubits still come in many varieties. Any two-level quantum system—a nuclear spin, a photon’s polarization, or a quantum dot’s spin, to name a few—can be used as a qubit. The usefulness of a particular system, however, depends on things such as how easily the qubits are to manipulate and entangle, how long they remain in desired quantum states, and how prone they are to having their states destroyed by outside noise. Qubits are prone to errors. All sorts of environmental factors—thermal fluctuations, electromagnetic radiation, magnetic fields—can knock a qubit out of its intended state. That degradation of information is known as decoherence and can occur in a fraction of a second. Despite the use of refrigeration to reduce thermal fluctuations, decoherence eventually creeps in and produces hardware errors, like accidentally flipping a qubit’s state from ∣0〉 to ∣1〉. (The commonly used refrigeration systems, like the one shown above from IBM, are what many people picture when they imagine a quantum computer.) The number of operations that can be performed with a qubit is limited by the qubit’s decoherence time. Moreover, every set of qubit hardware has its own unique deviations from ideal performance. Hardware-aware quantum compiler The hardware-aware quantum compiler, also known as a transpiler, is responsible for figuring out how to complete a set of logic operations in a manner that accounts for the physical connections between qubits. Although physical qubits can’t easily be moved, the states of two qubits can be swapped for an effective rearrangement. The transpiler works out how to implement an arbitrary operation between qubits given the hardware constraints, such as which qubits are directly connected to each other. It also decides which qubits to use for each operation—for instance, if a particular qubit is known to be faulty, information might need to be routed around it. Quantum Error Corrector Correcting qubit errors with QEC is inherently resource intensive—some current schemes use tens of physical qubits per logical block—and will likely require more qubits than are available in existing devices to provide any real benefit. Accordingly, QEC is more important in the long term than it is for current machines. Quantum firmware aims to reduce the burden on QEC routines by dealing with more predictable noise, thereby improving QEC’s resource efficiency. Logical-level compilation and circuit optimization A single algorithm can be represented by multiple logically equivalent circuits, and the goal of circuit optimization is to find the one requiring the fewest operations or timesteps. Executing fewer operations enables the algorithm to run faster—an important goal for any quantum computer, whether or not it is using QEC. Quantum algorithms and applications Quantum algorithms play the same role as classical algorithms: They provide step-by-step instructions for completing a computational task. Although a regular algorithm could in principle be run on a quantum computer, a true quantum algorithm takes advantage of the underlying hardware’s quantum nature. A variational quantum algorithm is a compromise between classical and quantum ones. It breaks up a computation into a small quantum component and a larger classical optimization problem and therefore requires a much smaller quantum computer than, say, the quantum Fourier transform. Such algorithms are promising for solving problems in finance, logistics, and chemistry. User interface, QAAS, and operating system Most people who want to use quantum computers aren’t going to build or even buy one—at least not anytime soon. To facilitate access to the limited existing quantum computing resources, companies have put together cloud-based infrastructures that allow remote operation. As in a classical computer, the highest level of the quantum computing stack provides the interface that users interact with.
<urn:uuid:8fce0663-10ed-44a0-8469-631fc337bb17>
CC-MAIN-2022-33
https://www.insidequantumtechnology.com/news-archive/whats-under-the-hood-of-a-quantum-computer/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572515.15/warc/CC-MAIN-20220816181215-20220816211215-00723.warc.gz
en
0.912287
1,161
3.5625
4
A new phase of matter was observed in a quantum computer after physicists pulsed light on its qubits in a pattern inspired by the Fibonacci sequence. If you think this is mind-boggling, then this peculiar quirk of quantum mechanics behaves as if it has two time dimensions, rather than one; A trait that the scientists say makes qubits more powerful and able to remain stable for the duration of the experiment. This stability is called quantum coherence, and it is one of the main goals of an error-free quantum computer—and one of the most difficult to achieve. The work represents “a completely different way of thinking about the phases of matter,” according to computational quantum physicist Felipe Domitrescu of the Flatiron Institute, and lead author of a new research paper describing the phenomenon. “I’ve been working on these theoretical ideas for over five years, and seeing them actually come true in experiments is exciting.” Quantum computing is based on qubits, the quantum equivalent of computing qubits. However, when bits process information in one of two states, 1 or 0, they can be qubits at once, a condition known as quantum superposition. The mathematical nature of this superposition can be incredibly powerful from a computational point of view, making short problem solving under the right conditions. But the uncertain and unstable nature of a series of qubits also depends on how their oscillating states relate to each other – a relationship called entanglement. Frustratingly, qubits can get entangled with almost anything in their environment, which leads to errors. The more sensitive a qubit’s fuzzy state is (or the more messy its environment), the greater the risk that it will lose this coherence. Improving coherence to a point of feasibility is likely a multi-tactic approach to removing a major hurdle standing in the way of a functional quantum computer – every little bit makes a difference. “Even if you keep all of the atoms under tight control, they can lose their quantity by talking to their environment, heating up or interacting with things in ways they didn’t plan for,” Domitrescu explained. “In practice, experimental devices contain many error sources that can degrade coherence after a few laser pulses.” One way to protect qubits from decoherence is to enforce symmetry. Rotate an ordinary old square ninety degrees, and it’s still effectively the same shape. This symmetry protects it from certain rotational effects. Clicking the qubits with evenly spaced laser pulses ensures symmetry that does not depend on space, but rather in time. Domitrescu and colleagues wanted to see if they could increase this effect by adding, not symmetric periodic, but asymmetric quasi-periodic. They assumed that this would not add a one-time symmetry, but a one-time symmetry; One is actually buried inside the other. The idea was based on previous work by the team that proposed creating a so-called quasicrystalline in time, rather than space. When a crystal consists of a symmetrical network of atoms that repeat in space, such as a square lattice forest gym or honeycomb, the pattern of atoms on a semi-crystal is non-repetitive, like a Penrose tiling, yet still ordered. The team conducted their experiment on a high-end commercial quantum computer designed by Quantinuum, a quantum computing company. This monster employs 10 atoms of ytterbium (one of the favorite elements of atomic clocks). These atoms are kept in an electric ion trap, through which laser pulses can be used to control or measure them. Domitrescu and colleagues created a series of laser pulses based on Fibonacci numbers, with each part representing the sum of the previous two parts. This results in an ordered, but not repeating, sequence, just like a quasicrystal. Semi-crystalline crystals can be described mathematically as lower dimensional sections of higher dimensional lattices. Penrose tiling can be described as a two-dimensional slice of a five-dimensional cube. In the same way, the team’s laser pulses can be described as a one-dimensional representation of a two-dimensional pattern. In theory, this meant that it would likely impose two time symmetries on the qubits. The team tested their work by flashing lasers into a ytterbium qubit, first in symmetrical sequences, then semi-periodically. Then they measured the coherence of two qubits on either side of the trap. For the periodic sequence, the qubits were stable for 1.5 s. For the quasi-periodic sequences, they remained stable for 5.5 s – the duration of the experiment. The additional time symmetry added another layer of protection against quantum decoherence, the researchers said. “With this quasi-periodic sequence, there is a complex evolution that eliminates all the errors that live on the edge,” Domitrescu said. “Because of that, the edge stays quantum mechanically coherent a lot, much longer than you’d expect.” The researchers said the work is nowhere near ready to be integrated into functional quantum computers, but it does represent an important step toward that goal. The search was published in temper nature. #strange #phase #matter #appears #occupy #time #dimensions
<urn:uuid:70efbde1-8b25-4a82-a150-493e6c80e958>
CC-MAIN-2022-33
https://zeliw.com/cbmiymh0dhbzoi8vd3d3lnnjawvuy2vhbgvydc5jb20vys1uzxctcxvhbnr1bs1wagfzzs1vzi1tyxr0zxitymvoyxzlcy1sawtllwl0lwhhcy10d28tdgltzs1kaw1lbnnpb25z0geaoc5/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571472.69/warc/CC-MAIN-20220811133823-20220811163823-00524.warc.gz
en
0.937506
1,141
3.703125
4
The IBM lab responsible for inventing the scanning tunneling microscope and the atomic force microscope has invented another critical tool for helping us understand the nanoscale. Accurately measuring the temperature of objects at the nanoscale has been challenging scientists for decades. Current techniques are not accurate and they typically generate artifacts, limiting their reliability. In the 1980s, IBM scientists Gerd Binnig and the late Heinrich Rohrer wanted to directly explore a surface’s electronic structure and imperfections. The instrument they needed to take such measurements didn’t exist, yet. So they did what any good scientist would do: they invented one. It became known as the scanning tunneling microscope (STM), opening the door to nanotechnology. Just a few years later, the invention was recognized with the highest of honors, the Nobel Prize for Physics in 1986. More than 30 years later IBM scientists continue to follow in the footsteps of Binnig and Rohrer and with their latest invention. Dr. Fabian Menges, an IBM post-doc and co-inventor of the technique said, “We started back in 2010 and simply never gave up. Previous research was focused on a nanoscale thermometer, but we should have been inventing a thermometer for the nanoscale — an important distinction. This adjustment led us to develop a technique which combines local thermal sensing with the measuring capability of a microscope — we call it scanning probe thermometry.” IBM scientist Fabian Menges with his invention. How it Works: A Scanning Probe Thermometry The most common technique to measure temperature on the macroscale is to bring a thermometer into thermal contact with the sample. This is how a fever thermometer works. Once it’s placed under our tongue it equilibrates to our body temperature so that we can determine our temperature at a healthy 37 degrees C. Unfortunately, it gets a little more challenging when using a thermometer to measure a nanoscopic object. For example, it would be impossible to use a typical thermometer to measure the temperature of an individual virus. The size of the virus is too small and the thermometer cannot equilibrate without significantly disturbing the virus temperature. To solve this challenge, IBM scientists developed a single scan non-equilibrium contact thermometry technique to measure the temperature of nanoscopic objects using a scanning probe. As the scanning probe thermometer and the object cannot thermally equilibrate at the nanoscale, two signals are measured simultaneously: a small heat flux, and its resistance to heat flow. Combining these two signal the temperature of nanoscopic objects can then be quantified for an accurate result. IBM scientist Dr. Bernd Gotsmann and co-inventor explains, “The technique is analogous to touching a hot plate and inferring its temperature from sensing the heat flux between our own body and the heat source. Essentially, the tip of the probe is our the hand. Our perception to hot and cold can be very helpful to get an idea of an objects temperature, but it can also be misleading if the resistance to heat flow is unknown.” Previously, scientists weren’t accurately including this resistance dependence; but only measuring the rate of the thermal energy transfer through the surface, know as heat flux. In the paper, the authors included the effects of local variations of thermal resistance to measure the temperature of an indium arsenide (InAs) nanowire, and a self-heated gold interconnect with a combination of a few-miliKelvin and few-nanometer spatial resolution. Menges adds, “Not only is the scanning probe thermometer accurate, it meets the trifecta for tools: it’s easy to operate, simple to build, and very versatile, in that it can be used to measure the temperature of nano- and micro-sized hot spots that can locally effect the physical properties of materials or govern chemical reactions in devices such as transistors, memory cells, thermoelectric energy converters or plasmonic structures. The applications are endless.” From left to right, IBM scientists Nico Mosso, Bernd Gotsmann, Fabian Motzfeld and Fabian Menges in the Noise Free Lab. Noise Free Labs It’s no coincidence that the team began to see improvements in the development of the scanning probe thermometer 18 months ago when they moved their research into the new Noise Free Labs — six meters underground at the Binnig and Rohrer Nanotechnology Center on the campus of IBM Research-Zurich. This unique environment, which shields the experiments from vibration, acoustic noise, electromagnetic signals and temperature fluctuations, helped the team achieve sub-milli Kelvin precision. “While we had the benefit of this unique room, the technique can also produce reliable results in normal environment,” said Menges. “We hope the paper will produce both a lot of excitement and relief for scientists, who like us, have been searching for such a tool,” said Gotsmann. “Similar to the STM, we hope to license this technique to tool manufacturers who can then bring it to market as an additional function to their microscopy product line.” The scientists would like to thank the 7th Program Framework for its support under the NANOHEAT project and the Swiss National Science Foundation. A team formed by IBM Research scientist Dr. Leo Gross, University Regensburg professor Dr. Jascha Repp, and University Santiago de Compostela professor Dr. Diego Peña Gil has received a European Research Center (ERC) Synergy Grant for their project “Single Molecular Devices by Atom Manipulation” (MolDAM). In the paper “Coherent spin manipulation of individual atoms on a surface,” published in the journal Science, our team demonstrated the use of single atoms as qubits for quantum information processing. This is the first time a single-atom qubit has been achieved using a Scanning Tunneling Microscope. Our team at IBM Research developed a new technique to control the magnetism of a single copper atom, a technology that could one day allow individual atomic nuclei to store and process information. In a paper published today in the journal Nature Nanotechnology, our team demonstrated that we can control the magnetism of a single […]
<urn:uuid:887ea79b-4589-4cd4-853f-e2d7fcbd8440>
CC-MAIN-2022-33
https://www.ibm.com/blogs/research/2016/03/ibm-scientists-invent-a-thermometer-for-the-nanoscale/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571758.42/warc/CC-MAIN-20220812200804-20220812230804-00525.warc.gz
en
0.925397
1,324
3.859375
4
Transporting renewable energy to where it’s needed lies at the heart of the human endeavour to get rid of the need for fossil fuels. Superconductors can do so without loosing any of the precious electricity on the way, seemingly defying physical intuition. Find out in this article why many body physics is needed to understand their counter-intuitive behaviour, what role quantum entanglement plays and how quantum computation might lead to the discovery of materials which may give us the tools for a greener future. Dealing with climate change and the shortening fossil resources of our planet is one of the most pressing problems of our generation. Physically, both issues arise from the fact that fossil fuels are incredibly convenient for solving the two most important human tasks: Producing energy and transporting it to where it’s needed. With oil, the former task has been done by nature in the last couple of million years. We just have to pump the ready-made product out of the earth. Transportation is also easy due to its incredible energy density. Just 50kg of oil can carry a car weighing 2 metric tonnes for a thousand kilometres! The curse of Ohm At first sight, both problems are not that hard to solve. We know how to harvest the sun’s energy with solar panels, so why don’t we just put a lot of them in the deserts of the earth and then transport the electricity to cities with long cables? The main reason is probably of political nature (deserts close to Europe for example have been war zones recently), but there is also a physical aspect: With current technology, transporting energy comes with a price in the form of Ohm’s law, which holds in all normal metals like copper and iron which we use to transport electricity. Because of Ohm’s law, we inevitably lose energy when we transport it. And there is also another problem: Because the loss of energy happens in the form of heat, cables have to be thick enough (at a given energy throughput) so they don’t melt. Most current energy transport also happens at high voltage (U), because then the energy loss (P) due to heating is less. But a high voltage also means high electric fields and those fields can be damaging to electronics and humans and we need to make sure that there is no lightning jumping from the cable to the ground. All of these reasons justify why you see all these ugly masts everywhere in today’s civilized world. While you may say that aesthetics is maybe not the most important thing when it comes to an issue endangering millions of people, reality is that most people would not like to have one of these masts in their front yard. In Germany, this fact has led to the stalling of the “Energiewende” because important electricity transport lines from the windy north (where most renewable energy is produced) to the population and industry centres in the south (where all those shiny cars are built) can’t be set up due to resistance of the population living along the planned route. But there actually are materials with which you can transport the same amount of electricity as those huge masts in a single cable of just a few cms diameter under any old road! In superconductors, Ohm’s curse doesn’t hold and so they can conduct electricity without any energy loss (and with that I literally mean zero loss). How is that possible you may ask? Doesn’t this sound like a perpetuum mobile, something like a car that keeps on rolling when you just set it moving once? Every time something counter-intuitive happens in physics, chances are that it’s quantum mechanics that lies at the base of it and it is no different in superconductors. In fact, you can describe a cable of superconducting material with a single wave function, as if it was just one huge quantum particle moving. Superconductors are one of the few examples where quantum effects become truly macroscopic; the kilometres of coherence length reached surmount the wavefunction extent of the electron in a hydrogen atom by a factor of 10’000’000’000’000! Put differently, if an electron wavefunction would be the size of a human then the wavefunction in a superconductor would be as large as the distance between Earth and Pluto! Electron couple dance How does this happen exactly and why does this lead to frictionless flow of electricity? While the exact explanation of this is quite involved and resulted in multiple (!) nobel prizes being awarded to the theory’s discoverers, I want to give a simple picture of analogy here. In a normal conductor, electrons are lone wolfs, they fight themselves through the mace of atoms and get pushed around by them, loosing energy to the crystal lattice every time they bump into something. In a superconductor, something beautiful happens: As the temperature is lowered, electrons suddenly realize that they are note alone, and start to assemble in pairs (called “Cooper pairs” after their discoverer Leon Cooper, Nobel prize ’72). These pairs can then be regarded as one entity, just like a married couple often assumes one name. In the superconductor, this means that the “particles” with which we can base our theoretical description upon, are note the electrons any more, but the new “quasiparticle” (check out our article about those!) which we just called Cooper pair. But there is something weird in this picture. Everyone knows that all electrons are negatively charged and equal charges repel each other. So how can they suddenly do the opposite? Overcoming this difficulty was the insight of Bardeen, Cooper and Schrieffer who jointly got awarded the nobel prize for this. They showed that what happens is that when an electron flies through the lattice it also distorts the regularly ordered arrays of atoms. If the relaxation of this distortion is much slower than the time between two electrons passing the same place, then a second electron will feel the effect of this distortion and gets attracted by it. Effectively, blending out the lattice, the first electron has therefore exerted an attractive force on the second. It’s also clear from this picture, that this attraction will be a quite long ranged force between the electrons. In fact, in typical superconductors, the distance between the two constituents of a cooper pair is hundreds of times larger than the distance between atoms in the crystal. This means that I should have drawn the arms on above picture much much longer! The dancing couples merge How does superconductivity arise from the cooper pairs? To understand that, we must first understand what’s so special about this pairs. They differ in one substantial property from the electrons: while two electrons can’t be in the same place at the same time (a purely quantum mechanical effect also termed “Pauli exclusion principle”), two,three, four, even hundreds of Cooper pairs can! And at very low temperatures, they also do. In fact, they get so close to each other that their quantum mechanical wave functions start to overlap, so strongly that all of them can in fact be descriped by one, macroscopically large wavefunction. A Bose Einstein Condensate (BEC) has been born, one of the only macroscopic quantum effects known so far. One of the most counter-intuitive properties of this BEC is that it is also a superfluid, a fluid which can flow without any friction! This means that if you set this fluid in motion, it will never stop! And this is exactly how superconductivity emerges: a superfluid of cooper pairs has the property we were trying to explain all along: It flows without friction through its host material, i.e. without any resistance. Can this even be used? Yes and it already is! Ever seen those high-speed Maglev trains in Japan? They are based on yet another weird effect of superconductors: They push magnetic fields out of themselves! Maglevs are using this by levitating on superconducting magnets. But also the application discussed in the beginning is not in the too far future. First kilometer-long cables of superconducting material have already been built and the proof of principle been shown. The problem however remains that one has to cool these materials with liquid nitrogen for them to be superconducting. There is however a whole different class of materials in which superconductivity ocurrs at much higher temperatures. Somewhat uncreatively, they are called “High temperature superconductors”. And even 30 years after their discovery it still remains a secret how superconductivity emerges in them as the picture which I presented above can’t be used for understanding them. One thing is clear however: Quantum mechanics deeply has its mysterious fingers in their inner workings. Exciting times are ahead as today’s and tomorrow’s quantum computers study quantum materials like superconductors and they might lead to even more counter-intuitive, exciting and useful phenomena in the future! Stay tuned for more!
<urn:uuid:bf133077-325b-4f34-aaa8-6584070ddb1a>
CC-MAIN-2022-33
https://manybodyphysics.com/2018/12/13/how-quantum-physics-may-save-earth-from-global-warming/?shared=email&msg=fail
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573104.24/warc/CC-MAIN-20220817183340-20220817213340-00329.warc.gz
en
0.953902
1,895
3.515625
4
Try a quick experiment: Take two flashlights into a dark room and shine them so that their light beams cross. Notice anything peculiar? The rather anticlimactic answer is, probably not. That’s because the individual photons that make up light do not interact. Instead, they simply pass each other by, like indifferent spirits in the night. But what if light particles could be made to interact, attracting and repelling each other like atoms in ordinary matter? One tantalizing, albeit sci-fi possibility: light sabers — beams of light that can pull and push on each other, making for dazzling, epic confrontations. Or, in a more likely scenario, two beams of light could meet and merge into one single, luminous stream. It may seem like such optical behavior would require bending the rules of physics, but in fact, scientists at MIT, Harvard University, and elsewhere have now demonstrated that photons can indeed be made to interact — an accomplishment that could open a path toward using photons in quantum computing, if not in lightsabers. In a paper published today in the journal Science, the team, led by Vladan Vuletic, the Lester Wolfe Professor of Physics at MIT, and Professor Mikhail Lukin from Harvard University, reports that it has observed groups of three photons interacting and, in effect, sticking together to form a completely new kind of photonic matter. In controlled experiments, the researchers found that when they shone a very weak laser beam through a dense cloud of ultracold rubidium atoms, rather than exiting the cloud as single, randomly spaced photons, the photons bound together in pairs or triplets, suggesting some kind of interaction — in this case, attraction — taking place among them. While photons normally have no mass and travel at 300,000 kilometers per second (the speed of light), the researchers found that the bound photons actually acquired a fraction of an electron’s mass. These newly weighed-down light particles were also relatively sluggish, traveling about 100,000 times slower than normal noninteracting photons. Vuletic says the results demonstrate that photons can indeed attract, or entangle each other. If they can be made to interact in other ways, photons may be harnessed to perform extremely fast, incredibly complex quantum computations. “The interaction of individual photons has been a very long dream for decades,” Vuletic says. Vuletic’s co-authors include Qi-Yung Liang, Sergio Cantu, and Travis Nicholson from MIT, Lukin and Aditya Venkatramani of Harvard, Michael Gullans and Alexey Gorshkov of the University of Maryland, Jeff Thompson from Princeton University, and Cheng Ching of the University of Chicago. Biggering and biggering Vuletic and Lukin lead the MIT-Harvard Center for Ultracold Atoms, and together they have been looking for ways, both theoretical and experimental, to encourage interactions between photons. In 2013, the effort paid off, as the team observed pairs of photons interacting and binding together for the first time, creating an entirely new state of matter. In their new work, the researchers wondered whether interactions could take place between not only two photons, but more. “For example, you can combine oxygen molecules to form O2 and O3 (ozone), but not O4, and for some molecules you can’t form even a three-particle molecule,” Vuletic says. “So it was an open question: Can you add more photons to a molecule to make bigger and bigger things?” To find out, the team used the same experimental approach they used to observe two-photon interactions. The process begins with cooling a cloud of rubidium atoms to ultracold temperatures, just a millionth of a degree above absolute zero. Cooling the atoms slows them to a near standstill. Through this cloud of immobilized atoms, the researchers then shine a very weak laser beam — so weak, in fact, that only a handful of photons travel through the cloud at any one time. The researchers then measure the photons as they come out the other side of the atom cloud. In the new experiment, they found that the photons streamed out as pairs and triplets, rather than exiting the cloud at random intervals, as single photons having nothing to do with each other. In addition to tracking the number and rate of photons, the team measured the phase of photons, before and after traveling through the atom cloud. A photon’s phase indicates its frequency of oscillation. “The phase tells you how strongly they’re interacting, and the larger the phase, the stronger they are bound together,” Venkatramani explains. The team observed that as three-photon particles exited the atom cloud simultaneously, their phase was shifted compared to what it was when the photons didn’t interact at all, and was three times larger than the phase shift of two-photon molecules. “This means these photons are not just each of them independently interacting, but they’re all together interacting strongly.” The researchers then developed a hypothesis to explain what might have caused the photons to interact in the first place. Their model, based on physical principles, puts forth the following scenario: As a single photon moves through the cloud of rubidium atoms, it briefly lands on a nearby atom before skipping to another atom, like a bee flitting between flowers, until it reaches the other end. If another photon is simultaneously traveling through the cloud, it can also spend some time on a rubidium atom, forming a polariton — a hybrid that is part photon, part atom. Then two polaritons can interact with each other via their atomic component. At the edge of the cloud, the atoms remain where they are, while the photons exit, still bound together. The researchers found that this same phenomenon can occur with three photons, forming an even stronger bond than the interactions between two photons. “What was interesting was that these triplets formed at all,” Vuletic says. “It was also not known whether they would be equally, less, or more strongly bound compared with photon pairs.” The entire interaction within the atom cloud occurs over a millionth of a second. And it is this interaction that triggers photons to remain bound together, even after they’ve left the cloud. “What’s neat about this is, when photons go through the medium, anything that happens in the medium, they ‘remember’ when they get out,” Cantu says. This means that photons that have interacted with each other, in this case through an attraction between them, can be thought of as strongly correlated, or entangled — a key property for any quantum computing bit. “Photons can travel very fast over long distances, and people have been using light to transmit information, such as in optical fibers,” Vuletic says. “If photons can influence one another, then if you can entangle these photons, and we’ve done that, you can use them to distribute quantum information in an interesting and useful way.” Going forward, the team will look for ways to coerce other interactions such as repulsion, where photons may scatter off each other like billiard balls. “It’s completely novel in the sense that we don’t even know sometimes qualitatively what to expect,” Vuletic says. “With repulsion of photons, can they be such that they form a regular pattern, like a crystal of light? Or will something else happen? It’s very uncharted territory.” This research was supported in part by the National Science Foundation. Publication: Qi-Yu Liang, et al., “Observation of three-photon bound states in a quantum nonlinear medium,” Science, 16 Feb 2018: Vol. 359, Issue 6377, pp. 783-786; DOI: 10.1126/science.aao7293
<urn:uuid:aead25a3-1691-487c-a61e-84d91723824b>
CC-MAIN-2022-33
https://scitechdaily.com/mit-physicists-create-new-form-of-light-where-photons-interact/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570901.18/warc/CC-MAIN-20220809033952-20220809063952-00732.warc.gz
en
0.944356
1,676
3.5625
4
Entanglement is at the heart of quantum physics and future quantum technologies. Like other aspects of quantum science, the phenomenon of entanglement reveals itself at very tiny, subatomic scales. When two particles, such as a pair of photons or electrons, become entangled, they remain connected even when separated by vast distances. In the same way that a ballet or tango emerges from individual dancers, entanglement arises from the connection between particles. It is what scientists call an emergent property. How do scientists explain quantum entanglement? In the video below, Caltech faculty members take a stab at explaining entanglement. Featured: Rana Adhikari, professor of physics; Xie Chen, professor of theoretical physics; Manuel Endres, professor of physics and Rosenberg Scholar; and John Preskill, Richard P. Feynman Professor of Theoretical Physics, Allen V. C. Davis and Lenabelle Davis Leadership Chair, and director of the Institute for Quantum Information and Matter. When researchers study entanglement, they often use a special kind of crystal to generate two entangled particles from one. The entangled particles are then sent off to different locations. For this example, let's say the researchers want to measure the direction the particles are spinning, which can be either up or down along a given axis. Before the particles are measured, each will be in a state of superposition, or both "spin up" and "spin down" at the same time. If the researcher measures the direction of one particle's spin and then repeats the measurement on its distant, entangled partner, that researcher will always find that the pair are correlated: if one particle's spin is up, the other's will be down (the spins may instead both be up or both be down, depending on how the experiment is designed, but there will always be a correlation). Returning to our dancer metaphor, this would be like observing one dancer and finding them in a pirouette, and then automatically knowing the other dancer must also be performing a pirouette. The beauty of entanglement is that just knowing the state of one particle automatically tells you something about its companion, even when they are far apart. Are particles really connected across space? But are the particles really somehow tethered to each other across space, or is something else going on? Some scientists, including Albert Einstein in the 1930s, pointed out that the entangled particles might have always been spin up or spin down, but that this information was hidden from us until the measurements were made. Such "local hidden variable theories" argued against the mind-boggling aspect of entanglement, instead proposing that something more mundane, yet unseen, is going on. Thanks to theoretical work by John Stewart Bell in the 1960s, and experimental work done by Caltech alumnus John Clauser (BS '64) and others beginning in the 1970s, scientists have ruled out these local hidden-variable theories. A key to the researchers' success involved observing entangled particles from different angles. In the experiment mentioned above, this means that a researcher would measure their first particle as spin up, but then use a different viewing angle (or a different spin axis direction) to measure the second particle. Rather than the two particles matching up as before, the second particle would have gone back into a state of superposition and, once observed, could be either spin up or down. The choice of the viewing angle changed the outcome of the experiment, which means that there cannot be any hidden information buried inside a particle that determines its spin before it is observed. The dance of entanglement materializes not from any one particle but from the connections between them. Relativity Remains Intact A common misconception about entanglement is that the particles are communicating with each other faster than the speed of light, which would go against Einstein's special theory of relativity. Experiments have shown that this is not true, nor can quantum physics be used to send faster-than-light communications. Though scientists still debate how the seemingly bizarre phenomenon of entanglement arises, they know it is a real principle that passes test after test. In fact, while Einstein famously described entanglement as "spooky action at a distance," today's quantum scientists say there is nothing spooky about it. "It may be tempting to think that the particles are somehow communicating with each other across these great distances, but that is not the case," says Thomas Vidick, a professor of computing and mathematical sciences at Caltech. "There can be correlation without communication," and the particles "can be thought of as one object." Entanglement can also occur among hundreds, millions, and even more particles. The phenomenon is thought to take place throughout nature, among the atoms and molecules in living species and within metals and other materials. When hundreds of particles become entangled, they still act as one unified object. Like a flock of birds, the particles become a whole entity unto itself without being in direct contact with one another. Caltech scientists focus on the study of these so-called many-body entangled systems, both to understand the fundamental physics and to create and develop new quantum technologies. As John Preskill, Caltech's Richard P. Feynman Professor of Theoretical Physics, Allen V. C. Davis and Lenabelle Davis Leadership Chair, and director of the Institute for Quantum Information and Matter, says, "We are making investments in and betting on entanglement being one of the most important themes of 21st-century science." How Bell's Theorem Proved ‘Spooky Action at a Distance' Is Real
<urn:uuid:7253b586-bfe5-4adb-ad14-16d86ea1c486>
CC-MAIN-2022-33
https://scienceexchange.caltech.edu/topics/quantum-science-explained/entanglement?utm_source=caltechnews&utm_medium=web&utm_campaign=csequantum
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573699.52/warc/CC-MAIN-20220819131019-20220819161019-00133.warc.gz
en
0.933489
1,160
3.53125
4
Harvard University researchers have demonstrated the first material that can have both strongly correlated electron interactions and topological properties. It’s a discovery that not only paves the way for more stable quantum computing, but also an entirely new platform to explore the wild world of exotic physics. The research was published in Nature Physics. Topological insulators are materials that can conduct electricity on their surface or edge but not in the middle. The strange thing about these materials is that no matter how you cut them, the surface will always be conducting and the middle always insulating. These materials offer a playground for fundamental physics but are also promising for a number of applications in special types of electronics and quantum computing. Since the discovery of topological insulators, researchers around the world have been working to identify materials with these powerful properties. “A recent boom in condensed matter physics has come from discovering materials with topologically protected properties,” said Harris Pirie, a graduate student in the Department of Physics and first author of the paper. One potential material, samarium hexaboride, has been at the center of a fierce debate among condensed matter physicists for more than a decade. The central question: is it or isn’t it a topological insulator? “Over the last ten years, a bunch of papers have come out saying yes and a bunch of papers have come out saying no,” said Pirie. “The crux of the issue is that most topological materials don’t have strongly interacting electrons, meaning the electrons move too quickly to feel each other. But samarium hexaboride does, meaning that electrons inside this material slow down enough to interact strongly. In this realm, the theory gets fairly speculative and it’s been unclear whether or not it’s possible for materials with strongly interacting properties to also be topological. As experimentalists, we’ve been largely operating blind with materials like this.” In order to settle the debate and figure out, once and for all, whether or not it’s possible to have both strongly interacting and topological properties, the researchers first needed to find a well-ordered patch of samarium hexaboride surface on which to perform the experiment. It was no easy task, considering the majority of the material surface is a craggy, disordered mess. The researchers used ultra-high precision measurement tools developed in the lab of Jenny Hoffman, the Clowes Professor of Science and senior author of the paper, to find a suitable, atomic-scale patch of samarium hexaboride. Next, the team set out to determine if the material was topologically insulating by sending waves of electrons through the material and scattering them off of atomic defects — like dropping a pebble into a pond. By observing the waves, the researchers could figure out the momentum of the electrons in relation to their energy. “We found that the momentum of the electrons is directly proportional to their energy, which is the smoking gun of a topological insulator,” said Pirie. “It’s really exciting to be finally moving into this intersection of interacting physics and topological physics. We don’t know what we’ll find here.” As it relates to quantum computing, strongly interacting topological materials may be able to protect qubits from forgetting their quantum state, a process called decoherence. “If we could encode the quantum information in a topologically protected state, it is less susceptible to external noise that can accidentally switch the qubit,” said Hoffman. “Microsoft already has a large team pursuing topological quantum computation in composite materials and nanostructures. Our work demonstrates a first in a single topological material that harnesses strong electron interactions that might eventually be used for topological quantum computing.” The researchers are working on next steps for this research. “The next step will be to use the combination of topologically protected quantum states and strong interactions to engineer novel quantum states of matter, such as topological superconductors,” said Dirk Morr, Professor of Physics at University of Illinois at Chicago and the senior theorist on the paper. “Their extraordinary properties could open unprecedented possibilities for the implementation of topological quantum bits.” Yu Liu, Anjan Soumyanarayanan, Pengcheng Chen, Yang He, M. M. Yee, P. F. S. Rosa, J. D. Thompson, Dae-Jeong Kim, Z. Fisk, Xiangfeng Wang, Johnpierre Paglione, and M. H. Hamidian also worked on the study. The electronic measurements at Harvard and the samarium hexaboride crystal growth at UC Irvine were supported by the National Science Foundation. The crystal growth at University of Maryland was supported by the Gordon & Betty Moore Foundation. Magnetic measurements at Los Alamos National Lab and theoretical work at University of Illinois were supported by the Department of Energy.
<urn:uuid:bcc3576a-8c9f-486a-a7e8-a102c2499e58>
CC-MAIN-2022-33
https://thequantuminsider.com/2019/12/09/harvard-study-quantum-computing-research/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573193.35/warc/CC-MAIN-20220818094131-20220818124131-00333.warc.gz
en
0.923978
1,031
3.5
4
Sign in: Staff/Students Fuel such as petrol is made up of hydrocarbons – a family of molecules consisting entirely of carbon and hydrogen. Pigment and dye, coal and tar are made up of hydrocarbons too. These common, abundant materials, sometimes even associated with waste, are not often thought of as being electronically or magnetically interesting. But an international research team led by Professor Matthew J. Rosseinsky in the University’s Department of Chemistry and Professor Kosmas Prassides of Tohoku University in Japan has made a significant find. The team have discovered how to take such hydrocarbon molecular components, dress them with electrons, each of which carries a small compass – an unpaired spin – and pack them together like cookies in a box to create a quantum spin liquid – a long-sought hypothetical state of matter. The existence of quantum spin liquids was first theoretically proposed in 1973. In conventional magnets, the motion of the electron spins – the tiny magnets – freezes on cooling as they align parallel or antiparallel to each other. In contrast, the spins in a quantum spin liquid never stop fluctuating, randomly and strongly, even at the lowest temperature of absolute zero. Each individual spin points simultaneously along an infinite number of directions and is highly entangled with other spins, even those far away. As such, this sea of electron spins is predicted to be host to many exotic phenomena of both fundamental and technological interest. However, experimental realization of this unique fully-entangled state of matter has remained to date unfulfilled. Despite a four-decade-long search, there are very few quantum spin liquid candidates. Current options include certain copper inorganic minerals and some organic salts, which contain rare, heavy or toxic elements. In results published in two consecutive papers in the journal `Nature Chemistry’, the team came up with the new chemistry needed to make high-purity crystalline materials from the reaction of polyaromatic hydrocarbons with alkali metals for the first time. Materials obtained from polyaromatic hydrocarbons (molecules with many aromatic rings) were proposed in the past as candidates of new superconductors – materials with no electrical resistance and able to carry electricity without losing energy – devoid of toxic or rare elements. However, destruction of the molecular components in the synthetic treatments employed had inhibited any progress in this field. Professor Matthew Rosseinsky said: “It took us many years of work to achieve our breakthrough. But in the end, we succeeded in developing not one, but two complementary chemistry routes, which open the way to a rich variety of new materials with as-yet unknown properties.” Professor Kosmas Prassides said: “Removing the existing synthetic roadblock has led to very exciting developments. We have already discovered that some of the structures of the new materials – made entirely of carbon and hydrogen, the simplest possible combination – show unprecedented magnetic properties – spin liquid behaviour – with potential applications in superconductivity and quantum computing.” The Liverpool and Tohoku groups worked with teams led by Dr Ryotaro Arita at RIKEN, Japan and Professor Denis Arcon at the University of Ljubljana, Slovenia. The research was supported by the Mitsubishi Foundation, JSPS KAKENHI, JST-ERATO Isobe Degenerate p-Integration Project, the Engineering and Physical Sciences Research Council and the European Union. Part of the research was carried out at the synchrotron X-ray facilities at the ESRF (France) and Diamond Light Source. The papers `π-electron S = ½ quantum-spin-liquid state in an ionic polyaromatic hydrocarbon’ (DOI: 10.1038/NCHEM.2764) and `Redox-controlled potassium intercalation into two polyaromatic hydrocarbon solids` (DOI: 10.1038/NCHEM.2765) are both published in Nature Chemistry. Image: Diagrammatic representation of the structure of the ionic hydrocarbon discovered in this work as host of a quantum spin liquid. The left panel shows the molecular ions, which arrange in triangular vertex-sharing chains. The right panel depicts the co-existing spiral magnetic tubes. The two structural motifs interlink to give a complex packing architecture, as shown in projection in the middle panel. Each molecular ion has one spin (shown as grey arrow). The spins perpetually fluctuate down to low temperatures. The figure shows one of an infinite number of entangled spin arrangements. © 2017 Kosmas Prassides You must be logged in to post a comment. All recent news Top A-Level results for Liverpool’s specialist Maths School EVENT: University of Liverpool Industry-Chemistry Engagement Meeting ‘Molecular movies’ shed light on enzyme involved in greenhouse gas production Gallium oxide: Crystal Complexity Tamed by Machine Learning Becoming an Expert: Using data science to identify health inequalities for people with dementia Researchers have developed new understanding of gallium oxide by combining a machine-learning theoretical approach with experimental results. http://bit.ly/3PsGWXP Becoming an Expert: #PhD student James Watson (@Jmswats) is using data science to identify health inequalities for people with #dementia➡️http://bit.ly/3Pt0ifJ "We hosted what we believe may have been the first healthcare-related citizens' jury in Uganda, which aimed to garner attitudes towards the use of electronic medical data in the research context." @LivUniISMIB @IDIMakerere #Postcard #PublicEngagement https://news.liverpool.ac.uk/2022/08/12/postcard-citizens-jury-debate-ethical-use-of-electronic-health-data-in-uganda/
<urn:uuid:e1199a61-56f1-4c74-ba0e-91a961cbe265>
CC-MAIN-2022-33
https://news.liverpool.ac.uk/2017/04/24/from-sustainable-hydrocarbons-to-spin-liquids/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573849.97/warc/CC-MAIN-20220819222115-20220820012115-00734.warc.gz
en
0.908584
1,234
3.828125
4
On May 30, 2020, a SpaceX rocket carrying two American astronauts was launched from NASA’s Kennedy Space Center in Florida. The first for a commercial spacecraft, the launch marked the start of a new era of human spaceflight, one in which traveling to space is becoming more accessible. It has reignited the conversation around sending humans back to the Moon and to Mars. But exploring this new frontier comes with many new challenges, including concerns for astronaut safety. And while to some it might seem like the most dangerous part of the trip would be getting blasted off the planet on a tiny capsule driven by an enormous explosion, there is also risk once you get to space from the constant bombardment of tiny particles from outer space called cosmic rays. Cosmic rays are highly energetic charged particles—mostly protons—that are accelerated by some of the most violent objects in the universe. They are harmless to us here on Earth’s surface because we are protected by Earth’s magnetosphere, the region of space around our planet that is dominated by a system of magnetic fields; the protection even extends far enough to reach astronauts on the International Space Station. But once humans embark on interplanetary trips, Earth’s magnetosphere can no longer shield them, which means humans are exposed to dangerous levels of radiation. That’s a problem that Dr. Paolo Desiati of the Wisconsin IceCube Particle Astrophysics Center (WIPAC), a research center at the University of Wisconsin–Madison, is trying to solve. In collaboration with UW astronomy professor Elena D’Onghia and Kieran Furlong, a senior fellow at UW–Madison’s COWS thinktank, Desiati is developing a magnetic shield that will divert space and cosmic ray radiation away from a volume—functioning kind of like Earth’s magnetosphere. In addition to protecting astronauts and instrumentation from space radiation during interplanetary travel, the technology has another application: protecting quantum computers from the harmful decoherence effects induced by cosmic ray muon radiation on Earth’s surface. The second application was recently patented by the Wisconsin Alumni Research Foundation (WARF), and the team has been awarded support from the Draper Technology Innovation Fund (TIF) for the work necessary to finalize and commercialize the concept. The innovation has also attracted the attention of private companies and was discussed at the highest government levels of the US and Italy. Desiati, who has been at UW since 2001, is mostly responsible for the technical aspects of the project; for example, he performs all the calculations for the magnetic shield’s feasibility study. As the principal investigator on the Draper TIF award, he is also completing all the detailed studies of the magnetic shield for the quantum computing application. He and D’Onghia have been working on a magnetic shield for a few years now; the idea was hatched during the pair’s weekly brainstorming sessions at a Madison coffee shop. “Although the idea of protecting astronauts from space radiation is not new, we thought that this would soon be a major issue at this dawn of a new space age,” says Desiati. In 2019, they sought help from Discovery to Product (D2P), a UW–Madison research center that partners with a range of campus entities to advance entrepreneurial efforts. There, they were matched with a mentor, Kieran Furlong, and participated in innovator programs where they learned how to turn their idea into a marketable product. Furlong, a co-inventor on the WARF patent, is now helping them drive their technology toward a potential commercial application. “Connecting with D2P was the best thing we could have done, as it gave us a business and commercial perspective on the idea we were starting to work on,” says Desiati. “D2P programs, with the invaluable mentorship of Kieran, expanded our horizons to a wide spectrum of possible applications for our magnetic shielding innovation. Never would I have imagined that the fast-growing quantum computing technology would eventually need to be isolated from the cosmic ray muons to prevent them from disrupting the quantum coherence required for real world operations—and that WARF would have accepted to file a patent on this.” They have also gotten help from UW–Madison students. In fall 2020, Desiati and D’Onghia were “clients” for College of Engineering students in a freshman design course. As described in an article from the college, Desiati met weekly with the classes to teach them about cosmic rays and magnetic fields so they could prototype magnetic shielding system designs in groups and present their ideas to Desiati and D’Onghia. Throughout 2021, the two collaborators also workshopped the magnetic shield’s advanced preliminary design with mechanical engineering and aerospace engineering undergraduate seniors. Desiati says that connecting with the “talented UW engineering students” was another boost in the development of their project. Now, he and D’Onghia are seeking funding for the magnetic shield from NASA and other funding agencies. “The pandemic has given Elena and me the energy to take our idea and transform it into useful possible applications,” says Desiati. “At our academic jobs, we work on pure scientific research. This new adventure has made it possible to apply some of this knowledge to the service of humanity and technology. Let’s see how far we get. So far it has been a real blast.” Read more about the project: - UW–Madison Astronomy Department article - D2P Innovator Profile - UW–Madison College of Engineering article
<urn:uuid:eadd65cd-d1ec-45f1-b959-7d0b7eaf0dab>
CC-MAIN-2022-33
https://wipac.wisc.edu/wipac-scientist-and-collaborators-develop-magnetic-shield-to-protect-astronauts-and-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573744.90/warc/CC-MAIN-20220819161440-20220819191440-00338.warc.gz
en
0.951332
1,183
3.734375
4
Electrical currents can be now be switched on and off at the smallest conceivable scale enabling a new generation of ‘green electronics’ with the potential for great impact on the digital economy Robert Wolkow is no stranger to mastering the ultra-small and the ultra-fast. A pioneer in atomic-scale science with a Guinness World Record to boot (for a needle with a single atom at the point), Wolkow’s team, together with collaborators at the Max Plank Institute in Hamburg, have just released findings that detail how to create atomic switches for electricity, many times smaller than what is currently used. What does it all mean? With applications for practical systems like silicon semi-conductor electronics, it means smaller, more efficient, more energy-conserving computers, as just one example of the technology revolution that is unfolding right before our very eyes (if you can squint that hard). “This is the first time anyone’s seen a switching of a single-atom channel,” explains Wolkow, a physics professor at the University of Alberta and the Principal Research Officer at Canada’s National Institute for Nanotechnology. “You’ve heard of a transistor–a switch for electricity–well, our switches are almost a hundred times smaller than the smallest on the market today.” Today’s tiniest transistors operate at the 14 nanometer level, which still represents thousands of atoms. Wolkow’s and his team at the University of Alberta, NINT, and his spinoff QSi, have worked the technology down to just a few atoms. Since computers are simply a composition of many on/off switches, the findings point the way not only to ultra-efficient general purpose computing but also to a new path to quantum computing. “We’re using this technology to make ultra-green, energy-conserving general purpose computers but also to further the development of quantum computers. We are building the most energy conserving electronics ever, consuming about a thousand times less power than today’s electronics.” While the new tech is small, the potential societal, economic, and environmental impact of Wolkow’s discovery is very large. Today, our electronics consume several percent of the world’s electricity. As the size of the energy footprint of the digital economy increases, material and energy conservation is becoming increasingly important. Wolkow says there are surprising benefits to being smaller, both for normal computers, and, for quantum computers too. “Quantum systems are characterized by their delicate hold on information. They’re ever so easily perturbed. Interestingly though, the smaller the system gets, the fewer upsets.” Therefore, Wolkow explains, you can create a system that is simultaneously amazingly small, using less material and churning through less energy, while holding onto information just right. When the new technology is fully developed, it will lead to not only a smaller energy footprint but also more affordable systems for consumers. “It’s kind of amazing when everything comes together,” says Wolkow. Wolkow is one of the few people in the world talking about atom-scale manufacturing and believes we are witnessing the beginning of the revolution to come. He and his team have been working with large-scale industry leader Lockheed Martin as the entry point to the market. “It’s something you don’t even hear about yet, but atom-scale manufacturing is going to be world-changing. People think it’s not quite doable but, but we’re already making things out of atoms routinely. We aren’t doing it just because. We are doing it because the things we can make have ever more desirable properties. They’re not just smaller. They’re different and better. This is just the beginning of what will be at least a century of developments in atom-scale manufacturing, and it will be transformational.” The Latest on: Atomic-scale manufacturing via Google News The Latest on: Atomic-scale manufacturing - Chip-Scale Atomic Clock (CSAC) Market 2022, Worth USD 561 Mn by 2028 at CAGR of 8.6% – Report Spread across 74 Pageson August 3, 2022 at 9:20 pm The Chip-Scale Atomic Clock (CSAC) market report provides a detailed analysis of global market size, regional and country-level market size, segmentation market growth, market share, competitive ... - New materials research sees transformations at an atomic levelon August 3, 2022 at 7:31 am When manufacturing techniques turn metals, ceramics or composites into a technologically useful form, understanding the mechanism of the phase transformation process is essential to shape the behavior ... - Chip-Scale Atomic Clock (CSAC) Market In 2022 : Research Insights with Upcoming Trends, Opportunities, Competitive Analysis, Forecast to 2022-2028on July 31, 2022 at 5:01 pm What are the upstream raw materials and manufacturing equipment of Chip-Scale Atomic Clock (CSAC) along with the manufacturing process of Chip-Scale Atomic Clock (CSAC)? What are the key ... - Stabenow, Peters urge passage of CHIPS Acton July 26, 2022 at 6:46 pm The CHIPS Act is one of President Joe Biden’s legislative priorities, and is also championed by both Michigan senators. The bill would invest billions of dollars in boosting domestic semiconductor ... - Fundamentals underpinning future atomic and close-to-atomic scale manufacturingon July 24, 2022 at 5:00 pm Atomic and close-to-atomic scale manufacturing (ACSM) represents the processing techniques for high-end products, which requires not only the atomic-level manufacturing precision and functional ... - Researchers develop novel 3D atomic force microscopy probeson July 21, 2022 at 5:00 pm A team of researchers has developed new kind of Atomic Force Microscopy (AFM) probes in true three-dimensional shapes they call 3DTIPs. AFM technology allows scientists to observe, measure, and ... - Atomically Precise Manufacturing Nanotech Meets The Semi Worldon July 21, 2022 at 5:00 pm One new area where atomically precise manufacturing (APM ... that by the nature of things has to be the future is one with atomic precision. Down at the nanometer scale, structures are only a few or a ... - Atomic level deposition to extend Moore’s law and beyondon July 13, 2022 at 5:00 pm Finally, atomic scale resolution can be achieved by inherently ... an increasingly important role in the field of micro-nano manufacturing. The chip makers have shown strong interest in this ... - Senior ministers to retire before Victoria’s election – as it happenedon June 23, 2022 at 2:13 am Simmons said the atomic-scale circuit technology would allow ... The “exquisite precision of the device” also proved their atomic manufacturing capabilities, she said. To build the processor ... via Bing News
<urn:uuid:180e46fb-8f7e-4924-8db2-0b37582a3ef1>
CC-MAIN-2022-33
https://innovationtoronto.com/2016/10/when-it-comes-to-atomic-scale-manufacturing-less-really-is-more-and-it-will-affect-everything/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571719.48/warc/CC-MAIN-20220812140019-20220812170019-00137.warc.gz
en
0.915353
1,473
3.53125
4
Google researchers are figuring out how to study some of the weirdest theorized physics phenomena, like wormholes that link pairs of black holes, using experiments in a lab. One central question driving theoretical physics today is how to use the same theory to explain both gravity and the rules that atoms follow, called quantum mechanics. The two haven’t played nicely yet, since gravity is an incredibly weak force, so probing it at the smallest scales is effectively impossible with today’s technology. But theoretical work has demonstrated that hints of this “quantum gravity” might emerge in certain quantum systems, ones that would one day be possible to create in the lab. One such experiment proposed by Google physicists posits that a quantum state reproducible in the physics lab can be explained as information traveling through a wormhole between two black holes. “The experimental study of such situations therefore offers a path toward a deeper understanding of quantum gravity,” the authors write in the paper published on the arXiv. It seems that gravity simply refuses to cooperate with quantum mechanics, and theorists have worked hard to string the two together—yet there are places where both concepts must exist simultaneously, such as on the surface of or inside black holes and at the moment of the Big Bang. One of the most popular theories linking the two is string theory, which replaces subatomic particles with tiny strings vibrating in a higher-dimensional space. String theory exists on scales far smaller than can be probed with particle accelerators, making it hard to test. However, a two-decade-old conjecture called the AdS/CFT correspondence essentially says that you can understand the higher-dimensional gravity in this higher-dimensional world as if it were a hologram produced by quantum mechanical particles. So a team of physicists at Google, as well as CalTech, the University of Maryland, and the University of Amsterdam, think that studying extreme quantum behaviors might provide stronger evidence of string theory’s existence. Maybe quantum computers could produce string theory-probing behaviors—or wormhole-like phenomena. Among this decades’ most important physical advances has been the development of machines that control and manipulate quantum states, what we call quantum computers and quantum simulators. The smallest objects, like electrons orbiting atoms, can only take on certain values of properties, but when you’re not looking at them, they can have different values simultaneously (until you measure them, at least, when they go back to only having one value). Two or more particles can also entangle, meaning they and their properties must be described as a single mathematical object, even if you separate the atoms across space. Google’s proposal suggests creating a circuit with two sets of connected qubits, the artificial “atoms” of the quantum computer, and dividing it into a left and right group. Pulses of inputted energy do the mathematical equivalent of evolving the qubits’ state backward in time, while another pulse is used to encode a “message” by altering the lefthand atoms’ quantum states in a specific way. Another pulse then plays the role of speeding up the qubits’ behavior. Crucial to the black hole analogy, this scrambles the message among the qubits in a mathematically similar way to how information about a particle’s properties is scrambled and potentially lost upon entering a black hole. Once the information is scrambled, each qubit on the left is entangled with its mirror-image qubit on the right. Finally, after some amount of time, the message mysteriously should reappear in the righthand qubits, without requiring any decoding. “It is not at all obvious how the message made it [to the other side of the system], and the most surprising fact of all is that the simplest explanation lies in the physics of black holes,” the authors write in the paper. Essentially, the researchers think that the information traveling between groups of qubits in the system is analogous to a message entering a black hole, traveling through a wormhole, and emerging outside of a second black hole. The researchers then go on to introduce a mathematical framework for understanding what’s going on and how it serves as an analogy to a traversable wormhole that doesn’t collapse. According to the paper, there are potential setups where this system can be realized. One setup consists of arrays of atoms’ electrons are either in the lowest-energy or a very-high “Rydberg” state, controlled by laser pulses. Another is made from arrays of trapped charged ions. Either might one day be able to realize the experiment proposed by Google. Basically, scientists think they can make a quantum computer act mathematically similar to information passing between two black holes via a wormhole. No wormholes will actually be created here on Earth. This is just a model, and like other analog systems, just because the mathematical description of a lab experiment looks like someone’s theory describing space doesn’t mean that the theory is automatically correct. These models are just a way to produce stronger mathematical evidence that a theory might be correct. None of the researchers nor Google have responded to Gizmodo’s request for comment; I’ll update the post if I hear back. This work builds on research into quantum information scrambling over time, as well as connections between this scrambling and black holes. But it has physicists buzzing with excitement nonetheless. Last week, dozens of physicists met at a Google X conference to discuss how quantum technology could be useful for quantum gravity researchers. “That was quite a moment, hearing about this experiment,” Guillaume Verdon, quantum resident at the Google-founded X who was not involved in this work, told Gizmodo. Studying quantum gravity “was the dream that brought me into quantum computing.” Quantum computers that can create these wormhole-mimicking “thermofield-double” qubit states described in the paper are on the horizon, Christopher Monroe, a University of Maryland physics professor who consulted on this research, told Gizmodo. He hopes that the trapped-ion quantum computer that his group is working on could soon serve as a platform upon which to create the quantum states required to test these ideas. “Papers like this are motivating us, and giving us a push in university, company, and government labs to build these things.”
<urn:uuid:851b9c23-8b73-4fb4-901d-153c4cbd3408>
CC-MAIN-2022-33
https://gizmodo.com/google-researchers-are-studying-wormholes-with-quantum-1839984769
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571538.36/warc/CC-MAIN-20220812014923-20220812044923-00338.warc.gz
en
0.931077
1,326
4.03125
4
If computers only do what we tell them, how do they create random numbers? Random numbers are all around us, particuarly when we look at computers. Our “auto-generated” passwords, the amount of coins you win for logging in daily to your favorite game, and, of course, the =RAND() Excel function – all random. So where do these random numbers come from? Is there some magical random place within your computer? Like all things in computer (quantum computers excluded), things just don’t happen on their own. Computers do what they’re programmed to do. The same applies to random numbers. Not to burst your bubble, but those “random” numbers aren’t actually random, as we’ll see. In fact they’re made with simple algorithms you can quickly create yourself. Origins of Random Numbers To create random numbers, we typically use a Random Number Generator (RNG) (but of course we do…). The first RNG was devised by John von Neumann in the 1940’s. Many current methods still piggyback off of his initial work. Von Neumann suspected that he could start with any number he could think of (known as a seed), square it, and slice out the middle numbers. So if we started with 675248, we’d then square it to get 455959861504, we’d then slice out the middle numbers (959861) to arrive at our “random number” . From there, we could then use 959861 as our seed to repeat the process, getting 51 as our second “random” number. |Scrape out Middle as our Random Number||959861| |Set new Seed||959861| As you can see, there’s really nothing random about this method. It’s computed using a simple equation, yet it does produce values that appear random to us. Because of these two properties, we’ll call these number pseudo-random numbers (PRN). Today’s algorithms commonly utilize the same foundation, but of course have advanced significantly. Most continue to start with an initial number (seed), perform a computation, and reiterate that computation with the last result. Von Neumann’s work isn’t used today because people noticed the “random numbers” quickly start to repeat themselves in this cycle. Today’s algorithms are commonly optimized to repeat only after billions or trillions of runs. Create Your Own Random Number Generator A simple, but pretty useful random number generator is called the Linear Congruent Generator (LCG) – and yes, it sounds much worse than it really is. Like Von Neumann, you start with a seed. We then multiply the seed by a special prime number, and the perform a modulo with it and another prime number. (These prime numbers are selected to ensure they cycle repeats only after very long runs). We then plug our random number back into the system as the new seed. #Simple Linear Congruential Generator (LCG) import numpy as np def generator(seed, n,): #create an empty list to store our n random numbers array_of_rn = np.empty(n) for i,_ in enumerate(array_of_rn): random_number = np.mod((58598 * seed), 233280) #save the random number array_of_rn[i] = random_number #reset the seed to the last random number seed = random_number return array_of_rn generator(1438786,10) ## array([ 23948., 125704., 186992., 195616., 27008., 43264., 130112., 12736., 41408., 80704.]) What Happens with Bad Random Number Generators? Remember those “special prime numbers” we talked about in the last section? Yes, those really are needed. Let’s see why. Below is a plot of 100 randomly generated numbers using our algorithm above (I generated random x’s and random y’s and plotted). As you can see, everything really does look random. Now, let’s use the exact same algorithm, same seeds, same everything, except change those special prime numbers to 2 and 8. Again, we’ll generate 100 points using two lists of random numbers. No, it’s not a mistake. You only see 3 points. Why? Because without those special primes, our algorithm continually repeats itself. In this case, each cycle is 3 points and the same 3 “random numbers” appear over and over again. Hopefully you’ve learned a little about how those random-numbers you see are made. If you look close, you’ll start to see them everywhere – especially with all of those new two-factor authentication apps. Of course today’s top-of-the-line RNGs are much more complex than the simple ones we’ve covered – and will likely get even more complex with the rise of quantum computing. But for now, the underlying mechanics are the same. They’re not truly random, but they’re the best we can do for now and they generally do the trick.
<urn:uuid:60826d5f-2f86-48f5-a080-1578539b24bd>
CC-MAIN-2022-33
https://lowhangingfruitanalytics.wordpress.com/blog/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573623.4/warc/CC-MAIN-20220819035957-20220819065957-00538.warc.gz
en
0.876357
1,140
3.8125
4
By using optical equipment in a totally unexpected way, MIT researchers have created an imaging system that makes light look slow. MIT researchers have created a new imaging system that can acquire visual data at a rate of one trillion exposures per second. Thats fast enough to produce a slow-motion video of a burst of light traveling the length of a one-liter bottle, bouncing off the cap and reflecting back to the bottles bottom. Media Lab postdoc Andreas Velten, one of the systems developers, calls it the ultimate in slow motion: Theres nothing in the universe that looks fast to this camera, he says. The system relies on a recent technology called a streak camera, deployed in a totally unexpected way. The aperture of the streak camera is a narrow slit. Particles of light photons enter the camera through the slit and pass through an electric field that deflects them in a direction perpendicular to the slit. Because the electric field is changing very rapidly, it deflects late-arriving photons more than it does early-arriving ones. The image produced by the camera is thus two-dimensional, but only one of the dimensions the one corresponding to the direction of the slit is spatial. The other dimension, corresponding to the degree of deflection, is time. The image thus represents the time of arrival of photons passing through a one-dimensional slice of space. The camera was intended for use in experiments where light passes through or is emitted by a chemical sample. Since chemists are chiefly interested in the wavelengths of light that a sample absorbs, or in how the intensity of the emitted light changes over time, the fact that the camera registers only one spatial dimension is irrelevant. But its a serious drawback in a video camera. To produce their super-slow-mo videos, Velten, Media Lab Associate Professor Ramesh Raskar and Moungi Bawendi, the Lester Wolfe Professor of Chemistry, must perform the same experiment such as passing a light pulse through a bottle over and over, continually repositioning the streak camera to gradually build up a two-dimensional image. Synchronizing the camera and the laser that generates the pulse, so that the timing of every exposure is the same, requires a battery of sophisticated optical equipment and exquisite mechanical control. It takes only a nanosecond a billionth of a second for light to scatter through a bottle, but it takes about an hour to collect all the data necessary for the final video. For that reason, Raskar calls the new system the worlds slowest fastest camera. Doing the math After an hour, the researchers accumulate hundreds of thousands of data sets, each of which plots the one-dimensional positions of photons against their times of arrival. Raskar, Velten and other members of Raskars Camera Culture group at the Media Lab developed algorithms that can stitch that raw data into a set of sequential two-dimensional images. The streak camera and the laser that generates the light pulses both cutting-edge devices with a cumulative price tag of $250,000 were provided by Bawendi, a pioneer in research on quantum dots: tiny, light-emitting clusters of semiconductor particles that have potential applications in quantum computing, video-display technology, biological imaging, solar cells and a host of other areas. The trillion-frame-per-second imaging system, which the researchers have presented both at the Optical Society's Computational Optical Sensing and Imaging conference and at Siggraph, is a spinoff of another Camera Culture project, a camera that can see around corners. That camera works by bouncing light off a reflective surface say, the wall opposite a doorway and measuring the time it takes different photons to return. But while both systems use ultrashort bursts of laser light and streak cameras, the arrangement of their other optical components and their reconstruction algorithms are tailored to their disparate tasks. Because the ultrafast-imaging system requires multiple passes to produce its videos, it cant record events that arent exactly repeatable. Any practical applications will probably involve cases where the way in which light scatters or bounces around as it strikes different surfaces is itself a source of useful information. Those cases may, however, include analyses of the physical structure of both manufactured materials and biological tissues like ultrasound with light, as Raskar puts it. As a longtime camera researcher, Raskar also sees a potential application in the development of better camera flashes. An ultimate dream is, how do you create studio-like lighting from a compact flash? How can I take a portable camera that has a tiny flash and create the illusion that I have all these umbrellas, and sport lights, and so on? asks Raskar, the NEC Career Development Associate Professor of Media Arts and Sciences. With our ultrafast imaging, we can actually analyze how the photons are traveling through the world. And then we can recreate a new photo by creating the illusion that the photons started somewhere else. Its very interesting work. I am very impressed, says Nils Abramson, a professor of applied holography at Swedens Royal Institute of Technology. In the late 1970s, Abramson pioneered a technique called light-in-flight holography, which ultimately proved able to capture images of light waves at a rate of 100 billion frames per second. But as Abramson points out, his technique requires so-called coherent light, meaning that the troughs and crests of the light waves that produce the image have to line up with each other. If you happen to destroy the coherence when the light is passing through different objects, then it doesnt work, Abramson says. So I think its much better if you can use ordinary light, which Ramesh does. Indeed, Velten says, As photons bounce around in the scene or inside objects, they lose coherence. Only an incoherent detection method like ours can see those photons. And those photons, Velten says, could let researchers learn more about the material properties of the objects, about what is under their surface and about the layout of the scene. Because we can see those photons, we could use them to look inside objects for example, for medical imaging, or to identify materials. Im surprised that the method Ive been using has not been more popular, Abramson adds. Ive felt rather alone. Im very glad that someone else is doing something similar. Because I think there are many interesting things to find when you can do this sort of study of the light itself. This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.
<urn:uuid:0b7df0ed-ab55-432f-927d-38223943336a>
CC-MAIN-2022-33
https://phys.org/news/2011-12-trillion-frame-per-second-video.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573118.26/warc/CC-MAIN-20220817213446-20220818003446-00537.warc.gz
en
0.933079
1,374
3.59375
4
It is interesting to note that there are not only 5 generations of programming languages. There are also 5 generations of computers or computer technology in other words. In this article we shall be going through history to see how computers have evolved. In case you are wondering let us briefly look at what defines a computer generation in these contexts. The thing with technology is that is it ever-changing and new advancements are always being made. What differentiates one computer generation from another is some huge leap forward. In simple terms, when a huge and significant technological advancement is made, that is what ushers in the next computer generation. Table of Contents First Generation of Computers You probably have heard before that the first computers were extremely big. Well, you heard right because that was exactly the case with the first generation computer. Typically one computer could fill an entire room. Just to put things into perspective, one of the first generation computers had 20000 vacuum tubes, 70000 resistors and 10000 capacitors. All in all, that computer weighed 30 million kilograms. The working mechanism was comprised of two major components i.e. vacuum tubes and magnetic drums. The former were meant for the electric circuits that would drive the computer. The latter was the driver of the memory component of the computer. JP Eckert and JW Mauchy built the first ever computer titled the ENIAC. The acronym stood for Electronic Numeric Integrated And Calculator. Despite being very big they had very little in terms of functionality. Memory capacity was very low and overheating was a huge problem. First generation computers employed the use of low level programming languages (i.e. machine language). The first generation of computers covered a period spanning from 1940 to 1956. Example of a first generation computer is the UNIVAC, EDVAC and IBM-701. Second Generation of Computers 1956 to 1963 was the period that saw the era of second generation computers. When it comes to the second generation computer there was one huge improvement. We earlier mentioned that circuits for the first generation computer were driven by vacuum tubes. In this era the transistor came in to take the role of the vacuum tubes. This led to improvements in energy consumption and also performance in terms of speed. Overheating was still an issue here despite the improvements. Second generation computers employed the use of what was called assembly languages. In short, programming them was no longer just strictly based on binary. The use of mnemonics in programming was beginning to pick up momentum here. Examples of second generation computers are the IBM 7094, CDC 1604, and UNIVAC 1108. Third Generation of Computers This generation was from 1964 to 1971. With third generation computers came in some interesting changes. One of the key changes was the coming in of integrated circuits. It was Robert Noyce and Jack Kilby who came up with the innovation of integrated circuits during the 1950s. This was all thanks to the fact that transistors had been engineered in such a way that they could be very small. Integrated circuits are still at the core of what computers are in this day and age. Just as speed was stepped in for the second generation computers, it was further stepped up here. Size was markedly improved here because computers became way smaller in size and footprint. The present setup of the computer as it is today started taking shape during this third generation. The idea of having input and output devices along with an operating system was not available for the previous generations. This all came along for the third generation computer. Examples of third generations of computers are the IBM-360 series, IBM-370/168 and TDC-316. Fourth Generation of Computers The fourth generation entailed further advancements to the computer. Layers and layers of new innovations were being added to the previous ones to come up with a consummate computer. One of the key highlights of this generation was the coming in of the microprocessor. This played an instrumental role in further reducing the size of the computer. It is also at this time that the idea of ‘personal computer’ came to life. Earlier, computers were still expensive and out of reach of the general public. It is also during this same generation that IBM and Apple started making a name for themselves in the PCs industry. The fourth generation of computers was from 1971 to 2010. Examples of fourth generation computers are STAR 1000, IBM 4341, PDP 11 and DEC 10. Fifth Generation of Computers Fifth generation are computers that are still evolving in many ways. Bear in mind that the major highlight of the fifth generation computer is artificial intelligence (AI). It is postulated that in the near future the computer will be able to understand natural language and also to learn on its own. This will be a huge leap forward from the computer understanding binary language and being told what to do by algorithms coded by humans. Anyways, there are several highlights of the fifth generation computer. Some of them are AI (as stated earlier), advanced semiconductor technology and advanced parallel processing, more user friendly interfaces with multimedia features. This generation includes the fascinating field of quantum computing. Some of the things we just highlighted are strongly believed to be things that will be done by a quantum computer. Currently quantum computers are still evolving and also too expensive to be made available as personal computers. The fifth generation of computers spans from 1980 to present day and the future. Examples of fifth generation computers are laptop, desktop, notebook, chromebook, ultrabook etc. That is the history of computers with respect to a generation by generation look. The journey is still on as we definitely will see more advancements being made in the evolution of computers. Technologies or fields such as quantum computing and nanotechnology will bring about some amazing computers in the near future.
<urn:uuid:bc87421e-a177-432e-823d-ba0f4ae8fd51>
CC-MAIN-2022-33
https://greenthrottle.com/generations-of-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570793.14/warc/CC-MAIN-20220808092125-20220808122125-00540.warc.gz
en
0.975008
1,176
3.71875
4
Quantum entanglement is a process by which microscopic objects like electrons or atoms lose their individuality to become better coordinated with each other. Entanglement is at the heart of quantum technologies that promise large advances in computing, communications and sensing, for example detecting gravitational waves. Entangled states are famously fragile: in most cases even a tiny disturbance will undo the entanglement. For this reason, current quantum technologies take great pains to isolate the microscopic systems they work with, and typically operate at temperatures close to absolute zero. The ICFO team, in contrast, heated a collection of atoms to 450 Kelvin, millions of times hotter than most atoms used for quantum technology. Moreover, the individual atoms were anything but isolated; they collided with each other every few microseconds, and each collision set their electrons spinning in random directions. The researchers used a laser to monitor the magnetization of this hot, chaotic gas. The magnetization is caused by the spinning electrons in the atoms, and provides a way to study the effect of the collisions and to detect entanglement. What the researchers observed was an enormous number of entangled atoms - about 100 times more than ever before observed. They also saw that the entanglement is non-local - it involves atoms that are not close to each other. Between any two entangled atoms there are thousands of other atoms, many of which are entangled with still other atoms, in a giant, hot and messy entangled state. What they also saw, as Jia Kong, first author of the study, recalls, "is that if we stop the measurement, the entanglement remains for about 1 millisecond, which means that 1000 times per second a new batch of 15 trillion atoms is being entangled. And you must think that 1 ms is a very long time for the atoms, long enough for about fifty random collisions to occur. This clearly shows that the entanglement is not destroyed by these random events. This is maybe the most surprising result of the work". The observation of this hot and messy entangled state paves the way for ultra-sensitive magnetic field detection. For example, in magnetoencephalography (magnetic brain imaging), a new generation of sensors uses these same hot, high-density atomic gases to detect the magnetic fields produced by brain activity. The new results show that entanglement can improve the sensitivity of this technique, which has applications in fundamental brain science and neurosurgery. As ICREA Prof. at ICFO Morgan Mitchell states, "this result is surprising, a real departure from what everyone expects of entanglement." He adds "we hope that this kind of giant entangled state will lead to better sensor performance in applications ranging from brain imaging to self-driving cars to searches for dark matter." A Spin Singlet and QND A spin singlet is one form of entanglement where the multiple particles' spins--their intrinsic angular momentum--add up to 0, meaning the system has zero total angular momentum. In this study, the researchers applied quantum non-demolition (QND) measurement to extract the information of the spin of trillions of atoms. The technique passes laser photons with a specific energy through the gas of atoms. These photons with this precise energy do not excite the atoms but they themselves are affected by the encounter. The atoms' spins act as magnets to rotate the polarization of the light. By measuring how much the photons' polarization has changed after passing through the cloud, the researchers are able to determine the total spin of the gas of atoms. The SERF regime Current magnetometers operate in a regime that is called SERF, far away from the near absolute zero temperatures that researchers typically employ to study entangled atoms. In this regime, any atom experiences many random collisions with other neighbouring atoms, making collisions the most important effect on the state of the atom. In addition, because they are in a hot medium rather than an ultracold one, the collisions rapidly randomize the spin of the electrons in any given atom. The experiment shows, surprisingly, that this kind of disturbance does not break the entangled states, it merely passes the entanglement from one atom to another. ICFO was founded by the Government of Catalonia and the Universitat Politècnica de Catalunya (UPC), both of which are members of its board of trustees along with the Cellex and Mir-Puig Foundations, philanthropic entities that have played a critical role in the advancement of the institute. Located in the Mediterranean Technology Park in the metropolitan area of Barcelona, the institute currently hosts 400 people, organized in 25 research groups in 60 state-of-the-art research laboratories. Research lines encompass diverse areas in which photonics plays a decisive role, with an emphasis on basic and applied themes relevant to medicine and biology, advanced imaging techniques, information technologies, a range of environmental sensors, tunable and ultra-fast lasers, quantum science, photovoltaics and the properties and applications of nano-materials such as graphene, among others. In addition to two state awarded Severo Ochoa accreditations of excellence, ICFOnians have secured 15 ICREA Professorships and 37 European Research Council grants. ICFO is proactive in fostering entrepreneurial activities, spin-off creation, and creating collaborations and links between industry and ICFO researchers. To date, ICFO has helped create 7 start-up companies. Hangzhou Dianzi University is located in Hangzhou, one of the most dynamic cities in the Yang-tse River Delta area and the capital city of Zhejiang Province, one of the most prosperous provinces in China with strong economic growth, vitality and potential. Hangzhou Dianzi University (HDU) was founded in 1956. It is a comprehensive university and one of the best top 5 universities with its own distinctive features in the field of electronic science and technology, engineering and information technology as well as management and accounting,etc. HDU has over 25000 students and more than 2300 staff members. It has 21 schools and research institutes which offers 59 undergraduate programs, 93 postgraduate programs and 6 PhD programs in science, engineering, management, economics, literature, law, education and art, along with multiple interactive disciplines and specialties. HDU has successfully established partner relationships and developed many kinds of international cooperative programs with more than 90 universities and institutes all over the world, including USA, Canada, Mexico, Russia, Belarus, UK, Ireland, France, Germany, Spain, Italy, Sweden, Australia, Japan, etc.
<urn:uuid:1561b29f-7dde-45d2-b831-77053bf80562>
CC-MAIN-2022-33
https://www.eurekalert.org/news-releases/594025
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571097.39/warc/CC-MAIN-20220810010059-20220810040059-00138.warc.gz
en
0.931362
1,363
3.859375
4
One of the first electronic, programmable computers in the world is remembered today mostly by its nickname: Colossus. The fact that this moniker evokes one of the seven wonders of the ancient world is fitting both physically and conceptually. Colossus, which filled an entire room and included dinner-plate-sized pulleys that had to be loaded with tape, was built in World War II to help crack Nazi codes. Ten versions of the mammoth computer would decrypt tens of millions of characters of German messages before the war ended. Colossus was a marvel at a time when “computers” still referred to people—women, usually—rather than machines. And it is practically unrecognizable by today's computing standards, made up of thousands of vacuum tubes that contained glowing hot filaments. The machine was programmable, but not based on stored memory. Operators used switches and plugs to modify wires when they wanted to run different programs. Colossus was a beast and a capricious one at that. In the early days of computing, this was to be expected. Vacuum tubes worked in computers, but they didn’t always work very well. They took up tons of space, overheated, and burned out. The switch to transistor technology in the 1960s was revolutionary for this reason. It was the transistor that led to the creation of the integrated circuit. And it was the steady growth of transistors per unit area—doubling every two years or so for three decades—that came to be known as Moore’s Law. The switch from tubes to transistors represented a turning point in computing that—despite the huge strides since—hasn’t had a contemporary parallel until now. We are at an analogous crossroads today, a moment in which seemingly incremental and highly technical changes to computing architecture could usher in a new way of thinking about what a computer is. This particular inflection point comes as quantum computing crosses a threshold from the theoretical to the physical. Quantum computing promises processing speeds and heft that seem unimaginable by today’s standards. A working quantum computer—linked up to surveillance technology, let's say—might be able to instantly identify a single individual in real-time by combing through a database that includes billions of faces. Such a computer might also be able to simulate a complex chemical reaction, or crack through the toughest encryption tools in existence. (There’s an entire field of study dedicated to post-quantum cryptography. It’s based on writing algorithms that could withstand an attack by a quantum computer. People still aren't sure if such security is even possible, which means quantum computing could wreak havoc on global financial systems, governments, and other institutions.) It’s often said that a working quantum computer would take days to solve a problem that a classical computer would take millions of years to sort through. Now, theoretical ideas about the development of such machines—long relegated to the realm of mathematical formula—are being turned into computer chips. “As we started making these better controlled, engineered systems that do the physics as written down in the textbook, we start to engage more theorists and people who are more interested in these systems actually existing,” said Jerry Chow, the manager of the experimental quantum computing group at IBM. “It's definitely exciting because we're starting to really make systems which are of interest in terms of not only potential applications but also underlying physics.” IBM announced in April that it had figured out a critical kind of error detection by building a square lattice of four superconducting qubits—units of quantum information—on a chip roughly one-quarter-inch square. The advances the company announced represent a key step toward actually building a large-scale quantum computer, Chow told me, because it represents a physical structure that could be rebuilt bigger while keeping quantum properties in tact—one of the core challenges in quantum computing. “It's basically a primitive for this scabale architecture,” Chow said. “The idea is to continue to grow this lattice to reach the point where you can encode a perfect qubit—a pefect, logical qubit in a sea of these faulty physical qubits.” The error detection component is critical to advances in quantum computing. As Chow and his colleagues wrote of their findings in Nature, qubits are “susceptible to a much larger spectrum of errors” than classical bits. “So any way to speed this up with a protocol that can deal with errors simultaneously is likely to be a significant improvement,” said Steve Rolston, the co-director of the Joint Quantum Institute at the University of Maryland. “Almost all of the qubits in a real quantum computer are going to be there for error detection. It seems kind of crazy but it could be the case that 99 percent of the qubits that are there in a quantum computer are there for error detection and correction.” The race to build a large-scale working quantum computer has intensified in recent years—and recent months, in particular. In 2013, Google bought what it says is a quantum computer from the company D-Wave, a Canadian company which has also sold its machine to the defense contractor Lockheed Martin. (Google is also letting NASA use the D-Wave system as part of a public-private partnership.) In March of this year, Google said it had built a nine-qubit device that successfully detected one (but not both) of the key kinds of errors typical in quantum computing. After IBM's announcement that followed in April, D-Wave announced in June it had broken the 1,000-qubit barrier, a processing milestone that it said would allow “significantly more complex computational problems to be solved than was possible on any previous quantum computer.” D-Wave has a somewhat controversial history, with critics saying its claims about what its computers can do are often overstated. And yet there's no question that much has happened in the two decades since Shor's algorithm, named for the mathemetician Peter Shor, first offered a framework for quantum computing. “Peter shor came up with his algorithm in 1994,” Rolston told me. “It's been a long time now, a surprisingly long time in some ways. If you look at what's really happened in those last 20 years, mainly what people have been doing is really trying to perfect qubits and interactions with one or a handful of qubits—keeping the idea of scability in the back of their minds. There's no pont in me making a perfect qubit if I can't make hundreds, but there's also no point in desinging a hundred if I can't get one or two to behave properly.” Up until about five years ago, most quantum computing work was still being done on single-qubit level. That's rapidly changing. “The real challenge,” Chow, of IBM, said, “is how we're going to controllably put more and more of these together so we can still control what we need to but the quantum information can be protected. People say we're basically somewhere between the vacuum tube and transistor. We're still in the early days.”
<urn:uuid:a562b317-96f8-471f-b86e-9c15e7e813db>
CC-MAIN-2022-33
https://www.theatlantic.com/technology/archive/2015/07/quantum-computer-race/397181/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571538.36/warc/CC-MAIN-20220812014923-20220812044923-00343.warc.gz
en
0.969867
1,491
3.734375
4
If you have ever applied for a job before you’ve likely encountered this requirement: critical thinking skills. Throughout our day-to-day lives, we are constantly presented with choices that we need to make. Should I hit the snooze button? Should I wear a tie or not? Should I ask for a raise at work? All these choices make us stop for a moment to evaluate our options. If I hit the snooze button, then I’ll get more sleep but might be late for work. If I don’t hit the snooze button I might be tired for work, but at least I’ll show up on time. This deconstruction of weighing the pros and cons is what critical thinking is all about. According to the University of Toronto, critical thinking is the practice of using a number of different advanced thinking techniques in a variety of complex ways. Obviously, this can sound like a fairly vague definition. In its most basic sense, critical thinking involves gathering massive amounts of information, systematically analyzing that information, and making rational and informed conclusions. To go into more detail, we split critical thinking skills into three general components: - it focuses on how things are proven or presented, - it involves reflection on our decisions and the process, - and it is discipline specific. How is critical thinking different than regular thinking? To examine the difference between these two thinking techniques, we need to look at three things: - what we are focusing on, - how do we do it, - and what’s the ultimate goal. With regular thinking, we focus on the facts at hand. For example, it’s 7:30 am, I’m going to be late for work. Next, we attempt to construct relationships between different ideas and develop inferences based on those relationships. Finally, we form a plan of action for what we are thinking about. When it comes to critical thinking skills, the main idea is that the regular thinking process is undertaken in much more detail. We focus on different points of views or opinions and the merits of each. Next, we examine the relationships in depth. We must evaluate not only other people’s methods of thinking, but also our own. Finally, we use the material we have assessed to make an informed decision about what we have been thinking about, and how we thought about it. In a sense, we are thinking about thinking. Simple enough right? Well, without further ado, here are 10 sure-fire ways to improve your critical thinking skills. 1. Know what question you want to ask Before thinking about any idea critically, you want to know what question you are trying to ask. You must approach the question with an open mind and understand the reason why you want this particular problem solved. To improve your critical thinking skills, you must examine the question from a logical standpoint, not an emotional one. 2. Be self-aware One of the most important characteristics of people who think critically is that they are self-aware. They know that they aren’t always right. Critical thinkers are open to the views and opinions of others and will take their input into consideration with the same weight as their own. 3. Act with integrity Again, we are trying to improve our thinking skills, not our ability to always be right. To be a productive thinker, one must act honestly and with integrity. It’s only by acting with integrity that eventually we can come to a rational and logical conclusion. 4. Ask simple questions Going back to tip #1, the question you want to ask doesn’t need to be profoundly difficult. Does every earthly problem require a drawn out and elaborate thinking process? Sometimes when we overthink things, the original question gets lost in the quagmire. To combat this, break the overall question into smaller ones: what do I currently know about the problem? How did I come to know this information? What am I trying to solve? 5. Don’t assume things Assuming makes an *** out of you and me. You know the old saying. Even if something is globally assumed, you should question it. Way back in the day people assumed the Earth was flat. However, because critical thinkers don’t assume things, they analyzed the data and came to know that the Earth was a sphere. 6. Swap relationships For example, let’s just say that excessive video game use causes us to smoke. Instead of looking at relationships from one point of view, try swapping them. Does smoking cause excessive video game use? Although this example is merely hypothetical, switching variables in relationships allows to deconstruct these relationships and make more informed decisions. 7. Gather the information you’re presented with and evaluate it without bias Tip #2 tells us that to be a critical thinker we must be self-aware. Aware that other people’s opinions are just as important as our own. Therefore, we need to take the information they present to us and evaluate it in the same way that we evaluate our own. For example, if someone told you about the relationship between video games and smoking, you should ask yourself how they got this information and why. This is the main concept behind the media reporting on a new scientific study. Every day the media tells us that some new study shows how X causes Y. But, as all scientists know, correlation does not prove causation. We need to examine who conducted the study, how they conducted it, and why they conducted it. 8. Don’t solely rely on others Although critical thinking requires intense levels of research and analysis, don’t sell yourself short. Even if you are not an expert in the question you want answered, you should never discount your own views and ideas. Sure, you might not be an expert on Quantum Entanglement, but always include your own thoughts (however limited they may be) in the thinking process. 9. Combine all the information gathered from tips #1-#8 You’ve been open-minded, you sought others advice, you were unbiased, and you didn’t make assumptions. Now you need to combine all of this information to make a conclusion. You have all your deconstructed ideas and opinions and now need to weigh the implications of each decision. In other words, you’re examining the pros and cons of one decision vs. the other. You’ve done your research on Quantum Entanglement so now it’s time to decide if you are for it, or against it. Weigh the pros and the cons, examine the implications of your choice, and arrive at a logical conclusion. 10. Don’t try and think critically exclusively Critical thinking involves massive amounts of research, information processing, and analysis. Obviously, you can’t think this way all the time. You would never get anything done! Should you hit the snooze button? “Well, let’s examine my own rationale and the views of my co-workers, and then conduct extensive literature research on the relationship between sleeping and work productivity”. By the time you thought about this decision critically, you already missed a full day of work and the point is moot. Save your critical thinking skills for the important decisions in life. Like that honors thesis or your investment strategy. There you have it, 10 sure-fire ways to improve your critical thinking skills. When it comes to improving thinking skills, the jargon can get fairly wordy and complicated. If this all seems confusing, the best course of action would be to think critically about critical thinking! Okay, maybe that didn’t lessen the confusion. Regardless, if you want to make informed and sound decisions in life, critical thinking is your friend. It is in your best interests to learn these tips, apply them, and get thinking about thinking!
<urn:uuid:c85a94d4-d90f-4c75-982a-4c688bad8536>
CC-MAIN-2022-33
https://www.sciencelass.com/mind-and-brain/10-sure-fire-ways-to-improve-your-critical-thinking-skills/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573533.87/warc/CC-MAIN-20220818215509-20220819005509-00143.warc.gz
en
0.948203
1,667
3.671875
4
Scientists have uncovered a mathematical shortcut for calculating an all-important feature of quantum devices. Having crunched the numbers on the quantum properties of 12,000 elements and compounds, researchers have published a new equation for approximating the length of time the materials can maintain quantum information, called “coherence time.” The elegant formula allows scientists to estimate the materials’ coherence times in an instant — versus the hours or weeks it would take to calculate an exact value. “People have had to rely on complicated codes and calculations to predict spin qubit coherence times. But now people can compute the prediction by themselves instantaneously. This opens opportunities for researchers to find the next generation of qubit materials by themselves.” — Shun Kanai, Tohoku University The team, comprising scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory, the University of Chicago, Tohoku University in Japan and Ajou University in Korea, published their result in April in the Proceedings of the National Academy of Sciences. Their work is supported the Center for Novel Pathways to Quantum Coherence in Materials, an Energy Frontier Research Center funded by the U.S. Department of Energy, and by Q-NEXT, a DOE National Quantum Information Science Research Center led by Argonne. The team’s equation applies to a particular class of materials — those that can be used in devices called spin qubits. “People have had to rely on complicated codes and calculations to predict spin qubit coherence times. But now people can compute the prediction by themselves instantaneously,” said study co-author Shun Kanai of Tohoku University. “This opens opportunities for researchers to find the next generation of qubit materials by themselves.” Qubits are the fundamental unit of quantum information, the quantum version of classical computer bits. They come in different forms and varieties, including a type called the spin qubit. A spin qubit stores data in a material’s spin — a quantum property inherent in all atomic and subatomic matter, such as electrons, atoms and groups of atoms. Scientists expect that quantum technologies will be able to help improve our everyday lives. We may be able to send information over quantum communication networks that are impenetrable to hackers, or we could use quantum simulations to speed up drug delivery. The realization of this potential will depend on having qubits that are stable enough — that have long enough coherence times — to store, process and send the information. While the research team’s equation gives only a rough prediction of a material’s coherence time, it gets pretty close to the true value. And what the equation lacks in precision, it makes up for in convenience. It requires only five numbers — the values of five particular properties of the material in question — to get a solution. Plug them in, and voila! You have your coherence time. Diamond and silicon carbide are currently the best-established materials for hosting spin qubits. Now scientists can explore other candidates without having to spend days calculating whether a material is worth a deeper dive. “The equation is like a lens. It tells you, ‘Look here, look at this material — it looks promising,’” said University of Chicago Professor and Argonne senior scientist Giulia Galli, a co-author of the study and Q-NEXT collaborator. “We are after new qubit platforms, new materials. Identifying mathematical relationships like this one points out new materials to try, to combine.” With this equation in hand, the researchers plan to boost the accuracy of their model. They’ll also connect with researchers who can create the materials with the most promising coherence times, testing whether they perform as well as the equation predicts. (The team has marked one success already: A scientist outside the team reported that the relatively long coherence time of a material called calcium tungstate performed as predicted by the team’s formula.) “Our results help us with advancing current quantum information technology, but that’s not all,” said Tohoku University Professor Hideo Ohno, who is currently president of the university and paper co-author. “It will unlock new possibilities by bridging the quantum technology with a variety of conventional systems, allowing us to make even greater progress with the materials we’re already familiar with. We’re pushing more than one scientific frontier.” The other authors of the paper are F. Joseph Heremans, Argonne and UChicago; Hosung Seo, Ajou University; Gary Wolfowicz, Argonne and UChicago; Christopher P. Anderson, UChicago; Sean E. Sullivan, Argonne; Mykyta Onizhuk, UChicago; and David D. Awschalom, Argonne and UChicago. This work was supported by the Center for Novel Pathways to Quantum Coherence in Materials, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, in collaboration with the U.S. Department of Energy Office of Science National Quantum Information Science Research Centers. Q-NEXT is a U.S. Department of Energy National Quantum Information Science Research Center led by Argonne National Laboratory. Q-NEXT brings together world-class researchers from national laboratories, universities and U.S. technology companies with the single goal of developing the science and technology to control and distribute quantum information. Q-NEXT collaborators and institutions will create two national foundries for quantum materials and devices, develop networks of sensors and secure communications systems, establish simulation and network testbeds, and train a next-generation quantum-ready workforce to ensure continued U.S. scientific and economic leadership in this rapidly advancing field. For more information, visit https://www.q-next.org. Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.
<urn:uuid:f09bb08a-32cc-466d-a7dc-0ad4141e60d6>
CC-MAIN-2022-33
https://www.anl.gov/article/a-mathematical-shortcut-for-determining-quantum-information-lifetimes
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572161.46/warc/CC-MAIN-20220815054743-20220815084743-00344.warc.gz
en
0.905184
1,410
3.71875
4
The 2016 Nobel Prize in physics has been awarded to David Thouless, Duncan Haldane and Michael Kosterlitz, three theoretical physicists whose research used the unexpected mathematical lens of topology to investigate phases of matter and the transitions between them. Topology is a branch of mathematics that deals with understanding shapes of objects; it’s interested in “invariants” that don’t change when a shape is deformed, like the number of holes an object has. Physics is the study of matter and its properties. The Nobel Prize winners were the first to make the connection between these two worlds. Everyone is used to the idea that a material can take various familiar forms such as a solid, liquid or gas. But the Nobel Prize recognizes other surprising phases of matter – called topological phases – that the winners proposed theoretically and experimentalists have since explored. Topology is opening up new platforms for observing and understanding these new states of matter in many branches of physics. I work with theoretical aspects of cold atomic gases, a field which has only developed in the years since Thouless, Haldane and Kosterlitz did their groundbreaking theoretical work. Using lasers and atoms to emulate complex materials, cold atom researchers have begun to realize some of the laureates’ predictions – with the promise of much more to come. Cold atoms get us to quantum states of matter All matter is made up of building blocks, such as atoms. When many atoms come together in a material, they start to interact. As the temperature changes, the state of matter starts to change. For instance, water is a liquid until a fixed temperature, when it turns into vapor (373 degrees Kelvin; 212 degrees Fahrenheit; 100 degrees Celsius); and if you cool, solid ice forms at a fixed temperature (273K; 32℉; 0℃). The laws of physics give us a theoretical limit to how low the temperature can get. This lowest possible temperature is called absolute zero (0K) (and equals -460℉ or -273℃). Classical physics governs our everyday world. Classical physics tells us that if we cool atoms to really low temperatures, they stop their normally constant vibrating and come to a standstill. But really, as we cool atoms down to temperatures approaching close to 0K, we leave the regime of classical physics – quantum mechanics begins to govern what we see. In the quantum mechanical world, if an object’s position becomes sharply defined then its momentum becomes highly uncertain, and vice versa. Thus, if we cool atoms down, the momentum of each atom decreases, and the quantum uncertainty of its position grows. Instead of being able to pinpoint where each atom is, we can now only see a blurry space somewhere within which the atom must be. At some point, the neighboring uncertain positions of nearby atoms start overlapping and the atoms lose their individual identities. Surprisingly, the distinct atoms become a single entity, and behave as one coherent unit – a discovery that won a previous Nobel. This new, amazing way atoms organize themselves at very low temperatures results in new properties of matter; it’s no longer a classical solid in which the atoms occupy periodic well-defined positions, like eggs in a carton. Instead, the material is now in a new quantum state of matter in which each atom has become a wave with its position no longer identifiable. And yet the atoms are not moving around chaotically. Instead, they are highly coherent, with a new kind of quantum order. Just like laser beams, the coherent matter waves of superfluids, superconductors and magnets can produce interference patterns. Physicists have known about quantum order in superfluids and magnets in three dimensions since the middle of the last century. We understand that the order is lost at a critical temperature due to thermal fluctuations. But in two dimensions the situation is different. Early theoretical work showed that thermal fluctuations would destroy the quantum order even at very low temperatures. What Thouless, Haldane and Kosterlitz addressed were two important questions: What is the nature of the quantum ordered state of superfluids, superconductors and magnets in low dimensions? What is the nature of the phase transition from the ordered to the disordered state in two dimensions? Thinking about defects Kosterlitz and Thouless’s innovation was to show that topological defects – vortex and anti-vortex whirls and swirls – are crucial to understand the magnetic and superfluid states of matter in two dimensions. These defects are not just local perturbations in the quantum order; they produce a winding or circulation as one goes around it. The vorticity, which measures how many times one winds around, is measured in integer units of the circulation. Kosterlitz and Thouless showed that at low temperatures, a vortex is bound up with an anti-vortex so the order survives. As the temperature increases, these defects unbind and grow in number and that drives a transition from an ordered to a disordered state. It’s been possible to visualize the vortices in cold atomic gases that Kosterlitz and Thouless originally proposed, bringing to life the topological defects they theoretically proposed. In my own research, we’ve been able to extend these ideas to quantum phase transitions driven by increasing interactions between the atoms rather than by temperature fluctuations. Figuring out step-wise changes in materials The second part of the Nobel Prize went to Thouless and Haldane for discovering new topological states of matter and for showing how to describe them in terms of topological invariants. Physicists knew about the existence of a phenomenon called the quantum Hall effect, first observed in two dimensional electrons in semiconductors. The Hall conductance, which is the ratio of the transverse voltage and the current, was observed to change in very precise integer steps as the magnetic field was increased. This was puzzling because real materials are disordered and messy. How could something so precise be seen in experiments? It turns out that the current flows only in narrow channels at the edges and not within the bulk of the material. The number of channels is controlled by the magnetic field. Every time an additional channel or lane gets added to the highway, the conductance increase by a very precise integer step, with a precision of one part in billion. Thouless’ insight was to show that the flow of electrons at the boundaries has a topological character: the flow is not perturbed by defects – the current just bends around them and continues with its onward flow. This is similar to strong water flow in a river that bends around boulders. Thouless figured out that here was a new kind of order, represented by a topological index that counts the number of edge states at the boundary. That’s just like how the number of holes (zero in a sphere, one in a doughnut, two in glasses, three in a pretzel) define the topology of a shape and the robustness of the shape so long as it is deformed smoothly and the number of holes remains unchanged. Global, not local, properties Interacting topological states are even more remarkable and truly bizarre in that they harbor fractionalized excitations. We’re used to thinking of an electron, for instance, with its charge of e as being indivisible. But, in the presence of strong interactions, as in the fractional quantum Hall experiments, the electron indeed fractionalizes into three pieces each carrying a third of a charge! Haldane discovered a whole new paradigm: in a chain of spins with one unit of magnetic moment, the edge spins are fractionalized into units of one-half. Remarkably, the global topological properties of the chain completely determine the unusual behavior at the edges. Haldane’s remarkable predictions have been verified by experiments on solid state materials containing one-dimensional chains of magnetic ions. Topological states are new additions to the list of phases of matter, such as, solid, liquid, gas, and even superfluids, superconductors and magnets. The laureates’ ideas have opened the floodgates for prizeworthy predictions and observations of topological insulators and topological superconductors. The cold atomic gases present opportunities beyond what can be achieved in materials because of the greater variety of atomic spin states and highly tunable interactions. Beyond the rewards of untangling fascinating aspects of our physical world, this research opens the possibility of using topologically protected states for quantum computing. Now, Check Out: - Odd states of matter: how three British theorists scooped the 2016 Nobel Prize for Physics - New Breakthrough Crystal Heals Itself After Being Broken in Half - All you need for quantum computing at room temperature is some mothballs - How random is your randomness, and why does it matter? - Physicists Discover a Weird New Form of Matter
<urn:uuid:0567e3bc-a220-4156-955d-3b812d4f0166>
CC-MAIN-2022-33
http://sciencerocksmyworld.com/physicists-explore-exotic-states-of-matter-inspired-by-nobel-winning-research/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571284.54/warc/CC-MAIN-20220811103305-20220811133305-00344.warc.gz
en
0.933215
1,840
3.5
4
This artist’s representation shows an electron beam (in purple) being used to create a 2D superlattice made up of quantum dots having extraordinary atomic-scale precision and placement. Credit: Peter Allen Control is a constant challenge for materials scientists, who are always seeking the perfect material — and the perfect way of treating it — to induce exactly the right electronic or optical activity required for a given application. As electronics and the devices that incorporate them — smartphones, laptops and the like — have become smaller and smaller, the semiconductor transistors that power them have shrunk to the point of being not much larger than an atom. They can’t get much smaller. To overcome this limitation, researchers are seeking ways to harness the unique characteristics of nanoscale atomic cluster arrays — known as quantum dot superlattices — for building next generation electronics such as large-scale quantum information systems. In the quantum realm, precision is even more important. New research conducted by UC Santa Barbara’s Department of Electrical and Computer Engineering reveals a major advance in precision superlattices materials. The findings by Professor Kaustav Banerjee, his Ph.D. students Xuejun Xie, Jiahao Kang and Wei Cao, postdoctoral fellow Jae Hwan Chu and collaborators at Rice University appear in the journal Nature Scientific Reports. Their team’s research uses a focused electron beam to fabricate a large-scale quantum dot superlattice on which each quantum dot has a specific pre-determined size positioned at a precise location on an atomically thin sheet of two-dimensional (2-D) semiconductor molybdenum disulphide (MoS2). When the focused electron beam interacts with the MoS2 monolayer, it turns that area — which is on the order of a nanometer in diameter — from semiconducting to metallic. The quantum dots can be placed less than four nanometers apart, so that they become an artificial crystal — essentially a new 2-D material where the band gap can be specified to order, from 1.8 to 1.4 electron volts (eV). This is the first time that scientists have created a large-area 2-D superlattice — nanoscale atomic clusters in an ordered grid — on an atomically thin material on which both the size and location of quantum dots are precisely controlled. The process not only creates several quantum dots, but can also be applied directly to large-scale fabrication of 2-D quantum dot superlattices. “We can, therefore, change the overall properties of the 2-D crystal,” Banerjee said. Each quantum dot acts as a quantum well, where electron-hole activity occurs, and all of the dots in the grid are close enough to each other to ensure interactions. The researchers can vary the spacing and size of the dots to vary the band gap, which determines the wavelength of light it emits. “Using this technique, we can engineer the band gap to match the application,” Banerjee said. Quantum dot superlattices have been widely investigated for creating materials with tunable band gaps but all were made using “bottom-up” methods in which atoms naturally and spontaneously combine to form a macro-object. But those methods make it inherently difficult to design the lattice structure as desired and, thus, to achieve optimal performance. As an example, depending on conditions, combining carbon atoms yields only two results in the bulk (or 3-D) form: graphite or diamond. These cannot be ‘tuned’ and so cannot make anything in between. But when atoms can be precisely positioned, the material can be designed with desired characteristics. “Our approach overcomes the problems of randomness and proximity, enabling control of the band gap and all the other characteristics you might want the material to have — with a high level of precision,” Xie said. “This is a new way to make materials, and it will have many uses, particularly in quantum computing and communication applications. The dots on the superlattice are so close to each other that the electrons are coupled, an important requirement for quantum computing.” The quantum dot is theoretically an artificial “atom.” The developed technique makes such design and “tuning” possible by enabling top-down control of the size and the position of the artificial atoms at large scale. To demonstrate the level of control achieved, the authors produced an image of “UCSB” spelled out in a grid of quantum dots. By using different doses from the electron beam, they were able to cause different areas of the university’s initials to light up at different wavelengths. “When you change the dose of the electron beam, you can change the size of the quantum dot in the local region, and once you do that, you can control the band gap of the 2-D material,” Banerjee explained. “If you say you want a band gap of 1.6 eV, I can give it to you. If you want 1.5 eV, I can do that, too, starting with the same material.” This demonstration of tunable direct band gap could usher a new generation of light-emitting devices for photonics applications. Story Source: Materials provided byUniversity of Chicago Original written by Whitney Clavin.Note: Content may be edited for style and length. Xuejun Xie, Jiahao Kang, Wei Cao, Jae Hwan Chu, Yongji Gong, Pulickel M. Ajayan, Kaustav Banerjee. Designing artificial 2D crystals with site and size controlled quantum dots. Scientific Reports, 2017; 7 (1) DOI: 10.1038/s41598-017-08776-3
<urn:uuid:665e8291-46ec-4723-a167-369bfb94141d>
CC-MAIN-2022-33
https://sciencebulletin.org/band-gaps-made-to-order/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570921.9/warc/CC-MAIN-20220809094531-20220809124531-00145.warc.gz
en
0.908445
1,213
3.6875
4
Researchers made colors disappear, turned common red bricks into batteries, and granted the senses of sight and touch to a nonhuman system. Was it magic? Nope: science! Read on for this week’s coolest discoveries. What is it? “Intrinsic color” — the kind we usually perceive — is created by different wavelengths of light being absorbed by the atoms, molecules, and surface structures that make up whatever we’re looking at. But now engineers at the University of Pennsylvania have designed a “system of nanoscale semiconductor strips” that makes the intrinsic color of a material disappear. Why does it matter? According to a Penn Engineering news release, the new system — described in an article in Nature Communications — could have uses in “holographic displays and optical sensors,” and could “pave the way for new types of microlasers and detectors, fundamental elements of long-sought-after photonic computers.” How does it work? The strips take advantage of so-called structural color. One example of structural color is peacock feathers, which have no one intrinsic color. The birds’ brilliant plumage is the effect of different wavelengths reflecting off the nanoscale structures on their feathers’ surfaces, colliding and interfering to create that iridescent sheen. The Penn researchers designed their nanoscale strips from tungsten disulfide on a gold backing. At only a few dozen atoms thick, the strips “are spaced out at suboptical wavelength sizes, allowing them to give off the type of structural color” exemplified by peacock feathers. In this way the strips, which should look blue, appear to have no color at all. If biological materials like feathers can have little to no intrinsic color but appear colorful due to their nanoscale structures, lead researcher Deep Jariwala said, this study suggests the reverse is also true: “If a material does have a strong intrinsic color, we show that one can do the opposite and make it disappear with appropriate nanostructuring.” What is it? Engineers at Washington University in St. Louis devised a way to turn the common red brick — same kind as you can pick up at the hardware store — into an energy storage unit. Why does it matter? Buildings whose walls have the ability to charge a phone or a computer or supply electricity to a light fixture have an obvious appeal, and the researchers imagine their creation could be useful in, for instance, emergency lighting situations, perhaps when connected with solar cells. And as they point out in their article in Nature Communications, fired red brick is “a universal building material” whose use dates back 5,000 years. That’s a lot of potential batteries. How does it work? The Washington University team developed the energy storage device by creating “a coating of the conducting polymer PEDOT, which is comprised of nanofibers that penetrate the inner porous network of a brick; a polymer coating remains trapped in a brick and serves as an ion sponge that stores and conducts electricity,” explains chemistry professor Julio D’Arcy. What is it? Scientists at the University of Chicago’s Pritzker School of Molecular Engineering discovered a “simple modification” that enables quantum systems to operate 10,000 times longer than before. Why does it matter? Business and governments have eyed quantum computing as a way to create “virtually unhackable networks or extremely powerful computers,” even a quantum internet. But they have been held back by the fragility of quantum systems, which require extreme stability. Such systems now operate on the order of milliseconds. The U. of C. discovery points the way forward, said David Awschalom, lead author of a new study in Science: “This breakthrough lays the groundwork for exciting new avenues of research in quantum science. It enables new research opportunities previously thought impractical.” How does it work? By, essentially, “tricking” the quantum system into thinking there’s no background noise, using electromagnetic pulses in addition to a precisely tuned continuous alternating magnetic field. Postdoctoral researcher Kevin Miao said, “To get a sense of the principle, it's like sitting on a merry-go-round with people yelling all around you. When the ride is still, you can hear them perfectly, but if you're rapidly spinning, the noise blurs into a background.” What is it? Scientists at Singapore’s Nanyang Technological University combined “skin-like electronics with computer vision” into an artificial intelligence system that can recognize hand gestures. Why does it matter? The technology could have uses in surgical robots, gaming interfaces, and robot-aided workplaces. Markus Antonietti, the director of Germany’s Max Planck Institute of Colloids and Interfaces — who was not involved in the project — said in NTU’s press release that “the findings from this paper bring us another step forward to a smarter and more machine-supported world. Much like the invention of the smartphone, which has revolutionized society, this work gives us hope that we could one day physically control all of our surrounding world with great reliability and precision through a gesture.” The paper was published in Nature Electronics. How does it work? The Singaporean team’s “bio-inspired” system includes a stretchable sensor, made of single-walled carbon nanotubes, that fits over the hand, while the AI system combines three different neural network approaches: one concerning visual processing, one concerning somatosensory processing and one that fuses the two. NTU’s Chen Xiaodong, the study’s lead author, said the technology is “unique” in that it resembles “the somatosensory-visual fusion hierarchy in the brain.” What is it? Researchers at Texas A&M University are working on a new method that uses machine learning to improve the quality of low-resolution images produced by electron microscopes. Why does it matter? The technique may have solved an old problem in electron microscopy, which — as the name suggests — obtains images by means of a high-energy electron beam aimed at the sample. Higher resolution can be achieved by cranking up the energy, only this can damage the sample under examination, similar to how ultraviolet rays can damage the skin. “There's always that dilemma for scientists,” said engineering professor Yu Ding, who co-authored an article on the technique in IEEE Transactions on Image Processing. “To maintain the specimen's integrity, high-energy electron beams are used sparingly. But if one does not use energetic beams, high-resolution or the ability to see at finer scales becomes limited.” How does it work? Ding and colleagues trained a neural network on pairs of images at low and high resolutions, which enabled the AI to learn how to enhance details on other low-res images. Ding explained, “Normally, a high-energy electron beam is passed through the sample at locations where greater image resolution is desired. But with our image processing techniques, we can super-resolve an entire image by using just a few smaller-sized, high-resolution images. This method is less destructive since most parts of the specimen sample needn't be scanned with high-energy electron beams."
<urn:uuid:28f6a9db-7a4c-4ef6-81f2-aea0b1116d3f>
CC-MAIN-2022-33
https://www.ge.com/news/reports/5-coolest-things-earth-week-106
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573145.32/warc/CC-MAIN-20220818003501-20220818033501-00748.warc.gz
en
0.935005
1,537
3.703125
4
Photons can have half-integer values of angular momentum when they are confined to fewer than three dimensions. That is the conclusion of physicists in Ireland, who have revived an experiment first done in the 1830s to show that photons are not limited to having just integer values of angular momentum. The discovery could have applications in quantum computing and could also boost the capacity of optical-fibre data transmission. The angular momentum of light comes in two varieties: spin and orbital. Spin is associated with optical polarization, which is the orientation of light’s electric-field oscillations. Orbital angular momentum rotates a light beam’s wavefront around its propagation axis, giving it a corkscrew shape. Individually, the two types of angular momentum come in multiples of the reduced Planck’s constant, ħ. For spin, those multiples are either +1 or –1, while the orbital variety can take any integer value. To date, physicists have assumed that a photon’s total angular momentum is simply the sum of these two parts and that it therefore comes in integer multiples of ħ. But in the latest research, Paul Eastham of Trinity College Dublin and colleagues have shown that the total angular momentum can in fact take on half-integer values. Inspiration for the work, says Eastham, came from celebrations of the 200th anniversary of the birth of Irish mathematician William Hamilton in 2005. Hamilton and physicist Humphrey Lloyd showed, in the 1830s, that a beam of light passing through a “biaxial” crystal takes on the shape of a hollow cylinder. The void at its centre is now known to be caused by the light acquiring orbital angular momentum. The bicentennial prompted renewed interest in this effect among physicists in Ireland, says Eastham, who joined Trinity College in 2009 and then started to think about exactly how such beams behave quantum-mechanically. Eastham drew on work from the early 1980s regarding matter particles confined to two dimensions, in particular Frank Wilczek’s prediction that electrons travelling on a plane around a magnetic flux could have non-integer angular momentum. Eastham and colleagues Kyle Ballantine and John Donegan realized that a similar effect could occur within a beam of light having spin and orbital momentum. Given that Maxwell’s equations require rotational symmetry in three dimensions for the normal summing of a photon’s angular momentum, and noting that the symmetry of a beam in a biaxial crystal is limited to rotation about its axis of propagation, they worked out that the beam’s photons should have half-integer angular momentum. “The vortex of a beam with orbital angular momentum is a topological defect; it is a knot that you can’t untie,” he says. “We realized it is possible to make beams with a more complicated topological defect, where both phase and polarization vary across the beam.” To demonstrate light’s fractional angular momentum experimentally, the team shone a laser beam through a biaxial crystal preceded by a polarizer and then split the beam inside an interferometer. Employing a technique devised by Miles Padgett at the University of Glasgow in the UK, they rotated the beam in one arm of the interferometer before recombining it with the (un-rotated) beam travelling through the other arm, and then measured the output. To analyse the beam’s total angular momentum, the researchers rotated the orbital and spin components by different amounts: 180° and 90°, respectively. This enabled them to sort photons into two groups with half-integer values: those having +ħ/2 and others having –ħ/2. To make sure individual photons had angular momentum of ħ/2 – rather than half of them carrying ħ and the other half zero – they measured the beam’s “shot noise”. This noise will be lower if the quantum of angular momentum flow is smaller, which is what they observed. “In my undergraduate physics lectures I learnt that light has integer angular momentum, but we have now shown that it doesn’t have to,” says Eastham, who adds that he hopes the research will encourage others to “look more at the implications of low dimensions in optics”. He also points, somewhat tentatively, to possible applications of the work, including an optical analogue of “topological” quantum computing and a new way of exploiting angular momentum to increase bandwidth in optical-fibre communications. Michael Berry of the University of Bristol describes the demonstration as “a new wrinkle on the old subject of the angular momentum of light, supported by a clever experiment”. Padgett says that the Trinity group has provided a “lovely treatment of light transmission through biaxial crystals, particularly as regards the angular momentum content of the light”. However, he adds that it is not clear whether the new findings could be applied to fibre-based communications. The research is published in Science Advances.
<urn:uuid:15cfde22-0a67-4968-b032-667ff01d2a41>
CC-MAIN-2022-33
https://physicsworld.com/a/photons-with-half-integer-angular-momentum-are-the-latest-twist-on-light/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573533.87/warc/CC-MAIN-20220818215509-20220819005509-00147.warc.gz
en
0.923912
1,066
3.625
4
Making teleportation more energy-efficient An international team of researchers has achieved an important theoretical result by finding that quantum teleportation – the process of transporting quantum information at the speed of light, which could in theory be used to teleport macroscopic objects and, one day, even humans – can be achieved in a much more energy-efficient way than was previously thought. How teleportation works For the best part of the twentieth century, teleportation was dismissed as purely as a science fiction pipe dream. The problem lay in the approach: the only possible way to achieve it, scientists thought, would be to measure the position and momentum of every single atom of the object to be teleported, send it to its destination using classical (non-quantum) information, and finally rebuild it based on the set of "instructions" received. But science says that the first step – the perfect measurement of a particle – is simply impossible due to Heisenberg's uncertainty principle.In 1993, however, researchers showed that teleportation was indeed possible in principle, as long as the original object is destroyed in the process. The mechanism circumvents Heisenberg's uncertainty principle by exploiting one of the many quirks of quantum mechanics – a phenomenon called "quantum entanglement". Entanglement happens when a pair of particles, such as electrons and protons, are intrinsically bound together. Once entanglement is achieved, the two particles will maintain synchronization, whether they are next to each other or on opposite sides of the Universe. As long as the entangled state is maintained, if one particle changes its state, the other will instantaneously do so as well. As you might expect, the theory is quite hard to get one's head around, but let's give it a shot. Imagine that we have an object "A" that we want to teleport. We also have "B" and "C", which are entangled with each other, but not with A. Now let's transport object B to the sending station right next to A, and object C to the receiving station. Back in 1993, scientists found that they could scan A and B together, extracting partial information from A. Scanning scrambles the quantum states of both A and B, and because B and C are entangled, all the remaining information from A is instantly transmitted to C. Using lasers, fiber optics or any other traditional means of communication, the sending station can then send the partial information it had gathered about A to the receiving station. Now all the information about A is at the receiving station, and object C can be reassembled as a perfect copy of the original. Object A is destroyed in the process – hence we have teleportation, and not replication. One of the prerequisites for teleportation is that B and C must first have interacted closely to create an entangled state, and then must be able to be transported to their final destinations. This means that we can teleport objects to places we've been before but not, say, to a galaxy or planet that we've never visited. As already mentioned, the system works because B and C are entangled. But there's a problem: over time, as objects are teleported, the entangled state is slowly depleted. It can be renewed by having B and C interact closely again, but this means transporting manually (without teleportation) both objects to the same place, and then back again to the sending and receiving stations. The idea is that one difficult journey can allow for many quick transfers in the future. Five years ago, physicists came up with an alternative approach to teleportation that is faster because it doesn't require the correction of C, but which is highly impractical because the entangled state is destroyed every single time that information is teleported. In both cases, entanglement can be effectively thought of as the "fuel" that powers teleportation. Now, a group of physicists at Cambridge, University College London and the University of Gdansk have worked out how entanglement could be "recycled" to increase the efficiency of these connections. They have developed two protocols that generalize the two known methods of quantum teleportation and provide an optimal solution in which the entangled state holds much longer for the teleportation of multiple objects, while eliminating the need for error correction.The first of these protocols can be used to teleport quantum states sequentially, while the second makes it possible to teleport several states at the same time, which speeds up the process and is of particular interest for applications in quantum computing. The result obtained by the researchers is purely theoretical and didn't involve any quantum information actually being teleported from one place to another. But interest in quantum teleportation is quickly surging, and labs around the world are racing to demonstrate the ability to teleport information at longer and longer distances – last year, for instance, scientists reported teleporting photons over a record 143 km (89 miles) – so it might not be long until this theoretical result is actually put into practice. But wait – didn't we say that distance shouldn't matter at all when two particles are entangled? While it is true that two particles remain entangled regardless of their distance, for the time being, we are only able to store the entangled state for a very short period of time. This means that, in practice, scientists must create an entangled state between particles B and C and then rush them to the sending and receiving stations as quickly as possible, before the entangled state is depleted. During the transmission, photon losses and signal decoherence also increase with distance, which makes things considerably worse – although scientists are actively tackling the problem. Beam me up, Scotty So will the teleportation of people ever be feasible? Last November, a group of Chinese scientists have managed to achieve teleportation from one macroscopic object to another – an ensemble of 100 million rubidium atoms – with an accuracy approaching 90 percent. The human body, on the other hand, is comprised of some 1029 matter particles, all of which would have to be teleported with an extreme degree of precision. There are other obstacles as well. As mentioned before, the object (or, in this case, person) being teleported will be destroyed at the sending station and reassembled at the receiving station. This could be painful for the traveler; however, the surviving copy is made before the original was destroyed, and so, from the point of view of our traveler – assuming that the traveler's conscience is transported with him – one could argue that no pain would ever be felt. Moreover, a human traveler is not a static system, and so the process of scanning and reconstructing him or her must be nearly instantaneous – lest we end up with a teleported version of our telenaut that is dramatically different from the original. One last consideration. At first, it would seem that quantum entanglement could hold the potential for travel at superluminal speeds: when two particles are entangled, no matter their distance, when we modify one particle, we also instantaneously modify the other. Unfortunately, all modern interpretations of quantum mechanics agree that this trick can't be used for faster-than-light travel. Nobody expects to achieve human teleportation in the foreseeable future: it is an extraordinarily tough engineering problem, and even though the process wouldn't violate any fundamental law of physics, we lack the technology to achieve it – or anything even remotely close to it. In a sense, this piece of research could be seen as a small step toward human teleportation, but don't hold your breath for Star Trek-style teleporters just yet. The study was published on the journal Physical Review Letters. An open-access version can be found here.
<urn:uuid:f2eb5bab-fe47-4aa1-a5df-a12d0daf0623>
CC-MAIN-2022-33
https://newatlas.com/energy-efficient-quantum-teleportation/25886/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572870.85/warc/CC-MAIN-20220817062258-20220817092258-00547.warc.gz
en
0.95334
1,560
3.78125
4
Quantum sensing exploits the properties of quantum mechanics to develop ultra-sensitive technology that can detect changes in electric and magnetic fields, and motion. Image Credit: SkillUp/Shutterstock.com A quantum object is characterized by its quantum mechanical behavior and properties. For example, the energy levels of a quantum object are quantized. This can be electronic, magnetic, or vibrational levels of atoms or molecules or spin states in superconductors. Another quantum characteristic is quantum coherence. This describes the ability of the quantum states to maintain their wave-like superposition over time, withstanding any environmental interference. Quantum entanglement is also a quantum mechanical feature that describes a quantum object. Entanglement refers to generating two or more entangled particles that have identical quantum characteristics regardless of the distance between them. What is Quantum Sensing? Quantum sensing is achieved when a quantum object is used to measure a physical quantity. Any of the quantum properties described above can be implemented for detection. Changes in a physical quantity can be precisely measured by quantum coherence, quantum entanglement, or quantum states. The physical parameter that a quantum sensor responds to will determine the type of quantum technology platform required. For example, trapped ions are sensitive to electric fields and will be an ideal probe for electric field detection. Spin-based quantum sensors respond primarily to magnetic fields. Some of the different quantum technology platforms and their applications in sensing are described below. Spin properties of neutral alkali atoms in their ground state are used in quantum sensing. The requisite conditions required for sensing can be prepared and read out by lasers. A thermal vapor of atoms at room temperature can be used as a magnetic probe. The Zeeman splitting of the atomic energy levels is used to detect weak magnetic fields. Magnetoencephalography (MEG) is a medical testing method that uses atomic vapor to measure magnetic fields produced by the brain's neural activity. In high-energy physics, atomic vapor-based sensing promises to enhance the detection of elementary particles. Laser-cooled atoms that free-fall inside a vacuum tube are used in gravimetry. The matter-wave property of quantum particles is used to calculate acceleration by atom interferometry. The free-falling atoms are probed by lasers and the phase shift in the laser beam caused by the atoms is measured. Gravimeters have the ability to detect gravity at a given location with very high sensitivity. An application where a gravity sensor has major implications is in construction projects. Infrastructure development is often delayed and costly because of unforeseen hidden features underground. Quantum gravimeters can detect risks early and assist in mitigating problems like sinkholes and mine shafts. Gravimeters can also be used to detect minerals and oils deep underground. An accelerometer uses the same concept as a quantum gravimeter, for navigation. The ability to track minute changes in acceleration can provide information about the terrain and the environment. Quantum navigators do not rely on Global Positioning Systems (GPS) to steer towards a target. Rydberg atoms are atoms that have absorbed energy to excite an electron to a higher, outer energy level. When the electron moves further from the nucleus of the atom, the strength of the atom’s polarization increases. This quality of Rydberg atoms makes them ideal quantum sensors for electric fields. Rydberg atoms have been successfully used as single microwave photon detectors. Rydberg atoms are also a popular candidate to simulate condensed matter systems due to their long-range interactions. Atomic clocks use very insensitive electronic transitions in specific atoms to keep time with extreme accuracy. Optical clocks are used as the absolute frequency reference and have a significant impact in any application where timekeeping is essential. For example, in GPS, for high-speed broadband communications, and in the development of autonomous vehicles. Electrical charge atomic ions trapped in eclectic or magnetic fields are also employed as quantum sensors. Laser-cooled motional states of trapped ions are extremely sensitive to electric fields and forces. Some advanced applications of trapped ions include ultrasensitive force microscopy, and detecting weak electric field noise above surfaces induced by absorbents. Trapped ions are also being explored as atomic clocks and as Rydberg ions. In the field of optomechanics, quantized mechanical vibrations coupled to light can detect weak forces. Apart from force measurements, optomechanical sensing applications include acceleration, magnetic fields, voltages, masses and spins. Quantum sensing is also achieved with photons, which are fundamental particles of light. Squeezed light, which produces partially entangled photons with quantum fluctuations below the shot noise limit, is used for extremely sensitive sensing applications. For example, the Laser Interferometer Gravitational-Wave Observatory (LIGO), employs squeezed light to detect gravitational waves. Nuclear magnetic resonance (NMR) Nuclear magnetic resonance (NMR) uses intrinsic spin properties of atomic nuclei to detect weak magnetic fields. NMR is one of the earliest quantum sensors to be commercialized. They have broad applications in clinical magnetic resonance imaging (MRI), geological and archaeological surveys, and space missions. NMR devices are sturdy and easy to operate. Defects in Diamond Color centers in diamond is another magnetic quantum sensor that has gained a wide range of applicability over the last decade. Electronic defects, fabricated in diamond crystals can be operated at room temperature with low-cost laser sources. Defects can be synthesized by injecting nitrogen, silicon, germanium, and other atoms into the diamond lattice. Microscopic mapping of magnetic fields enabled by nitrogen-vacancy centers in diamond (NV center) has led to imaging of magnetic organelles in bacteria, microscopic responses in meteorites as well Covid-19 diagnosis devices. The Superconducting Quantum Interference Device (SQUIDs) is a very sensitive magnetometer. Built with superconducting interferometers, SQUIDs are one of the oldest quantum sensors. SQUIDs have been successfully used for materials characterization and clinical magnetoencephalography. Quantum sensing has significantly advanced sensing technology in the last few years as highlighted in the examples above. With many government entities and private sectors accelerating quantum technology research and development, applications of quantum sensing will broaden and mature in the future. Other quantum mechanics-based device explorations in computing, simulation, and communications will have a profound impact on the growth of quantum sensing. More from AZoQuantum: What is Quantum Chemistry? References and Further Reading C. L. Degen, F. Reinhard, and P. Cappellaro, Quantum sensing, Rev. Mod. Phys. 89, 035002 – Published 25 July 2017 DOI:https://doi.org/10.1103/RevModPhys.89.035002 Mahiro Abe et al, Matter-wave Atomic Gradiometer Interferometric Sensor (MAGIS-100), Quantum Sci. Technol. 6 044003, 2021 https://doi.org/10.1088/2058-9565/abf719 Barzanjeh, S., Xuereb, A., Gröblacher, S. et al. Optomechanics for quantum technologies.Nat. Phys. 18, 15–24 (2022). https://doi.org/10.1038/s41567-021-01402-0
<urn:uuid:174d06e7-7863-4e29-bb91-665da622dd78>
CC-MAIN-2022-33
https://www.azoquantum.com/Article.aspx?ArticleID=324
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573242.55/warc/CC-MAIN-20220818154820-20220818184820-00748.warc.gz
en
0.888642
1,530
3.609375
4
Teleportation is the transfer of matter or energy from one location to another without either of them crossing the distance in the traditional physical sense. When Captain James T. Kirk of the "Star Trek" TV series and movies first told Starship Enterprise engineer, Montgomery "Scotty" Scott to "beam me up" in 1967, little did the actors know that by 1993, IBM scientist Charles H. Bennett and colleagues would propose a scientific theory that suggested the real-life possibility of teleportation. By 1998, teleportation became reality when physicists at the California Institute of Technology quantum-teleported a particle of light from one location to another in a lab without it physically crossing the distance between the two locations. While some similarities do exist between science fiction and science fact, the teleportation in the real world differs greatly from its fictional roots. Teleportation Roots: Quantum Physics and Mechanics The branch of science that led to that first teleportation in 1998 gets its roots from the father of quantum mechanics, German physicist Max Planck. His work in 1900 and 1905 in thermodynamics led him to the discovery of distinct packets of energy he called "quanta." In his theory, now known as Planck's constant, he developed a formula that describes how quanta, at a subatomic level, perform as both particles and waves. Many rules and principles in quantum mechanics at the macroscopic level describe these two types of occurrences: the dual existence of waves and particles. Particles, being localized experiences, convey both mass and energy in movement. Waves, representing delocalized events, spread across space-time, such as light waves in the electromagnetic spectrum, and carry energy but not mass as they move. For example, the balls on a pool table – objects that you can touch – behave like particles, while ripples on a pond behave like waves where there is "no net transport of water: hence no net transport of mass," writes Stephen Jenkins, physics professor at the University of Exeter in the U.K. Fundamental Rule: Heisenberg's Uncertainty Principle One fundamental rule of the universe, developed by Werner Heisenberg in 1927, now known as Heisenberg's uncertainty principle, says that there exists an intrinsic doubt affiliated with knowing the exact location and thrust of any individual particle. The more you can measure one of the particle's attributes, such as thrust, the more unclear the information about the particle's location becomes. In other words, the principle says you can't know both states of the particle at the same time, much less know the multiple states of many particles at once. On its own, Heisenberg's uncertainty principle makes the idea of teleportation impossible. But this is where quantum mechanics gets weird, and it's due to physicist Erwin Schrödinger's study of quantum entanglement. Spooky Action at a Distance and Schrödinger's Cat When summarized in the simplest of terms, quantum entanglement, which Einstein called "spooky action at a distance," essentially says that measurement of one entangled particle affects the measurement of the second entangled particle even if there's a wide distance between the two particles. Schrödinger described this phenomenon in 1935 as a "departure from classical lines of thought" and published it in a two-part paper in which he called the theory "Verschränkung," or entanglement. In that paper, in which he also spoke of his paradoxical cat – alive and dead at the same time until observation collapsed the existence of the cat's state into it being either dead or alive – Schrödinger suggested that when two separate quantum systems become entangled or quantumly linked because of a previous encounter, an explanation of the features of one quantum system or state is not possible if it does not include the characteristics of the other system, no matter the spatial distance between the two systems. Quantum entanglement forms the basis of quantum teleportation experiments scientists conduct today. Quantum Teleportation and Science Fiction Teleportation by scientists today relies upon quantum entanglement, so that what happens to one particle happens to the other instantaneously. Unlike science fiction, it doesn't involve physically scanning an object or a person and transmitting it to another location, because it's currently impossible to create a precise quantum copy of the original object or person without destroying the original. Instead, quantum teleportation represents moving a quantum state (like information) from one atom to a different atom across a considerable difference. Scientific teams from the University of Michigan and the Joint Quantum Institute at the University of Maryland reported in 2009 that they successfully completed this particular experiment. In their experiment, information from one atom moved to another a meter apart. Scientists held each atom in separate enclosures during the experiment. What the Future Holds for Teleportation While the idea of transporting a person or an object from the Earth to a distant location in space remains in the realm of science fiction for the moment, quantum teleportation of data from one atom to another has potential for applications in multiple arenas: computers, cybersecurity, the Internet and more. Basically any system that relies on transmitting data from one location to another could see data transmissions occur much faster than people can begin to imagine. When quantum teleportation results in data moving from one location to another without any time lapse because of superposition – the data existing in both the dual states of both 0 and 1 in a computer's binary system until measurement collapses the state into 0 or 1 – data moves faster than the speed of light. When this happens, computer technology will undergo a whole new revolution. - Analog: Science Fiction and Fact Magazine: All About Teleportation - Law and Business Review of the Americas: Telephonic Credit: The Next Generation of Branchless Banking in Mexico - California Institute of Technology: Caltech Physicists Achieve First Bona-fide Quantum Teleportation - University of Nebraska: Some Basic Ideas About Quantum Mechanics - University of Pittsburgh: Einstein on the Completeness of Quantum Theory - Stanford Encyclopedia of Philosophy: Quantum Entanglement and Information - University of Maryland: Joint Quantum Institute: First Teleportation Between Distant Atoms - Florida State University: Max Planck About the Author As a journalist and editor for several years, Laurie Brenner has covered many topics in her writings, but science is one of her first loves. Her stint as Manager of the California State Mining and Mineral Museum in California's gold country served to deepen her interest in science which she now fulfills by writing for online science websites. Brenner is also a published sci-fi author. She graduated from San Diego's Coleman College in 1972.
<urn:uuid:bd64ba41-33f6-4bee-930a-47705487bbb0>
CC-MAIN-2022-33
https://sciencing.com/is-teleportation-possible-in-real-life-13711526.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571086.77/warc/CC-MAIN-20220809185452-20220809215452-00748.warc.gz
en
0.92757
1,351
3.765625
4
The years 2020 and 2021 have changed the way we look at our workplaces. Digital and remote work has become much more prominent while the concept of physical workplaces has started to demand a rethink. It was said that major pipelines of the economy would migrate to the digital environment and the need for Artificial Intelligence and related technology would become much more important than ever before. As we stroll along a technological roadmap, the need as well as the demand for AI courses increases. When we look at applied AI courses, Google reviews suggest that such courses would reach the pinnacle of their popularity in the next five years. That said, artificial intelligence has reinvented itself as a technology in the post-pandemic era. It is now used for vaccine trials as well as vaccine development and treatment of various diseases and ailments. Autonomous driving or self-driving cars that ferry only a single passenger due to covid 19 restrictions and advanced chatbot technology that caters to the grievances of customers are recent highlights of AI technology. Let us understand such post-pandemic developments in artificial intelligence in some more detail. How did AI fast-tracked the development of new vaccines? Vaccine development is a very long process and takes many years to complete. There are usually three stages to vaccine development. Each stage takes no less than a year to get completed. However, with the help of artificial intelligence technology, we were able to analyse large data sets about Coronavirus from different countries of the world. With the help of artificial intelligence models, it also became possible to fast-track the process of data examination and vaccine trials from different countries of the world. In addition to this, artificial intelligence models also made the analysis of the sub-components or the proteins of the virus possible in a short span of time. The application of artificial intelligence technology in the genetic domain made it possible to create vaccines within one year of the first reported case of Coronavirus. When we look at the technical aspects of artificial intelligence technology, we understand the relevance of the linear fold AI algorithm that proved very handy for medical teams around the globe to examine the sequence of ribonucleic acids. The linear fold algorithm made it possible to examine and predict the secondary structure of the ribonucleic acids as well as the possible mutation that it undergoes. With the help of this algorithm, we were also able to predict the human immune response that would be generated on exposure of the human body to the inactivated virus. This reduced the time span between the development of the virus and its approval by the regulating bodies. How self-driving cars become the new normal for ferrying passengers in covid restrictions? Although the technology of autonomous vehicles was already under development for the last five years, it found a great fit with the situation created by the Covid 19 virus. Passengers needed to be ferried from one place to another and driverless cars proved to be the perfect mode of transportation for doing this without any chance of infection from another person. When we look at the technical aspects of driverless cars, we find that artificial intelligence has been able to conceive a reinforcement learning system within the vehicle that is able to learn from the environment so that the driving experience can be improvised in the long run. With the help of artificial intelligence, the safety aspects of the vehicle have also been addressed. It has become possible to connect the driverless car with the internet of things as well as satellite technology so that multiple safety levels can be created. The vehicle can also make a sense of the traffic up to a few kilometres and plan the ride accordingly. It is also possible to create a safety factor on the upper driving limit of the vehicle. In addition to this, new innovations in the form of a 5G remote driving service are in the final stage of testing. Furthermore, we have also seen the commercialisation of the technology of self-driving vehicles in Singapore as well as China. The Apollo go Robo taxi service has been launched in several cities in China and trial operations have concluded successfully. This is a positive sign for conceiving a full-fledged fleet of Robo taxis in the time to come. How has the advancement in chatbot technology led to an effective grievance redressal mechanism? In the post-pandemic era, there has been a renewed impetus for chatbot technology. The artificial intelligence technology that operates behind a chatbot is natural language processing. With the help of Natural Language Processing, we are able to analyse the various aspects of human language like intent as well as emotions. In addition to this, natural language processing technology is able to power the most sophisticated chatbots that can be used for communication with humans through digital channels. This technology is extremely important in various industries that interact with customers through a digital interface. The business process outsourcing industry, as well as the telecommunication industry, are the most important industries where chatbot technology finds its application. Since the Covid 19 virus, there have been constant innovations in chatbot technology as well as Natural Language Processing. The aim is to conceive the next generation of chatbots and virtual assistants that can communicate with humans by understanding sentiments, emotions and even linguistic patterns. One of the most important breakthroughs has come in the form of a novel framework for natural language generation called ERNIE-GEN. With the help of semantic modelling techniques that are used by ERNIE-GEN, it has become possible to maintain a human-like flow in dialogue engagement and question generation. How has the field of quantum computing witnessed significant advances with AI technology? Quantum computing has been our answer to the most complex computational tasks and the derivation of their solutions in a short span of time. This technology makes use of qubits that can simultaneously hold both values of zero and one and is a huge advancement over the erstwhile binary technology that we followed in computing operations. With the help of quantum computing technology, we have been able to process the largest amount of information possible and run various cloud processes simultaneously in real-time. Deep learning algorithms have played a great role in the advancement of quantum computing research. The next level of quantum computing would become possible once this technology is integrated with artificial intelligence. One example of this is the launch of paddle quantum which allows researchers to train Quantum neural networks with a lot of ease. In addition to this, we may witness further development in the field of Quantum computing as researchers line up to launch Quantum leaf. Quantum leaf provides such a type of development toolkit to researchers that can enable them to work in cloud-based Quantum computing ecosystems and also reduce the time span of quantum programming. Further innovation would be seen in the form of artificial intelligence devices like artificial intelligence chips that would be designed to perform specific tasks. A large number of companies have already started to make major breakthroughs in AI technology and further innovations and developments are in the pipeline.
<urn:uuid:507dc0bb-2f90-40ad-a126-8b5bc3474913>
CC-MAIN-2022-33
https://writfy.com/mega-developments-and-breakthroughs-in-ai-in-the-post-pandemic-era/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572127.33/warc/CC-MAIN-20220815024523-20220815054523-00150.warc.gz
en
0.964985
1,395
3.546875
4
In 2020, a star system close to Earth made headlines when it was discovered to host a black hole, but further investigations have revealed something much rarer dwelling in the system. Image Credit: alionaprof/Shutterstock.com Back in 2020, astronomers believed they had made a startling discovery lurking in Earth’s cosmic backyard. Using the FEROS spectrograph on the MPG/ESO 2.2 meter telescope located at the La Silla observatory in the Chilean desert, a team including researchers from the European Southern Observatory (ESO) believed they had found a black hole in the system HR 6819, located within the Milky Way. At a distance of just 1000 light-years from Earth, this would have made the black hole in HR 6819 the closest such object to Earth. So close in fact, that its host system can be seen with the naked eye. It would have also meant that the star system HR 6819 was a triple system consisting of two stars and a black hole. The discovery of the triple system with two stars and the invisible black hole was a complete surprise to astronomers when the research was published in a paper in the journal Astronomy & Astrophysics. However, not everyone was completely satisfied with the result. Another team of researchers set about testing if HR 6819 was indeed a triple system with a black hole. This team was joined by the initial team of astronomers keen to challenge their own results in what would prove to be a validation of the never-ending curiosity and drive for answers in science and those that practice it. Indeed, these researchers would discover that HR 6819’s black hole is absent. It was never really there at all. But, in its place is something rarer, a most extraordinary cosmic vampire — a star bloated after feeding from its companion. Astronomers Go Vampire Hunting Upon the close study of the HR 6819 system in 2020, astronomers were left with two competing and contradictory theories. There were two sources of light in the system orbiting each other Because this implies a third, dark body, purely supplying a source of gravity for one of the two luminous stars to orbit around — namely a black hole — authors of the Astronomy & Astrophysics paper and ESO astronomers Thomas Rivinius and Dietrich Baade believed what they had discovered was a triple system with a black hole. However, the researchers couldn’t rule out a binary system with an unusual star caught in a very brief, and thus extremely rare, phase after an interaction with its companion stripped it of stellar material. If the black hole scenario were true, the stars in the triple system version of HR 6819 should be far apart, whereas if the rare binary scenario was correct, the two stars should be closer together with no invisible interloper between them. Thus, the key to solving this conundrum was obtaining a clearer picture of HR 6819. Joined by ESO fellow Julia Bodensteiner and her team, the original researchers set about doing this by studying the mysterious system with ESO’s Very Large Telescope (VLT) and Very Large Telescope Interferometer (VLTI). Using these instruments, the team was able to determine that the stars are close together, orbiting each other in just a 40-day orbit period. This may initially seem disappointing, but the lack of a black hole makes HR 6819 no less fascinating. In fact, the astronomers discovered that one of the stars must have recently ‘fed’ upon the stellar material of the other, causing it to lose a large amount of mass — almost all of it in fact. A Rare And Short-Lived Cosmic Vampire Observing stars stripping material from a companion donor star is not uncommon, but the stage that follows this mass transfer is much tougher to spot. Following the loss of much of its material, the donor star quickly shrinks to become a very small and hot subdwarf. What the team observed in HR 6819 is a star that has lost a great deal of material but has yet to shrink to this state. This means that the mass loss must have occurred in the system’s recent history. This presents a unique opportunity for astronomers - a short window of time to observe the inner layers of a star after the outer layers have been stripped away. This short-lived and rare phase could reveal the history of the system and help researchers better understand what happens when one star feeds upon another. According to a recent and updated paper published in Astronomy & Astrophysics, all this means that the current best estimation of HD 6819 is that it is a binary system with no black hole in which two stars have interacted, with the stripping of mass from one star speeding it up like a spinning top. Astronomers have caught these stars in a rare phase of existence, meaning that not only could further investigations reveal more secrets buried in the binary, but could teach us about the evolution of binary stars. More from AZoQuantum: The Role of Diamonds in Quantum Computing References and Further Reading Rivinius. T.H, Baade. D, Hadrava. P, Heida. M, Klement. R, , ‘A naked-eye triple system with a non-accreting black hole in the inner binary,’ Astronomy and Astrophysics, [DOI: 10.1051/0004–6361/202038020] Frost. A.J., Bodensteiner. J., Baade. D., et al, , ‘HR 6819 is a binary system with no black hole,’ Astronomy & Astrophysics, https://www.aanda.org/articles/aa/full_html/2022/03/aa43004-21/aa43004-21.html
<urn:uuid:e766577b-494a-4650-95e9-38fa1c4ca0c0>
CC-MAIN-2022-33
https://www.azoquantum.com/Article.aspx?ArticleID=327
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573242.55/warc/CC-MAIN-20220818154820-20220818184820-00751.warc.gz
en
0.94774
1,213
3.84375
4
Graphene is the super substance that could replace silicon, plastic and glass By Marco Chiappetta The silicon, plastic, and glass that make up much of our tech these days could soon be replaced with something old, yet completely new: Graphene. If graphene sounds like something that could fell a superhero, you’re almost right. It’s the thinnest substance known to science, yet it’s 300 times stronger than steel and harder than a diamond. High-quality graphene is also transparent and flexible, and it’s an excellent conductor of heat and electricity. We’ve known of graphene’s existence since the mid-1800s, but scientists have been able to experiment with graphene only in the past decade. In 2004, two researchers at the University of Manchester isolated graphene for the very first time, using—believe it or not—a chunk of graphite and a roll of adhesive tape. So what exactly is graphene? Graphene is a crystalline structure composed entirely of carbon atoms, arranged in a hexagonal, honeycomb-like pattern. Graphene’s single-atom thinness (meaning it has length and width, but no height) makes it as close to 2D as any substance can be. Graphene is also a fundamental component of other allotropes (structurally different forms of the element carbon). These include charcoal, carbon nanotubes, and other fullerenes (molecules composed solely of carbon). It is graphene’s unique structure and composition that endows it with so many valuable properties. Carbon atoms have four electrons in their outer shell, three of which form strong covalent bonds with the electrons in neighboring carbon atoms. This gives graphene its signature hexagonal shape. The fourth electron in each carbon atom, now known to be fermions, behave like relativistic particles described by the Dirac equation (which, in another sci-fi twist, also implies the existence of antimatter). Getting back to graphene, it is those free electrons, in conjunction with the material’s relative uniformity, that make graphene such an excellent electrical and thermal conductor, superior to copper and silver respectively. The strong covalent bonds between the carbon atoms, meanwhile, give graphene its strength. Layers of graphene are bonded by weak van der Waals forces (the sum of attractive forces between two surfaces, accounting for a lizard’s ability to climb vertical walls, among other things). The bonds between the carbon atoms in each layer of graphene, on the other hand, are incredibly strong; in fact, a hammock fabricated from a single-atom-thick sheet of graphene could support a load of nearly 9 pounds. High-quality graphene is also lightweight, flexible, impermeable to other elements, and it’s virtually transparent. Thanks to the space between its atoms, the material absorbs just 2.3 percent of white light, allowing 97 percent to pass through. How graphene might be used Potential applications for graphene are nearly limitless. Numerous projects are already underway in industries ranging from consumer electronics to sporting goods. To date, graphene-based consumer products have been limited to items that use a small amount of the substance in protective coatings. Once the mysteries of graphene manufacturing have been unlocked—more on that later—you can expect to find the material everywhere. One area where graphene is likely to have the most immediate impact is the manufacture of flexible and transparent electronics, such as touchscreens. Graphene could replace indium, which is one of the rarest elements on Earth. (Carbon—the foundation of graphene—is one of the most abundant elements on the planet.) Graphene is also lighter, thinner, and stronger than indium. Ultra-strong windshields that double as display clusters are not out of the realm of possibility. Neither is Tony Stark’s transparent smartphone. Graphene’s electrical properties also render it an ideal material for building integrated circuits. During a Q&A session at the 2013 Intel Developers Forum, Intel CEO Brian Krzanich said the company is evaluating graphene’s potential use in chip manufacturing, replacing silicon. Routine use, he said, would be a “few generations” out, putting it roughly in the 2020 timeframe. Graphene might also serve as the foundation for next-generation solid-state capacitors that charge more quickly than today’s offerings and hold a charge for much longer. And graphene could usher in an age of ultra-powerful, lightweight batteries with far more capacity than anything available today. By super-cooling graphene and surrounding it in strong magnetic fields, researchers have also been able to alter the direction of the flow of electrons along graphene’s surface, based on the spin of the electrons, which opens up possibilities for quantum computing. Graphene won’t be relegated solely to electronics and display technology. Its excellent strength-to-weight ratio could also pave the way for strong, lightweight vehicles, while its transparency and electrical conductivity make it a good candidate for future solar panels. Punching nano-sized holes in a sheet of otherwise impermeable graphene could be used in machines that pull a single strand of DNA through the hole, for rapid DNA sequencing, or water purification or desalination. Before those fantastical devices can become reality, however, industry must first develop a reliable, cost-effective manufacturing process. That’s where the majority of current graphene research effort is concentrated. Graphene is being manufactured today using a number of methods: The “Scotch tape” method (also known as mechanical exfoliation or the cleavage method), is the simplest. This is how Andre Geim and Konstantin Novoselov isolated graphene from a larger hunk of graphite in 2004—research that led to their being awarded the Nobel Prize in Physics in 2010. The adhesive tape is used to extract small pieces of graphite from a larger chunk. A layer of graphene is peeled away from the graphite by continually folding the tape over the pieces and then separating the tape. The strength of the adhesive overcomes the weak van der Walls forces holding the layers of graphite together until there is a single layer, yielding graphene. Mechanical exfoliation can be used only to isolate relatively small pieces of graphene, however, so researchers are experimenting with other methods to produce larger quantities. Chemical vapor deposition (CVD) is one of the most promising. In this process, chemical vapors are evaporated in a furnace, leaving a graphene deposit on a thin metal substrate. A similar process has been used in the manufacture of very large integrated circuits (VLSI) for many years. Graphene can also be isolated by submerging graphite in a liquid and blasting it with ultrasonic waves to separate its individual layers, or by slicing an edge of a cylinder formed from graphene (also known as a carbon nanotube). Using these methods, scientists have been able to produce pieces of graphene of various qualities and sizes, including long graphene strands that have already been used to make super-capacitors. While some companies—most recently Samsung—have claimed breakthrough achievements in graphene manufacturing, most of the known work remains academic and has not yet scaled to real-world industrial applications. We’re still a ways off from widespread availability of graphene-based microprocessors, flexible touchscreens, and similarly exotic new devices. But when industry perfects a practical and inexpensive means of manufacturing graphene, you can bet it will become as ubiquitous as plastics are today. Image credits: The image at the top of this page is courtesy of Graphenea, a graphene manufacturer and supplier. The image of the graphite, adhesive tape dispenser, and graphene transistors was released by the copyright holder into the public domain. Dell CouponGet Xbox Live digital gift card at 10% off with Dell coupons
<urn:uuid:fc9b7df8-9547-4cb8-a567-6a86c0261c3c>
CC-MAIN-2022-33
https://www.pcworld.com/article/438921/graphene-is-the-super-substance-that-could-replace-silicon-plastic-and-glass.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573760.75/warc/CC-MAIN-20220819191655-20220819221655-00551.warc.gz
en
0.945642
1,657
3.625
4
Quantum mechanics is usually associated with weird and counterintuitive phenomena we can't observe in real life. But it turns out that quantum processes can occur in living organisms, too, and with very concrete consequences. Some species of birds, for example, use quantum mechanics to navigate. And as Plus found out at a conference on quantum physics and the nature of reality, which took place in Oxford in September, studying these little creatures' quantum compass may help us achieve the holy grail of computer science: building a quantum computer. At the conference Plus editor Rachel Thomas met up with the physicists Simon Benjamin and Erik Gauger, both from the University of Oxford, who were intrigued by research done with European Robins by biologists in Frankfurt, Germany. European robins spend their summers in Scandinavia, but avoid the chilly winter by migrating to North Africa in the autumn. Biologists believe that the birds' sense of direction comes from an internal quantum compass in the bird's eye which consists of two electrons and a quantum property called spin. Effectively, each electron behaves like a tiny bar magnet, which can point either up or down. (For a more detailed explanation of electron spin read this entertaining blog by Chad Orzel, which includes a demonstration by his toddler!) "The two electrons [in the bird compass] are correlated with each other, with their spins pointing in different directions, " explains Benjamin. "They get excited when a photon is absorbed in the bird's eye. The two electron spins then move apart from each other. The way they behave afterward, whether they stay correlated as they were originally, or the correlations change, depends on the Earth's magnetic field." Thus able to sense the Earth's magnetic field, the birds know which direction to fly in. Biologists have known about this theoretical model of the birds' navigation system, called the radical-pair model, for around thirty years. It's quantum mechanical, since spin is a quantum mechanical concept, but not sufficiently so to interest hard-core quantum physicists like Benjamin and Gauger. What caught their interest was some recent research by Roswitha and Wolfgang Wiltschko, from the Goethe University, into how easily the birds' quantum compass could be disrupted. To test the bird compass, the researchers had kidnapped some birds on their way down to North Africa and subjected them to a weak oscillatory electromagnetic field, that is a field whose strength jitters backwards and forwards about a million times a second. "That's an incredibly weak oscillatory field," says Benjamin. "Not only could it not possibly harm the birds, but it would be amazing if the birds could even tell that there was this [oscillation]." Surprisingly, though, this weak signal was enough to disrupt the birds' sense of direction. "The researchers found that at a particular speed of oscillation — 1.3 MHz — suddenly the birds were no longer able to orientate themselves," says Benjamin. "The direction they wanted to go in became random, no longer pointing to Africa." Intrigued that such a tiny perturbation should have an effect on the birds, Benjamin and Gauger looked at the mathematics describing what goes on in the birds' quantum compass. They were particularly interested to see how long it would take for the effect of the field to kick in, since basic physics suggests that detecting signals as weak as that takes some time. "There must be time for this tiny effect to build up and make a difference for the bird," says Benjamin. Using their equations Benjamin and Gauger calculated that it would take at least 120 microseconds for the birds' compass to get jammed by the field. That's very fast, certainly a time period like that can't be detected by humans, but in terms of quantum processes it's rather slow. And this is where quantum computers come into to the picture. As the name suggests, quantum computers work using quantum processes. No-one has as yet been able to build a useful working quantum computer, but once we do, these machines will be way faster and more powerful than ordinary computers. Electrons and their spins form the basic components of quantum computers. "In [quantum computing] you care about in which direction electron spins point and how they correlate with each other," says Benjamin. "But in order to make a quantum computer work , you must insulate these electron spins, the tiny magnets, from the rest of the world. For that reason people have been trying to come up with molecules that can protect electron spins, to isolate them from the rest of the world." A nitrogen-doped C60 molecule — an atom of nitrogen trapped in a carbon cage. Image Quantum Spin Dynamics group at the University of Oxford. What Benjamin and Gauger realised is that the same goes for the the birds. For the bird compass to work, interference from the outside world must be kept down to a very low level. "Otherwise it would mess up such a long lasting sensing process," says Benjamin. Since it takes the bird at least 120 microseconds to detect the oscillatory field, it must be able to insulate its quantum compass from the outside world for at least that length of time, perhaps more. That's compared to the record of 80 microseconds that's so far been achieved in the lab. "It seems that the way the bird protects the pair of two tiny magnets is better than the best we can do," says Benjamin. What's more, the exotic molecule used to insulate quantum systems in the lab — a nitrogen atom trapped inside a carbon cage, called N@C60 — is incredibly hard to make (and costs around £7 million a gramme). The birds certainly don't have access to this material, so the question is how they achieve their insulation and if we can copy their method to build quantum computers. "It's a series of ifs," says Benjamin. "Various things could be wrong. The experimental results could be wrong, or our basic idea of [how the quantum compass works] might be wrong. But if all the ingredients are correct and there really is this extraordinary protection of quantum information in the birds, then it's conceivable that we can work out what chemical it is and we might learn a thing or two." You can listen to the podcast of our interview with Simon and Erik, as well as our podcast from the conference on Quantum Physics and the Nature of Reality. You can also learn more about quantum mechanics from Simon in Caging Schrödinger's Cat, his series of audio and video podcasts about quantum nanotechnology. What happens if you rear a robin in any other place where the temperature is uniformly comfortable all year? Does it still want to fly to Africa during a European winter? Is this the same method used by all migratory birds and animals? Everyone knows that pigeons carried small messages for ancient kings. Many assume that the same pigeon could be used between any two places. They can't. The magnetic extractive properties unique to the place it was hatched in and bred for about a year get embedded within a pigeons body. Then you have to transport in its cage to say, a war front. A message capsule fixed to its feet will be delivered by that pigeon only its place of birth. It is as if it develops an invisible rubber band leash anchored to the first place. Humans have a similar desire to die in the place of their births. I think with humans it's more complex. If you have happy memories of many years spent at a place, then that becomes your rubber bands' anchor.
<urn:uuid:bc837847-5222-4807-9461-7f19370fecd0>
CC-MAIN-2022-33
https://plus.maths.org/content/comment/7689
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570921.9/warc/CC-MAIN-20220809094531-20220809124531-00152.warc.gz
en
0.964756
1,543
3.703125
4
by Thomas Stace The technology that allowed Marty McFly to travel back in time in the 1985 movie Back to the Future was the mythical flux capacitor, designed by inventor Doc Brown. We’ve now developed our own kind of flux capacitor, as detailed recently in Physical Review Letters. While we can’t send a DeLorean car back in time, we hope it will have important applications in communication technology and quantum computing. How did we do it? Well it’s all to do with symmetry. There are many kinds of symmetry in science, including one that deals with time reversal. Time reversal symmetry is a complex sort of symmetry that physicists like to think about, and relies on the imaginary as much as the real. Suppose you make a movie of an event occurring. You could then ask: “If I edited the movie to run backwards, and showed it to my friends, could they tell?” This might seem obvious: people don’t usually walk or talk backwards; spilt milk doesn’t spontaneously jump back into its carton; a golf ball doesn’t miraculously launch backwards from the fairway, landing perfectly balanced on the tee at the same moment as the club catches it. Source: Tom Stace But at a microscopic level, the story is not that clear. The collision of two billiard balls looks pretty similar in reverse; even more so for the collision of two atoms. A beam of light travelling in one direction obeys exactly the same laws of physics as a beam of light travelling in the opposite direction. Indeed, the basic equations of physics look essentially the same if we replace time with its negative. This mathematical transformation reverses the flow of time in our equations. Since the microscopic laws of physics appear to be unchanged under this mathematical transformation, we say the universe possesses time reversal symmetry, even though we cannot actually reverse time in reality. Unlike Doc Brown, we can’t make the clock tick backwards. There is a conceptual conflict here. At the macroscopic scale, the entropy of the universe — a measure of disorder or randomness — always increases, so that there is an arrow of time. This is obvious in our everyday experience: a scrambled egg is not reversible. How does this irreversiblity emerge from microscopic laws that are reversible? This remains a mystery. The Circulator Circuit Microscopic reversibility presents an important technological challenge. It complicates the diversion of electronic and radio signals around a circuit. There are various applications where engineers want electromagnetic signals (such as light or radio waves) in a circuit to behave a bit like cars around a roundabout. This is pictured below: a signal entering port A of the device should be directed to port B; a signal entering at B should go to port C; and a signal entering port Cshould be directed to port A, clockwise around the device. One way to do this is to use a network of amplifiers to switch signals as desired. But there is a profound result in quantum mechanics (the “no cloning theorem”) that means that amplification must always add noise, or randomness, to the signal. Sorry audiophiles: a perfect amplifier is impossible. If the signal is extremely weak, so that additional noise is intolerable, then noiseless circulation is accomplished with a device called a circulator. Such devices are used to separate very weak signals going to and from sensitive electronics, including in radar receivers, or in existing and future quantum computers. It turns out a device like this must locally break time reversal symmetry. If we made a movie of the signals coming and going from the circulator, and ran the movie backwards, it would look different. For example, we would see a signal entering port B and leaving via port A, rather than via C. But most devices in a quantum research laboratory, such as mirrors, beam splitters, lasers, atoms do not break time reversal symmetry, so cannot be used as circulators. Something else is needed. The practical way to break time reversal symmetry for real devices is to introduce a magnetic field. Like a rotating vortex in water, magnetic fields have a circulation, since they arise from electrical currents circulating in an electrical loop. The magnetic field defines a direction of rotation (clockwise or counterclockwise) for electrically charged particles and thus for electrical signals. So when physicists say that a device breaks time reversal symmetry, they usually mean that there is a magnetic field about somewhere. Commercial circulators are an anomaly in the world of electronics. Unlike transistors, diodes, capacitors and other circuit elements, basic materials science means that commercial circulators have not been miniaturised, and are still the size of a coin. Building them into large-scale integrated microelectronic circuits is therefore a challenge. This will become an increasing problem as we try to fit thousands of qubits on a quantum computer chip, each requiring its own circulator to enable control and read-out. Our Quantum Flux Capacitor We have developed a new way of building micrometer-sized circulators that can be fabricated on a microchip. We figured out how to integrate magnetic flux quanta — the smallest units of magnetic field — with microfabricated capacitors and other superconducting circuit elements, so that time-reversal symmetry can be broken. This lead to our new circulator proposal. As with conventional circulators, there is a magnetic field present. But because we can use just one magnetic flux quantum, our design can be microscopic. Sadly for history buffs, our design won’t help much in your DeLorean time machine: it doesn’t reverse time. But its magnetic field does break time-reversal symmetry as advertised and we expect these devices will find applications in future quantum technologies. Even sooner, they may help in high-bandwidth communications environments like mobile phone base stations in very dense populations, or for ultra-high sensitivity radar where every photon of the electromagnetic field counts. – – –
<urn:uuid:8ce31772-1b82-4b19-9a92-635557a89a2a>
CC-MAIN-2022-33
https://theminnesotasun.com/2018/06/13/scientists-have-created-a-flux-capacitor-that-could-unlock-new-dimensions-to-communications-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571950.76/warc/CC-MAIN-20220813111851-20220813141851-00553.warc.gz
en
0.92866
1,256
3.734375
4
Researchers at the Department of Energy’s Oak Ridge National Laboratory have developed a quantum chemistry simulation benchmark to evaluate the performance of quantum devices and guide the development of applications for future quantum computers. Their findings were published in npj Quantum Information. Quantum computers use the laws of quantum mechanics and units known as qubits to greatly increase the threshold at which information can be transmitted and processed. Whereas traditional “bits” have a value of either 0 or 1, qubits are encoded with values of both 0 and 1, or any combination thereof, allowing for a vast number of possibilities for storing data. While still in their early stages, quantum systems have the potential to be exponentially more powerful than today’s leading classical computing systems and promise to revolutionize research in materials, chemistry, high-energy physics, and across the scientific spectrum. But because these systems are in their relative infancy, understanding what applications are well suited to their unique architectures is considered an important field of research. “We are currently running fairly simple scientific problems that represent the sort of problems we believe these systems will help us to solve in the future,” said ORNL’s Raphael Pooser, principal investigator of the Quantum Testbed Pathfinder project. “These benchmarks give us an idea of how future quantum systems will perform when tackling similar, though exponentially more complex, simulations.” Pooser and his colleagues calculated the bound state energy of alkali hydride molecules on 20-qubit IBM Tokyo and 16-qubit Rigetti Aspen processors. These molecules are simple and their energies well understood, allowing them to effectively test the performance of the quantum computer. By tuning the quantum computer as a function of a few parameters, the team calculated these molecules’ bound states with chemical accuracy, which was obtained using simulations on a classical computer. Of equal importance is the fact that the quantum calculations also included systematic error mitigation, illuminating the shortcomings in current quantum hardware. Systematic error occurs when the “noise” inherent in current quantum architectures affects their operation. Because quantum computers are extremely delicate (for instance, the qubits used by the ORNL team are kept in a dilution refrigerator at around 20 millikelvin (or more than -450 degrees Fahrenheit), temperatures and vibrations from their surrounding environments can create instabilities that throw off their accuracy. For instance, such noise may cause a qubit to rotate 21 degrees instead of the desired 20, greatly affecting a calculation’s outcome. “This new benchmark characterizes the ‘mixed state,’ or how the environment and machine interact, very well,” Pooser said. “This work is a critical step toward a universal benchmark to measure the performance of quantum computers, much like the LINPACK metric is used to judge the fastest classical computers in the world.” While the calculations were fairly simple compared to what is possible on leading classical systems such as ORNL’s Summit, currently ranked as the world’s most powerful computer, quantum chemistry, along with nuclear physics and quantum field theory, is considered a quantum “killer app.” In other words, it is believed that as they evolve quantum computers will be able to more accurately and more efficiently perform a wide swathe of chemistry-related calculations better than any classical computer currently in operation, including Summit. “The current benchmark is a first step towards a comprehensive suite of benchmarks and metrics that govern the performance of quantum processors for different science domains,” said ORNL quantum chemist Jacek Jakowski. “We expect it to evolve with time as the quantum computing hardware improves. ORNL’s vast expertise in domain sciences, computer science and high-performance computing make it the perfect venue for the creation of this benchmark suite.” ORNL has been planning for paradigm-shifting platforms such as quantum for more than a decade via dedicated research programs in quantum computing, networking, sensing and quantum materials. These efforts aim to accelerate the understanding of how near-term quantum computing resources can help tackle today’s most daunting scientific challenges and support the recently announced National Quantum Initiative, a federal effort to ensure American leadership in quantum sciences, particularly computing. Such leadership will require systems like Summit to ensure the steady march from devices such as those used by the ORNL team to larger-scale quantum systems exponentially more powerful than anything in operation today. Access to the IBM and Rigetti processors was provided by the Quantum Computing User Program at the Oak Ridge Leadership Computing Facility, which provides early access to existing, commercial quantum computing systems while supporting the development of future quantum programmers through educational outreach and internship programs. Support for the research came from DOE’s Office of Science Advanced Scientific Computing Research program. “This project helps DOE better understand what will work and what won’t work as they forge ahead in their mission to realize the potential of quantum computing in solving today’s biggest science and national security challenges,” Pooser said. Next, the team plans to calculate the exponentially more complex excited states of these molecules, which will help them devise further novel error mitigation schemes and bring the possibility of practical quantum computing one step closer to reality. Read more from original source: https://www.eurekalert.org/pub_releases/2020-01/drnl-ora010220
<urn:uuid:42475d37-75ad-47ef-b65d-3966525c7c1f>
CC-MAIN-2022-33
https://cvmr.ca/news/3139/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571719.48/warc/CC-MAIN-20220812140019-20220812170019-00154.warc.gz
en
0.926598
1,101
3.59375
4
CHICAGO: Flashes of what may become a transformative new technology are coursing through a network of optic fibres under Chicago. Researchers have created one of the world’s largest – a field of science that depends on paradoxes so strange that Albert Einstein didn’t believe them. The network, which connects the University of Chicago with Argonne National Laboratory in Lemont, is a rudimentary version of what scientists hope someday to become the. For now, it’s opened up to businesses and researchers to test fundamentals of quantum information sharing. The network was announced this week by the Chicago Quantum Exchange – which also involves Fermi National Accelerator Laboratory, Northwestern University, the University of Illinois and the University of Wisconsin. With a US$500mil (RM2.2 trillion) federal investment in recent years and US$200mil (RM880mil) from the state, Chicago, Urbana-Champaign, and Madison form a leading region for quantum information research. Why does this matter to the average person? Because quantum information has the potential to help crack currently unsolvable problems, both threaten and protect private information, and lead to breakthroughs in agriculture, medicine and climate change. While classical computing uses bits of information containing either a 1 or zero, quantum bits, or qubits, are like a coin flipped in the air – they contain both a 1 and zero, to be determined once it’s observed. That quality of being in two or more states at once, called superposition, is one of the many paradoxes of quantum mechanics – how particles behave at the atomic and subatomic level. It’s also a potentially crucial advantage, because it can handle exponentially more complex problems. Another key aspect is the property of entanglement, in which qubits separated by great distances can still be correlated, so a measurement in one place reveals a measurement far away. The newly expanded Chicago network, created in collaboration with Toshiba, distributes particles of light, called photons. Trying to intercept the photons destroys them and the information they contain – making it far more difficult to The new network allows researchers to “push the boundaries of what is currently possible,” said University of Chicago professor David Awschalom, director of the Chicago Quantum Exchange. However, researchers must solve many practical problems before large-scale quantum computing and networking are possible. For instance, researchers at Argonne are working on creating a “foundry” where dependable qubits could be forged. One example is a with tiny pockets to hold and process qubits of information. Researchers at Argonne also have by freezing neon to hold a single electron. Because quantum phenomena are extremely sensitive to any disturbance, they might also be used as tiny sensors for medical or other applications – but they’d also have to be made more durable. The quantum network was launched at Argonne in 2020, but has now expanded to Hyde Park and opened for use by businesses and researchers to test new communication devices, security protocols and algorithms. Any venture that depends on secure information, such as banks’ financial records of hospital medical records, would potentially use such a system. Quantum computers, while in development now, may someday be able to perform far more complex calculations than current computers, such as, which could be useful in developing drugs to treat diseases such as Alzheimer’s. In addition to driving research, the quantum field is stimulating economic development in the region. A hardware company, EeroQ, announced in January that it’s moving its headquarters to Chicago. Another local software company,, was recently acquired, and several others are starting up in the region. Because quantum computing could be used to hack into traditional encryption, it has also attracted the bipartisan attention of federal lawmakers. The National Quantum Initiative Act was signed into law by President Donald Trump in 2018 to accelerate quantum development for national security purposes. In May, President Joe Biden directed federal agency to migrate to quantum-resistant cryptography on its most critical defence and intelligence systems. Ironically, basic mathematical problems, such as 5+5=10, are somewhat difficult through quantum computing. Quantum information is likely to be used for high-end applications, while classical computing will likely continueto be practical for many daily uses. Renowned physicist Einstein famously scoffed at the paradoxes and uncertainties of quantum mechanics, saying that God does not “play dice” with the universe. But quantum theories have been proven correct in applications from nuclear energy to MRIs. Stephen Gray, senior scientist at Argonne, who works on algorithms to run on quantum computers, said quantum work is very difficult, and that no one understands it fully. But there have been significant developments in the field over the past 30 years, leading to what some scientists jokingly called Quantum 2.0, with practical advances expected over the next decade. “We’re betting in the next five to 10 years there’ll be a true quantum advantage (over classical computing),” Gray said. “We’re not there yet. Some naysayers shake their canes and say it’s never going to happen. But we’re positive.” Just as early work on conventional computers eventually led to cellphones, it’s hard to predict where quantum research will lead, said Brian DeMarco, professor of physics at the University of Illinois at Urbana-Champaign, who works with the Chicago Quantum Exchange. “That’s why it’s an exciting time,” he said. “The most important applications are yet to be discovered.” – Chicago Tribune/dpa
<urn:uuid:d60d3089-e84f-4a74-ab94-3551ba4cb385>
CC-MAIN-2022-33
https://venturecurrent.com/chicago-network-plans-to-remodel-computing-medicine-cybersecurity/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571996.63/warc/CC-MAIN-20220814052950-20220814082950-00754.warc.gz
en
0.939515
1,173
3.578125
4
Superconductivity is a fascinating phenomenon in which, below a so-called critical temperature, a material loses all its resistance to electrical currents. In certain materials, at low temperatures, all electrons are entangled in a single, macroscopic quantum state, meaning that they no longer behave as individual particles but as a collective – resulting in superconductivity. The general theory for this collective electron behaviour has been known for a long time, but one family of materials, the cuprates, refuses to conform to the paradigm. It was long thought that for these materials the mechanism that ‘glues together’ the electrons must be special, but recently the attention has shifted and now physicists investigate the non-superconducting states of cuprates, hoping to find out their differences with normal superconductors. Most superconductors, when heated to exceed their critical temperature, change into ‘ordinary’ metals. The quantum entanglement that causes the collective behaviour of the electrons fades away, and the electrons start to behave like an ordinary ‘gas’ of charged particles. Cuprates are special, first of all because their critical temperature is considerably higher than that of other superconductors. On top of that, they have very special measurable properties even in their ‘metal phase’. In 2009, physicist Nigel Hussey observed experimentally that the electrons in these materials form a new type of structure, different from that in ordinary metals, and the term ‘strange metal’ was born. At nearly the same time, originating in Stanford in the United States, physicists started applying the theoretical machinery of string theory – a theory for a very different phenomenon, the behavior of gravity at the quantum level – to the description of electrons in metals. Completely unexpectedly, this machinery turned out to be able to predict certain phenomena that experimentally were known to occur in cuprates and other strange metals. Theoretical physicists Jan Zaanen and Koenraad Schalm (Leiden University) were involved in the early stages of these developments and made important contributions. In 2017, the pioneering work was transformed into a national research programme funded by NWO: Strange Metals. The programme is a special collaboration that involves both experimental and theoretical groups. Special behaviour at low temperatures The higher the temperature of a material, the more ‘noise’ measurements will show. To make the special properties of the strange metal state clearly visible, one would like to study the material at a temperature that is as low as possible, at most 1 degree above the absolute temperature minimum of -273°C. The obstacle for this is superconductivity itself: most strange metals already turn into superconductors when cooled to temperatures around -200°C. For this reason, in the Strange Metals programme, the choice was made to focus exclusively on a material with the chemical name Bi2Sr2CuO6, also known as ‘Bi2201’. This material becomes superconducting at about 35 degrees above the absolute minimum temperature. That is still too ‘hot’ for good measurements, but now the researchers can use a trick: superconductivity can be suppressed by a magnetic field. The general rule of thumb is: the larger the critical temperature of a material, the stronger the magnetic field required to suppress superconductivity. Since for Bi2201 the critical temperature is already quite low, the required magnetic field comes just within reach of the biggest magnets available in the Netherlands. This allowed PhD students Jake Ayres and Maarten Berben working within the groups of Hussey (HFML-FELIX, Bristol) and Van Heumen to eventually study the strange metal state of Bi2201 at various low temperatures and various magnetic field strengths. In this domain, the differences between strange metals and ordinary metals become strikingly visible. For ordinary metals, for example, one expects the electrical resistance to increase quadratically with temperature: increase the temperature by a factor of two, and the resistance will grow by a factor of four. The same holds if it is not the temperature but the magnetic field that is increased. The Dutch/UK team has now shown that these golden rules do not hold for cuprates. In these materials a new phase exists where the resistance depends linearly on the temperature and field strength: if one of these increases by a factor of two, so does the resistance. Contrary to what was observed before, the group discovered that this behaviour persists for a large range of the parameters. At the moment, there are two widely accepted theories that could explain the linear behaviour of the resistance. The first theory assumes that the linear behaviour only occurs near very specific values of the temperature and magnetic field strength. With the new measurements, this theory has now come under considerable pressure. The second theory is the theory of extreme quantum entanglement that comes from the string theoretic approach. Within this theory it is possible to observe the linear behavior for a large range of parameters. Surprisingly, therefore, it seems that to describe strange metals, one truly needs a theory that can also be used to describe quantum gravity! Quantum gravity in the lab The link between strange metals and quantum gravity has special observable effects. In an extensive analysis, the team shows that within the conventional models of electrical transport, it is absolutely impossible to properly explain the data. Their analysis shows that there exists a previously unobserved mechanism that makes the electrons lose energy. This loss occurs at extremely short time scales related to a fundamental constant of nature in quantum mechanics: Planck’s constant. According to general theory, this is the shortest time scale at which a quantum system can lose energy – something which moreover is only possible when the system is maximally entangled. This fingerprint of quantum gravity behaviour in the data excites many supporters of the link with string theory: it would be a first clue of physics far beyond the usual model of metals. To shed further light on the tension between ‘normal’ and ‘strange’ behaviour of metals, further experiments are needed. In that respect, promising developments still lie ahead within the Strange Metals program. Using a technique called ‘optical spectroscopy’, Van Heumen expects to be able to provide new details soon, and the groups of Mark Golden (Amsterdam) and Milan Allan (Leiden) are also working on results that could cause new surprises when it comes to the mysterious relation between quantum gravity and strange metals. Incoherent transport across the strange metal regime of overdoped cuprates, J. Ayres, M. Berben, M. Čulo, Y.-T. Hsu, E. van Heumen, Y. Huang, J. Zaanen, T. Kondo, T. Takeuchi, J. R. Cooper, C. Putzke, S. Friedemann, A. Carrington and N. E. Hussey. Nature 595 (2021) 661-666.
<urn:uuid:b7c650bc-ec08-436c-babf-04b03c1f1644>
CC-MAIN-2022-33
https://hum.uva.nl/en/shared-content/subsites/institute-of-physics/en/news/2021/07/from-quantum-gravity-to-strange-metals.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570730.59/warc/CC-MAIN-20220807211157-20220808001157-00754.warc.gz
en
0.92629
1,426
3.796875
4
Feb 27, 2015 The values of two inherent properties of one photon – its spin and its orbital angular momentum – have been transferred via quantum teleportation onto another photon for the first time by physicists in China. Previous experiments have managed to teleport a single property, but scaling that up to two properties proved to be a difficult task, which has only now been achieved. The team's work is a crucial step forward in improving our understanding of the fundamentals of quantum mechanics and the result could also play an important role in the development of quantum communications and quantum computers. Alice and Bob Quantum teleportation first appeared in the early 1990s after four researchers, including Charles Bennett of IBM in New York, developed a basic quantum teleportation protocol. To successfully teleport a quantum state, you must make a precise initial measurement of a system, transmit the measurement information to a receiving destination and then reconstruct a perfect copy of the original state. The "no-cloning" theorem of quantum mechanics dictates that it is impossible to make a perfect copy of a quantum particle. But researchers found a way around this via teleportation, which allows a flawless copy of a property of a particle to be made. This occurs thanks to what is ultimately a complete transfer (rather than an actual copy) of the property onto another particle such that the first particle loses all of the properties that are teleported. The protocol has an observer, Alice, send information about an unknown quantum state (or property) to another observer, Bob, via the exchange of classical information. Both Alice and Bob are first given one half of an additional pair of entangled particles that act as the "quantum channel" via which the teleportation will ultimately take place. Alice would then interact the unknown quantum state with her half of the entangled particle, measure the combined quantum state and send the result through a classical channel to Bob. The act of the measurement itself alters the state of Bob's half of the entangled pair and this, combined with the result of Alice's measurement, allows Bob to reconstruct the unknown quantum state. The first experimentation teleportation of the spin (or polarization) of a photon took place in 1997. Since then, the states of atomic spins, coherent light fields, nuclear spins and trapped ions have all been teleported. But any quantum particle has more than one given state or property – they possess various "degrees of freedom", many of which are related. Even the simple photon has various properties such as frequency, momentum, spin and orbital angular momentum (OAM), which are inherently linked. More than one Teleporting more than one state simultaneously is essential to fully describe a quantum particle and achieving this would be a tentative step towards teleporting something larger than a quantum particle, which could be very useful in the exchange of quantum information. Now, Chaoyang Lu and Jian-Wei Pan, along with colleagues at the University of Science and Technology of China in Hefei, have taken the first step in simultaneously teleporting multiple properties of a single photon. In the experiment, the team teleports the composite quantum states of a single photon encoded in both its spin and OAM. To transfer the two properties requires not only an extra entangled set of particles (the quantum channel), but a "hyper-entangled" set – where the two particles are simultaneously entangled in both their spin and their OAM. The researchers shine a strong ultraviolet pulsed laser on three nonlinear crystals to generate three entangled pairs of photons – one pair is hyper-entangled and is used as the "quantum channel", a second entangled pair is used to carry out an intermediate "non-destructive" measurement, while the third pair is used to prepare the two-property state of a single photon that will eventually be teleported. The image above represents Pan's double-teleportation protocol – A is the single photon whose spin and OAM will eventually be teleported to C (one half of the hyper-entangled quantum channel). This occurs via the other particle in the channel – B. As B and C are hyper-entangled, we know that their spin and OAM are strongly correlated, but we do not actually know what their values are – i.e. whether they are horizontally, vertically or orthogonally polarized. So to actually transfer A's polarization and OAM onto C, the researchers make a "comparative measurements" (referred to as CM-P and CM-OAM in the image) with B. In other words, instead of revealing B's properties, they detect how A's polarization and OAM differ from B. If the difference is zero, we can tell that A and B have the same polarization or OAM, and since B and C are correlated, that C now has the same properties that A had before the comparison measurement. On the other hand, if the comparative measurement showed that A's polarization as compared with B differed by 90° (i.e. A and B are orthogonally polarized), then we would rotate C's field by 90° with respect to that of A to make a perfect transfer once more. Simply put, making two comparative measurements, followed by a well-defined rotation of the still-unknown polarization or OAM, would allow us to teleport A's properties to C. One of the most challenging steps for the researchers was to link together the two comparative measurements. Referring to the "joint measurements" box in the image above, we begin with the comparative measurement of A and B's polarization (CM-P). From here, either one of three scenarios can take place – one photon travels along path 1 to the middle box (labelled "non-destructive photon-number measurement"); no photons enter the middle box along path 1; or two single photons enter the middle box along path 1. The middle box itself contains the second set of entangled photons mentioned previously (not shown in figure) and one of these two entangled photons is jointly measured with the incoming photons from path 1. But the researcher's condition is that if either no photons or two photons enter the middle box via path 1, then the measurement would fail. Indeed, what the middle box ultimately shows is that exactly one photon existed in path 1, and so exactly one photon existed in path 2, given that two photons (A and B) entered CM-P. To show that indeed one photon existed in path two required the third and final set of entangled photons in the CP-OAM box (not shown), where the OAM's of A and B undergo a comparative measurement. The measurements ultimately result in the transfer or teleportation of A's properties onto C – although it may require rotating C's (as yet unknown) polarization and OAM depending on the outcomes of the comparative measurements, but the researchers did not actually implement the rotations in their current experiment. The team's work has been published in the journal Nature this week. Pan tells physicsworld.com that the team verified that "the teleportation works for both spin-orbit product state and hybrid entangled state, achieving an overall fidelity that well exceeds the classical limit". He says that these "methods can, in principle, be generalized to more [properties], for instance, involving the photon's momentum, time and frequency". Physicist Wolfgang Tittel from the University of Calgary, who was not involved in the current work (but wrote an accompanying "News and Views" article in Nature) explains that the team verified that the teleportation had indeed occurred by measuring the properties of C after the teleportation. "Of course, the no-cloning theorem does not allow them to do this perfectly. But it is possible to repeat the teleportation of the properties of photon A, prepared every time in the same way, many times. Making measurements on photon C (one per repetition) allows reconstructing its properties." He points out that although the rotations were not ultimately implemented by the researchers, they found that "the properties of C differed from those of A almost exactly by the amount predicted by the outcomes of the comparative measurements. They repeated this large number of measurements for different preparations of A, always finding the properties of C close to those expected. This suffices to claim quantum teleportation". While it is technically possible to extend Pan's method to teleport more than two properties simultaneously, this is increasingly difficult because the probability of a successful comparative measurement decreases with each added property. "I think with the scheme demonstrated by [the researchers], the limit is three properties. But this does not mean that other approaches, either other schemes based on photons, or approaches using other particles (e.g. trapped ions), can't do better," says Tittel. Pan says that to teleport three properties, their scheme "needs the experimental ability to control 10 photons. So far, our record is eight photon entanglement. We are currently working on two parallel lines to get more photon entanglement." Indeed, he says that the team's next goal is to experimentally create "the largest hyper-entangled state so far: a six-photon 18-qubit Schrödinger cat state, entangled in three degrees-of-freedom, polarization, orbital angular momentum, and spatial mode. To do this would provide us with an advanced platform for quantum communication and computation protocols". The work is published in Nature.
<urn:uuid:aaba1d6b-a7dd-44c6-bddc-0c0050ba7281>
CC-MAIN-2022-33
https://seqre.net/two-quantum-properties-teleported-together-first-time
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573104.24/warc/CC-MAIN-20220817183340-20220817213340-00355.warc.gz
en
0.937289
1,899
3.828125
4
Scientists pinpoint the singularity for quantum computers Researchers from the University of Bristol have discovered that super-powerful quantum computers, which scientists and engineers across the world are racing to build, need to be even more powerful than previously thought before they can beat today's ordinary PCs. Quantum computers are a new type of machine that operate on quantum mechanical hardware and are predicted to give enormous speed advantages in solving certain problems. Research groups at leading universities and companies, including Google, Microsoft and IBM, are part of a worldwide race to realise the first quantum computer that crosses into the 'quantum computational singularity'. This represents a problem so complex that today's top supercomputer would take centuries to find a solution, while a quantum computer could crack it in minutes. Now a team of scientists from Bristol have discovered that the boundary to this singularity is further away than previously thought. The research is reported this week in Nature Physics. The results apply to a highly influential quantum algorithm known as 'boson sampling', which was devised as a very direct route to demonstrate quantum computing's supremacy over classical machines. The boson sampling problem is designed to be solved by photons (particles of light) controlled in optical chips – technology pioneered by Bristol's Quantum Engineering and Technology Labs (QETLabs). Predicting the pattern of many photons emerging from a large optical chip is related to an extremely hard random matrix calculation. With the rapid progress in quantum technologies, it appeared as though a boson sampling experiment that crossed into the quantum computational singularity was within reach. However, the Bristol team were able to redesign an old classical algorithm to simulate boson sampling, with dramatic consequences. Dr Anthony Laing, who heads a group in QETLabs and led this research, said: "It's like tuning up an old propeller aeroplane to go faster than an early jet aircraft. "We're at a moment in history where it is still possible for classical algorithms to outperform the quantum algorithms that we expect to ultimately be supersonic. "But demonstrating such a feat meant assembling a crack team of scientists, mathematicians, and programmers." Classical algorithms expert Dr Raphaël Clifford, from Bristol's Department of Computer Science, redesigned several classical algorithms to attack the boson sampling problem, with the 1950's Metropolised Independence Sampling algorithm giving the best performance. The simulation code was optimised by QETLabs researcher 'EJ', a former LucasArts programmer. Expertise on computational complexity came from Dr Ashley Montanaro, of Bristol's School of Mathematics, while QETLabs students Chris Sparrow and Patrick Birchall worked out the projected performance of the competing quantum photonics technology. At the heart of the project and bringing all these strands together was QETLabs PhD student and first author on the paper, Alex Neville, who tested, implemented, compared, and analysed, all of the algorithms. He said: "The largest boson sampling experiment reported so far is for five photons. "It was believed that 30 or even 20 photons would be enough to demonstrate quantum computational supremacy." Yet he was able to simulate boson sampling for 20 photons on his own laptop, and increased the simulation size to 30 photons by using departmental servers. Alex added: "With access to today's most powerful supercomputer, we could simulate boson sampling with 50 photons." The research builds on Bristol's reputation as a centre of activity for quantum science and the development of quantum technologies. Through QETLabs, the university has embarked on an ambitious programme to bring quantum technologies out of the laboratory and engineer them in to useful devices that have real-world applications for tackling some of society's toughest problems. In addition to collaborations with tech companies such as Microsoft, Google, and Nokia, start-ups and new business activities focused on quantum technologies have emerged in Bristol. An important theme across the overall quantum research activity is developing our understanding of exactly how quantum technologies can provably outperform conventional computers. Recently Dr Montanaro, together with Professor Noah Linden of the School of Mathematics, organised a Heilbronn Focused Research Group on the topic of quantum computational supremacy. This meeting brought some of the world leaders in the field, from both industry and academia, to Bristol for a week of intense discussions and collaboration. Among the attendees was one of the theorists who devised boson sampling, Professor Scott Aaronson, from UT Austin. Although outperforming classical computers might take a little longer than originally hoped, Dr Laing is still optimistic about the prospects for building a device to do just that. He said: "We now have a solid idea of the technological challenge we must meet to demonstrate that quantum machines can out-compute their classical counterparts. For boson sampling, the singularity lies just beyond 50 photons. It's a tougher nut to crack than we first thought, but we still fancy our chances." With Dr Laing's group focused on practical applications of quantum technologies, the current work puts bounds on the size and sophistication of photonic devices that will be required to tackle industrially relevant problems that are beyond the capabilities of today's classical algorithms.
<urn:uuid:728c0b5f-0f76-4096-860a-a7c92f99df68>
CC-MAIN-2022-33
https://phys.org/news/2017-10-scientists-singularity-quantum.html?utm_content=buffer1feee&utm_medium=social&utm_source=facebook.com&utm_campaign=buffer
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573849.97/warc/CC-MAIN-20220819222115-20220820012115-00752.warc.gz
en
0.937086
1,070
3.75
4
In a world where technological advances are constantly on the rise, the word ‘impossible’ does not exist. From virtual reality, cryptocurrencies and quantum computing to flying cars of Dubai Police, things are only starting to get more advanced and innovative. This, in turn, has led to massive progress in countries all over the world, as well as open opportunities for people working in tech. One of the greatest innovations of the century includes artificial intelligence, commonly referred to as AI. When people hear this term, they usually think of robots wreaking havoc upon humans on earth. They are usually portrayed as evil forces that aim to overthrow the human race, making them a debatable topic in scientific and academic circles. But are any of those portrayals real? Continue reading below for more information. What is artificial intelligence (AI)? Back when Alan Turing broke the Nazi cypher device Enigma that helped the Allied Forces win the war, he turned the tides and changed history once again. He wanted to ask and answer the question: Can machines have the human capacity to think? During the 1950s, he published a seminal paper called Computing Machinery and Intelligence. It is considered to be the first scientific paper that established the foundation, goals and vision of artificial intelligence, opening new doors for further research in the field. Artificial intelligence (AI) defines the simulation of human intelligence in machines that are automated to mimic the actions and thoughts of human beings. It often revolves around human characteristics that include the capacity to reason, find meaning and learn from previous experiences, among other things. How does it work? Even though the term AI has been popular for many years now, a huge chunk of the population that is not tech-savvy still doesn’t know how it works. How did scientists, robotics engineers and software developers manage to make computers act like humans? How is that possible? Oftentimes, when people discuss the mechanisms of AI, they focus on one component which is machine learning. Before a machine can learn certain algorithms and patterns, AI should have a steady foundation of specialised hardware and software design. Some developers would use programming languages such as Java, R and Python, to name a few. AI systems work through a combination of sets of data that focuses on analysing patterns and correlations. Through these data patterns, an AI machine can predict future events and execute human-like actions. For instance, when making a chatbot, developers will incorporate examples of text chats. This will make it easier for the system to create meaningful exchanges with people, making them feel like they’re not just talking to a machine. Pros and cons of artificial intelligence (AI) There’s no doubt that the invention of AI has contributed lots of progress toward machine learning, revolutionizing various fields like healthcare, autonomous flying and shopping. However, just like any other technological advance, it comes with its advantages and disadvantages. When does AI become harmful and become beneficial to different sectors of society? If you’re an adult that has a full-time job, working for eight hours straight can put so much strain on your well-being. This is why you have to take breaks that would help you rest your mind and body so you can perform better at work. You also have to take your paid time off, rest on the weekends and prioritize other things outside of work. This means that human workers won’t be available to provide services 24/7. Humans are not designed that way. However, with AI machines, companies can make them work nonstop without the fear of putting someone’s health at risk. When companies focus on AI, they would increase their productivity rate, generate larger revenues and lessen costs in hiring new employees. Takes risks that humans are not capable of There are still undiscovered parts of the world that humans haven’t reached yet, especially the deepest parts of oceans. When basic ethics are applied, you can’t just send someone on an exploration quest and put them in grave danger. This also means no matter how trained an individual is, forcing them to defuse bombs during disasters or mine for coal and oil is still considered a risky business. Thanks to artificial intelligence, all of these things are possible. Let’s take a look at the 1986 Chernobyl nuclear power plant explosion that happened in Ukraine. During that point in history, there were no AI-powered robots that can be used to lessen the effects of radiation and put the fire under control. Consequently, the humans who took the risk to get close and address the situation at hand died within minutes. What does this mean? If AI was used during this deadly, hazardous situation, hundreds of lives could have been saved. Handles repetitive jobs well There are a lot of repetitive jobs in the market that can cause increased burnout, reduced creativity, less employee engagement and costly labour costs, among other things. This includes bookkeeping, telemarketing, proofreading and research analytics. These jobs, according to experts, will be replaced by automation and computerization, helping companies better analyse customer behaviour and data. For instance, when it comes to market research, marketers and advertisers would do a great job in creating meaningful content, products and messaging. However, through AI and automated surveys, marketing companies can compile huge sets of data in one go. An example would be GrowthBot, conducting market research with just one click of a Slack command. It can also lessen lengthy bank processes. When you visit banks, you have to undergo several document verifications when applying for a loan. But through AI Cognitive Automation, banks can speed up transactions in under a minute, increasing customer satisfaction and overall productivity. Makes the right decisions Since there is a complete absence of emotions in AI-powered systems, they can make the right decisions in a short period. It only works with its programmed data and previous automated tasks, helping it settle on a decision that’s not bound by emotions or practicality. This makes it the ideal choice for many industries, especially healthcare. In Cambridge, Massachusetts, PathAI is starting to develop a machine learning system that would help pathologists diagnose illnesses more accurately. To further expand its goals, the company has collaborated with drug developers and organizations such as the Bill & Melinda Gates Foundation and Bristol-Myers Squibb. AI is also used in diagnosing deadly blood diseases in Boston, Massachusetts. Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School, is using AI-enhanced microscopes to look for harmful bacteria such as staphylococcus and E. coli. Doctors are now studying blood samples faster than they did when they had to depend on manual scanning. The machines have 95% accuracy after scientists fed them 25,000 photos of blood samples. Lack of creativity Yes, you can teach machines how to think, talk like humans and study with big sets of data within seconds, but you can never teach them creativity. Keep in mind that they can only execute the commands and data integrated into them, so when it comes to being creative, they can’t compete with the human brain. Humans are known for their intellect and emotions. They know how to push the limit, think out of the box and make things that machines cannot do. Their thoughts completely depend on their feelings and comprehension that AI-powered systems can never replicate. Perhaps the most dangerous effect of AI development is increased unemployment. Since more companies are aiming to generate sales and reduced costs, they are prompted to replace human workers with AI and automation. They argue that AI robots can perform similar tasks with better efficiency and accuracy, so people looking for jobs that may be repetitive might have little to no chance of getting hired. While this can be a sign of huge progress for many companies, it can also mean that workers won’t have many job opportunities to seek. Unable to incorporate ethics One of the reasons why AI is still debatable despite its great potential is because of its inability to incorporate ethics in certain situations. Remember that AI only has algorithms and data that it can use to make decisions and follow patterns, so it only focuses on logical results. This might lead to discriminatory conclusions and inserted bias. The complete reliance on AI-powered systems may lead to inaccuracies that can put someone’s life on the line. For instance, there is a software program that shows bias when identifying future criminals. It showed bias against black people, incarcerating innocent individuals just because of the colour of their skin. If you want to know more about AI technology and other related topics about it, visit Vula Telematix and find articles that will help you understand how they work.
<urn:uuid:98bf0ef6-1196-4612-b239-f1c49a947022>
CC-MAIN-2022-33
https://www.vulatelematix.co.za/blog/what-you-need-to-know-about-ai/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571959.66/warc/CC-MAIN-20220813142020-20220813172020-00756.warc.gz
en
0.961109
1,798
3.765625
4
Chenoa van den Boogaard, Physics & Astronomy editor Teleportation has finally become a reality. But before you get too excited, the type of teleportation scientists are experimenting with is not the same as what you’ve seen on Star Trek. Scientists are not trying to teleport people or objects from one place to another. Instead, they are teleporting information in the quantum world, where things simply don’t behave the same way as they do in the world we can see. Quantum teleportation has the potential to revolutionize current technology, especially in the areas of communication and computing. “Teleportation will likely be a key element of the quantum internet, when it comes to fruition,” explains John Nichol, an assistant professor of physics at the University of Rochester. Imagine a world where information could be sent and received instantaneously through a quantum internet and computers could store and calculate information at a substantially higher speed than is currently possible. But before we explore what quantum teleportation can do, let’s take a look at how it actually works. To understand quantum teleportation, we must first look at quantum entanglement, a process so strange that Albert Einstein famously described it as “spooky action at a distance.” Quantum entanglement occurs when a pair or group of particles are generated in such a way that their behaviour can no longer be described as independent of each other, even when the particles are separated by great distances. A particle’s behaviour is described by its quantum state, which defines all the possible outcomes of a measurement on that particle. When particles exist in quantum entanglement, their quantum states are always correlated, meaning a measurement on one particle will allow us to know something about the state of the other. For example, imagine two entangled particles as a pair of gloves. If you were to mail each glove in separate boxes to two different locations, by opening only one box, you could determine whether the left or right glove was in the other box. How particles are connected and communicate within a system of quantum entanglement is still a mystery, but scientists have succeeded in using the phenomenon to their advantage. In 1993, a group of physicists from Canada, the United States, France, and Israel collaborated to transfer, or teleport, the quantum state of an independent photon across two entangled photons. The process can be explained by considering a scenario in which there are two observers, Alice and Bob, who each possess one of a pair of entangled photons. The entangled photons have quantum states of either up or down. Because they are correlated, when measured, one photon will be in an up state and the other in a down state because entangled particles can never be in the same state (in this case, both up or down). Alice then introduces a third independent photon to the system (the yellow photon in the figure below), which also has a quantum state of either up or down and forces it to interact with her entangled photon. If the state of the yellow photon is up, it will change the state of Alice’s entangled photon to down, which results in Bob’s photon’s state becoming up due to the correlation with its entangled partner. This process effectively teleports the state of the yellow photon to Bob’s photon, which also results in the annihilation of Alice’s entangled photon and the yellow photon. Importantly, information telling Bob that his entangled photon is now an identical copy of the yellow photon must reach him from Alice. If he is able to determine instantaneously that his photon has changed, this means that the information has travelled to him faster than the speed of light, which is impossible within our current understanding of physics. The yellow photon has not really been teleported in the sense that it has physically moved. Instead, Bob’s photon has taken on the quantum state of the yellow photon, forming an exact copy. The yellow photon is annihilated because an original particle and its copy cannot exist simultaneously, as outlined by the no-cloning theorem in physics. This means that if Star Trek were using this type of teleportation, Captain Kirk would be annihilated and an identical copy of him would be formed at his destination every time Scotty beamed him up. Since 1993, several experiments have successfully demonstrated teleportation in various other materials including atoms, ions, and superconducting circuits. In 2017, a group of Chinese physicists achieved space-based teleportation by teleporting information from Earth to the Micius satellite, a record distance of 1,400 kilometers. Scientists are continuing to push quantum teleportation into space, with plans to integrate teleportation into the design of space-based telescopes and satellites. “There are already efforts underway in different countries to create quantum networks in space,” explains Nichol. “Teleportation is often a key element of benchmarking these satellite-based systems [and] can also be used to create remotely entangled pairs of particles for secure communication protocols.” In June of this year, scientists from the University of Rochester and Purdue University confirmed that quantum teleportation is possible between electrons, a discovery that has major implications for the world of quantum computing. The researchers, including John Nichol, explained their findings in an article published in Nature Communications. In current computing methods, billions of transistors called bits, transfer information through a single binary value 0 or 1. By comparison, quantum bits (or qubits) have the ability to exist as both 0 and 1 simultaneously. “Electrons are desirable qubits because they can be manipulated quickly and their coherence times (the length of time over which they can retain quantum information) can be extremely long,” says Nichol. “Compared with photons, electrons also interact easily with each other, which is a key requirement for quantum computing.” Quantum teleportation has the potential to revolutionize the way we obtain and pass on information, whether it is in areas of communication, computing, healthcare, economics, or other industries. One day, we may even succeed in teleporting matter. But for now, that kind of spooky action is still in the distant future. Banner image by Matthias Weinberger, CC BY-NC-ND 2.0
<urn:uuid:00df02f9-a4b9-4b8f-8fd0-34dbf99e8b7e>
CC-MAIN-2022-33
https://blog.scienceborealis.ca/teleportation-is-possible-in-the-quantum-world-at-least/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572870.85/warc/CC-MAIN-20220817062258-20220817092258-00555.warc.gz
en
0.950794
1,272
4.03125
4
16 September 2011—The long-promised arrival of practical quantum computers—machines that exploit the laws of quantum mechanics to solve complex problems much faster than conventional computers do—seems a step closer, thanks to two recent advances by physicists. In the first development, reported in the 2 September issue ofNature by a group led by Serge Haroche of the École Normale Supérieure and the Collège de France in Paris, the researchers created a real-time feedback mechanism for a quantum computer. Control mechanisms, such as feedback loops, are central to the operation of large conventional computers. In the second advance, reported the same week inScience by a group led by Matteo Mariantoni and John Martinis of the University of California, Santa Barbara, scientists created a quantum central processing unit (CPU) with memory. The rudimentary device is the first quantum computer based on the common von Neumann processor-memory architecture that conventional computers use. Dick Slusher, director of the Quantum Institute at the Georgia Institute of Technology, in Atlanta, and other experts unanimously praised the work of both groups. However, Slusher says that ”for quantum computing to be fault tolerant—a condition required to scale up to true applications like factoring useful coding keys—the error levels must be much lower than achieved so far.” Quantum computing is an emerging field that has witnessed considerable advances in recent years, including progress toward silicon devices. However, it has proved difficult to create a practical quantum computer that would rival the processing abilities of a conventional machine. Part of the difficulty lies in the fragility of quantum states, which break down (or ”decohere,” in the parlance of quantum mechanics) rather quickly. So far, only rudimentary quantum computers with a handful of ”qubits” (quantum bits) have been built. (In May, D-Wave Systems sold Lockheed Martin a special type of computer that relies on a ”quantum annealing” processor, but many quantum computing experts remain skeptical that it is a true quantum computer.) As they seek to create larger quantum systems, scientists have tried to incorporate some of the same systems-engineering concepts that are used in conventional computers, but the equivalent quantum systems have proved elusive—until now. ”These machines are very fragile,” says Haroche. ”The coupling to their environment causes decoherence, which destroys the quantum features required to achieve their tasks. Correcting the effects of decoherence is thus a very important aspect of quantum information. One possibility is to control the quantum machine by quantum feedback.” Yet therein lies a challenge: In the quantum world, the mere act of observing photons or atoms perturbs their motion and changes their positions and velocities—and therefore the value the qubit holds. So for quantum feedback to work, one must be able to observe the system by performing ”weak measurements,” perturbing it only minimally, and the computer must take the perturbation into account before applying the correction. Haroche and his colleagues use a small collection of atoms as a kind of quantum sensor to overcome this challenge. They pass atoms through a microwave cavity that contains the qubits as photons. The atoms obtain a detectable signal—a shift in their phase. This technique provides information about the state of the photons, but it does so by performing only a weak measurement and does not lead to a total collapse of the light’s quantum nature. Measuring changes in the final state of atoms that sequentially pass through the light field provides a signal that can be used to control the light. ”The work is a very impressive demonstration experiment showing that the many techniques developed in the systems engineering community can be translated to the quantum regime—if one is clever enough,” says Michael Biercuk, a quantum physicist at the University of Sydney, in Australia. The challenge of translating a classical system, in this case the common von Neumann processor-memory architecture, into a quantum system also motivated the second team of researchers. To build a quantum CPU and RAM, the UC Santa Barbara group used two superconducting Josephson junctions—two pieces of superconducting metal separated by a thin insulating layer—as qubits. They connected the qubits using a bus made of a superconducting microwave resonator. Each qubit also had a separate resonator that acted as RAM. With the help of microwave pulses, the qubits could influence one another’s state in a way that performed calculations, and the results could be stored in the quantum RAM. They tested their CPU by allowing it to solve a few quantum algorithms, including the equivalent of the Fourier transform. The demonstration could quickly lead to a larger-scale quantum processor based on superconducting circuits, according to the UC Santa Barbara team. The most complex algorithms performed so far have used a quantum computing system based on trapped ions, but Biercuk says the superconducting system is quickly catching up, and that’s ”extremely exciting.” While no one expects a quantum computer to rival a conventional computer in the very near future, experts were pleased with these recent developments. Raymond Laflamme, executive director of the Institute for Quantum Computing at the University of Waterloo, in Canada, said both experiments had ”very strong results,” and that they ”demonstrate an increasing amount of control of quantum processors.” About the Author Saswato R. Das, a New York City–based writer, contributes frequently to IEEE Spectrum. For one assignment, Das got the last interview with famed science fiction writer Arthur C. Clarke before he died in 2008.
<urn:uuid:d68a8d92-1971-4707-808e-ffca082366b3>
CC-MAIN-2022-33
https://spectrum.ieee.org/practical-quantum-computers-creep-closer-to-reality
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572077.62/warc/CC-MAIN-20220814204141-20220814234141-00758.warc.gz
en
0.934865
1,179
3.5
4
A new phase of matter has been discovered in a quantum computer after physicists beamed light at its qubits in a pattern inspired by the Fibonacci sequence. If you think it’s mind-boggling, this strange quirk of quantum mechanics behaves as if it has two time dimensions instead of one; this trait, the scientists say, makes the qubits more robust, able to remain stable throughout an experiment. This stability is called quantum coherence, and it’s one of the main goals of an infallible quantum computer—and one of the hardest to achieve. The work is “a completely different way of thinking about the phases of matter,” according to computational physicist Philippe Dumitrescu of the Flatiron Institute, lead author of a new paper describing the phenomenon. “I have been working on these theoretical ideas for more than five years, and it is very interesting to see how they are actually implemented in experiments.” Quantum computing is based on qubits, the quantum equivalent of computing bits. However, when bits process information in one of two states, 1 or 0, qubits can be in both at the same time, a state known as quantum superposition. The mathematical nature of this superposition can be incredibly powerful from a computational standpoint, being able to solve problems quickly under the right circumstances. But the fuzzy, undefined nature of a series of qubits also depends on how their undefined states relate to each other, a relationship called entanglement. Unfortunately, qubits can get entangled with just about anything in their environment, introducing errors. The more delicate the blurred state of a qubit (or the more chaos in its environment), the higher the risk of it losing this coherence. Improving consistency to the point of viability is likely a multitactic approach to removing a major hurdle standing in the way of a functional quantum computer—every little thing counts. “Even if you keep all the atoms under tight control, they can lose their quantumness by talking to the environment, heating up or interacting with things differently than you planned,” Dumitrescu explained. “In practice, experimental devices have many sources of errors that can degrade coherence after just a few laser pulses.” Ensuring symmetry can be one way to protect qubits from decoherence. Rotate the good old square ninety degrees and it will still have the same shape. This symmetry protects it from some rotational effects. Exposing the qubits to uniformly distributed laser pulses guarantees a symmetry based not on space, but on time. Dumitrescu and his colleagues wanted to see if they could enhance this effect by adding not symmetric periodicity, but asymmetric quasi-periodicity. They suggested that this would add not one temporal symmetry, but two; one is effectively hidden inside the other. The idea was based on earlier work by the group, which proposed creating something called a quasi-crystal in time rather than space. If a crystal consists of a symmetrical lattice of atoms that repeats in space, like a square grid in a gym or a honeycomb, then the pattern of atoms on a quasicrystal is non-repeating, like a Penrose tiling, but still ordered. The team ran their experiment on an advanced commercial quantum computer developed by Quantinuum, a quantum computing company. This beast uses 10 ytterbium atoms (one of the preferred elements for atomic clocks) for its qubits. These atoms are held in an electrical ion trap from which laser pulses can be used to control or measure them. Dumitrescu and his colleagues created a sequence of laser pulses based on Fibonacci numbers, where each segment is the sum of the previous two segments. The result is a sequence that is ordered but not repeated, as in a quasicrystal. Quasicrystals can be mathematically described as low-dimensional segments of multidimensional lattices. A Penrose tiling can be described as a two-dimensional slice of a five-dimensional hypercube. Similarly, the team’s laser pulses can be described as a one-dimensional representation of a two-dimensional pattern. Theoretically, this meant that he could potentially impose two temporal symmetries on qubits. The team tested their work by flashing lasers on an array of ytterbium qubits, first in a symmetrical sequence and then quasi-periodically. They then measured the coherence of the two qubits at both ends of the trap. For a periodic sequence, the qubits were stable for 1.5 seconds. For a quasi-periodic sequence, they remained stable for 5.5 seconds, the duration of the experiment. The extra temporal symmetry added another layer of protection against quantum decoherence, the researchers say. “With this quasi-periodic sequence, a complex evolution occurs that eliminates all the errors living on the edge,” said Dumitrescu. “Because of this, the edge remains quantum mechanically coherent for much, much longer than you would expect.” The work is not yet ready to be integrated into functional quantum computers, but it represents an important step towards that goal, the researchers said. The study was published in Nature. #strange #phase #matter #span #time #dimensions
<urn:uuid:d80dc6a3-0e60-4d73-921c-6b15271ba2ad>
CC-MAIN-2022-33
https://silgitsin.com/this-strange-new-phase-of-matter-seems-to-span-2-time-dimensions/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572043.2/warc/CC-MAIN-20220814143522-20220814173522-00359.warc.gz
en
0.942761
1,117
3.71875
4
Quantum computers may one day rapidly find solutions to problems no regular computer might ever hope to solve, but there are vanishingly few quantum programmers when compared with the number of conventional programmers in the world. Now a new beginner’s guide aims to walk would-be quantum programmers through the implementation of quantum algorithms over the cloud on IBM’s publicly available quantum computers. Whereas classical computers switch transistors either on or off to symbolize data as ones or zeroes, quantum computers use quantum bits, or “qubits,” which because of the peculiar nature of quantum physics can exist in a state called superposition where they are both 1 and 0 at the same time. This essentially lets each qubit perform two calculations at once. The more qubits are quantum-mechanically linked, or entangled (see our explainer), within a quantum computer, the greater its computational power can grow, in an exponential fashion. Currently quantum computers are noisy intermediate-scale quantum (NISQ) platforms, meaning their qubits number up to a few hundred at most and are error-ridden as well. Still, quantum processors are widely expected to grow in terms of qubit count and quality, with the aim of achieving a quantum advantage that enables them to find the answers to problems no classical computers could ever solve. Although the field of quantum programming started in the 1990s, it has to date drawn only a small community. “Programming quantum computers may seem like a great challenge, requiring years of training in quantum mechanics and related disciplines,” says the guide’s senior author, Andrey Lokhov, a theoretical physicist at Los Alamos National Laboratory, in New Mexico. “Additionally, the field is dominated by physics and algebraic notations that at times present unnecessary entry barriers for mainstream computer and mathematically trained scientists.” Now, with their new guide, Lokhov and his colleagues hope to help pave the way “for the upcoming quantum-computing revolution,” he says. “We believe that our guide fills a missing space in the field of quantum computation, introducing nonexpert computer scientists, physicists, and engineers to quantum algorithms and their implementations on real-world quantum computers.” The new guide explains the basics of quantum computing and quantum programming, including quantum algorithms. “Very much like how classical algorithms describe a sequence of instructions that need to be executed on a classical computer, a quantum algorithm represents a step-by-step procedure, where each of the steps needs to be performed on a quantum computer,” Lokhov says. “However, the term ‘quantum algorithm’ is usually reserved for algorithms that contain inherently quantum operations, such as quantum superposition or quantum entanglement, which turn out to be computationally powerful.” “We believe that our guide fills a missing space in the field of quantum computation, introducing nonexpert computer scientists, physicists, and engineers to quantum algorithms and their implementations on real-world quantum computers.” —Andrey Lokhov To implement such quantum operations on quantum computers, quantum programs are represented as circuits describing a sequence of elementary operations, called gates, that are applied on a set of qubits. One major difference between quantum and classical programming lies in a central principle of quantum mechanics—when it comes to measuring a quantum program’s results, the process is inherently probabilistic, or subject to random variation. “Our guide aims to explain the basic principles of quantum programming, which are quite different from classical programming, with straightforward algebra that makes understanding the underlying fascinating quantum-mechanical principles optional,” Lokhov says. “We have received positive feedback from many scientists—beginners in the field—who were able to quickly familiarize themselves with the basics of quantum programming using our guide.” The new guide provides the minimal knowledge needed to start implementing and running quantum algorithms right away. These include 20 standard quantum algorithms, including Shor’s algorithm for factoring integers and Grover’s algorithm for database searching. “In addition, our review covers the most successful hybrid quantum-classical algorithms, such as the quantum approximate optimization algorithm, as well as classical tools that are useful for certifying the performance of quantum algorithms, such as quantum tomography,” Lokhov says. “Hence, the guide surveys a combination of quantum, classical, and hybrid algorithms that are foundational for the field of quantum computing.” The guide then walks quantum programmers through implementing these algorithms over the cloud on IBM’s publicly available quantum computers, such as its 5-qubit IBMQX4. The guide discusses the results of the implementation and explains differences between the simulator and the actual hardware runs. Lokhov notes that currently, in order to show that a new quantum algorithm works efficiently, one needs to give a mathematical proof. In contrast, in classical computing, many efficient algorithms were discovered heuristically—that is, by trial and error, or by loosely defined rules—with theoretical guarantees coming much later. The hope is that new quantum algorithms may get discovered in a similar fashion the more quantum programmers there are. “We believe that our guide could be useful for introducing more scientists to quantum computing and for inviting them to experiment with the forthcoming quantum computers with larger numbers of qubits,” Lokhov says. - Building a Quantum Computing Workforce from the Ground Up ... › - Waiting for Quantum Computing? Try Probabilistic Computing - IEEE ... › - Meet Twist: MIT's Quantum Programming Language - IEEE Spectrum ›
<urn:uuid:21be0c8b-f251-49e8-bb69-608f91a8e536>
CC-MAIN-2022-33
https://spectrum.ieee.org/quantum-computing-for-dummies
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572161.46/warc/CC-MAIN-20220815054743-20220815084743-00359.warc.gz
en
0.924259
1,151
3.609375
4
Scientists have now developed a universal quantum gate, which could become the key component in a quantum computer. Light particles completely ignore each other. In order that these particles can nevertheless switch each other when processing quantum information, researchers at the Max Planck Institute of Quantum Optics in Garching have now developed a universal quantum gate. Quantum gates are essential elements of a quantum computer. Switching them with photons, i.e. light particles, would have practical advantages over operating them with other carriers of quantum information. The light-saber fights of the Jedi and Sith in the Star Wars saga may well suggest something different, but light beams do not notice each other. No matter how high their intensity, they cut through each other without hindrance. When individual light particles meet, as is necessary for some applications of quantum information technology, nothing at all happens. Photons can therefore not switch each other just like that, as would have to be the case if one wanted to use them to operate a quantum gate, the elementary computing unit of a quantum computer. A quantum computer can master some tasks, such as searching through databases, much faster than conventional computers. Physicists have already developed quantum gates for the super-computers of the future, for example by using nitrogen atoms contained in diamonds as impurities as the smallest computing unit. But “to have a quantum computer compute with photons would have practical advantages,” says Stephan Ritter, who leads a Research Group in Gerhard Rempe’s Division at the Max Planck Institute of Quantum Optics. “This is because quantum information has to be in the form of photons in order to be transmitted over large distances. If we can use photons to process it as well, we do not have to transfer it to other carriers, such as atoms, in order to compute with it.” An atom in a resonator mediates between light particles In order for photons to sense each other’s presence in the first place, let alone switch each other, they need mediators. In the experiments being conducted by Stephan Ritter’s team of physicists, this mediating role is taken on by a single atom in a resonator. The resonator consists of two mirrors 0.5 mm apart. The Garching-based researchers use a laser beam to trap the atom in the resonator. For their experiments, the scientists now need two photons each carrying one qubit. A qubit is the quantum mechanical equivalent of the bit of a conventional computer. It can, however, not only encode the zero and the one, but assume all possible states in between as well. The researchers write the states of the two qubits into the polarization of the two light particles, i.e. into the direction of oscillation of the electromagnetic waves. The Max Planck physicists send the two photons, one shortly after the other, onto the system of atom and resonator. The first photon thereby transfers information to the atom by changing its state – but only if the photon has the right polarization. This change then has an effect on the polarization of the second photon when it impinges onto the system of atom and resonator a short time later. The quantum gate operates in a deterministic way “Our system only becomes a universal quantum gate because the second photon can also transfer information onto the first photon, however,” says Bastian Hacker, who conducted the experiments as part of his doctoral thesis. To this end, the scientists initially store the two photons in an optical fiber more than one kilometer in length after the light particles have been reflected at the resonator. At the same time, they conduct a measurement on the atom, which can also affect the polarization state of the two photons due to the surprising properties of quantum mechanics. As is the case with a conventional bit, there are only two possible measurement results. They provide the researchers with reliable information about which rotation of the polarization of the first photon they can use to complete the gate operation. “Our quantum gate operates in a deterministic way,” says Stephan Ritter. This means that the scientists can reliably predict which changes the light particles should experience in the quantum gate depending on the original polarization of the photons fed in. In addition, the gate carries out these operations on all photons which impinge on the resonator with the trapped atom – at least in principle. In reality, unavoidable technical shortcomings decrease the efficiency of the quantum gate as well as the precision of its operations. However, the researchers already have some ideas about how they can improve the two characteristics of the quantum gate: by using mirrors with lower losses, for example, or a storage device for the photons which is more efficient than an optical fibre. In other implementations of quantum gates between photons with which physicists have already experimented, the errors are inherent, however, because chance always plays a role here. Two experiments demonstrate how reliable the quantum gate is The Garching-based researchers have conducted two experiments to demonstrate how reliably their quantum gate already operates. Which operations the quantum gate executes here depends only on how the two input photons are polarized. In one experiment, the researchers circularly polarize the first photon so that its direction of oscillation rotates either clockwise or counter-clockwise. The second photon is linearly polarized, i.e. so that it oscillates in a horizontal or vertical plane. On a photon pair with these input states, the quantum gate acts like a CNOT operation, where the first qubit controls the second one. This is because, depending on the direction in which the first photon rotates, the quantum gate flips the polarization of the second photon – from the vertical to the horizontal plane, for example – or not. CNOT gates are essential for a quantum computer, because they can be used to execute all logic operations. For the second experiment, the researchers in Garching polarize both photons linearly. Fed with such input states, the quantum gate entangles the two photons. Entangled photons can no longer be described independently of each other, but only with a common state – no matter how great the distance between the two light particles. As much as entanglement puts our imagination to the test, for the quantum computer it is an indispensable ingredient like the CNOT gate. “Only the entanglement of qubits allows the strength of the quantum computer to be unfolded,” says Stephan Welte, who also contributed crucial work to the experiments as part of his doctoral thesis. The atom in the resonator as the key element of a quantum computer “With the quantum gate, we now have a key element for an optical quantum computer,” says Gerhard Rempe, Director at the Max Planck Institute in Garching. It will be a while before such a quantum computer completes some computing tasks at a speed which will outclass any conventional computer, however; not least because this requires the quantum gate to compute more reliably. Nevertheless, Gerhard Rempe already has definite ideas about how such a super-computer could be operated with an atom in the resonator. This would not require many of these systems, each of which can quite easily fill a laboratory. “The logic operations can be carried out one after the other with a single atom in a resonator,” says Gerhard Rempe. The European Commission obviously also believes that these quantum technology concepts have a future. It plans to invest one billion euros into their development over a period of approx. ten years. This funding could also speed up the process of realizing the superfast quantum computer – which is also what Stephan Ritter and his colleagues in Garching are hoping. Publication: Bastian Hacker, et al., “A photon-photon quantum gate based on a single atom in an optical resonator,” Nature (2016) doi:10.1038/nature18592 PDF copy of the Study: A photon-photon quantum gate based on a single atom in an optical resonator
<urn:uuid:54983e5a-b430-4ae1-8f45-dd4682e4a670>
CC-MAIN-2022-33
https://scitechdaily.com/researchers-develop-a-universal-quantum-gate/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571950.76/warc/CC-MAIN-20220813111851-20220813141851-00560.warc.gz
en
0.938638
1,635
3.796875
4
Achieving the immense promise of quantum computing requires new developments at every level, including the computing hardware itself. A Lawrence Berkeley National Laboratory (Berkeley Lab)-led international team of researchers has discovered a way to use ion beams to create long strings of “color center” qubits in diamond. Their work is detailed in the journal Applied Physics Letters. Creating large numbers of high-quality quantum bits (qubits), in close enough proximity for coupling to each other, is one of the great challenges of quantum computing. Collaborating with colleagues worldwide, the team has been exploring the use of ion beams to create artificial color centers in diamond for use as qubits. Color centers are microscopic defects – departures from the rigorous lattice structure of a crystal, such as diamond. The type of defect that is of specific interest for qubits is a nitrogen atom next to a vacancy, or empty space, in a diamond lattice. (Nitrogen is commonly found in the crystal lattice of diamond, which is primarily a crystalline form of carbon, and can contribute to the color of the stone.) When excited by the rapid energy deposition of a passing ion, nitrogen-vacancy centers can form in the diamond lattice. The electron and nuclear spins of nitrogen-vacancy centers and the adjacent carbon atoms can all function as solid-state qubits, and the crystal lattice can help protect their coherence and mutual entanglement. The result is a physically durable system that does not have to be used in a cryogenic environment, which are attractive attributes for quantum sensors and also for qubits in this type of solid-state quantum computer. However, making enough qubits, and making them close enough to each other, has been a challenge. When swift (high-energy) heavy ions such as the beams this team used – gold ions with a kinetic energy of about one billion electron volts – pass through a material, such as nitrogen-doped diamond, they leave a trail of nitrogen-vacancy centers along their tracks. Color centers were found to form directly, without need for further annealing (heat treatment). What’s more, they formed all along the ion tracks, rather than only at the end of the ion range as had been expected from earlier studies with lower-energy ions. In these straight “percolation chains,” color-center qubits are aligned over distances of tens of microns, and are just a few nanometers from their nearest neighbors. A technique developed by Berkeley Lab’s Molecular Foundry measured color centers with depth resolution. The work on qubit synthesis far from equilibrium was supported by the Department of Energy’s Office of Science. The next step in the research will be to physically cut out a group of these color centers – which are like a series of beads on a string – and show that they are indeed so closely coupled that they can be used as quantum registers. Results published in the current article show that it will be possible to form quantum registers with up to about 10,000 coupled qubits – two orders of magnitude greater than achieved thus far with the complementary technology of ion-trap qubits – over a distance of about 50 microns (about the width of a human hair). “Interactions of swift heavy ions with materials have been studied for decades for a variety of purposes, including the behavior of nuclear materials and the effects of cosmic rays on electronics,” said Schenkel. He added that researchers worldwide have sought to make quantum materials by artificially inducing color centers in diamond. “The solid-state approaches to quantum computing hardware scale beautifully, but integration has been a challenge. This is the first time that direct formation of color-center qubits along strings has been observed.” The stars, like diamonds On a miniscule and ephemeral scale (nanometers and picoseconds) the deposition of energy by the ion beams produces a state of high temperature, which Schenkel likens to the surface of the sun, in the 5000 K range, and pressure. Besides knocking carbon atoms out of the crystal lattice of diamond, this effect could enable fundamental studies of exotic states of transient warm dense matter, a state of matter that is present in many stars and large planets and which is difficult to study directly on Earth. It might also enable formation of novel qubits with tailored properties that cannot be formed with conventional methods. “This opens a new direction for expanding our ability to form quantum registers,” said Schenkel. Currently, color-center strings are formed with beams from large particle accelerators, such as the one at the German laboratory GSI that was used in this research. In the future, they might be made using compact laser-plasma accelerators like the ones being developed at the Berkeley Lab Laser Accelerator (BELLA) Center. The BELLA Center is actively developing its ion-acceleration capabilities with funding by the DOE Office of Science. These capabilities will be used as part of LaserNetUS. Ion pulses from laser-plasma acceleration are very intense and greatly expand our ability to form transient states of highly excited and hot materials for qubit synthesis under novel conditions. More facets in materials science far from equilibrium The process of creating these color centers is interesting in its own right and has to be better understood as part of further progress in these applications. The details of how an intense ion beam deposits energy as it traverses the diamond samples, and the exact mechanism by which this leads to color-center formation, hold exciting prospects for further research. “This work demonstrates both the discovery science opportunities and the potential for societally transformative innovations enabled by the beams from accelerators,” says ATAP Division Director Cameron Geddes. “With accelerators, we create unique states of matter and new capabilities that are not possible by other means.” The authors includes several scientists from Berkeley Lab: Arun Persaud, who led the study, and Thomas Schenkel, head of the Accelerator Technology and Applied Physics (ATAP) Division’s Fusion Science & Ion Beam Technology Program, as well as Casey Christian (now with Berkeley Lab’s Physics Division), Edward Barnard of Berkeley Lab’s Molecular Foundry, and ATAP affiliate Russell E. Lake. For information about licensing or collaboration, contact Berkeley Lab’s Intellectual Property Office at firstname.lastname@example.org.
<urn:uuid:79a5baa0-491c-410a-809e-e351d41cf2ed>
CC-MAIN-2022-33
https://thequantuminsider.com/2021/05/18/ion-trap-advance-berkeley-lab-pioneers-way-that-could-increase-scalability-to-over-10000-qubits-for-quantum-sensing-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571153.86/warc/CC-MAIN-20220810100712-20220810130712-00764.warc.gz
en
0.937644
1,329
3.71875
4
May 9, 2020 IST Austria scientists demonstrate quantum radar prototype New detection technique based on quantum technology developed at IST Austria – Study published in Science Advances Physicists at the Institute of Science and Technology Austria (IST Austria) have invented a new radar prototype that utilizes quantum entanglement as a method of object detection. This successful integration of quantum mechanics into our everyday devices could significantly impact the biomedical and security industries. The research is published in the journal Science Advances. Quantum entanglement is a physical phenomenon where two particles remain inter-connected, sharing physical traits regardless of how far apart they are from one another. Now, scientists from the research group of Professor Johannes Fink at the Institute of Science and Technology Austria (IST Austria) along with collaborators Stefano Pirandola from the Massachusetts Institute of Technology (MIT) and the University of York, UK, and David Vitali from the University of Camerino, Italy — have demonstrated a new type of detection technology called ‘microwave quantum illumination’ that utilizes entangled microwave photons as a method of detection. The prototype, which is also known as a ‘quantum radar’, is able to detect objects in noisy thermal environments where classical radar systems often fail. The technology has potential applications for ultra-low power biomedical imaging and security scanners. Using quantum entanglement as a new form of detection The working principles behind the device are simple: Instead of using conventional microwaves, the researchers entangle two groups of photons, which are called the ‘signal’ and ‘idler’ photons. The ‘signal’ photons are sent out towards the object of interest, whilst the ‘idler’ photons are measured in relative isolation, free from interference and noise. When the signal photons are reflected back, true entanglement between the signal and idler photons is lost, but a small amount of correlation survives, creating a signature or pattern that describes the existence or the absence of the target object—irrespective of the noise within the environment. “What we have demonstrated is a proof of concept for Microwave Quantum Radar,” says lead author and at the time of the research project postdoc in the Fink group Shabir Barzanjeh, whose previous research helped advance the theoretical notion behind quantum enhanced radar technology. “Using entanglement generated at a few thousandths of a degree above absolute zero (-273.14 °C), we have been able to detect low reflectivity objects at room-temperature.” Quantum technology can outperform classical low-power radar While quantum entanglement in itself is fragile in nature, the device has a few advantages over conventional classical radars. For instance, at low power levels, conventional radar systems typically suffer from poor sensitivity as they have trouble distinguishing the radiation reflected by the object from naturally occurring background radiation noise. Quantum illumination offers a solution to this problem as the similarities between the ‘signal’ and ‘idler’ photons — generated by quantum entanglement — makes it more effective to distinguish the signal photons (received from the object of interest) from the noise generated within the environment. Barzanjeh who is now an Assistant Professor at the University of Calgary on the prototype’s performance: “The main message behind our research is that ‘quantum radar’ or ‘quantum microwave illumination’ is not only possible in theory but also in practice. When benchmarked against classical low-power detectors in the same conditions we already see, at very low-signal photon numbers, that quantum-enhanced detection can be superior.” Prominent milestone towards advancing 80 year-old radar technology Throughout history, basic science has been one of the key drivers of innovation, paradigm shift and technological breakthrough. Whilst still a proof of concept, the group’s research has effectively demonstrated a new method of detection that, in some cases, may already be superior to classical radar. “Throughout history, proof of concepts such as the one we have demonstrated here have often served as prominent milestones towards future technological advancements. It will be interesting to see the future implications of this research, particularly for short-range microwave sensors.” says Barzanjeh. Last author and group leader Professor Johannes Fink adds “This scientific result was only possible by bringing together theoretical and experimental physicists that are driven by the curiosity of how quantum mechanics can help to push the fundamental limits of sensing. But to show an advantage in practical situations we will also need the help of experienced electrical engineers and there still remains a lot of work to be done in order to make our result applicable to real-world detection tasks.” S. Barzanjeh, S. Pirandola, D. Vitali & J. M. Fink. 2019. Science Advances. DOI: 10.1126/sciadv.abb0451 This IST Austria part of the project was supported by funding from the European Union (ERC Starting Grant QUNNECT, no. 758053), the EU’s Horizon 2020 research and innovation programme under grant agreement number 862644 (FET Open QUARTET), and IST Austria.
<urn:uuid:793e1f55-92af-48c4-8875-60be0ffab1ec>
CC-MAIN-2022-33
https://ista.ac.at/en/news/scientists-demonstrate-quantum-radar-prototype/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573744.90/warc/CC-MAIN-20220819161440-20220819191440-00364.warc.gz
en
0.921772
1,080
3.578125
4
What is Quantum Computing? Quantum Computing concentrates on the development of computer technology following the quantum theory principle. The classical computers that we use in the present times use a bit to process information. Bit uses either 1 or 0 to decode a piece of information. However, in quantum computing, the computer uses quantum bits or qubits to process data. It uses the unique ability of subatomic particles which enable them to exist in more than one state. Quantum and classical both try to solve problems, but the approach followed by each is different. Some of the major players engaged in quantum computing include Accenture, Alibaba Group, Amazon Bracket, AT&T, Atos Quantum, Baidu, Google Quantum AI Lab, IBM, Intel, and Microsoft. What makes Quantum Computers unique? Quantum Computing can be explained via two vital quantum physics features, ‘Superposition’ and ‘Entanglement’, that form the basis of supercomputers. These two features enable these supercomputers to operate at an exponentially higher speed as compared to conventional computers with less energy consumption. Superposition refers to the counterintuitive ability of quantum objects, like an electron that can exist in multiple states at the same time. Entanglement refers to the phenomenon quantum entities are created or manipulated in a way that none of them can be described without referencing others. Interesting Facts About Quantum Computing: - Quantum Computing is considered more efficient than modern computing because it uses quantum tunnelling, which supports in reducing the power up to a thousand times. - IBM’s Deep Blue computer was successful in defeating chess champion Garry Kasparov as this computer was capable of calculating 200 million potential moves every second. Quantum computing is even more efficient than IBM’s Deep Blue computer as it can calculate one trillion per second. - Quantum computers require cold temperature for accurate functioning. - The increased speed of quantum computing would speed up the learning speed of Artificial Intelligence. Why is Quantum Computing becoming so important? Why do we need it? In 2017 IP Expo, Professor Brian Cox said that quantum computers have a huge capacity to find answers related to life, the universe, and encryption. He stated that quantum computing could be considered as a massive stack of possibilities and sets of data. The objective to develop quantum computing was to execute Shor’s algorithm of large numbers. This led to the prime driver towards the field of quantum computing. Shor’s algorithm is a well-known algorithm for factoring prime numbers on a classical (non-quantum) computer that needs an amount of time that is basically exponential with the size of N. Here N is any number, let say 25. If we enter N=25, then the quantum computer returns the factors of 25. To develop a broader view of quantum computers, one must understand that quantum computing delivers incredible speedups for specific problems. On that front, the researchers are working understand the type of problem suitable for quantum speed-ups and accordingly develop the algorithms to resolve them. In simple terms, quantum computing is believed to solve problems related to optimisation, which plays a crucial role in every field from defence to financial trading. What are the different types of Quantum Computing? There are three types of quantum computing (as depicted in the image below). Quantum Annealing is used for solving optimisation problems. On this front, as highlighted above, researchers are looking for possible best possible configuration. An example of this is the quantum experiment which was conducted by Volkswagen in association with Google and D-Wave Systems, which aimed to reduce heavy travel in the city Beijing. The experiment was a success as it was able to reduce the traffic by selecting an ideal path for each vehicle. While classical computers can take many years to compute the optimisation solution, quantum computing can make this happen within a few hours or even less. Quantum Annealing can be provided beneficial for various industrial problems like air traffic control. Quantum Simulations cater to specific problems in quantum physics and are beyond the capacity of classical computing. An area where this type of quantum computing is more suitable includes modelling the effect of a chemical stimulation on massive subatomic particles. It is capable of simulating protein folding. Misfold protein can result in diseases such as Alzheimer’s and Parkinson’s. In this area, quantum computing can help in computing the massive protein folding sequence to prepare an effective medication. In the upcoming period, there are possibilities that we see that quantum simulations would be used for rapid drug designer testing. Universal Quantum Computing: Universal Quantum Computing is most challenging to build; however, these computers are highly powerful and are most generally applicable. These computers would make use of 100,000 qubits. In present times, we can access not more than 128 qubits. The idea behind developing the universal quantum computing is to direct the machine at any complex computation and get a quick solution. This includes solving the other two quantum computing types discussed above and even beyond that. In the long run, experts believe that universal quantum computers could be beneficial in the fields of Artificial Intelligence. INTERESTING READ: Top Technology Predictions for the Current Year & Beyond
<urn:uuid:d5e3ded3-7464-45b0-aa1b-ced686b25808>
CC-MAIN-2022-33
https://kalkinemedia.com/definition/q/quantum-computing
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571847.45/warc/CC-MAIN-20220812230927-20220813020927-00763.warc.gz
en
0.922158
1,091
3.71875
4
Quantum computing has become a buzzword in the IT industry. Some people think it'll change how we do computing forever and give us more processing power than we ever imagined. Some fear this new technology might break all current encryption and security. Others are creating sci-fi shows based on quantum computing, like Devs, which appears in this list of our community's favorite TV shows. But most people, even many developers, aren't quite sure what quantum computing is. Let's clear up some of the confusion. Quantum computing terms you need to know Before we get into how quantum computing works, let's look at some key terms that you'll need to know to understand the concept. The quantum in quantum computing refers to quantum mechanics. A quantum in physics is the minimum amount of any physical property that can exist. For instance, a photon is a single quantum of light. Quantization of energy and how it affects the interactions between matter and energy is part of the fundamental framework for describing the physical world. Qubit is short for quantum bit — the quantum version of the bit we use in classical computing. Standard bits can only be one of two values: 1 or 0. Qubits, on the other hand, hold a superposition of all possible states. Every quantum state can be represented as a sum of two or more other distinct states, and quantum particles combine all possible states. They remain in all of these states at once until they're actually observed and measured. Think of a coin flip. Once the coin lands on the ground, it'll be heads or tails, but while it's in the air, it still has a chance of being either one. Quantum computers use the concept of superposition to manipulate qubits and affect their probabilities before making a final measurement to get the answer. Entanglement is a process by which quantum particles can link up so that their states stay linked no matter how far apart they are in space. They share a unified quantum state and can exert an influence on each other. By entangling qubits in a quantum computer, more information can be represented simultaneously, giving the quantum computer more computing power and the ability to solve more complicated problems. In a quantum computer, entanglement is a good thing, but interference is bad. Quantum interference is part of a qubit’s natural behavior that can influence the probability of the final measurement of its superposition. Quantum computers try to reduce interference as much as possible to ensure more accurate results. How does quantum computing work? A quantum computer has three main parts. The first part is the structure that holds the qubits used for computation. These qubits must be stored in a way that minimizes quantum interference. In some quantum computers, superfluids chill the qubit housing to a hundredth of a degree Celsius above absolute zero to keep the qubits stable. Other quantum computers use a vacuum to help with qubit cohesion and minimize interference between them. The second part is a mechanism for transferring information to the qubits. To use them for computations, their behavior must be controlled so they can hold, change, and read information. There are a few ways to do this. Lasers, microwaves, and voltage are the most common. The third and final major part of a quantum computer is a standard computer where the code written for the quantum computer is run. It interfaces with the control mechanism, which sends instructions to the qubits. Where can quantum computing be used? Quantum computing is still in its early stages, and it's not quite ready to be used in everyday businesses. Still, some companies are starting to find new uses for the technology. Most of the work in quantum computing is currently being done by scientists and quantum computing experts who create proof-of-concept applications and test them on a small scale to help identify future uses for the technology. That way, they'll be ready when quantum hardware develops to the point that it's practical for more uses. Also, while a quantum computer can do certain things many magnitudes faster than a classical computer, they don't do everything quicker and aren't practical for some computational problems. Here are some of the many industries where quantum computing will have the biggest impact. The power of quantum computers threatens to make current cryptography techniques obsolete, such as RSA encryption, which is used to secure much of the sensitive data in the digital world. The good news is that there are already companies working on new cryptography techniques that even quantum computers can't crack. Machine learning is changing many things about our world, but running machine learning algorithms on traditional computers takes a lot of time and resources. Scientists and Quantum Computing Researchers are looking into new ways to make machine learning faster and more efficient using quantum computers. Quantum computers have many uses in the healthcare industry. They simulate chemical reactions much faster than standard computers, and they're also used for protein folding, where they help speed up the creation of new drugs. Quantum computing is also used in fintech, where its power makes parsing massive amounts of financial data quicker and model creation more accurate. It can also be used in fraud detection and portfolio risk optimization. Quantum computers are good at optimization. There are many challenges involved in supply chains and international shipping routes that can take a standard computer literally years to solve, but a quantum computer can solve in only minutes. Programming languages and SDKs used in quantum computing The programming languages used in quantum computing may have a similar syntax to those used in standard programming, but they were created specifically to handle the quantum computing environment. But that doesn't mean you can't still use standard programming languages. There are high-level SDKs (Software Development Kits) written in languages like Python that allow you to branch into quantum computing without needing to learn a new language. Here are some of the many programming languages and SDKs used in quantum computing: - QCL: QCL (Quantum Computing Language) is one of the first programming languages used for quantum computing. Its syntax resembles the C programming language, and its data types are similar to the primitive data types in C. - Q: Q was the second programming language implemented in quantum computers. It was designed as an extension of C++, so C++ developers can start working with it quickly. - OpenQASM: OpenQASM (Open Quantum Assembly Language) is a low-level language released by IBM for use with quantum computers. - Q#: Q# is an open-source quantum programming language offered by Microsoft. It has some features that developers who know the Python, C#, and F# programming languages will recognize. - Silq: Silq is an open-source high-level programming language written in the D programming language. It's available on Github and is relatively new. The first version was published in 2020. - Cirq: Cirq is a Python library created by Google for writing, manipulating, and optimizing quantum circuits. Cirq abstracts away many of the low-level details of quantum hardware in a language familiar to many developers. - Qiskit SDK: Qiskit is a software development kit created specifically for working with the OpenQASM programming language and IBM Q quantum processors. It's written in Python, so developers don't have to have high-level knowledge of quantum hardware to use it. - Braket SDK: The Braket SDK is yet another quantum computing SDK written in Python that works with Amazon's proprietary Braket quantum computing platform. How to get started in quantum computing As we said, quantum computing isn't yet practical enough to be used in the average business. So you can't get a job writing code for quantum computers yet, unless the job is with a business currently experimenting with the technology or building their own quantum computers. Still, you can experiment with quantum computer coding right now. Here are four places you can do that: - Amazon Braket: Amazon will give you one free hour per month to experiment with their quantum computing platform, and it provides an SDK written in Python to interact with the Braket platform so you can write quantum code in a familiar programming language. - IBM Quantum: You can also sign up for an account with IBM to run experiments on their quantum computing platform. You can write your code in Python here using the Qiskit SDK. - Azure Quantum: You can experiment with the quantum computers that Microsoft has access to, and when you sign up, you can get a free $200 credit. - DWave Leap: DWave also provides developers with limited free access to their quantum computing platform. Python is a good choice if you're ready to jump into quantum computing today since Circ, the Qiskit SDK, and the SDK for Amazon's Braket are based on the language. Check out our Learn Python 3 course to learn what you need to know to get started. Or, if you'd rather work with some of the low-level languages used for quantum computing, try Learn C++.
<urn:uuid:a86505a1-2152-4a14-a753-ea33180990c2>
CC-MAIN-2022-33
https://www.codecademy.com/resources/blog/what-is-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572212.96/warc/CC-MAIN-20220815205848-20220815235848-00365.warc.gz
en
0.938099
1,881
3.640625
4
AI machine learning presents a roadmap to define new materials for any need, with implications in green energy and waste reduction. Scientists and institutions dedicate more resources each year to the discovery of novel materials to fuel the world. As natural resources diminish and the demand for higher value and advanced performance products grows, researchers have increasingly looked to nanomaterials. Nanoparticles have already found their way into applications ranging from energy storage and conversion to quantum computing and therapeutics. But given the vast compositional and structural tunability nanochemistry enables, serial experimental approaches to identify new materials impose insurmountable limits on discovery. Now, researchers at Northwestern University and the Toyota Research Institute (TRI) have successfully applied machine learning to guide the synthesis of new nanomaterials, eliminating barriers associated with materials discovery. The highly trained algorithm combed through a defined dataset to accurately predict new structures that could fuel processes in clean energy, chemical, and automotive industries. “We asked the model to tell us what mixtures of up to seven elements would make something that hasn’t been made before,” said Chad Mirkin, a Northwestern nanotechnology expert, and the paper’s corresponding author. “The machine predicted 19 possibilities, and, after testing each experimentally, we found 18 of the predictions were correct.” The study, “Machine learning-accelerated design and synthesis of polyelemental heterostructures,” will be published December 22 in the journal Science Advances. Mirkin is the George B. Rathmann Professor of Chemistry in the Weinberg College of Arts and Sciences; a professor of chemical and biological engineering, biomedical engineering, and materials science and engineering at the McCormick School of Engineering; and a professor of medicine at the Feinberg School of Medicine. He also is the founding director of the International Institute for Nanotechnology. Mapping the materials genome According to Mirkin, what makes this so important is the access to unprecedentedly large, quality datasets because machine learning models and AI algorithms can only be as good as the data used to train them. The data-generation tool, called a “Megalibrary,” was invented by Mirkin and dramatically expands a researcher’s field of vision. Each Megalibrary houses millions or even billions of nanostructures, each with a slightly distinct shape, structure and composition, all positionally encoded on a two-by-two square centimeter chip. To date, each chip contains more new inorganic materials than have ever been collected and categorized by scientists. Mirkin’s team developed the Megalibraries by using a technique (also invented by Mirkin) called polymer pen lithography, a massively parallel nanolithography tool that enables the site-specific deposition of hundreds of thousands of features each second. When mapping the human genome, scientists were tasked with identifying combinations of four bases. But the loosely synonymous “materials genome” includes nanoparticle combinations of any of the usable 118 elements in the periodic table, as well as parameters of shape, size, phase morphology, crystal structure and more. Building smaller subsets of nanoparticles in the form of Megalibraries will bring researchers closer to completing a full map of a materials genome. Mirkin said that even with something similar to a “genome” of materials, identifying how to use or label them requires different tools. “Even if we can make materials faster than anybody on earth, that’s still a droplet of water in the ocean of possibility,” Mirkin said. “We want to define and mine the materials genome, and the way we’re doing that is through artificial intelligence.” Machine learning applications are ideally suited to tackle the complexity of defining and mining the materials genome, but are gated by the ability to create datasets to train algorithms in the space. Mirkin said the combination of Megalibraries with machine learning may finally eradicate that problem, leading to an understanding of what parameters drive certain materials properties. ‘Materials no chemist could predict’ If Megalibraries provide a map, machine learning provides the legend. Using Megalibraries as a source of high-quality and large-scale materials data for training artificial intelligence (AI) algorithms, enables researchers to move away from the “keen chemical intuition” and serial experimentation typically accompanying the materials discovery process, according to Mirkin. “Northwestern had the synthesis capabilities and the state-of-the-art characterization capabilities to determine the structures of the materials we generate,” Mirkin said. “We worked with TRI’s AI team to create data inputs for the AI algorithms that ultimately made these predictions about materials no chemist could predict.” In the study, the team compiled previously generated Megalibrary structural data consisting of nanoparticles with complex compositions, structures, sizes and morphologies. They used this data to train the model and asked it to predict compositions of four, five and six elements that would result in a certain structural feature. In 19 predictions, the machine learning model predicted new materials correctly 18 times — an approximately 95% accuracy rate. With little knowledge of chemistry or physics, using only the training data, the model was able to accurately predict complicated structures that have never existed on earth. “As these data suggest, the application of machine learning, combined with Megalibrary technology, may be the path to finally defining the materials genome,” said Joseph Montoya, senior research scientist at TRI. Metal nanoparticles show promise for catalyzing industrially critical reactions such as hydrogen evolution, carbon dioxide (CO2) reduction and oxygen reduction and evolution. The model was trained on a large Northwestern-built dataset to look for multi-metallic nanoparticles with set parameters around phase, size, dimension and other structural features that change the properties and function of nanoparticles. The Megalibrary technology may also drive discoveries across many areas critical to the future, including plastic upcycling, solar cells, superconductors and qubits. A tool that works better over time Before the advent of megalibraries, machine learning tools were trained on incomplete datasets collected by different people at different times, limiting their predicting power and generalizability. Megalibraries allow machine learning tools to do what they do best — learn and get smarter over time. Mirkin said their model will only get better at predicting correct materials as it is fed more high-quality data collected under controlled conditions. “Creating this AI capability is about being able to predict the materials required for any application,” Montoya said. “The more data we have, the greater predictive capability we have. When you begin to train AI, you start by localizing it on one dataset, and, as it learns, you keep adding more and more data — it’s like taking a kid and going from kindergarten to their Ph.D. The combined experience and knowledge ultimately dictates how far they can go.” The team is now using the approach to find catalysts critical to fueling processes in clean energy, automotive and chemical industries. Identifying new green catalysts will enable the conversion of waste products and plentiful feedstocks to useful matter, hydrogen generation, carbon dioxide utilization and the development of fuel cells. Producing catalysts also could be used to replace expensive and rare materials like iridium, the metal used to generate green hydrogen and CO2 reduction products. Reference: “Machine learning-accelerated design and synthesis of polyelemental heterostructures” 22 December 2021, Science Advances. The research was supported by TRI. Additional support came from the Sherman Fairchild Foundation, Inc., and the Air Force Office of Scientific Research (award numbers FA9550-16-1-0150 and FA9550-18-1-0493). Northwestern co-authors are materials science and engineering doctoral student Carolin B. Wahl and chemistry doctoral student Jordan H. Swisher, both members of the Mirkin lab. Authors from TRI include Muratahan Aykol and Montoya. This work made use of the EPIC facility of Northwestern University’s NUANCE Center, which has received support from the Soft and Hybrid Nanotechnology Experimental (SHyNE) Resource (NSF ECCS-1542205); the MRSEC program (NSF DMR-1720139) at the Materials Research Center; the International Institute for Nanotechnology (IIN); the Keck Foundation; and the State of Illinois, through the IIN.
<urn:uuid:e35ff137-2317-4d4a-9e84-3b371520a3df>
CC-MAIN-2022-33
https://scitechdaily.com/ai-used-to-predict-synthesis-of-complex-novel-materials-materials-no-chemist-could-predict/?utm_source=pocket_mylist
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571472.69/warc/CC-MAIN-20220811133823-20220811163823-00567.warc.gz
en
0.918852
1,780
3.625
4
This article is an exploration of how quantum computing enhances machine learning and artificial intelligence systems. The difference between classical computing and quantum computing is that classical computing is exclusively binary, with data stored in physical bits of “zeros” or “ones” but never both concurrently; while in quantum computing, there is an allowance for linearity such that a combination of both states simultaneously is possible giving room for significantly more data to be stored in a unit (quantum bit) than in a regular one. An illustration of the importance of quantum computing is in spotting relationships between very large datasets. A conventional system would consider each item in a parallel manner and would take a long time; in some cases, due to the size of the datasets, it might never arrive at a solution. A quantum computer on the other hand would resolve the problem in a matter of seconds. Impact of Quantum Computing The application of quantum algorithms in techniques involving artificial intelligence will enhance the learning abilities of machines. This will result in the development of prediction systems like those of the financial industry being improved. There is, however, a waiting period before these improvements will be evident. The processing power needed to derive value from the numerous streams of data being collected, particularly for the application of artificial intelligence techniques like machine learning continually increases. Researchers have been putting efforts into expediting these processes by the application of quantum computing algorithms to AI techniques; this has resulted in a previously non-existent discipline referred to as Quantum Machine Learning being formed. Artificial intelligence and machine learning technologies are two main aspects of research in quantum computing algorithm application. A characteristic of this system of calculation is its allowance for the representation of multiple states simultaneously; this is especially suitable for AI techniques. Intel notes that voice assistants would be beneficiaries of the implementation with quantum computing increasing accuracy in folds, enhancing the quantity of data they are capable of handling as well as their processing power. Machines can process a higher amount of calculation variables when quantum computing is used, leading to answers being arrived at more speedily than if a person does it. Increased Algorithm Accuracy Quantum computing is applicable in many fields for the solution of problems because it is capable of representing and handling numerous states. Intel has made several forays into researching quantum algorithms owing to the sheer number of opportunities it presents. An example would be material science which is a field where the initial applications will yield results; where small molecule modeling is a task heavily reliant on computing. Bigger, more complex machines will give room for medicine design and logistics optimizations to discern the route with the greatest efficiency. Supervised learning forms the bulk of industrial application of AI in such areas as recognition of images and prediction of consumption trends. Fernandez Lorenzo expounds that going on various QML proposals put forward, this aspect will very likely experience potentially exponential growth. In the aspect of reinforcement learning, there is still plenty of ground to cover; as well as specified application to the solution of practical issues plaguing the industry. Another promising, but less explored aspect is that of non-supervised learning. A researcher considers the case of dimensionality reduction algorithms, used for the representation of data in a space more limited than that occupied by the original but still retains the most vital characteristics of the parent dataset. He states that quantum computing will be useful in identifying more general properties than the ones specific to the dataset. The capability of reinforcement learning to manage complicated scenarios is evident in its video gaming application. The most difficult task with regards to time consumed and computing workload is the training received by the algorithm. Fernandez Lorenzo highlights that theoretical proposals have been put forward to hasten this training by engaging quantum computers which may instigate a significantly more advanced artificial intelligence than is currently obtainable. Use in the Banking Sector The unification of quantum computing and artificial intelligence in the sector of finance may aid the fight against fraud and improve its detection. Models trained to utilize a quantum computer would be able to identify patterns that would likely elude more mainstream instruments. Models are also being developed whereby numerical calculations can be used in conjunction with professional advice to arrive at financial resolutions. An NBD researcher from BBVA identifies a key benefit of these models as their ease of interpretation when compared to neural network algorithms, increasing the chances of them being approved by a regulatory board. Provision of customized products and services to customers is the learning of the banking sector currently. This is done by utilizing developed systems of recommendation. Several quantum models have been suggested for the improvement of the performance of these systems. Fernandez believes that in the not-so-distant future, the sector would be able to project favorable strategies for investment inspired by quantum algorithms. To arrive at this destination, research is being done into investigating the links between machine learning and quantum supremacy concerning what existing quantum processors are capable of. The breakthrough will be dependent on how possible it would be to build models that regular computers would be almost incapable of implementing. Studies are yet to be done on how these models would be applicable in the industry from a practical viewpoint. The limitations on machine language algorithms due to classical computers’ computational power will be far less in quantum computers. Sycamore, a quantum processor Google claims to have developed, solved in 200 seconds a task that would take the world’s fastest supercomputer at least 10,000 years to solve. A potential problem that could have arisen from quantum computing would be sensitivity to environmental alterations potentially leading to errors, but a research team at Max Planck Institute for the Science of Light showed that artificial intelligence neural networks are capable of correcting quantum errors.
<urn:uuid:4b62a9cb-045d-4585-b3eb-df48277ab495>
CC-MAIN-2022-33
https://dataenigmaco.wordpress.com/2021/06/30/the-impact-of-quantum-computing-data-science-and-artificial-intelligence/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573630.12/warc/CC-MAIN-20220819070211-20220819100211-00766.warc.gz
en
0.944685
1,135
3.828125
4
An international team led by Princeton University scientists has discovered an elusive massless particle theorized 85 years ago. The particle could give rise to faster and more efficient electronics because of its unusual ability to behave as matter and antimatter inside a crystal, according to new research. The researchers report in the journal Science July 16 the first observation of Weyl fermions, which, if applied to next-generation electronics, could allow for a nearly free and efficient flow of electricity in electronics, and thus greater power, especially for computers, the researchers suggest. Proposed by the mathematician and physicist Hermann Weyl in 1929, Weyl fermions have been long sought by scientists because they have been regarded as possible building blocks of other subatomic particles, and are even more basic than the ubiquitous, negative-charge carrying electron (when electrons are moving inside a crystal). Their basic nature means that Weyl fermions could provide a much more stable and efficient transport of particles than electrons, which are the principle particle behind modern electronics. Unlike electrons, Weyl fermions are massless and possess a high degree of mobility; the particle’s spin is both in the same direction as its motion — which is known as being right-handed — and in the opposite direction in which it moves, or left-handed. “The physics of the Weyl fermion are so strange, there could be many things that arise from this particle that we’re just not capable of imagining now,” said corresponding author M. Zahid Hasan, a Princeton professor of physics who led the research team. The researchers’ find differs from the other particle discoveries in that the Weyl fermion can be reproduced and potentially applied, Hasan said. Typically, particles such as the famous Higgs boson are detected in the fleeting aftermath of particle collisions, he said. The Weyl fermion, however, was discovered inside a synthetic metallic crystal called tantalum arsenide that the Princeton researchers designed in collaboration with researchers at the Collaborative Innovation Center of Quantum Matter in Beijing and at National Taiwan University. The Weyl fermion possesses two characteristics that could make its discovery a boon for future electronics, including the development of the highly prized field of efficient quantum computing, Hasan explained. For a physicist, the Weyl fermions are most notable for behaving like a composite of monopole- and antimonopole-like particles when inside a crystal, Hasan said. This means that Weyl particles that have opposite magnetic-like charges can nonetheless move independently of one another with a high degree of mobility. The researchers also found that Weyl fermions can be used to create massless electrons that move very quickly with no backscattering, wherein electrons are lost when they collide with an obstruction. In electronics, backscattering hinders efficiency and generates heat. Weyl electrons simply move through and around roadblocks, Hasan said. “It’s like they have their own GPS and steer themselves without scattering,” Hasan said. “They will move and move only in one direction since they are either right-handed or left-handed and never come to an end because they just tunnel through. These are very fast electrons that behave like unidirectional light beams and can be used for new types of quantum computing.” The Latest on: Weyl fermions via Google News The Latest on: Weyl fermions - Weyl loops link upon July 11, 2022 at 5:01 pm These studies brought them to Weyl loops, which are structures involving Weyl fermions – massless particles first predicted in 1929 by the theoretical physicist Herman Weyl as a solution to the Dirac ... - Advanced Quantum Condensed Matter Physicson June 21, 2022 at 10:48 pm Applications as manifest in the quantum Hall effect, topological insulators and Weyl semimetal are presented ... and ending with modern aspects … such as Dirac materials and Dirac fermions. The ... - At the SLS (IMAGE)on July 31, 2021 at 8:05 am The 3 PSI researchers Junzhang Ma, Ming Shi and Jasmin Jandke (from left to right) at the Swiss Light Source SLS, where they succeeded in proving the existence of Weyl fermions in paramagnetic ... - Christopher Weberon August 17, 2020 at 3:06 am His current work focuses on the newly-discovered Dirac and Weyl semimetals, materials in which electrons behave as though massless. - Evidence for Weyl fermions by the local nuclear magnetic resonance techniqueson June 25, 2019 at 5:09 am Tantalum has one of the largest quadrupole moments among all elements which makes it a rather useful local probe for excitations of Weyl fermions in the new Weyl semimetal TaP. We found three NQR ... - Institute of Physics, Chinese Academy of Scienceson July 18, 2018 at 11:38 am IOP has made many breakthroughs in the fundamental research of physics, e.g. the discoveries of “Weyl fermions in condensed matter”, “three-component fermions in the topological semimetal ... - The Physics of Neutrinoson June 21, 2018 at 12:47 pm The Weyl equation governing the motion of Weyl fermions tells us that Weyl fields are eigenstates of helicity and are therefore massless. Massive neutrinos may be of the Dirac or Majorana type. The ... - Berry Phases in Electronic Structure Theoryon March 28, 2018 at 1:20 pm Teicher, S. M. L. Svenningsson, I. K. Schoop, L. M. and Seshadri, R. 2019. Weyl nodes and magnetostructural instability in antiperovskite Mn3ZnC. APL Materials, Vol ... - Researchers Stumble Upon a New Type of Quantum Materialon December 27, 2017 at 12:35 pm The researchers realized that the reduction of mass could be attributed to the presence of Weyl fermions. It's only recently that the existence of solid-state conducting materials capable of ... - Physicists Predict The Existence of New Particle in the "Material Universe"on November 29, 2015 at 7:33 am Three of these quasiparticles, the Dirac, Majorana, and Weyl fermions, were discovered in such materials, despite the fact that the latter two had long been elusive in experiments." A crystal of ... via Bing News
<urn:uuid:74120d7d-4675-4729-a0f5-3d6dc69723ca>
CC-MAIN-2022-33
https://innovationtoronto.com/2015/07/after-85-year-search-massless-particle-with-promise-for-next-generation-electronics-discovered/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570692.22/warc/CC-MAIN-20220807181008-20220807211008-00569.warc.gz
en
0.927779
1,379
3.734375
4
JILA researchers make coldest quantum gas of molecules As featured on the cover of the Feb. 22 issue of Science, the team produced a gas of potassium-rubidium (KRb) molecules at temperatures as low as 50 nanokelvin (nK). That's 50 billionths of a Kelvin, or just a smidge above absolute zero, the lowest theoretically possible temperature. The molecules are in the lowest-possible energy states, making up what is known as a degenerate Fermi gas. In a quantum gas, all of the molecules' properties are restricted to specific values, or quantized, like rungs on a ladder or notes on a musical scale. Chilling the gas to the lowest temperatures gives researchers maximum control over the molecules. The two atoms involved are in different classes: Potassium is a fermion (with an odd number of subatomic components called protons and neutrons) and rubidium is a boson (with an even number of subatomic components). The resulting molecules have a Fermi character. JILA is jointly operated by the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder. NIST researchers at JILA have been working for years to understand and control ultracold molecules, which are more complex than atoms because they not only have many internal energy levels but also rotate and vibrate. The JILA team made their first molecular gas 10 years ago. “The basic techniques for making the gas are the same ones we've used before, but we have a few new tricks such as significantly improving the cooling of the atoms, creating more of them in the lowest-energy state,” NIST/JILA Fellow Jun Ye said. “This results in a higher conversion efficiency so we get more molecules.” The JILA team produced 100,000 molecules at 250 nK and as many as 25,000 molecules at 50 nK. Before now, the coldest two-atom molecules were produced in maximum numbers of tens of thousands and at temperatures no lower than a few hundred nanoKelvin. JILA's latest gas temperature record is much lower than (about one-third of) the level where quantum effects start to take over from classical effects, and the molecules last for a few seconds–remarkable longevity, Ye said. The new gas is the first to get cold and dense enough for the matter waves of these molecules to be longer than distances between them, making them overlap with each other to create a new entity. Scientists call this quantum degeneracy. (Quantum matter can behave as either particles or matter waves, that is, waveform patterns of the probability of a particle's location). Quantum degeneracy also means an increase in the repulsion among fermionic particles, which tend to be loners anyway, resulting in fewer chemical reactions and a more stable gas. This is the first experiment in which scientists have observed collective quantum effects directly affecting the chemistry of individual molecules, Ye said. “This is the first quantum degenerate gas of stable molecules in bulk, and the chemical reactions are suppressed–a result that nobody had predicted,” Ye said. The molecules created in this experiment are called polar molecules because they have a positive electric charge at the rubidium atom and a negative charge at the potassium atom. Their interactions vary by direction and can be controlled with electric fields. Polar molecules thus offer more tunable, stronger interactions and additional control “knobs” compared with neutral particles. These new ultralow temperatures will enable researchers to compare chemical reactions in quantum versus classical environments and study how electric fields affect the polar interactions. Eventual practical benefits could include new chemical processes, new methods for quantum computing using charged molecules as quantum bits, and new precision measurement tools such as molecular clocks. The process for making the molecules begins with a gas mixture of very cold potassium and rubidium atoms confined by a laser beam. By sweeping a precisely tuned magnetic field across the atoms, scientists create large, weakly bound molecules containing one atom of each type. This technique was pioneered by Ye's colleague, the late Deborah Jin, in her 2003 demonstration of the world's first Fermi condensate. To convert these relatively fluffy molecules into tightly bound molecules without heating the gas, scientists use two lasers operating at different frequencies–each resonating with a different energy jump in the molecules–to convert the binding energy into light instead of heat. The molecules absorb near-infrared laser light and release red light. In the process, 90 percent of the molecules are converted through an intermediate energy state, to the lowest and most stable energy level. The research is supported by NIST, the Air Force Office of Scientific Research, the Army Research Office and the National Science Foundation. Paper: L. De Marco, G. Valtolina, K. Matsuda, W.G. Tobias, J.P. Covey and J. Ye. 2018. A Fermi Degenerate Gas of Polar Molecules. Science. Feb 22, 2019 issue. DOI: 10.1126/science.aau7230 All latest news from the category: Physics and Astronomy This area deals with the fundamental laws and building blocks of nature and how they interact, the properties and the behavior of matter, and research into space and time and their structures. innovations-report provides in-depth reports and articles on subjects such as astrophysics, laser technologies, nuclear, quantum, particle and solid-state physics, nanotechnologies, planetary research and findings (Mars, Venus) and developments related to the Hubble Telescope. Important milestone on the way to transition metal catalysis with aluminum Chemists successfully synthesize a cationic, low-valent aluminum complex salt via metathesis. The chemists Philipp Dabringhaus, Julie Willrett and Prof. Dr. Ingo Krossing from the Institute of Inorganic and Analytical Chemistry… A simple way of sculpting matter into complex shapes A new method for shaping matter into complex shapes, with the use of ‘twisted’ light, has been demonstrated in research at the University of Strathclyde. When atoms are cooled to…
<urn:uuid:57413460-526b-4eeb-8321-728e59c1170d>
CC-MAIN-2022-33
https://www.innovations-report.com/physics-and-astronomy/jila-researchers-make-coldest-quantum-gas-of-molecules/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572220.19/warc/CC-MAIN-20220816030218-20220816060218-00770.warc.gz
en
0.917082
1,289
3.546875
4
Just to recall we discussed in my previous post that persistent current flows in a superconductor to cancel out the externally applied magnetic field and this current does not contain electrons instead it consists of cooper pairs . This effect is called Meissner Effect . Now in my first blog we talked about tunneling ,means if conductors are extremely near to each other than barrier cannot control the flow of electrons or flow of current from one conductor to another Josephson Junction is an amalgamation of all these concepts, If we will separate two superconductors (any conductor behaves like a superconductor only under a critical temperature) by a thin barrier and apply external magnetic field to it. A current consists of cooper pairs start flowing in these super conductors to oppose the external magnetic field as per Meissner Effect. Now because these superconductors are separated by a very thin barrier ,This current of copper pair tunnels through the barrier and reaches to other superconductor . So, the net current in these superconductors is the coupling of these two currents one is the actual current which is flowing because of external magnetic field and one is flowing because of tunneling. Copper Pairs are tunnelling through the barrier, this barrier is called Josephson junction Now in my first blog, I talked about wavefunction of electron but in case of superconductors current is flown by copper pairs and this current is also tunneled from one superconductor to another .So in these two superconductors currents are flowing in both directions simultaneously and this dual current is changing corresponding to external magnetic field ,wavefunctions of these cooper pair looks like as below Here Ψ1 represents the wave function of copper pairs present in persistent current flowing in super conductor and Ψ2 represents the wave function of copper pairs tunnel to this super conductor and K represents the coefficient of tunneled current . µ1 and µ2 represents the energy levels of wave functions. Here Ψ1 and Ψ2 can be represented as below Where n1 and n2 are copper pairs density and θ1 and θ2 are phases. Here phases basically define the direction of current or copper pair movement. Now these wave functions can be represented in terms of current as well because current is nothing but measurement of number of electrons pass in one second or in some time unit Current is measured in ampere and below is definition of ampere “The SI unit of electric current is the ampere, which is the flow of electric charge across a surface at the rate of one coulomb per second” And Electric charge is measured in terms of coulomb which is basically defines the charge carried by one electron. Electric charge on an electron is approximately 1.6021773310−19 coulomb. So, if this value is 2 coulombs per second it means 2 electrons passed in one second Now intensity of this current is represented by µ1 and µ2 in Equation 1 .In much simpler way We can understand that current is flow of electrons from negative to positive ,now if we want to increase the speed of this flow, we apply voltage to this flow. Voltage pushes electrons and electrons moves with greater speed and this change in speed changes the overall energy of wave function. Please note in case of superconductor voltage pushes copper pairs, as current consists of cooper pairs not electrons. Full derivation on this is as below Till now, hopefully we understood that current flows in both direction in these Josephson junction based superconductors ,in presence of external magnetic field and Intensity of these current or wave function of current can be controlled by voltage or by changing the external magnetic field Now let us understand how to measure this coupled current As you can see above JJ2 and JJ1 are Josephson junctions, H is the external magnetic field and IB is the net persistent current which is flowing in this superconductor to expel the magnetic field H. Now in JJ1 junction I1 current is coming in anti-clockwise direction while I2 current is coming from clockwise direction, so net current which is flowing in this junction is the coupled effect of both these currents. Now current IB is the current, which is expelling the magnetic field, so this current will not change until external magnetic field will not change. But currents at junctions can be controlled by application of voltage or in simple words let us say if IB= I1+ I2 Let us say we apply some voltage on junction 1 and I1 is increased to I1´ then I2 will decrease to I2´ to maintain the same persistent current IB ,because external magnetic field is not changed ,so current IB also can’t be changed, but I1 and I2 can change on application of voltage. IB= I1´+ I2´ Now overall magnetic field of this entire system which is created by current IB will not change. But because current is changing at junctions, magnetic field at junctions will change and this change in magnetic field can be measured using magnetic flux. This effect is called AC Josephson Effect . We can change the current at junction by changing the external magnetic field as well. This effect is called DC Josephson Effect. This magnetic flux is indirectly proportional to current and voltage on junctions and we have mentioned in equation 5 and 6 above that wave function of copper pairs can be represented in terms of current and voltage. So if we can measure this magnetic flux ,we can measure this wavefunction from outside .This magnetic flux can be measured by a device called SQUID. Now after all this explanation you might be thinking where is the qubit in all this To understand it let us see what a bit is Bit is 0 when current does not flow and Bit is 1 when current does flow right? Now in case of qubit ,current is always flowing ,its never 0. But qubit means ,there are two currents flowing instead of one and wave function of this qubit is the combination of these two currents. What is changing is the intensities of these two current , if current I2 is extremely high then current I1 will be low to maintain the net persistent current IB, and when these currents changes ,wave function changes ,and because magnetic flux is proportional to these currents ,so when we measure this magnetic flux ,we are measuring different values of this wavefunction, or different states of qubit. This magnetic flux can be measured using SQUID, so we can measure the qubit as well without collapsing the wave function. I hope you all have better understanding of qubit Now. In next blog we will read about how to identify different states of qubits from different values of flux.
<urn:uuid:985bc464-b176-40d8-934b-881d2a3c330d>
CC-MAIN-2022-33
https://acsharmablog.com/2020/07/15/super-conducting-qubit-ii/?like_comment=182&_wpnonce=1a5a1fd5ac
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571911.5/warc/CC-MAIN-20220813081639-20220813111639-00370.warc.gz
en
0.927547
1,387
3.515625
4
Emulating impossible “unipolar” laser pulses paves the way for processing quantum information A laser pulse that sidesteps the inherent symmetry of light waves could manipulate quantum information, potentially bringing us closer to room temperature quantum computing. The study, led by researchers at the University of Regensburg and the University of Michigan, could also accelerate conventional computing. Quantum computing has the potential to accelerate solutions to problems that need to explore many variables at the same time, including drug discovery, weather prediction and encryption for cybersecurity. Conventional computer bits encode either a 1 or 0, but quantum bits, or qubits, can encode both at the same time. This essentially enables quantum computers to work through multiple scenarios simultaneously, rather than exploring them one after the other. However, these mixed states don’t last long, so the information processing must be faster than electronic circuits can muster. While laser pulses can be used to manipulate the energy states of qubits, different ways of computing are possible if charge carriers used to encode quantum information could be moved around—including a room-temperature approach. Terahertz light, which sits between infrared and microwave radiation, oscillates fast enough to provide the speed, but the shape of the wave is also a problem. Namely, electromagnetic waves are obliged to produce oscillations that are both positive and negative, which sum to zero. The positive cycle may move charge carriers, such as electrons. But then the negative cycle pulls the charges back to where they started. To reliably control the quantum information, an asymmetric light wave is needed. “The optimum would be a completely directional, unipolar ‘wave’, so there would be only the central peak, no oscillations. That would be the dream. But the reality is that light fields that propagate have to oscillate, so we try to make the oscillations as small as we can,” said Mackillo Kira, a professor of electrical engineering and computer science at U-M and leader of the theory aspects of the study in Light: Science & Applications. Since waves that are only positive or only negative are physically impossible, the international team came up with a way to do the next best thing. They created an effectively unipolar wave with a very sharp, high-amplitude positive peak flanked by two long, low-amplitude negative peaks. This makes the positive peak forceful enough to move charge carriers while the negative peaks are too small to have much effect. They did this by carefully engineering nanosheets of a gallium arsenide semiconductor to design the terahertz emission through the motion of electrons and holes, which are essentially the spaces left behind when electrons move in semiconductors. The nanosheets, each about as thick as one thousandth of a hair, were made in the lab of Dominique Bougeard, a professor of physics at the University of Regensburg. Then, the group of Rupert Huber, also a professor of physics at the University of Regensburg, stacked the semiconductor nanosheets in front of a laser. When the near-infrared pulse hit the nanosheet, it generated electrons. Due to the design of the nanosheets, the electrons welcomed separation from the holes, so they shot forward. Then, the pull from the holes drew the electrons back. As the electrons rejoined the holes, they released the energy they’d picked up from the laser pulse as a strong positive terahertz half-cycle preceded and followed by a weak, long negative half-cycle. “The resulting terahertz emission is stunningly unipolar, with the single positive half-cycle peaking about four times higher than the two negative ones,” said Huber. “We have been working for many years on light pulses with fewer and fewer oscillation cycles. The possibility of generating terahertz pulses so short that they effectively comprise less than a single half-oscillation cycle was beyond our bold dreams,” he added. Next, the team intends to use these pulses to manipulate electrons in room temperature quantum materials, exploring mechanisms for quantum information processing. The pulses could also be used for ultrafast processing of conventional information. “Now that we know the key factor of unipolar pulses, we may be able to shape terahertz pulses to be even more asymmetric and tailored for controlling semiconductor qubits,” said Qiannan Wen, a PhD student in applied physics at U-M and a co-first-author of the paper, along with Christian Meineke and Michael Prager, PhD students in physics at the University of Regensburg. Collaborators at Justus Liebig University Giessen and Helmut Schmidt University, both in Germany, contributed to the experiment and the characterization of the nanosheets. Kira, Huber and Bougeard conceived the study along with Markus Stein, a postdoctoral researcher in physics at Justus Liebig University Giessen. Huber and his PhD students Meineke, Johannes Hayes and Lukas Kastner along with Bougeard, Prager and staff scientist Dieter Schuh, designed the setup and the terahertz pulse emitter. Prager, Schuh and Bougeard then grew the semiconductor nanosheets and tested the sample quality, and Stein and colleagues from Giessen tested optical properties. Kira and Wen developed the quantum theory and carried out numerical simulations to interpret the results. Meineke, Hayes, Kastner and Huber, with support from Kilian Fritsch and Oleg Pronin, a PhD student and professor of laser technology and spectroscopy at Helmut Schmidt University in Hamburg, Germany, carried out the experiments and analyzed the data. This research was supported by the German Research Foundation (DFG) through Project ID 422 314695032-SFB 1277 (Subprojects A01 and B02), W. M. Keck Foundation, and the National Science Foundation program Designing Materials to Revolutionize and Engineer our Future (2118809).
<urn:uuid:a0757f09-bcf9-406c-a27d-153091718948>
CC-MAIN-2022-33
https://micl.engin.umich.edu/stories/emulating-impossible-unipolar-laser-pulses-paves-the-way-for-processing-quantum-information
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570879.1/warc/CC-MAIN-20220808213349-20220809003349-00372.warc.gz
en
0.918614
1,259
3.59375
4
Scientists from the University of Queensland, Australia, have used single particles of light (photons) to simulate quantum particles traveling through time. They showed that one photon can pass through a wormhole and then interact with its older self. Their findings were published in Nature Communications. The source of this time travel conundrum comes from what are called “closed time-like curves” (CTC). CTCs are used to simulate extremely powerful gravitational fields, like the ones produced by a spinning black hole, and could, theoretically (based on Einstein’s theory of general relativity), warp the fabric of existence so that spacetime bends back on itself – thus creating a CTC, almost like a path that could be used to travel back in time. According to Scientific American, many physicists find CTCs “abhorrent, because any macroscopic object traveling through one would inevitably create paradoxes where cause and effect break down.” Others disagree with this assessment, however; in 1991, physicist David Deutsch showed that these paradoxes (created by CTCs) could be avoided at the quantum scale because of the weird behavior of these fundamental particles that make up what we call matter. It’s well known that at the quantum scale, these particles do not follow the rules that govern classical mechanics, but behave in strange and unexpected ways that really shouldn’t even be possible. Welcome to the world of Quantum physics, where pioneering Physicist Niels Bohr once said, “if quantum mechanics hasn’t profoundly shocked you, you haven’t understood it yet.” “We choose to examine a phenomenon which is impossible, absolutely impossible, to explain in any classical way, and which has in it the heart of quantum mechanics. In reality, it contains the only mystery.” – Richard Feynman, a Nobel laureate of the twentieth century (Radin, Dean. Entangled Minds: Extrasensory Experiences in a Quantum Reality. New York, Paraview Pocket Books, 2006.) In the quantum world, paradoxes that we don’t understand are common findings, but this should not deter people from taking this science seriously. Even Einstein didn’t believe a lot of quantum theory, but I’d like to think that if he were alive today, he would definitely be having some fun, given all of the recent breakthroughs. “It’s intriguing that you’ve got general relativity predicting these paradoxes, but then you consider them in quantum mechanical terms and the paradoxes go away.” –University of Queensland physicist Tim Ralph (source) Tim Ralph (quoted above) and his PhD student Martin Ringbauer simulated a Deutsch’s model of CTCs, according to Scientific American, “testing and confirming many aspects of the two-decades-old theory.” Although it’s just a mathematical simulation, the researchers (and their team/colleagues) emphasize that their model is mathematically equivalent to a single photon traveling through a CTC. Nothing has actually been sent back through time though; to do that, scientists would have to find a real CTC, which has yet to happen as far as we know. Of course, there always remains the possibility that black budget science has. Think in terms of the ‘grandfather paradox,’ a hypothetical scenario where someone uses a CTC to travel back through time to cause harm to their grandfather, thus preventing their later birth. Now imagine a particle going back in time to flip a switch on the particle-generating machine that created it – this is a possibility that these physicists say they have shown through their simulation. You can read the specifics of the experiment here. Why This Is A High Probability In my opinion, there is no doubt time travel is possible. Why do I believe this? Well, it’s because we know one hundred percent that superposition is real on a quantum scale. “The maddening part of that problem is that the ability of particles to exist in two places at once is not a mere theoretical abstraction. It is a very real aspect of how the subatomic world works, and it has been experimentally confirmed many times over.”(source) “One of the supreme mysteries of nature… is the ability, according to the quantum mechanic laws that govern subatomic affairs, of a particle like an electron to exist in a murky state of possibility — to be anywhere, everywhere or nowhere at all — until clicked into substantiality by a laboratory detector or an eyeball.” (New York Times) This means that one particle can exist in multiple states at one time. This is best demonstrated by the quantum double slit experiment. Recent experiments have also confirmed quantum entanglement, showing that space is really just a construct that gives the illusion of separation. One thing that suggests there is a high probably of time travel, in conjunction with the experiment mentioned in this article, is the fact that there are experiments showing that particles can actually be entangled through time. This is illustrated by what is called the ‘delayed choice experiment.’ Like the quantum double slit experiment, the delayed choice/quantum eraser has been demonstrated and repeated time and time again. For example, physicists at The Australian National University (ANU) have successfully conducted John Wheeler’s delayed-choice thought experiment. Their findings were recently published in the journal Nature Physics. (source) In 2007 (Science 315, 966, 2007), scientists in France shot photons into an apparatus and showed that their actions could retroactively change something which had already happened. This particular experiment illustrates how what happens in the present can change what happened in the past. It also shows how time can go backwards, how cause and effect can be reversed, and how the future caused the past. “If we attempt to attribute an objective meaning to the quantum state of a single system, curious paradoxes appear: quantum effects mimic not only instantaneous action-at-a-distance, but also, as seen here, influence of future actions on past events, even after these events have been irrevocably recorded.” – Asher Peres, pioneer in quantum information theory (source)(source)(source) Although we do not have access to a CTC quite yet, there are good reasons to believe that this type of time travel is possible at the quantum mechanical level, and that is why I chose to mention these other experiments, to show that ‘time’ doesn’t even really exist as we think it does. You can access an excellent description of the delayed choice experiment using a cosmic scale explanation here, which makes it easier to understand. Why these same quantum mechanical laws have not been observed on the macroscopic level is yet to be understood, but physicists are working on the problem. For example, in 2012 physicists David Wineland and Serge Haroche received the Nobel Prize in physics for demonstrating how “quantum weirdness” could not only exist at the subatomic micro-world level, but also show itself in the macro-world. At one time, superposition was only thought to exist in the inaccessible quantum world, but not anymore. We know it’s possible, we just haven’t figured out how. We do, however, seem to be getting closer to finding out. (source) (source) Perhaps one day, we will have determined the key to this puzzle and be able to observe large objects like cars, humans, apples, and oranges behave in the ways that matter does on a subatomic level, and perhaps one day we will find a wormhole, or a CTC in space, to conduct actual experiments that go beyond theory. That being said, a lot of what used to be considered theoretical in quantum physics is no longer theoretical, like quantum entanglement.
<urn:uuid:2f96e375-245e-4f14-b04f-eb9f111445ea>
CC-MAIN-2022-33
https://www.thelastamericanvagabond.com/physicists-send-particles-light-past-proving-time-travel-possible/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570767.11/warc/CC-MAIN-20220808061828-20220808091828-00372.warc.gz
en
0.950079
1,636
3.734375
4
A team of researchers from the University of California, Davis and the University of Washington have demonstrated that the conductance of DNA can be modulated by controlling its structure, thus opening up the possibility of DNA’s future use as an electromechanical switch for nanoscale computing. Although DNA is commonly known for its biological role as the molecule of life, it has recently garnered significant interest for use as a nanoscale material for a wide-variety of applications. In their paper published in Nature Communications, the team demonstrated that changing the structure of the DNA double helix by modifying its environment allows the conductance (the ease with which an electric current passes) to be reversibly controlled. This ability to structurally modulate the charge transport properties may enable the design of unique nanodevices based on DNA. These devices would operate using a completely different paradigm than today’s conventional electronics. “As electronics get smaller they are becoming more difficult and expensive to manufacture, but DNA-based devices could be designed from the bottom-up using directed self-assembly techniques such as ‘DNA origami’,” said Josh Hihath, assistant professor of electrical and computer engineering at UC Davis and senior author on the paper. DNA origami is the folding of DNA to create two- and three-dimensional shapes at the nanoscale level. “Considerable progress has been made in understanding DNA’s mechanical, structural, and self-assembly properties and the use of these properties to design structures at the nanoscale. The electrical properties, however, have generally been difficult to control,” said Hihath. New Twist on DNA? Possible Paradigms for Computing In addition to potential advantages in fabrication at the nanoscale level, such DNA-based devices may also improve the energy efficiency of electronic circuits. The size of devices has been significantly reduced over the last 40 years, but as the size has decreased, the power density on-chip has increased. Scientists and engineers have been exploring novel solutions to improve the efficiency. “There’s no reason that computation must be done with traditional transistors. Early computers were fully mechanical and later worked on relays and vacuum tubes,” said Hihath. “Moving to an electromechanical platform may eventually allow us to improve the energy efficiency of electronic devices at the nanoscale.” This work demonstrates that DNA is capable of operating as an electromechanical switch and could lead to new paradigms for computing. To develop DNA into a reversible switch, the scientists focused on switching between two stable conformations of DNA, known as the A-form and the B-form. In DNA, the B-form is the conventional DNA duplex that is commonly associated with these molecules. The A-form is a more compact version with different spacing and tilting between the base pairs. Exposure to ethanol forces the DNA into the A-form conformation resulting in an increased conductance. Similarly, by removing the ethanol, the DNA can switch back to the B-form and return to its original reduced conductance value. One Step Toward Molecular Computing In order to develop this finding into a technologically viable platform for electronics, the authors also noted that there is still a great deal of work to be done. Although this discovery provides a proof-of-principle demonstration of electromechanical switching in DNA, there are generally two major hurdles yet to be overcome in the field of molecular electronics. First, billions of active molecular devices must be integrated into the same circuit as is done currently in conventional electronics. Next, scientists must be able to gate specific devices individually in such a large system. The Latest on: Molecular computing via Google News The Latest on: Molecular computing - Molecular Partners AG Investigation: Robbins LLP is Investigating Molecular Partners AG (MOLN) on Behalf of Shareholderson July 28, 2022 at 2:21 pm Shareholder rights law firm Robbins LLP is investigating the officers and directors of Molecular Partners AG (NASDAQ: MOLN) to determine whether they breached fiduciary duties or violates securities ... - SLAC expands and centralizes computing infrastructure to prepare for data challenges of the futureon July 26, 2022 at 5:00 pm A computing facility at the Department of Energy’s SLAC National Accelerator Laboratory is doubling in size, preparing the lab for new scientific endeavors that promise to revolutionize our ... - Cadence Expands into Molecular Simulation with Acquisition of OpenEye Scientific, a Pioneering Leader in Computational Molecular Designon July 25, 2022 at 6:06 am Cadence Design Systems, Inc. (Nasdaq: CDNS) announced today that it has entered into a definitive agreement to acquire privately held OpenEye Scientific Software, Inc., a leading provider of ... - Liquid Biopsy Market to Grow at a CAGR of 19.7% during 2018 – 2028 | BlueWeave Consultingon July 20, 2022 at 8:00 am North America dominates the market owing to the presence of large key players such as Biocept, F. Hoffmann-La Roche AG, Qiagen N.V., and others. Furthermore, the high cancer prevalence and widespread ... - Caris' Precision Oncology Alliance Welcomes Northside Hospital Cancer Instituteon July 20, 2022 at 5:30 am With the most board-certified oncologists in Georgia, Northside Hospital Cancer Institute delivers a powerful combination of doctors, treatment ... - Quantum computing and chemistry companies link for semiconductor researchon July 13, 2022 at 1:41 am Methods developed to model molecular systems and defect sub-systems will be incorporated into InQuanto for other researchers to use. “JSR’s scientists know materials science, we know quantum computing ... - Molecular computer uses 10,000 times less energy than a normal oneon July 8, 2022 at 9:54 am For most of computing history, as chips have decreased in size they have also required less energy to run. But this relationship broke around 15 years ago, meaning that computers that perform ... - 7 Quantum Computing Stocks to Buy for the Next 10 Yearson July 8, 2022 at 9:45 am Quantum computing has long been a concept stuck in ... It will improve the way medicines are developed by simulating molecular processes. It will reduce energy loss in batteries via optimized ... - New Advances in the Search for Molecular Magnetson July 5, 2022 at 5:00 pm have managed to synthesize and extensively characterize a series of cobalt molecules that exhibit the properties of molecular magnets, an encouraging result for the future of quantum-scale computing. - New advances in the search for molecular magnetson July 5, 2022 at 10:32 am These molecules that exhibit magnetic bi-stability are called molecular magnets ... for application in spintronics and quantum-scale computing," adds Bandeira. More information: Patrícia S. via Bing News
<urn:uuid:35f8f503-af59-4f1e-bf08-b2813dfb2fea>
CC-MAIN-2022-33
https://innovationtoronto.com/2015/12/uc-davis-scientists-demonstrate-dna-based-electromechanical-switch/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571909.51/warc/CC-MAIN-20220813051311-20220813081311-00174.warc.gz
en
0.939062
1,441
3.578125
4
Importance of computers in the present day is well known to all. These machines have almost taken over manpower and mostly for the betterment (with the exception of creating unemployment). Still, people expect computers to be more useful and powerful in times to come, and different computing technologies of the future are always on constant watch. Where a classical computer works with 0s and 1s, a quantum computer will have the advantage of using 1s, 0s and superpositions of 1s and 0s. The future of computing and the new fields of computer sciences paving the way for the next digital revolution are common topics of discussion. In this direction, quantum computing technologies and their emergence in the near future are discussed. It is expected that quantum computing technologies will reach the masses by 2020. This article presents how quantum computing will change lives, society, the economy and the entire working system. Computing technologies, in general, are based on a series of assumptions, which are: •A technological society could eventually achieve the capability of creating a computer simulation that is indistinguishable from reality to the inhabitants of the simulation. •Such a society would not do this once or twice. These would create many such simulations. •Left to run long enough, the societies within the simulations would eventually be able to create their own simulations, also indistinguishable from reality to the sub-simulations inhabitants. Certain tasks, which have long been thought impossible (or intractable) for classical computers, will be achieved quickly and efficiently by quantum computers. These computers will be millions of times more powerful than conventional computers, and quantum computing could lead to huge improvements in machine learning, artificial intelligence, computer simulations and cryptography. All of this could fundamentally alter the way our society operates. Quantum computers will be able to outperform conventional computers in the fields of machine learning (training computers to use data to, effectively, make decisions without additional human input, to run search engines, spam email filters, voice- or facial-recognition technologies or self-driving cars, for example) and simulation technologies. What quantum computing is Quantum computing is essentially harnessing and exploiting the amazing laws of quantum mechanics to process information. A traditional computer uses long strings of bits, which encode either 0 or 1. A quantum computer, on the other hand, uses quantum bits, or qubits. A qubit is a quantum system that encodes 0 and 1 into two distinguishable quantum states. Qubits represent atoms, ions, photons or electrons and their respective control devices that work together to act as computer memory and a processor. But, because qubits behave quantum mechanically, we can capitalise on the phenomena of superposition and entanglement. Superposition is the ability of a quantum system to be in multiple states at the same time, that is, something can be here and there, or up and down at the same time. Entanglement is an extremely strong correlation that exists between quantum particles—so strong that two or more quantum particles can be inextricably linked in perfect unison, even if separated by great distances. The particles remain perfectly correlated even if separated by great distances. These are so intrinsically connected that these can be said to dance in instantaneous, perfect unison, even when placed at opposite ends of the universe. Such quantum effects are extremely useful to the future of computing and communications technology. Thanks to superposition and entanglement, a quantum computer can process a vast number of calculations simultaneously. Where a classical computer works with 0s and 1s, a quantum computer will have the advantage of using 1s, 0s and superpositions of 1s and 0s. Qubits could be made of photons, atoms, electrons, molecules or perhaps something else. But these are notoriously tricky to manipulate, since any disturbance causes these to fall out of their quantum state (or decohere). Decoherence is the Achilles heel of quantum computing, but it is not insurmountable. The field of quantum error correction examines how to stave off decoherence and combat other errors. While quantum computers have been theoretically demonstrated to have incredible potential, and scientists are working around the world to realise that potential, there is much work to be done before these hit the market. There are quantum computers already, but not of sufficient power to replace classical computers. While practical quantum technologies are already emerging—including highly-effective sensors, actuators and other devices—a true quantum computer that outperforms a classical computer is still years away. Theorists are continually figuring out better ways to overcome decoherence, while experimentalists are gaining more and more control over the quantum world through various technologies and instruments. Pioneering work being done today is paving the way for the upcoming quantum era. Quantum computers will be able to efficiently simulate quantum systems. This will allow to study, in remarkable detail, interactions between atoms and molecules. This could help design new drugs and materials, such as superconductors that work at room temperature. Another of the many benefits of quantum computers over classical ones is searching through a space of potential solutions for the best solution. Researchers are constantly working on new quantum algorithms and applications. But the true potential of quantum computers likely has not even been imagined yet. Future uses of quantum computers are bound only by imagination. Quantum technologies offer ultra-secure communications, sensors of unprecedented precision and computers that are exponentially more powerful than any supercomputer for a given task. These technologies are destined to fundamentally change our lives, and the first commercially-available quantum devices are only now beginning to emerge. Quantum computing has the capability to unlock answers to some of humanity’s most pressing questions that are presently unsolvable with current computing technologies. It is expected that in less than ten years, quantum computers will begin to outperform everyday computers, leading to breakthroughs in artificial intelligence, discovery of new pharmaceuticals and beyond. The very fast computing power of quantum computers has the potential to disrupt traditional businesses and challenge cyber security. Businesses need to be ready for a quantum future because it is coming. The technology could herald radical changes for the following areas, to name a few: •Safer airplanes. Lockheed Martin plans to use its D-Wave to test jet software that is currently too complex for classical computers. •Discover distant planets. Quantum computers will be able to analyse the vast amount of data collected by telescopes and seek out Earth-like planets. •Win elections. Campaigners will comb through reams of marketing information to best exploit individual voter preferences. •Boost GDP. Hyper-personalised advertising, based on quantum computation, will stimulate consumer spending. •Detect cancer earlier. Computational models will help determine how diseases develop. •Help automobiles drive themselves. Google is already using a quantum computer to design software that can distinguish cars from landmarks. •Reduce weather-related deaths. Precision forecasting will give people more time to take cover. •Cut back on travel time. Sophisticated analysis of traffic patterns in the air and on the ground will forestall bottlenecks and snarls. •Develop more effective drugs. By mapping amino acids, for example, or analysing DNA-sequencing data, doctors would be able to discover and design superior drug based treatments. Developed countries are making huge investments for the development of quantum technologies in order to become the epicentres of this technology revolution in the near future. However, quantum computing might struggle to impact everyday life as it may be suppressed by those opposed to the changes it might bring. Kanchan Verma is M.Tech from Department of Computer Science and Engineering PIT, Kapurthala (PTU campus), Jalandhar, Punjab
<urn:uuid:babc200c-8b8e-4ba5-bd56-61966d59fcbb>
CC-MAIN-2022-33
http://www.electronicsforu.com/technology-trends/living-quantum-computing
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572304.13/warc/CC-MAIN-20220816120802-20220816150802-00375.warc.gz
en
0.923417
1,583
3.671875
4
BOULDER, Colo.—An atomic clock that uses an aluminum atom to apply the logic of computers to the peculiarities of the quantum world now rivals the world's most accurate clock, based on a single mercury atom. Both clocks are at least 10 times more accurate than the current U.S. time standard. The measurements were made in a yearlong comparison of the two next-generation clocks, both designed and built at the Commerce Department's National Institute of Standards and Technology (NIST). The clocks were compared with record precision, allowing scientists to measure the relative frequencies of the two clocks to 17 digits-the most accurate measurement of this type ever made. The comparison produced the most precise results yet in the worldwide quest to determine whether some of the fundamental constants that describe the universe are changing slightly over time, a hot research question that may alter basic models of the cosmos. The research is described in the March 6 issue of Science Express.* The aluminum and mercury clocks are both based on natural vibrations in ions (electrically charged atoms) and would neither gain nor lose one second in over 1 billion years-if they could run for such a long time-compared to about 80 million years for NIST-F1, the U.S. time standard based on neutral cesium atoms. The mercury clock was first demonstrated in 2000 and is now four times better than its last published evaluation in 2006, thanks to ongoing improvements in the clock design and operation. The mercury clock continues its reign as the world's most accurate for now, by a margin of 20 percent over the aluminum clock, but the designers say both experimental clocks could be improved further. "The aluminum clock is very accurate because it is insensitive to background magnetic and electric fields, and also to temperature," says Till Rosenband, the NIST physicist who built the clock and is the first author of the new paper. "It has the lowest known sensitivity of any atomic clock to temperature, which is one of the most difficult uncertainties to calibrate." Both the aluminum clock and the mercury clock are based on ions vibrating at optical frequencies, which are 100,000 times higher than microwave frequencies used in NIST-F1 and other similar time standards around the world. Because optical clocks divide time into smaller units, they can be far more precise than microwave standards. NIST scientists have several other optical atomic clocks in development, including one based on thousands of neutral strontium atoms. The strontium clock recently achieved twice the accuracy of NIST-F1, but still trails the mercury and aluminum clocks. Highly accurate clocks are used to synchronize telecommunications networks and deep-space communications, and for satellite navigation and positioning. Next-generation clocks may also lead to new types of gravity sensors, which have potential applications in exploration for underground natural resources and fundamental studies of the Earth. Laboratories around the world are developing optical clocks based on a variety of different designs and atoms; it is not yet clear which design will emerge as the best candidate for the next international standard. The new paper provides the first published evaluation of the operational quantum logic clock, so-named because it is based on the logical reasoning process used in quantum computers (see sidebar below for details). The clock is a spin-off of NIST research on quantum computers, which grew out of earlier atomic clock research. Quantum computers, if they can be built, will be capable of solving certain types of complex problems that are impossible or prohibitively costly or time consuming to solve with today's technologies. The NIST quantum logic clock uses two different kinds of ions, aluminum and beryllium, confined closely together in an electromagnetic trap and slowed by lasers to nearly "absolute zero" temperatures. Aluminum is a stable source of clock ticks, but its properties cannot be detected easily with lasers. The NIST scientists applied quantum computing methods to share information from the aluminum ion with the beryllium ion, a workhorse of their quantum computing research. The scientists can detect the aluminum clock's ticks by observing light signals from the beryllium ion. NIST's tandem ion approach is unique among the world's atomic clocks and has a key advantage: "You can pick from a bigger selection of atoms," explains NIST physicist Jim Bergquist, who built the mercury clock. "And aluminum has a lot of good qualities-better than mercury's." An optical clock can be evaluated precisely only by comparison to another clock of similar accuracy serving as a "ruler." NIST scientists used the quantum logic clock to measure the mercury clock, and vice versa. In addition, based on fluctuations in the frequencies of the two clocks relative to each other over time, NIST scientists were able to search for a possible change over time in a fundamental quantity called the fine-structure constant. This quantity measures the strength of electromagnetic interactions in many areas of physics, from studies of atoms and molecules to astronomy. Some evidence from astronomy has suggested the fine-structure constant may be changing very slowly over billions of years. If such changes are real, scientists would have to dramatically change their theories of the fundamental nature of the universe. The NIST measurements indicate that the value of the fine-structure constant is not changing by more than 1.6 quadrillionths of 1 percent per year, with an uncertainty of 2.3 quadrillionths of 1 percent per year (a quadrillionth is a millionth of a billionth). The result is small enough to be "consistent with no change," according to the paper. However, it is still possible that the fine-structure constant is changing at a rate smaller than anyone can yet detect. The new NIST limit is approximately 10 times smaller than the best previous measurement of the possible present-day rate of change in the fine-structure constant. The mercury clock is an especially useful tool for such tests because its frequency fluctuations are magnified by any changes in this constant. Background on the mercury clock is available at: www.nist.gov/public_affairs/releases/mercury_atomic_clock.htm. Background on quantum computing is available at: www.nist.gov/public_affairs/quantum/quantum_info_index.html. The work described in the new Science Express paper was supported in part by the Office of Naval Research and Disruptive Technology Office. As a non-regulatory agency of the Commerce Department, NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life. The NIST quantum logic clock is so named because it borrows techniques that are key to quantum computers, which would solve problems using quantum mechanics, nature's instruction book for the smallest particles of matter and light. Logic is reasoning that determines an action or result based on which one of different possible options is received as input. In the NIST clock, the input options are two different quantum states, or internal energy levels, of an aluminum ion. Information about this state is transferred to a beryllium ion, which, depending on the input, produces different signals that are easily detected. NIST scientists use lasers to cool the two ions which are held 4 thousandths of a millimeter apart in an electromagnetic trap. Aluminum is the larger of the two ions, while the beryllium emits light under the conditions of this experiment. Scientists hit the ions with pulses from a "clock laser" within a narrow frequency range. If the laser frequency is at the center of the frequency range, the precise "resonance frequency" of aluminum, this ion jumps to a higher energy level, or 1 in the binary language of computers. Otherwise, the ion remains in the lower energy state, or 0. If there is no change in the aluminum ion, then another laser pulse causes both ions to begin rocking side to side in unison because of their physical proximity and the interaction of their electrical charges. An additional laser pulse converts this motion into a change in the internal energy level of the beryllium ion. This pulse reverses the direction of the ion's magnetic "spin," and the beryllium goes dark, a signal that the aluminum remained in the 0 state. On the other hand, if the aluminum ion jumps to the higher energy level, then the additional laser pulses fail to stimulate a shared rocking motion and have no effect on the beryllium ion, which keeps emitting light. Scientists detect this light as a signal that the aluminum ion jumped from 0 to 1. The goal is to tune the clock laser to the exact frequency that prompts the aluminum to jump from 0 to 1. The actual measurement of the ticking of the clock is provided not by the ions but rather by the clock laser's precisely tuned center frequency, which is measured with a "frequency comb," a tool for measuring very high optical frequencies, or colors of light. See: www.nist.gov/public_affairs/newsfromnist_frequency_combs.htm. *T. Rosenband, D.B. Hume, P.O. Schmidt, C.W. Chou, A. Brusch, L. Lorini, W.H. Oskay, R.E. Drullinger, T.M. Fortier, J.E. Stalnaker, S.A. Diddams, W.C. Swann, N.R. Newbury, W.M. Itano, D.J. Wineland, and J.C. Bergquist. 2008. Frequency ratio of Al+ and Hg+ single-ion optical clocks; metrology at the 17th decimal place. Science Express. Published online March 6.
<urn:uuid:ea95b353-6430-4970-b2f9-4d8dee1b489a>
CC-MAIN-2022-33
https://www.nist.gov/news-events/news/2008/03/nist-quantum-logic-clock-rivals-mercury-ion-worlds-most-accurate-clock-0
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571758.42/warc/CC-MAIN-20220812200804-20220812230804-00575.warc.gz
en
0.930888
1,998
3.5625
4
Making 3-D nanosuperconductors with DNA Three-dimensional (3-D) nanostructured materials—those with complex shapes at a size scale of billionths of a meter—that can conduct electricity without resistance could be used in a range of quantum devices. For example, such 3-D superconducting nanostructures could find application in signal amplifiers to enhance the speed and accuracy of quantum computers and ultrasensitive magnetic field sensors for medical imaging and subsurface geology mapping. However, traditional fabrication tools such as lithography have been limited to 1-D and 2-D nanostructures like superconducting wires and thin films. Now, scientists from the U.S. Department of Energy's (DOE) Brookhaven National Laboratory, Columbia University, and Bar-Ilan University in Israel have developed a platform for making 3-D superconducting nano-architectures with a prescribed organization. As reported in the Nov. 10 issue of Nature Communications, this platform is based on the self-assembly of DNA into desired 3-D shapes at the nanoscale. In DNA self-assembly, a single long strand of DNA is folded by shorter complementary "staple" strands at specific locations—similar to origami, the Japanese art of paper folding. "Because of its structural programmability, DNA can provide an assembly platform for building designed nanostructures," said co-corresponding author Oleg Gang, leader of the Soft and Bio Nanomaterials Group at Brookhaven Lab's Center for Functional Nanomaterials (CFN) and a professor of chemical engineering and of applied physics and materials science at Columbia Engineering. "However, the fragility of DNA makes it seem unsuitable for functional device fabrication and nanomanufacturing that requires inorganic materials. In this study, we showed how DNA can serve as a scaffold for building 3-D nanoscale architectures that can be fully "converted" into inorganic materials like superconductors." To make the scaffold, the Brookhaven and Columbia Engineering scientists first designed octahedral-shaped DNA origami "frames." Aaron Michelson, Gang's graduate student, applied a DNA-programmable strategy so that these frames would assemble into desired lattices. Then, he used a chemistry technique to coat the DNA lattices with silicon dioxide (silica), solidifying the originally soft constructions, which required a liquid environment to preserve their structure. The team tailored the fabrication process so the structures were true to their design, as confirmed by imaging at the CFN Electron Microscopy Facility and small-angle X-ray scattering at the Complex Materials Scattering beamline of Brookhaven's National Synchrotron Light Source II (NSLS-II). These experiments demonstrated that the structural integrity was preserved after they coated the DNA lattices. "In its original form, DNA is completely unusable for processing with conventional nanotechnology methods," said Gang. "But once we coat the DNA with silica, we have a mechanically robust 3-D architecture that we can deposit inorganic materials on using these methods. This is analogous to traditional nanomanufacturing, in which valuable materials are deposited onto flat substrates, typically silicon, to add functionality." The team shipped the silica-coated DNA lattices from the CFN to Bar-Ilan's Institute of Superconductivity, which is headed by Yosi Yeshurun. Gang and Yeshurun became acquainted a couple years ago, when Gang delivered a seminar on his DNA assembly research. Yeshurun—who over the past decade has been studying the properties of superconductivity at the nanoscale—thought that Gang's DNA-based approach could provide a solution to a problem he was trying to solve: How can we fabricate superconducting nanoscale structures in three dimensions? "Previously, making 3-D nanosuperconductors involved a very elaborate and difficult process using conventional fabrication techniques," said Yeshurun, co-corresponding author. "Here, we found a relatively simple way using Oleg's DNA structures." At the Institute of Superconductivity, Yeshurun's graduate student Lior Shani evaporated a low-temperature superconductor (niobium) onto a silicon chip containing a small sample of the lattices. The evaporation rate and silicon substrate temperature had to be carefully controlled so that niobium coated the sample but did not penetrate all the way through. If that happened, a short could occur between the electrodes used for the electronic transport measurements. "We cut a special channel in the substrate to ensure that the current would only go through the sample itself," explained Yeshurun. The measurements revealed a 3-D array of Josephson junctions, or thin nonsuperconducting barriers through which superconducting current tunnels. Arrays of Josephson junctions are key to leveraging quantum phenomena in practical technologies, such as superconducting quantum interference devices for magnetic field sensing. In 3-D, more junctions can be packed into a small volume, increasing device power. "DNA origami has been producing beautiful and ornate 3-D nanoscale structures for almost 15 years, but DNA itself is not necessarily a useful functional material," said Evan Runnerstrom, program manager for materials design at the U.S. Army Combat Capabilities Development Command Army Research Laboratory of the U.S. Army Research Office, which funded the work in part. "What Prof. Gang has shown here is that you can leverage DNA origami as a template to create useful 3-D nanostructures of functional materials, like superconducting niobium. This ability to arbitrarily design and fabricate complex 3-D-structured functional materials from the bottom-up will accelerate the Army's modernization efforts in areas like sensing, optics, and quantum computing." "We demonstrated a pathway for how complex DNA organizations can be used to create highly nanostructured 3-D superconducting materials," said Gang. "This material conversion pathway gives us an ability to make a variety of systems with interesting properties—not only superconductivity but also other electronic, mechanical, optical, and catalytic properties. We can envision it as a "molecular lithography," where the power of DNA programmability is transferred to 3-D inorganic nanofabrication."
<urn:uuid:50c28882-6700-42e8-9b31-ef0a15ecff2f>
CC-MAIN-2022-33
https://phys.org/news/2020-11-d-nanosuperconductors-dna.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573242.55/warc/CC-MAIN-20220818154820-20220818184820-00774.warc.gz
en
0.934256
1,314
3.671875
4
By: Arjun Walia, Collective-Evolution | Quantum entanglement: a phenomenon that Einstein thought was so “spooky” that there was no way it could be valid, posits that the “space” between physical objects isn’t actually empty space as our senses perceive it to be, but rather, that either information is travelling faster than the speed of light, or even better, instantaneously with no “time” involved. It implies that everything is connected, that if there was a “big bang,” it happened when all physical matter was one, and then exploded out into little pieces that spread throughout the cosmos. The tricky part to understand is that all those little piece, those plants, those starts, and all the intelligent life that has most certainly formed, is still all connected in some sort of way we have yet to understand. In the past couple of years alone, quantum entanglement has left the realm of theoretical physics due to several experiments conducted by physicists around the world. For example, an experiment devised by the Griffith University’s Centre for Quantum Dynamics, led by Professor Howard Wiseman and his team of researchers at the university of Tokyo, recently published a paper in the journal Nature Communications confirming what Einstein did not believe to be real: the non-local collapse of a particle’s wave function (source)(source), and this is just one example of many. They did this by splitting a single photon between two laboratories, and testing whether measurement of it in one laboratory would actually cause a change in the local quantum state in the other laboratory. In doing so, researchers were able to verify the entanglement of the split single photon. Researchers have since replicated this experiment over and over again, with results of entanglement seen at kilometres of distance. “Space is just the construct that gives the illusion that there are separate objects.” — Dr. Quantum, from the 2004 film, What The Bleep Do We Know Below you can see a visual demonstration from the documentary. In an interview with Dr. Jeffrey Mishlove, a past director of the Association for Humanistic Psychology Dr. Elizabeth Rauscher, a world-renowned physicist, researcher, and presenter who has done a lot of work for NASA, among several other organizations, admitted that quantum entanglement has been replicated in space with experiments that’ve been done with NASA astronauts, as well as replicated in a number of laboratories around the world. You can watch that full interview here. “What it really is, is that particles that are born together stay in connection with each other over even kilometres of distance.” —Dr. Elizabeth Rauscher Now that this fact has hit the mainstream, a new study in the journal Science shows how scientists were able to produce entangled photons on a satellite orbiting 300 miles above the planet and beam the particles onto two different ground-based labs that were 750 miles apart, all without losing the particles’ strange linkage. According to the Washington Post, “it is the first time anyone has ever generated entangled particles in space, and represents a 10-fold increase in the distance over which entanglement has been maintained.” But, according to the interview linked above with Dr. Rauscher, it’s clearly not the first time. “It’s a really stunning achievement, and I think it’s going to be the first of possibly many such interesting and exciting studies that this particular satellite will open up,” said Shohini Ghose, a physicist at Wilfrid Laurier University in Canada. “Who knows, maybe there’ll be a space entanglement race?” The post goes on to emphasize that: “There’s good a reason world governments may soon race to test out quantum theory in orbit, and it’s not just so they can claim the title of ‘spookiest.’ Entangled particles could one day be used for ‘quantum communication’ — a means of sending super secure messages that doesn’t rely on cables, wireless signals, or code. Because any interference with an entangled particle, even the mere act of observing it, automatically affects its partner, these missives can’t be hacked. To hear quantum physicists tell it, entangled particles could help build a ‘quantum internet,’ give rise to new kinds of coding, and allow for faster-than-light communication — possibilities that have powerful appeal in an era where hospitals, credit card companies, government agencies, even election systems are falling victim to cyber attacks.” What About Black Budget Science? As we’ve mentioned a number of times before, there are severe restrictions on science. And this is no secret. For example, Scientists working for the Canadian government have started to raise their voices, accusing the federal government of “muzzling” them and their findings on various issues. Apparently, the union representing this group of researchers will be taking “the unusual step of demanding Ottawa enshrine scientific independence in their collective agreement.” (source) What’s even worse is the black budget world. We are talking about Special Access Programs (SAP). From these we have unacknowledged and waived SAPs. These programs do not exist publicly, but they do indeed exist. They are better known as ‘deep black programs.’ A 1997 U.S. Senate report described them as “so sensitive that they are exempt from standard reporting requirements to the Congress.” (source). Think about all of the resources put into this world, into the military industrial complex, a term coined by president Eisenhower. What about all of the science that’s going on within this system? All of it is classified, that deals with technology and concepts that are much more advanced and controversial than what we see in the mainstream. We know this from declassified material, like project STARGATE. We don’t really hear about black budget programs, or about people who have actually looked into them. However, the topic was discussed in 2010 by Washington Post journalists Dana Priest and William Arkin. Their investigation lasted approximately two years and concluded that America’s classified world has: “Become so large, so unwieldy and so secretive that no one knows how much money it costs, how many people it employees, how many programs exist within it or exactly how many agencies do the same work.” (source) Another person was aviation journalist Bill Sweetman. Within the Pentagon, he estimated that approximately 150 special access programs existed that weren’t even acknowledged. These programs are not known about by the highest members of government and the highest ranking officials in the military. He determined that most of these programs were dominated by private contractors (Lockheed Martin, Boeing, etc.) and that he had no idea as to how these programs were funded. (source) Another example was the U.S. air strike against Libya in 1986. The raid employed F-111 fighter aircraft. Left out of the mission, however, was the F-117A Nighthawk, better known as the stealth fighter. It had been operational since 1983, but was still classified in 1986. In a form of logic both perverse and rational, the F-117A was so radically advanced that keeping it secret was more important than using it for this military mission. Perhaps the Canadian Avro Arrow could be another. It’s also noteworthy to mention that the U.S. has a history of government agencies existing in secret for years. The National Security Agency (NSA) was founded in 1952, its existence was hidden until the mid 1960s. Even more secretive is the National Reconnaissance Office, which was founded in 1960 but remained completely secret for 30 years. Given the mixture of a treasure chest of government money, and private connections, the likelihood exists that six decades later there is a clandestine group that possesses: - Technology that is vastly superior to that of the “mainstream” world. - The ability to explore areas of our world and surroundings presently unavailable to the rest of us. - Scientific and cosmological understandings that give them greater insights into the nature of our world. Inouye was the highest ranking Asian-American politician in U.S. history, serving the democratic party from 1963 until his death in 2012. There exists a shadowy government with its own Air Force, its own Navy, its own fundraising mechanism, and the ability to pursue its own ideas of the national interest, free from all checks and balances, and free from the law itself. — Senator Daniel Inouye, highest ranking Asian-American politician in United States history (source).
<urn:uuid:145c830e-15f4-45c2-b390-b666cf55afdf>
CC-MAIN-2021-10
http://www.thesleuthjournal.com/spookiest-phenomena-quantum-entanglement/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178351374.10/warc/CC-MAIN-20210225153633-20210225183633-00182.warc.gz
en
0.961939
1,818
3.546875
4
Quantum computers promise huge speedups on some computational problems because they harness a strange physical property called entanglement, in which the physical state of one tiny particle depends on measurements made of another. In quantum computers, entanglement is a computational resource, roughly like a chip’s clock cycles — kilohertz, megahertz, gigahertz — and memory in a conventional computer. In a recent paper in the journal Proceedings of the National Academy of Sciences, researchers at MIT and IBM’s Thomas J. Watson Research Center show that simple systems of quantum particles exhibit exponentially more entanglement than was previously believed. That means that quantum computers — or other quantum information devices — powerful enough to be of practical use could be closer than we thought. Where ordinary computers deal in bits of information, quantum computers deal in quantum bits, or qubits. Previously, researchers believed that in a certain class of simple quantum systems, the degree of entanglement was, at best, proportional to the logarithm of the number of qubits. “For models that satisfy certain physical-reasonability criteria — i.e., they’re not too contrived; they’re something that you could in principle realize in the lab — people thought that a factor of the log of the system size was the best you can do,” says Ramis Movassagh, a researcher at Watson and one of the paper’s two co-authors. “What we proved is that the entanglement scales as the square root of the system size. Which is really exponentially more.” That means that a 10,000-qubit quantum computer could exhibit about 10 times as much entanglement as previously thought. And that difference increases exponentially as more qubits are added. Logical or physical? This matters because of the distinction, in quantum computing, between logical qubits and physical qubits. A logical qubit is an abstraction used to formulate quantum algorithms; a physical qubit is a tiny bit of matter whose quantum states are both controllable and entangled with those of other physical qubits. A computation involving, say, 100 logical qubits would already be beyond the capacity of all the conventional computers in the world. But with most of today’s theoretical designs for general-purpose quantum computers, realizing a single logical qubit requires somewhere around 100 physical qubits. Most of the physical qubits are used for quantum error correction and to encode operations between logical qubits. Since preserving entanglement across large groups of qubits is the biggest obstacle to developing working quantum devices, extracting more entanglement from smaller clusters of qubits could make quantum computing devices more practical. Qubits are analogous to bits in a conventional computer, but where a conventional bit can take on the values 0 or 1, a qubit can be in “superposition,” meaning that it takes on both values at once. If qubits are entangled, they can take on all their possible states simultaneously. One qubit can take on two states, two qubits four, three qubits eight, four qubits 16, and so on. It’s the ability to, in some sense, evaluate computational alternatives simultaneously that gives quantum computers their extraordinary power. In the new paper, Peter Shor, the Morss Professor of Applied Mathematics at MIT, and Movassagh, who completed his PhD with Shor at MIT, analyze systems of qubits called spin chains. In quantum physics, “spin” describes the way a bit of matter — it could be an electron, or an atom, or a molecule — orients itself in a magnetic field. Shor and Movassagh consider bits of matter with five possible spin states: two up states, two corresponding down states, and a zero, or flat, state. Previously, theorists had demonstrated strong entanglement in spin chains whose elements had 21 spin states and interacted with each other in complex ways. But such systems would be extremely difficult to build in the lab. Chain, chain, chain A spin chain can be envisioned as a sequence of particles lined up next to each other. Interactions between the spins of adjacent particles determine the total energy of the system. Shor and Movassagh first considered the set of all possible orientations of their spin chain whose net energy was zero. That means that if somewhere there was a spin up, of either of the two types, somewhere there had to be a corresponding spin down. Then they considered the superposition of all those possible states of the spin chain. But the major breakthrough of the paper was to convert that superposition into the lowest-energy state of a Hamiltonian. A Hamiltonian is a matrix — a big grid of numbers — that figures in the standard equation for describing the evolution of a quantum system. For any given state of the particles in the system, the Hamiltonian provides the system’s total energy. In the previous 30 years, Movassagh says, no one had found an example of a Hamiltonian whose lowest-energy state corresponded to a system with as much entanglement as his and Shor’s exhibits. And even for Shor and Movassagh, finding that Hamiltonian required a little bit of luck. “Originally, we wanted to prove a different problem,” Movassagh says. “We tried to come up with a model that proved some other theorem on generic aspects of entanglement, and we kept failing. But by failing, our models became more and more interesting. At some point, these models started violating this log factor, and they took on a life of their own.” Pros and cons “It’s a beautiful result, a beautiful paper,” says Israel Klich, an associate professor of physics at the University of Virginia. “It certainly made for a lot of interest in some parts of the physics community. The result is in fact very, very succinct and simple. It’s a relatively simple Hamiltonian whose ground state one can understand by simple combinatorial means.” “Inspired by this work, we recently introduced a new variation on this model that is even more entangled, which has, actually, linear scaling of entanglement,” Klich adds. “The reason this was possible is that if you look at the ground state wave function, it’s so easy to understand how entanglement builds up there, and that gave us the idea of how to string it on to be even more entangled.” But John Cardy, an emeritus professor of physics at Oxford University and a visiting professor at the University of California at Berkeley, doesn’t find the MIT researchers’ Hamiltonian so simple. “If you read the description of the Hamiltonian, it takes a lot of description,” he says. “When we have physically reasonable Hamiltonians, we can just write them down in one expression. They do have an equation that tells you what the Hamiltonian is. But to explain what all those ingredients are requires this whole formalism that is deliberately designed, as far as I can tell, to get the result that they want.” “But I don’t want to sound unduly negative, because this is the way that science proceeds,” he adds. “You find one counterexample, then you might find others that are more reasonable.”
<urn:uuid:86e56308-3c89-4ad7-8d6e-df0726392ea9>
CC-MAIN-2021-10
https://news.mit.edu/2016/simple-quantum-computers-1118
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178372367.74/warc/CC-MAIN-20210305122143-20210305152143-00503.warc.gz
en
0.940422
1,551
3.796875
4
Findings by three teams may solve a 40-year-old mystery. A compound whose odd electrical behaviour has puzzled physicists for decades could turn out to be a boon for quantum physics and electronic-device makers. When theorists proposed in 2005 that it should be possible to find materials that conduct electricity at the surface while the rest of the sample behaves as an insulator, physicists were intrigued. They wanted to study the quantum effects that should emerge in such materials, and to explore applications in low-power electronics and quantum computing. But topological insulators, as the materials were called, proved fiendishly difficult to make. Some researchers have slaved to produce thin films using complex techniques that are unlikely ever to scale up to the levels needed for industrial purposes. Others have contented themselves with compounds that approximate topological insulators but still have a degree of internal conductivity. Now, three papers1,2,3 suggest that samarium hexaboride, a poorly understood compound that was first found to gain conducting properties at very low temperatures4 in 1969 by researchers at Bell Labs in New Jersey, may in fact be a topological insulator in its bulk form. In the most recent paper1in the trend, posted online on 28 November, researchers at the University of California, Irvine, report seeing remarkably fast-moving electrons on the surface of SmB6 crystals, which they take as a sign of a superb surface conductor. Five days earlier, researchers at the University of Maryland in College Park had reported measurements tracing the path of electrons injected into SmB6 samples as they were cooled2. Those results suggest that the material is insulating in its interior at temperatures below around 30 kelvin. And, in a paper posted on 21 November3, scientists from the University of Michigan in Ann Arbor and University of California, Irvine, describe their measurements of conductivity through the surface and bulk of the material, and find evidence that the surface conducting behaviour persists despite imperfections and impurities, as would be expected from a true topological insulator. A spurt of interest in topological insulators over the past few years (see ‘Charging up’) led to a 2010 prediction that SmB6would be such a material5. “I’d say we’ve been tentatively vindicated,” says Piers Coleman of Rutgers University in Piscataway, New Jersey, one of the four theoretical physicists who made the prediction. “We’re thrilled by these new results.” The prediction grew, in part, from studies of materials known as Kondo insulators, which, unlike ordinary insulators, retain some of the small amount of conductivity they do have when they are cooled to a few degrees above absolute zero. SmB6, which is often categorized as a Kondo insulator, fits this description. Coleman and other theorists realized that the material’s behaviour would make sense if it were a topological insulator. That would mean that the quantum properties of the material would be such that electrons cannot flow through it freely, as they would in an ordinary conductor, except at the material’s surface. If this proves correct, Coleman thinks that insights gleaned from SmB6 and other Kondo insulators could carry over to all topological insulators. SmB6is an unusual topological insulator because the electrons in the outer shells of the samarium atoms interact with one another strongly, such that a coordinated motion emerges. This could make the material useful for creating some exotic quantum effects, including magnetic monopoles, or Majorana fermions — quasiparticles that might be useful for quantum computing, says Shoucheng Zhang, who has pioneered work on topological insulators at Stanford University in California. Zhang adds that the rush of interest in SmB6is part of a trend to study materials with electrons that interact strongly with each other. “Now we’re looking at a number of systems. It’s a very exciting development,” he says. Peter Armitage, who has been working on topological insulating behaviour in bismuth-based compounds at Johns Hopkins University in Baltimore, Maryland, says that in the field of condensed-matter physics, experiment usually leads theory, but this is a remarkable example of the opposite. He is now hoping to start experiments on SmB6in the next week or two to confirm and study the surface states. “These are beautiful effects that were hiding under our noses,” he says. “This is a very big advance.” Botimer, J. et al. Preprint at http://arxiv.org/abs/1211.6769 (2012). Zhang, X. et al. Preprint at http://arxiv.org/abs/1211.5532 (2012). Wolgast, S. 5 al. Preprint at http://arxiv.org/abs/1211.5104 (2012). Menth, A., Buehler, E. & Geballe, T. H. Phys. Rev. Lett. 22, 295–297 (1969). Dzero, M., Sun, K., Galitski, V. & Coleman, P. Phys. Rev. Lett. 104, 106408 (2010). Related links in Nature Research Related external links About this article Cite this article Samuel Reich, E. Hopes surface for exotic insulator. Nature 492, 165 (2012). https://doi.org/10.1038/492165a
<urn:uuid:af85fad9-44bc-4e89-b607-7eb31c1d41d6>
CC-MAIN-2022-33
https://www.nature.com/articles/492165a?error=cookies_not_supported&code=6207c977-b161-42ba-a1a8-135df1f49ce8
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571719.48/warc/CC-MAIN-20220812140019-20220812170019-00174.warc.gz
en
0.940242
1,200
3.640625
4
Last year, researchers at Fermilab received over $3.5 million for projects that delve into the burgeoning field of quantum information science. Research funded by the grant runs the gamut, from building and modeling devices for possible use in the development of quantum computers to using ultracold atoms to look for dark matter. For their quantum computer project, Fermilab particle physicist Adam Lyon and computer scientist Jim Kowalkowski are collaborating with researchers at Argonne National Laboratory, where they’ll be running simulations on high-performance computers. Their work will help determine whether instruments called superconducting radio-frequency cavities, also used in particle accelerators, can solve one of the biggest problems facing the successful development of a quantum computer: the decoherence of qubits. “Fermilab has pioneered making superconducting cavities that can accelerate particles to an extremely high degree in a short amount of space,” said Lyon, one of the lead scientists on the project. “It turns out this is directly applicable to a qubit.” Researchers in the field have worked on developing successful quantum computing devices for the last several decades; so far, it’s been difficult. This is primarily because quantum computers have to maintain very stable conditions to keep qubits in a quantum state called superposition. Classical computers use a binary system of 0s and 1s – called bits – to store and analyze data. Eight bits combined make one byte of data, which can be strung together to encode even more information. (There are about 31.8 million bytes in the average three-minute digital song.) In contrast, quantum computers aren’t constrained by a strict binary system. Rather, they operate on a system of qubits, each of which can take on a continuous range of states during computation. Just as an electron orbiting an atomic nucleus doesn’t have a discrete location but rather occupies all positions in its orbit at once in an electron cloud, a qubit can be maintained in a superposition of both 0 and 1. Since there are two possible states for any given qubit, a pair doubles the amount of information that can be manipulated: 22 = 4. Use four qubits, and that amount of information grows to 24 = 16. With this exponential increase, it would take only 300 entangled qubits to encode more information than there is matter in the universe. Qubits don’t represent data in the same way as bits. Because qubits in superposition are both 0 and 1 at the same time, they can similarly represent all possible answers to a given problem simultaneously. This is called quantum parallelism, and it’s one of the properties that makes quantum computers so much faster than classical systems. The difference between classical computers and their quantum counterparts could be compared to a situation in which there is a book with some pages randomly printed in blue ink instead of black. The two computers are given the task of determining how many pages were printed in each color. “A classical computer would go through every page,” Lyon said. Each page would be marked, one at a time, as either being printed in black or in blue. “A quantum computer, instead of going through the pages sequentially, would go through them all at once.” Once the computation was complete, a classical computer would give you a definite, discrete answer. If the book had three pages printed in blue, that’s the answer you’d get. “But a quantum computer is inherently probabilistic,” Kowalkowski said. This means the data you get back isn’t definite. In a book with 100 pages, the data from a quantum computer wouldn’t be just three. It also could give you, for example, a 1 percent chance of having three blue pages or a 1 percent chance of 50 blue pages. An obvious problem arises when trying to interpret this data. A quantum computer can perform incredibly fast calculations using parallel qubits, but it spits out only probabilities, which, of course, isn’t very helpful – unless, that is, the right answer could somehow be given a higher probability. Consider two water waves that approach each other. As they meet, they may constructively interfere, producing one wave with a higher crest. Or they may destructively interfere, canceling each other so that there’s no longer any wave to speak of. Qubit states can also act as waves, exhibiting the same patterns of interference, a property researchers can exploit to identify the most likely answer to the problem they’re given. “If you can set up interference between the right answers and the wrong answers, you can increase the likelihood that the right answers pop up more than the wrong answers,” Lyon said. “You’re trying to find a quantum way to make the correct answers constructively interfere and the wrong answers destructively interfere.” When a calculation is run on a quantum computer, the same calculation is run multiple times, and the qubits are allowed to interfere with one another. The result is a distribution curve in which the correct answer is the most frequent response. Listening for signals above the noise In the last five years, researchers at universities, government facilities and large companies have made encouraging advancements toward the development of a useful quantum computer. Last year, Google announced that it had performed calculations on their quantum processor called Sycamore in a fraction of the time it would have taken the world’s largest supercomputer to complete the same task. Yet the quantum devices that we have today are still prototypes, akin to the first large vacuum tube computers of the 1940s. “The machines we have now don’t scale up much at all,” Lyon said. There’s still a few hurdles researchers have to overcome before quantum computers become viable and competitive. One of the largest is finding a way to keep delicate qubit states isolated long enough for them to perform calculations. If a stray photon — a particle of light — from outside the system were to interact with a qubit, its wave would interfere with the qubit’s superposition, essentially turning the calculations into a jumbled mess – a process called decoherence. While the refrigerators do a moderately good job at keeping unwanted interactions to a minimum, they can do so only for a fraction of a second. “Quantum systems like to be isolated,” Lyon said, “and there’s just no easy way to do that.” Which is where Lyon and Kowalkowski’s simulation work comes in. If the qubits can’t be kept cold enough to maintain an entangled superposition of states, perhaps the devices themselves can be constructed in a way that makes them less susceptible to noise. It turns out that superconducting cavities made of niobium, normally used to propel particle beams in accelerators, could be the solution. These cavities need to be constructed very precisely and operate at very low temperatures to efficiently propagate the radio waves that accelerate particle beams. Researchers theorize that by placing quantum processors in these cavities, the qubits will be able to interact undisturbed for seconds rather than the current record of milliseconds, giving them enough time to perform complex calculations. Qubits come in several different varieties. They can be created by trapping ions within a magnetic field or by using nitrogen atoms surrounded by the carbon lattice formed naturally in crystals. The research at Fermilab and Argonne will be focused on qubits made from photons. Lyon and his team have taken on the job of simulating how well radio-frequency cavities are expected to perform. By carrying out their simulations on high-performance computers, known as HPCs, at Argonne National Laboratory, they can predict how long photon qubits can interact in this ultralow-noise environment and account for any unexpected interactions. Researchers around the world have used open-source software for desktop computers to simulate different applications of quantum mechanics, providing developers with blueprints for how to incorporate the results into technology. The scope of these programs, however, is limited by the amount of memory available on personal computers. In order to simulate the exponential scaling of multiple qubits, researchers have to use HPCs. “Going from one desktop to an HPC, you might be 10,000 times faster,” said Matthew Otten, a fellow at Argonne National Laboratory and collaborator on the project. Once the team has completed their simulations, the results will be used by Fermilab researchers to help improve and test the cavities for acting as computational devices. “If we set up a simulation framework, we can ask very targeted questions on the best way to store quantum information and the best way to manipulate it,” said Eric Holland, the deputy head of quantum technology at Fermilab. “We can use that to guide what we develop for quantum technologies.” This work is supported by the Department of Energy Office of Science.
<urn:uuid:bfc6ade8-cf58-430b-903f-ef04170b0d1f>
CC-MAIN-2021-10
https://scitechdaily.com/solving-vexing-problem-in-building-quantum-computers-with-particle-accelerator-technology/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178365454.63/warc/CC-MAIN-20210303042832-20210303072832-00026.warc.gz
en
0.943926
1,887
3.578125
4
Quantum logical operations realized with single photons Scientists from all over the world are working on concepts for future quantum computers and their experimental realization. Commonly, a typical quantum computer is considered to be based on a network of quantum particles that serve for storing, encoding and processing quantum information. In analogy to the case of a classical computer a quantum logic gate that assigns output signals to input signals in a deterministic way would be an essential building block. A team around Dr. Stephan Dürr from the Quantum Dynamics Division of Prof. Gerhard Rempe at the Max Planck Institute of Quantum Optics has now demonstrated in an experiment how an important gate operation – the exchange of the binary bit values 0 and 1 – can be realized with single photons. A first light pulse containing one photon only is stored as an excitation in an ultracold cloud of about 100 000 rubidium atoms. This gives rise to the effect that a second light pulse that passes through the cloud exhibits a phase shift of 180 degrees. "Photons are ideal carriers of quantum information because they hardly interact with their environment and can easily be transmitted over long distances," explains Dr. Stephan Dürr, leader of the project. "Therefore we are very interested in the development of a photon-photon-quantum gate where a single light pulse can modify an incoming photonic qubit in a deterministic way." Modern data processing is based on the principle that information can be encoded in a binary system. In this context, logic gates fulfil the task of implementing truth tables which uniquely assign a specific output pattern to a given input signal. For instance an input value of 0 can be transformed into an output value of 1 or vice versa. In a photon-photon-quantum gate, this corresponds to the process of a single photon manipulating the state of a second single photon in a deterministic way. This interaction has to be mediated by matter. Up to now no physical system could be found to provide a sufficiently strong interaction. In this experiment a cloud of about 100 000 rubidium atoms is cooled down to a temperature of 0.5 microkelvin and caught in a dipole trap composed of several light fields. Next, a rapid sequence of three light pulses impinges onto the cloud: the first so-called control pulse determines whether the second target pulse is significantly modified when it passes through the cloud, i.e. whether the gate operation is switched on or off. A third pulse is used to retrieve an excitation that has potentially been stored. The light pulses consist of two components: on the one hand, they contain red signal light so weak that a light pulse carries only one photon on average. With a wavelength of 780 nm it is near-resonant with a certain atomic transition. Without further treatment the light pulse would pass through the atomic cloud and acquire a certain phase shift. However, by adding blue coupling light of high intensity with a wavelength of 480 nm the photon in the signal pulse can be stored in a controlled and reversible way. During this process, one atom in the cloud is transferred into a highly excited Rydberg state where one electron is located at a large distance from the nucleus. In the next step, the atoms are irradiated with the target pulse which is also composed of both signal and coupling light. As the Rydberg atom exhibits a long-range van der Waals interaction with the other atoms in the cloud, atomic energy levels inside a certain region around the Rydberg atom are shifted. This results in a larger detuning of the target pulse from the atomic levels compared to the case without a previously stored control pulse. Because of this detuning the target pulse picks up a phase shift that differs by 180 degrees from the phase shift obtained when no control excitation is stored. "It is this additional phase shift, caused by the van der Waals interaction, that really matters," says Dr. Dürr. "This makes it possible to generate quantum states that are orthogonal to each other, which corresponds to a bit flip from 0 to 1." In the last step, a coupling light pulse retrieves the signal photon that is stored in the cloud. In a series of measurements, using wave plates and a polarizing beam splitter the scientists determined the polarization of both red signal photons after passing through the atomic cloud. Thereby they were able to show that the light pulse had picked up an additional phase shift of 180 degrees whenever the signal laser was switched on during the control pulse. The whole cycle – the storage of the control pulse, the propagation of the target pulse and the retrieval of the control excitation – takes only a few microseconds. "The experiment demonstrates that we can rotate the polarization plane of the photonic qubit in the target pulse with just one control photon," resumes Dr. Dürr. "This is an important prerequisite for the realization of a quantum gate. However, a quantum gate also has to provide the possibility to generate an entangled final state from two separate initial states. To achieve this goal we are planning to do further experiments."
<urn:uuid:615f17ec-41cf-4d91-a706-e4c116222f21>
CC-MAIN-2021-10
https://phys.org/news/2016-05-quantum-logical-photons.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178359624.36/warc/CC-MAIN-20210227234501-20210228024501-00027.warc.gz
en
0.928411
1,031
3.578125
4
Researchers have demonstrated the ability of an optical chip to simulate the motion of atoms within molecules at the quantum level, which could open superior means for developing chemicals that are used as pharmaceuticals. In an optical chip, light is used to process information, in the place of electricity, where the chip functions as a quantum computing circuit when it uses single particles of light, called photons. Data collected from the chip can be used to carry out a frame-by-frame reconstruction of atomic motions to produce a virtual movie of the quantum vibrations of a molecule, which is the core concept of the study reported in the Nature journal on May 30th, 2018. These outcomes are the fruit of an association between scientists from the University of Bristol, MIT, IUPUI, Nokia Bell Labs, and NTT. In addition to opening the door for highly efficient pharmaceutical developments, the study could induce innovative techniques of molecular modeling for industrial chemists. In the 1960s, when lasers were invented, experimental chemists conceptualized their use in the disintegration of molecules. However, the vibrations inside molecules instantaneously redistribute the laser energy before disintegration of the targeted molecular bond. If the behavior of molecules has to be controlled, it is necessary to gain insights into the way they vibrate at the quantum level. However, massive computational power is required to model these dynamics, even more than what is anticipated from future generations of supercomputers. The Quantum Engineering and Technology Labs at Bristol have pioneered the application of optical chips, in which single photons of light are controlled, as the fundamental circuitry for quantum computers. It is anticipated that quantum computers will become exponentially faster when compared to traditional supercomputers in solving specific problems. However, developing a quantum computer is a highly difficult long-term goal. As described in Nature, the researchers illustrated an innovative course to achieve molecular modeling that could turn out an early application of photonic quantum technologies. The new techniques harness an analogy between the vibrations of atoms in molecules and photons of light in optical chips. According to Bristol physicist Dr Anthony Laing, who headed the study, “We can think of the atoms in molecules as being connected by springs. Across the whole molecule, the connected atoms will collectively vibrate, like a complicated dance routine. At a quantum level, the energy of the dance goes up or down in well-defined levels, as if the beat of the music has moved up or down a notch. Each notch represents a quantum of vibration. Light also comes in quantised packets called photons. Mathematically, a quantum of light is like a quantum of molecular vibration. Using integrated chips, we can control the behaviour of photons very precisely. We can program a photonic chip to mimic the vibrations of a molecule. Dr Anthony Laing “We program the chip, mapping its components to the structure of a particular molecule, say ammonia, then simulate how a particular vibrational pattern evolves over some time interval. By taking many time intervals, we essentially build up a movie of the molecular dynamics.” Dr Laing added. Talking about the versatility of the simulator, first author Dr Chris Sparrow, who was a student on the project, stated that, “The chip can be reprogrammed in a few seconds to simulate different molecules. In these experiments we simulated the dynamics of ammonia and a type of formaldehyde, and other more exotic molecules. We simulated a water molecule reaching thermal equilibrium with its environment, and energy transport in a protein fragment. In this type of simulation, because time is a controllable parameter, we can immediately jump to the most interesting points of the movie. Or play the simulation in slow motion. We can even rewind the simulation to understand the origins of a particular vibrational pattern. Dr Chris Sparrow Joint first author, Dr Enrique Martín-Lopéz, who is at present a Senior Researcher with Nokia Bell Labs, added, “We were also able to show how a machine learning algorithm can identify the type of vibration that best breaks apart an ammonia molecule. A key feature of the photonic simulator that enables this is its tracking of energy moving through the molecule, from one localised vibration to another. Developing these quantum simulation techniques further has clear industrial relevance.” Japanese Telecoms company NTT fabricated the photonic chip used in the experiments. Dr Laing described the main directions for the future of the study, “Scaling up the simulators to a size where they can provide an advantage over conventional computing methods will likely require error correction or error mitigation techniques. And we want to further develop the sophistication of molecular model that we use as the program for the simulator. Part of this study was to demonstrate techniques that go beyond the standard harmonic approximation of molecular dynamics. We need to push these methods to increase the real-world accuracy of our models. This approach to quantum simulation uses analogies between photonics and molecular vibrations as a starting point. This gives us a head start in being able to implement interesting simulations. Building on this, we hope that we can realise quantum simulation and modelling tools that provide a practical advantage in the coming years. Dr Anthony Laing The researchers acknowledge support from the European Research Council (ERC). A.N. is thankful for support from the Wilkinson Foundation. J.C. is supported by EU H2020 Marie Sklodowska-Curie grant number 751016. Y.N.J. was supported by NSF grant number DMR-1054020. J.L.O’B. acknowledges a Royal Society Wolfson Merit Award and a Royal Academy of Engineering Chair in Emerging Technologies. A.L acknowledges a fellowship support from EPSRC. Credit: University of Bristol
<urn:uuid:fb6f2a22-64cc-43cf-9167-e32669d9e8d9>
CC-MAIN-2021-10
https://www.azoquantum.com/News.aspx?newsID=6075
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178383355.93/warc/CC-MAIN-20210308082315-20210308112315-00429.warc.gz
en
0.92564
1,168
3.75
4
January 20, 2006 feature Quantum Computing Steps Forward With the University of Michigan’s latest production of a quantum chip, it’s another step forward for quantum computers that will someday dwarf the abilities of today’s machines. Working with individual ions or atoms – much smaller than the transistors of even the most advanced microchips - quantum computers may be both more powerful and more compact than existing computers by various orders of magnitude. Common computers today are thousands of times more powerful and more compact than the first 30 ton behemoths, but they use virtually the same logic. The fundamental design has gone unchanged for 50 years. Quantum computing is whole new ball game. The secret lies in the almost magical property of quantum matter to adopt two states simultaneously. Normal integrated circuits store data using transistors which have just two states – on and off. Each quantum circuit, or qubit, can represent at least three states: on, off or both by an effect called quantum superposition. This means much more data can be stored on each individual circuit. Actually, qubits can potentially contain many states. Dr Andrew White, Senior Lecturer in Physics at University of Queensland describes a qubit like this: “A quantum computer takes that on or off state and adds many different possible states. The first thing, if you think of the globe, let the South Pole be on, the North Pole off – that’s not a very good description of the globe. A quantum computer let’s you describe information by saying, look, you can take an arrow from Earth’s center and point it at the North Pole, South Pole or Los Angeles or London, and that’s richer description. You can fit much more information on a single qubit.” Based on Dr. White’s description, a single qubit could replace a whole bank of conventional memory. Normal memory holds a large array of binary numbers expressed as on or off transistors – ones or zeros. Many transistors are needed to express anything more than just a simple number – hence today’s computers need for large memories. For example: you need 8 bits plus one bit for error correction to store the binary number for 256 which is expressed as 11111111. Going back to our globe example, our arrow could point to Amsterdam which could represent 256 – or any other number. A single qubit could store more information than thousands of transistors. This compact storage leads to another advantage: speed. Without the need to access many memory locations to read data, retrieval is almost instantaneous. Quantum computers will represent a huge leap in processing power as well – they could execute instructions exponentially faster because there would be almost no limit to the size of the instruction. Currently, most computers use 32 or 64 bit instructions. There is another exciting benefit to working with quantum reactions: Entanglement. It describes the ability of quantum matter to “link” two particles. Change one particle and the other changes – instantaneously, even though there is no physical connection! And distance may be irrelevant! This property – not fully understood – would enable computers to talk to each other with no time lag over long distances. Anton Zeilinger at the Institute of Experimental Physics in Vienna, Austria, preformed an experiment to demonstrate entanglement: their group strung an optical-fiber cable in a sewer tunnel under the Danube River with an "entangled" photon at each end. They measured of the state of polarization in one photon (horizontal, vertical, etc…) establishing that the other proton immediately had an identical polarization. What will be the difference to normal computer users? Try instant access to any type of data – whether it is in your computer or on the other side of the planet. As for processing power, few users rarely exceed the abilities of today’s computers. Much computer hardware is used to generate the fancy graphical interface we call Windows – with plenty left over in reserve. Those not familiar with computer science are often surprised to learn there are still a few applications that cannot run easily on today’s computers. They lack of sufficient processing power to do climate modeling, artificial intelligence or break strong encryption. The NSA (National Security Agency) would love to be able to break many a foreign power’s encrypted communications, but has been stymied by the lack of a sufficiently fast computer for the job. Experts estimate it would take more than the lifetime of the Universe using all the computers in the world to break a 1024 bit encryption key – the current standard for serious encryption applications. It’s worth noting that most commercial encryption only uses a 40 bit key. A quantum computer has the potential to break any encryption in a few days. Scientists who study global warming and climate would like to have finer-grained models to be able to predict the weather more effectively and determine the real impact man’s activities have over the planet. Current computers, although fast, still take hours or days to produce weather simulations that lack detail. Artificial intelligence is another field that could use the extra processing power. Current algorithms simply can’t be processed fast enough and, admittedly, may need more refining. However, a quantum computer could theoretically contain more processing power than the human brain in a smaller space – making true AI possible. In fact, more powerful computers often come along well before a use is found for them. In the future, more uses will be found for quantum machines as their tremendous processing power becomes available. But having the machine is not enough. All of today’s software is based on the silicon technology it runs on. New software is already being written to take advantage of quantum computation. One of the most important steps is to write software for error checking. All computers use some type of system to make sure a bit hasn’t accidentally “flopped” from a one to a zero. Quantum computer components, because of their atomic size, will be very susceptible to errors. In fact, one of the biggest problems faced by the scientists working on quantum computing is the problem associated with checking the state of an object so small. How does one check the value of a qubit without changing it? Error checking will be of critical importance and computer scientists have already developed some ideas to insure accuracy in quantum systems. They have also already developed algorithms and equipment for super strong quantum encryption designed to allow hacker-proof security for communications. The National Security Agency and Federal Reserve banks can now buy a quantum cryptographic system from several companies. Anyone who intercepts and tries to read the stream of photons used will disturb the photons in a way that is detectable to both sender and receiver. Quantum encryption represents the first major commercial implementation for what has become known as quantum information science - a blending of quantum mechanics and information theory. As for the software you use in day-to-day computing, no changes will be necessary. Just as software emulators permit Apple users to run Windows and Windows software on the Mac’s Power PC processor – albeit sacrificing some speed – an emulator could quite easily run any programs today at speeds that make the today’s fastest processors look frozen. So you won’t need to run out and buy Microsoft Office 2030 for Quantum Computers – although Bill Gates, if he’s still alive, might like that. It may also change the way we do computing. Like times past when computers were very expensive, we may share a large, centralized quantum computer – one that has the capacity to handle quadrillions of transactions. Connections would be via fiber optic connections and personal data – a whole lifetimes worth – could be stored on a quantum USB-type memory the size of a credit card. This would eliminate the need to have millions of PCs that require upgrading every few years. Don’t expect any of this to happen tomorrow. Scientists are still struggling with some tough problems. Which is the best material from which to make quantum systems? How to check qubit values and not lose the information at the same time? What mechanisms are involved in entanglement? Some experts predict it will be 20 years before we see the first fully functional computers that use quantum materials. No mater how long it takes, money will continue to flow into research efforts. Silicon-based processors are beginning to near the physical limit of smallness and speed. Intel’s best processors currently fabricated using .15 micron process and run 3GHZ. One day we may have more processing power than we know what to do with. It will be up to our imaginations – something no computer may ever accurately match - to think of new problems for these enormously powerful machines to solve. by Philip Dunn, Copyright 2005 PhysOrg.com
<urn:uuid:ad55af28-6c7e-4627-98f4-900368b557e0>
CC-MAIN-2021-10
https://phys.org/news/2006-01-quantum.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178361776.13/warc/CC-MAIN-20210228205741-20210228235741-00550.warc.gz
en
0.928287
1,800
3.71875
4
Physicists at LMU, together with colleagues at Saarland University, have successfully demonstrated the transport of an entangled state between an atom and a photon via an optic fiber over a distance of up to 20 km – thus setting a new record. ‘Entanglement’ describes a very particular type of quantum state which is not attributed to a single particle alone, but which is shared between two different particles. It irrevocably links their subsequent fates together – no matter how far apart they are – which famously led Albert Einstein to call the phenomenon as “spooky action at a distance”. Entanglement has become a cornerstone of new technologies based on effects at the quantum level and is distribution over long distances a central goal in quantum communication. Now LMU researchers led by physicist Harald Weinfurter, in collaboration with a team at the University of the Saarland in Saarbrücken, have shown that the entangled state of an atom and a photon can be transmitted via an optic fiber (like those used in telecommunications networks) over a distance of up to 20 km. The previous record was 700 meters. “The experiment represents a milestone, insofar as the distance covered confirms that quantum information can be distributed on a large scale with little loss,” says Weinfurter. “Our work therefore constitutes a crucial step toward the future realization of quantum networks.” Quantum networks essentially consist of quantum memories (made up of one or more atoms, for example) that act as nodes, and communication channels in which photons (light quanta) can propagate to link the nodes together. In their experiment, the researchers entangled a rubidium atom with a photon, and were able to detect the entangled state – which now shares the quantum properties of both particles – after its passage through a 20-km coil of optic fiber. The biggest problem the experimenters faced start with the properties of the rubidium atom. Following targeted excitation, these atoms emit photons with a wavelength of 780 nanometers, in the near-infrared region of the spectrum. “In an optic fiber made of glass, light at this wavelength is rapidly absorbed,” Weinfurter explains. Conventional telecommunications networks therefore make use of wavelengths around 1550 nanometers, which markedly reduces losses in transit. Obviously, this wavelength would also improve the experimenters’ chances of success. So Matthias Bock, a member of the group in Saarbrücken, built what is called a quantum frequency converter that was specifically designed to increase the wavelength of the emitted photons from 780 to 1520 nanometers. This task itself posed a number of extremely demanding technical challenges. For it was imperative to ensure that conversion from only a single photon to only one other photon happens and that none of the other properties of the entangled state, especially the polarization of the photon, were altered during the conversion process. Otherwise, the entangled state would be lost. “Thanks to the use of this highly efficient converter, we were able to maintain the entangled state over a much longer range at telecommunications wavelengths, and therefore to transport the quantum information that it carries over long distances,” says Weinfurter. In the next step, the researchers plan to frequency convert the light emitted by a second atom, which should enable them to generate entanglement between the two atoms over long telecommunications fibers. The properties of glass-fiber cables vary depending on factors such as the temperature and strain to which they are exposed. For this reason, the team intends to first carry out this experiment under controlled conditions in the laboratory. In the event of success, field experiments will be undertaken also adding new nodes to a growing network. After all, even long journeys can be successfully completely by taking one step at a time. Physical Review Letters, 2020 The Latest Updates from Bing News & Google News Go deeper with Bing News on: - IBM adds 10 historically Black colleges and universities to quantum computing centeron February 22, 2021 at 5:02 am The IBM-HBCU Quantum Center announced on Monday that it is adding 10 historically Black colleges and universities to the center's 13 founding institutions. The center was launched last fall with the ... - Encrypted Quantum Computing: When Ignorance Is Wantedon February 21, 2021 at 3:58 pm Quantum technologies for computers open up new concepts of preserving the privacy of input and output data of a computation. Scientists from the University of Vienna, the Singapore University of Techn ... - IBM Reveals Five Year Quantum Development Roadmapon February 18, 2021 at 7:47 pm Every year we get closer to mainstream use of quantum computers. IBM's Quantum roadmap shows how the company plans to make quantum accessible to more developers. - Quantum network is step towards ultrasecure interneton February 17, 2021 at 8:29 pm Physicists have taken a major step towards a future quantum version of the Internet by linking three quantum devices in a network. A quantum internet would enable ultrasecure communications and unlock ... - BP Joins IBM Quantum Networkon February 17, 2021 at 3:58 am BP (NYSE: BP) has announced that it has joined the IBM Quantum Network to advance the use of quantum computing in the energy industry. ... Go deeper with Google Headlines on: Go deeper with Bing News on: - China launches cloud-based quantum computing operating system software to challenge U.S. in technologyon February 22, 2021 at 11:31 pm Quantum computers achieve their immense power by replacing traditional bits with qubits, which can function as both a '1' and a '0' at the same time. - Researchers create 'beautiful marriage' of quantum enemieson February 22, 2021 at 1:45 pm Cornell University scientists have identified a new contender when it comes to quantum materials for computing and low-temperature electronics. - Planet Earth Report –“The Quantum Century to Events That Could Have Ended Humanity”on February 22, 2021 at 6:57 am Planet Earth Report” provides descriptive links to headline news by leading science journalists about the extraordinary discoveries, technology, people, and events changing our knowledge of Planet ... - B’luru institute takes big leap in quantum communicationon February 22, 2021 at 4:38 am Researchers team headed by Urbasi Sinha at the Raman Research Institute (RRI) successfully demonstrated free space quantum distribution key between two ... - Scientists create ‘beautiful marriage’ of quantum enemieson February 22, 2021 at 3:17 am Cornell scientists have identified a new contender when it comes to quantum materials for computing and low-temperature electronics. Using nitride-based materials, the researchers created a material ...
<urn:uuid:62bb4dc2-39a0-4f75-ada2-5874a9fe139f>
CC-MAIN-2021-10
https://innovationtoronto.com/2020/01/another-step-on-the-way-to-quantum-networks/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178355944.41/warc/CC-MAIN-20210226001221-20210226031221-00274.warc.gz
en
0.92637
1,388
3.5
4
Revising Moore's Law Ever noticed that computers become outdated remarkably quickly? It's nice to have increasingly powerful computers available, and very profitable for the computer industry to have a new model available every couple of years. So how do they manage to make it happen? Every new generation of computer hardware has roughly twice the processing power of the version two years before it. It's a phenomenon known as Moore's Law and it's held true for nearly 50 years. But could Moore's Law be coming to an end? Could we be reaching the limit of how fast computer processors can actually be? And if so, what then? Moore's Law states that the number of transistors that fit on a certain area on a computer chip doubles every two years. In the past few years, it's become clear that we're reaching the limit of just how small, and just how powerful, we can make processors. As a result, developers are now looking towards radical design changes, using exotic materials, and applying plenty of creative thinking in the quest for solutions. One of the fields attracting a lot of attention is the study of quantum behaviour of electrons and how this applies to computing. Existing (or "classical") computer hardware works by storing data in a binary format within transistors. The smallest piece of information – a "bit" – can have one of two states: "off" or "on", "0" or "1". Quantum computing, on the other hand, allows us to use many physical systems (such as electrons, photons, or tiny magnets) as quantum bits, or "qubits". These qubits can be engineered to contain the same binary information as classical bits – i.e. "0" or "1" – but, that's not all. Unlike any existing computer, one made of qubits can also encode an exponentially-larger amount of information than a simple binary state. Let's put this into perspective. Fourteen bits in your computer's central processing unit (CPU) can contain, well, 14 bits of binary information – 14 pieces of information which are either "0" or "1". Conversely, 14 qubits in a quantum computer can contain the equivalent of 214 bits of information. That's 16,384 bits, far more than the 14 pieces of binary information possible in a classical system. Let's take it one step further and use 300 qubits as an example. Three hundred qubits is the equivalent of 2300 classical bits which is approximately the same as the number of particles in the entire universe. So how can quantum bits store so much more information than classical bits? Well, it's all down to a phenomenon known as quantum entanglement. A quantum particle is said to be "entangled" with another when its properties are only defined in relation to the other. Two entangled quantum particles could be physically separated, but if you observe them individually you will find correlations between them that cannot be accounted for by assuming they act independently of each other. It may appear as if acting on one particle influences the other one instantly, even faster than the speed of light. In reality, the entanglement makes the particles acquire "non-local" properties. No "action at a distance" is required, and the principles of relativity (i.e. no information can be transported faster than the speed of light) are respected. Odd as this may sound, entangled particles create a distinguishable and legitimate state that can be used as a code to carry additional information without using additional bits. The availability of these entangled states is the reason quantum bits can encode exponentially more information that classical ones. While qubits can store an exponentially-greater amount of information than classical bits, quantum computing is still in its infancy. In fact, at the moment, there are only a few examples where quantum computers can be used to complete tasks more effectively than classical hardware. These include: - The ability to decipher encrypted information much faster than is currently possible - The ability to search an unsorted database quickly and effectively. The most advanced calculation done with quantum bits so far is the factoring of 15 = 3 × 5. This may seem unimpressive, but it proves that quantum computing can be used in this capacity. With more research and more time, we'll be able to factorise extremely large numbers – ones that are thousands of digits long – in a matter of minutes, rather than the millions of years it would take now. Given these limitations, it's not true to say that quantum computers will be able to replace existing computers. For one thing, the expected clock speed of a quantum computer is not likely to be any faster than that of a classical one. Therefore, if we run the the same algorithm on a quantum and on a classical computer, the classical one will usually win. Quantum computers will only be better if an algorithm exists where the presence of entangled quantum states can be exploited to reduce the number of steps required in a calculation. At this stage we don't know of any quantum algorithm to reduce the complexity of, say, web browsing or text editing, but the search is on. Regardless of how powerful and widespread quantum computers will be in decades to come, the basic research being undertaken to construct these machines is already very useful in the construction of classical systems. One of the most promising uses for quantum computing today involves the use of single atoms coupled to silicon transistors. That is, the exact same components used in classical computers but scaled to single atoms. In this way, many of the things we learn in the pursuit of a quantum computer can be reused for the purpose of pushing classical ones yet a step further in their miniaturisation. Quantum computing won't provide us with a replacement to classical computers if and when Moore's Law grinds to a halt. But it will help solve some interesting and challenging problems in computing. Andrea Morella is a senior lecturer in Quantum Nanosystems at University of New South Wales. This article first appeared in The Conversation on June 2. Republished with permission.
<urn:uuid:d379b8ee-52fc-46d4-b484-b68cd97f82df>
CC-MAIN-2021-10
https://www.eurekareport.com.au/investment-news/revising-moores-law/86840
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178357935.29/warc/CC-MAIN-20210226175238-20210226205238-00433.warc.gz
en
0.932246
1,253
3.515625
4
Welcome to Introduction to Quantum Computing. I am your guide, Associate Professor Chris Ferrie, a researcher in the UTS Centre for Quantum Software and Information. This is Lecture 6. It would probably be a good idea to have read the previous Lectures before continuing. What did you learn last week? In the last few weeks you completed your basic training. You now know about quantum information, how to write it, how to read it, and how to dial it up to 11 with entanglement. Entangled states were those that could not be written as product states (in any basis!). With multiqubit states and gates, you have all the tools you need to start creating quantum algorithms.What will you learn this week? This week you will be introduced to the first two canonical quantum protocols: superdense coding and teleportation. These demonstrate that entanglement can be used as a resource for some tasks. You’ll have your first taste as designing protocols as well as analysing them.What will you be able to do at the end of this week 5? At the end of this module, you should be able to answer these questions: What is superdense coding? What is quantum teleportation? What is entanglement swapping? How is entanglement a useful resource? Superdense coding by design Recall the Holevo theorem: a qubit cannot convey more than one bit of information. But… what if it could? Superdense coding is a communication protocol which allows two bits of information to be sent using a single qubit. How? Entanglement, of course! Simple communication protocols are usually phrased as a game with players named Alice and Bob. The players can have agreed upon strategies beforehand and are often constrained by what information they can pass to each other. There is always some goal, or win condition, that the players are working toward. In the case of superdense coding, Alice can only send a single qubit to Bob, but must convey two bits of information. They can meet beforehand and agree on some strategy, which they obviously must do since Holevo’s theorem proves that only a single bit of information can be conveyed with the qubit Alice sends. So, first of all, we know that two qubits are required. But Alice can only send one, and so Bob must possess the other. If the state of the two qubits is a product state, Bob’s qubit contains no information about the bits Alice needs to convey. So, the state must be entangled. We know how to do that with the Hadamard and CNOT gate. The first step in the protocol is for Alice and Bob to create a pre-arranged entangled pair of qubits. Once they are separated, Alice can only send back her qubit to Bob. Alice needs to perform some unitary on her qubit to encode the bits she wants to send. Whatever she ends up doing to her qubit, call it unitary U, we can see that Bob still needs to perform some action to decode the information contained in the pair. Why? Notice that, after Bob possesses both qubits, the state of the pair is U|0⟩⊗|0⟩ + U|1⟩⊗|1⟩, which is still entangled. To get a definitive answer, Bob has to disentangle the state to reduce it to one of the basis states, which depends on both bits. In other words, he must end up with the state |b₁⟩⊗|b₂⟩. Let’s assume he does this by inverting the original entangling operation. By working backwards through this computation, we can figure out what Alice needs to do for the whole protocol to work out as desired. That is, we start with |b₁⟩⊗|b₂⟩ and apply the inverse of each operation preceding it. In this case, both the Hadamard and CNOT are self-inverse, so we apply them to get. Now we need to find some operations of the form (U⊗I), where U depends on b₁b₂, to get back to our initial entangled state |0⟩⊗|0⟩ + |1⟩⊗|1⟩. We’ve done half the work in noticing that a Z gate can apply the phase. But at this point you might be thinking we are stuck since it’s Bob’s qubit that depends on b₂. However, a quick check shows that the symmetry of this entangled state saves us: |0⟩⊗|b₂⟩ + |1⟩⊗|¬b₂⟩ =|b₂⟩⊗|0⟩ + |¬b₂⟩⊗|1⟩. We can then see that the bit flipping X gate will get us back to the correct state. Putting it all together, the protocol looks like this. As an exercise, step through each gate of the algorithm to prove the the entire circuits acts as |0⟩⊗|0⟩ ↦|b₁⟩⊗|b₂⟩. It’s worth pausing here and asking, you know, why? Presumably sending two bits is much easier than sending a qubit to someone. But recalling Holevo’s theorem again, imagine that an eavesdropper intercepted the qubit. Could they decode the two-bit message? No! The most they could learn is one bit. Quantum entanglement enables secure quantum communication. Superdense coding allowed Alice to communicate two bits with one qubit. Quantum teleportation is in some sense the opposite — it allows Alice to instead communicate one qubit with two bits.This should be surprising — how could Alice communicate a qubit over the telephone with just two bits? Again, the answer is entanglement. Since it is so similar to superdense coding, I’ll just show you the circuit. Let’s approach this one from the perspective of verifying the circuit. We need to prove that the state in the first register |𝜓⟩ = 𝛼|0⟩ + 𝛽|1⟩ is the same as the same as the final register at the output. To do so, we step through applying one gate at a time. Not so bad, right? Now, I know you may be wondering what this has to do with teleportation, as seen on TV. Well, I have nothing to say that this webcomic doesn’t already say about that. Something cool about the teleportation protocol is that it preserves entanglement. That is, if the qubit Alice wants to send to Bob is entangled with another qubit held by, say, Charlie, the qubit Bob ends up with at the end of the protocol will be entangled with Charlie. In circuit form, it looks like this. But it gets even better! Imagine now Alice and Bob share an entangled pair, Alice and Charlie share an entangled pair, and Charlie and Diane share an entangled pair. So, we have 6 qubits in total and they are paired off. Alice and Bob perform the teleportation protocol, as do Charlie and Diane. Alice and Charlie measure — and hence collapse — each of the qubits they had initially entangled. Those qubits are individually teleported to Bob and Diane, respectively. But, here’s the kicker — Bob and Diane now share entanglement! This protocol is called entanglement swapping and I’m sure you can imagine how it might be useful in a quantum networking scenario. IBM’s Qiskit Textbook contains an introductory discussion of both Quantum Teleportation and Superdense Coding in Chapter 3.
<urn:uuid:90810447-6c64-4d55-ba3e-ebd58e16f8ab>
CC-MAIN-2021-10
https://csferrie.medium.com/my-first-quantum-protocol-de336d290322
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178389472.95/warc/CC-MAIN-20210309061538-20210309091538-00594.warc.gz
en
0.934616
1,694
3.578125
4
Quantum physics—the laws that govern the behavior of smallest components of our universe, such as fundamental particles, atoms and molecules—is admittedly a tough subject, a complicated path of intricate mathematics and scientific theory. Those outside the field who brave the journey often find themselves in a confusing place where the classical principles they learned in school no longer apply and the new rules seem…well…a bit unbelievable. In the quantum world, things can be in two places at once? Better yet, they can be two things at once? What??? If this has been your experience, don’t worry—you’re in very good company. Respected scientists, including Albert Einstein, felt the same way, and made many attempts to prove that these strange new theories couldn’t be correct. Each attempt, however, failed, and instead reinforced the reality of quantum physics in contrast to our conventional intuition. But this is good news—the properties buried in quantum theory hold great promise for exciting, real-world applications. So how do we make sense of these bizarre new rules? What really makes quantum physics so different, so strange, and so promising? To start, let’s take a look back to 1900 and the work of physicist Max Planck, who first drew back the curtain on the mysterious quantum world. That year, Planck was embroiled in a nagging physics problem—how to explain the radiation of light emanating from hot objects. At the time, there were two conflicting laws, neither of which was quite right. Sandwiching visible light on the electromagnetic spectrum are infrared waves, which have longer wavelengths and a lower frequency, and ultraviolet waves, which have shorter wavelengths and a higher frequency. One law—Wien’s law—could accurately predict the experimental results of ultraviolet waves, but fell apart when it came to infrared waves. Conversely, the Rayleigh-Jeans law covered infrared waves, but didn’t work for ultraviolet. What Planck needed, then, was one law that would correctly apply to both ends of the spectrum. For the birth of quantum physics, the details of Planck’s solution to this problem were far less important than the trick he used to arrive at it. This trick, which Planck later on called “happy guesswork,” was simple but unsettling: the radiation energy had to be chopped up into tiny packages, or particles of light. Based on everything physicists knew at the time, this claim was outrageous: light was understood as a wave, which left little space for particles of light, nowadays known as photons. So now light could be…both? While it was not his intent, Planck’s trick was the first step in a chain reaction that turned the physics world upside-down. We now understand that it’s not just light, but all of the fundamental components of our universe that embrace this dual nature and the other properties of the quantum world. To explain, let’s take another step back, this time to our early science education, and picture electrons—the negatively charged fundamental particles that, together with the positively charged protons and neutral neutrons, make up atoms. Are you picturing them as miniature billiard balls? What about a light wave? Do you imagine it as a tiny version of what comes crashing against the shoreline? These are convenient pictures, because they are easy to imagine. But what is your evidence that these mental pictures really describe the nature of an electron, and the nature of light? With your sensory perception, you cannot see a single electron, nor observe a light wave oscillate. And, as it turns out, neither light, nor electrons, nor atoms, nor even molecules are simply waves, or just particles. When it comes to strange quantum properties, this dual wave-particle nature is just the tip of the iceberg. One of the most striking concepts is that of quantum entanglement. It can be illustrated like this: imagine being the proud parent of two children, Susy and Sam, who have just hit the age of disagreeing with each other all the time. They both like mac & cheese as well as pizza. Sadly, this is no longer sufficient to guarantee a drama-free dinner. As a counter strategy, you and your partner team up and question Sam and Susy simultaneously in different rooms. This way, they cannot coordinate their dissent, and you have a 50 percent chance of random agreement on the dinner choice. Believe it or not, in the quantum world you would be doomed. In an experiment, the two parties could be photons, and the dinner question could be a measurement of their polarization. Polarization corresponds to the direction of oscillation—moving up and down or from side to side—when light behaves as a wave. Even if you separate the two parties, eliminating all communication, quantum physics allows for an invisible link between them known as entanglement. Quantum-Susy might change her answer from day to day (even pizza gets boring after a while), but every single time there is perfect anti-correlation with quantum-Sam’s answer: if one wants pizza, the other opts for mac & cheese—all the time! This is just one example of the many bizarre properties we know to be true based on careful calculation and experimentation. But if we’re so sure, why do we witness so little of the quantum world? Much of quantum physics happens at length scales so small that they remain hidden to us, even when using the most powerful microscopes. In addition, witnessing quantum physics at work turns out to be radically different from what you might call an “observation.” Seeing that an object is the color red is a fairly straightforward, unobtrusive process. Probing a quantum object like an electron or photon is an entirely different matter. True quantum behavior tends to be fragile, and attempting to measure it often constitutes a major but unavoidable disruption that usually prevents quantum weirdness from becoming directly visible. However, just because we cannot see quantum physics in action doesn’t mean that is hasn’t affected our lives in a tangible, positive way. The impact of quantum physics has been enormous: not only is it the prime common factor in nearly all physics Nobel Prizes awarded in the past one-hundred years, but it has also been a crucial driving force in technological advances ranging from lasers and superconductors to medical imaging like MRIs. Indeed, imagining a world in which quantum physics had never been discovered would amount to eliminating a lot of the technology we take for granted each and every day. The grandest vision, perhaps, is that of harnessing the power of quantum physics for a completely new kind of supercomputer. Such a quantum computer could solve tasks in a heartbeat that would currently require centuries of computation time on the fastest computers available today. Sounds intriguing? Many physicists around the world working on the hardware of such a machine would agree. (To learn more about what would make a quantum computer so powerful, check out the slideshow above.) They would also explain, however, how daunting the challenges are in this endeavor. Overcoming the fragile nature of quantum behavior is not an easy task—one that rivals the quantum leap of faith taken by Planck and his colleagues to bring us into this new and exciting world.
<urn:uuid:828f2e24-96cc-4bc2-a9aa-e674374c87f8>
CC-MAIN-2021-10
https://helix.northwestern.edu/article/why-quantum-physics-weird-and-stunningly-useful
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178363782.40/warc/CC-MAIN-20210302065019-20210302095019-00514.warc.gz
en
0.951854
1,498
3.75
4
What is Spintronics? Spintronics, also known as spin electronics, is an emerging solid-state device technology that exploits the intrinsic spin properties of an electron and its associated magnetic moment, in addition to the electron charge. Conventional electronic and semiconductor devices rely on the transport of electron charge carriers. Whereas, spintronics deal with spin-charge coupling in metallic systems with implications in the efficiency of data storage and transfer. Spintronic systems are of particular interest in the field of quantum computing and neuromorphic computing. Every electron can exist in one of the two spin states: spin-up and spin-down. In other words, electrons can rotate either clockwise or counterclockwise with constant frequency around its axis. They can represent 0 or 1 in logic operations. In ordinary materials, the spin-up magnetic moments cancel with spin-down magnetic moments and therefore are of no use for spintronics. Ferromagnetic materials are needed for spintronics which can provide a surplus accumulation of different spins in a tiny region called domains. These majority-up and majority-down domains are randomly scattered and with an externally applied magnetic field will line up the domains in the direction of the electric field. Spintronics is the driving technology behind next-generation nano-electronic devices to increase their memory and processing capabilities while reducing power consumption. In these devices, the spin polarization is controlled either by magnetic layers or via spin-orbit coupling. Spin waves, also known as magnons, can be used to carry spin current without causing heat. Spintronics is also used in the semiconductor industry to manufacture different types of transistors, lasers and integrated magnetic sensors. The miniaturization of microelectronic components is a basic necessity for semiconductor devices. However, over the years of miniaturization, the physical size of semiconductor electronics will soon approach a fundamental barrier. Therefore, device engineers and physicists feel that quantum mechanics can help in future miniaturization. After all, electronic spin is a quantum phenomenon. Spintronics combined with nanotechnology can be a perfect solution for the future miniaturized devices. Types of Spintronics - Metal-based spintronics: Giant-magneto resistance (GMR) in magnetic (metal) multilayers was discovered in 1988 and led to the birth of spintronics. GMR based metal spintronics became the standard technology for read-heads of hard disk drives. Later, a large tunnel-magnetoresistance (TMR) between two magnetic metals separated by a thin insulator was demonstrated at room temperature in 1994. Magnetic tunnel junction (MTJ) is currently the preferred choice for manufacturing of magnetic random-access memory (MRAM) devices. - Semiconductor based spintronics: Despite rapid advancement in metal-based spintronics, a major focus was to find novel ways to generate and utilize spin-polarization currents in semiconductors. Doped semiconductor materials display dilute ferromagnetism, and strong ferromagnetism is essential for achieving spintronics. The selection of materials for semiconductor spintronics depends on the ability of the material to provide ferromagnetism at room temperature. Majority of the work is focussed on the use of GaAs (Gallium Arsenide) and InAs (Indium Arsenide) at semiconductor-semiconductor or semiconductor-metal interfaces. Spins in semiconductors can be easily manipulated and controlled. Spintronics based devices can easily integrate with existing semiconductor technology. Semiconductor spintronics combined with photonics and magnetics can provide multi-functional devices such as spin-transistors, spin-LEDs, memory devices, optical switches operating at terahertz frequencies and few other devices. There are different ways to create spin polarisation and harness spin degree of freedom in metals and semiconductors. Few important ways are listed below. - Spin-injection from a ferromagnetic material. - A stray field (magnetic or electric) can induce population difference of spin polarised electrons in ferromagnetic materials. - Electromagnetic waves such as circularly polarized light and microwave excite spin polarised electrons in semiconductors depending on optical selection rule. A spin-polarised electron current can be extended further to spin generation by electromagnetic waves, including spin pumping and high-frequency spin induction. - A thermal gradient can also produce spin polarised carrier flow using the spin Seebeck and Nernst effect, and this can be useful in energy harvesting. There are many spintronic based devices in the market ranging from transistors, oscillators, memory units to quantum computing. A few prominent devices are listed below. - Spin transistor: The basic idea of a spin transistor is to control the spin orientation by applying a gate voltage. A spin field-effect transistor (FETs) consists of ferromagnetic source and drain electrodes, a semiconductor channel that contains a layer of electrons, and a gate electrode attached to the semiconductor. The spin-polarised electrons are injected from the source electrode. The spin precision angle controls the flow of current. The gate electrode controls the rotation of the electron spin after entering the semiconductor channel. The success of these transistors depends on efficient injection of spin currents from a ferromagnetic metal to a semiconductor. - Quantum dots or Spin-based computers: In quantum dots, electron motion is quantized in all directions and conducting electrons are confined within the nano-meter distances. The charge and spin of electrons can be controlled in these dots. The spin of an electron confined to quantum dots can be used as quantum bits, and these arrays of quantum dots can serve to build quantum computers. Already, quantum dots are useful in electronic and optic devices such as quantum-dot lasers, memory chips, and also in quantum cryptography. - Hard disk drive (HDD) read head: GMR based HDD was introduced by IBM in 1997. Later, TMR based HDD was introduced by Seagate in 2005. A new head assisted magnetic recording (HAMR) drive was demonstrated by Seagate in 2012. - Magnetic Sensors: Magnetic sensors are used to detect position, angle, rotation and magnetic fields. These sensors are built mainly based on Hall, GMR and AMR effects. A highly sensitive magnetic sensor is used in magnetoencephalography to map the brain. Spintronics is at the verge of becoming a major technology for microelectronics. Many devices have started entering the market with a recent launch of magnetic memory production at a large scale. However, there is a need for further improvements in spintronic device applications, and few are noted below. - Development of low power devices - Unconventional computing such as stochastic computing using spintronic devices - Energy harvesting using spin-diodes or spin-caloritronics - Need for the development of artificial neurons and synapses based on spintronic devices.
<urn:uuid:c70bfd27-fa81-4d1c-a049-93c4e4259681>
CC-MAIN-2021-10
https://www.ssla.co.uk/spintronics/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178364932.30/warc/CC-MAIN-20210302221633-20210303011633-00318.warc.gz
en
0.901134
1,435
3.71875
4
Google’s recent announcement that its quantum computer had achieved “quantum supremacy” has garnered significant global attention. And for good reason. Sycamore, Google’s 53-bit quantum computer reportedly performed a 200-second calculation in the time it would have taken the world’s fastest supercomputer, the IBM Summit, 10,000 years. Beyond conventional silicon computers, quantum computers represent a new era in the evolution of computational technology. Nonetheless, the challenges confronting the field suggest that there is a very long way to go. Born out of the thinking of Max Planck, Niels Bohr, and Albert Einstein, quantum theory offers new and unexplored potential for driving the evolution of computer science. Quantum computers operate on completely different principles compared to their conventional counterparts. Where classical computers are fast and efficient, they are simply not very good at problems that involve exponential complexity. Quantum researchers utilize the properties of electrons as an engine for performing exponentially fast calculations. Quantum computers are expected to transform cryptography, pattern matching, drug discovery and ultimately boost artificial intelligence (AI) training. However, the current generation of quantum computers are extremely sensitive to perturbations, noise, and other environmental effects that can cause their “quantum state” to waver and disappear— an effect referred to as decoherence. Contemporary quantum computers require exacting demands of stability and temperature for maintaining quantum states. In fact, researchers have only been able to maintain a quantum state for a tiny fraction of a second— not long enough to carry out a useful algorithm. This instability remains the biggest challenge facing quantum computing. Designing a quantum computer with qubits Research on quantum computing remains at a very early stage. Much like the traditional computers introduced in the 1950s, quantum computers remain big, clunky machines. The most common design of quantum computers rely on multiple layers of superconducting circuits sequestered in a controlled environment and cooled step-wise to temperatures colder than deep space. Where a conventional computer uses transistors as a substrate for information processing, quantum computers can use anything that demonstrates quantum behavior. This can include an atom, a molecule, or more commonly, an electron. Due to “superposition”, quantum computers can perform multiple calculations at once, giving them the potential to be exponentially more powerful than conventional computers. Superposition is best understood as the capacity for electrons to be at different positions at the same time. Quantum computers leverage the superposition of quantum states to manage calculations on orders of magnitude faster than silicon processors. As demonstrated by the famous double-slit experiment involving a single photon of light, photons may produce a wavelike interference pattern or superposition of all available paths. The most common quantum computers today leverage electrons to move beyond the binary logic of silicon computing. In conventional computing, information is stored as bits and exist as either ones or zeros. Unlike a conventional bit, the quantum bit or qubit can store and manipulate much more information than just ones and zeros. For example, A 10-qubit quantum computer can process 1,024 possible inputs at once (instead of analyzing them one at a time). The magic of qubits is that they can exist in superposition, or in multiple states at once. Using the example of Schrödinger’s cat, any given qubit can hold a 0 and a 1 at the same time. Thus, a single qubit can represent far more information than a binary bit. As an example, a four-qubit computer register can hold 16 different numbers simultaneously. Using code to manipulate electrons, many engineers are hoping to develop quantum algorithms to exploit the vast computational potential of quantum computers. Generally, the goal is to encode parts of a problem into a complex quantum state using qubits. Then, manipulating that state in order to drive it towards something that will eventually represent the solution. Solutions can be measured by collapsing the superpositions into deterministic sequences of zeros and ones. The race for high-performance quantum computers Quantum computers hold the promise of virtually limitless supercomputing power, pushing the envelope on supercomputing or high-performance computing (HPC). However, the current state of noisy quantum computers have a coherence time of a mere 100 microseconds. This is the maximum length of time in which an experiment can be run on a quantum processor before errors take over. The most common quantum computer designs today consist of superconductor computers and spin computers. Superconductors are the most well-established method for maintaining a quantum state: Metallic superconductors are used at near-zero temperatures in order to conduct electrons. Electrons must be free from all radiation or light particles and kept at a freezing temperature. Google’s quantum computer, for example, is cooled to an astonishing 460 degrees below zero. The more recent spin method of quantum computing uses a single electron within silicon to create qubits. Only a few nanometers in size, these electrons are called quantum dots and can operate at higher temperatures. In fact, a new silicon chip capable of manipulating the spin of a single electron could ultimately allow future quantum computers to be built using conventional electronic technology. Thanks largely to research by IBM, Google, Microsoft and others, the United States remains the leader in patents related to quantum computers. In the future, quantum computers are expected to become very good at highly specific problem-solving. Quantum computing performs best in probabilistic situations such as weather prediction, market forecasting, and breaking encryption. In the U.S., IBM and Google are racing to create the first truly useful quantum computer. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule. IBM is also working on developing quantum computing technologies and recently introduced the IBM Quantum Experience, a quantum computing platform delivered via the Cloud. Since 2016, IBM has provided researchers with a five-qubit cloud-based quantum computer and made its 20-qubit system available online at the end of 2017. In addition to IBM and Google, D-Wave, a Canadian company based in Vancouver has also been a leader in developing an early-stage quantum computer. D-Wave utilizes a method known as quantum annealing. Running adiabatic quantum computing algorithms, D-Wave’s machine finds a “good enough” or “local minima” solution. Volkswagen has leveraged D-Wave’s quantum annealing technology, using it to carry out research on traffic flow optimization in Beijing with 2,000 qubits. One very promising application of quantum technology is quantum communications. Researchers are working towards creating ultra-secure communication networks that could form the basis of a quantum internet. Where sensitive data is currently encrypted and transmitted using digital “keys” (1 and 0s), quantum communications has already demonstrated the capacity to secure encrypted information using qubits. Quantum key distribution (QKD), for example, combines digitally encrypted data with keys that are encoded and transmitted using quantum state using qubits. China has become a global leader in the drive to develop quantum communication technologies. Pouring vast sums of money into quantum research, China filed almost twice as many patents as the United States in the field of quantum technology in 2017 alone. That same year, the country launched a dedicated quantum communications satellite called Micius, staging the world’s first quantum key distribution-secured video conference between Beijing and Vienna. An arcane field only a decade ago, quantum computing has matured at an astonishing pace. As countries around the world continue to move the needle on supercomputing, we will likely see revolutionary applications in the field of quantum technology. Nonetheless, the mainstream application of quantum computing remains decades away. Quantum computing represents a revolution in computational technologies; that goes without saying. But there remains significant work ahead.
<urn:uuid:fd57ad02-c1f8-44d3-9a95-24b85304a419>
CC-MAIN-2021-10
https://netsmiami.com/a-deeper-dive-into-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178351374.10/warc/CC-MAIN-20210225153633-20210225183633-00200.warc.gz
en
0.907226
1,594
4
4
A team of researchers at The University of Texas at Austin and the University of California, Riverside have found a way to produce a long-hypothesized phenomenon—the transfer of energy between silicon and organic, carbon-based molecules—in a breakthrough that has implications for information storage in quantum computing, solar energy conversion, and medical imaging. The research is described in a paper out today in the journal Nature Chemistry. Silicon is one of the planet’s most abundant materials and a critical component in everything from the semiconductors that power our computers to the cells used in nearly all solar energy panels. For all of its abilities, however, silicon has some problems when it comes to converting light into electricity. Different colors of light are comprised of photons, particles that carry light’s energy. Silicon can efficiently convert red photons into electricity, but with blue photons, which carry twice the energy of red photons, silicon loses most of the energy as heat. The new discovery provides scientists with a way to boost silicon’s efficiency by pairing it with a carbon-based material that converts blue photons into pairs of red photons that can be more efficiently used by silicon. This hybrid material can also be tweaked to operate in reverse, taking in red light and converting it into blue light, which has implications for medical treatments and quantum computing. “The organic molecule we’ve paired silicon with is a type of carbon ash called anthracene. It’s basically soot,” said Sean Roberts, a UT Austin assistant professor of chemistry. The paper describes a method for chemically connecting silicon to anthracene, creating a molecular power line that allows energy to transfer between the silicon and ash-like substance. “We now can finely tune this material to react to different wavelengths of light. Imagine, for quantum computing, being able to tweak and optimize a material to turn one blue photon into two red photons or two red photons into one blue. It’s perfect for information storage.” For four decades, scientists have hypothesized that pairing silicon with a type of organic material that better absorbs blue and green light efficiently could be the key to improving silicon’s ability to convert light into electricity. But simply layering the two materials never brought about the anticipated “spin-triplet exciton transfer,” a particular type of energy transfer from the carbon-based material to silicon, needed to realize this goal. Roberts and materials scientists at UC Riverside describe how they broke through the impasse with tiny chemical wires that connect silicon nanocrystals to anthracene, producing the predicted energy transfer between them for the first-time. “The challenge has been getting pairs of excited electrons out of these organic materials and into silicon. It can’t be done just by depositing one on top of the other,” Roberts said. “It takes building a new type of chemical interface between the silicon and this material to allow them to electronically communicate.” Roberts and his graduate student Emily Raulerson measured the effect in a specially designed molecule that attaches to a silicon nanocrystal, the innovation of collaborators Ming Lee Tang, Lorenzo Mangolini and Pan Xia of UC Riverside. Using an ultrafast laser, Roberts and Raulerson found that the new molecular wire between the two materials was not only fast, resilient and efficient, it could effectively transfer about 90% of the energy from the nanocrystal to the molecule. “We can use this chemistry to create materials that absorb and emit any color of light,” said Raulerson, who says that, with further fine-tuning, similar silicon nanocrystals tethered to a molecule could generate a variety of applications, from battery-less night-vision goggles to new miniature electronics. Other highly efficient processes of this sort, called photon up-conversion, previously relied on toxic materials. As the new approach uses exclusively non-toxic materials, it opens the door for applications in human medicine, bioimaging and environmentally sustainable technologies, something that Roberts and fellow UT Austin chemist Michael Rose are working towards. At UC Riverside, Tang’s lab pioneered how to attach the organic molecules to the silicon nanoparticles, and Mangolini’s group engineered the silicon nanocrystals. “The novelty is really how to get the two parts of this structure—the organic molecules and the quantum confined silicon nanocrystals—to work together,” said Mangolini, an associate professor of mechanical engineering. “We are the first group to really put the two together.” The paper’s other authors include Devin Coleman and Carter Gerke of UC Riverside. Reference: “Achieving spin-triplet exciton transfer between silicon and molecular acceptors for photon upconversion” by Pan Xia, Emily K. Raulerson, Devin Coleman, Carter S. Gerke, Lorenzo Mangolini, Ming Lee Tang and Sean T. Roberts, 2 December 2019, Nature Chemistry. Funding for the research was provided by the National Science Foundation, the Robert A. Welch Foundation, the Research Corporation for Science Advancement, the Air Force Office of Scientific Research and the Department of Energy. Additionally, Raulerson holds the Leon O. Morgan Graduate Fellowship at UT Austin.
<urn:uuid:eb2e028d-778f-4b1d-87dd-1fc2bc241542>
CC-MAIN-2021-10
https://scitechdaily.com/new-way-to-split-and-sum-photons-with-silicon-is-breakthrough-for-quantum-computing-solar-energy/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178365454.63/warc/CC-MAIN-20210303042832-20210303072832-00041.warc.gz
en
0.913998
1,080
3.703125
4
As engineers and researchers work on developing and perfecting their machine learning and AI algorithms, the end goal is ultimately to recreate the human brain. The most perfect AI imaginable would be able to process the world around us through typical sensory input but leverage the storage and computing strengths of supercomputers. With that end goal in mind, it's not hard to understand the ways that AI is evolving as it continues to be developed. Deep learning AI is able to interpret patterns and derive conclusions. In essence, it's learning how to mimic the way that humans process the world around us. That said, from the onset, AIs generally need typical computer input, like coded data. Developing AIs that can process the world through audio and visual input, sensory input, is a much harder task. In order to understand artificial intelligence in the context of a perception-based interface, we need to understand what the end goal is. We need to understand how the brain is modeled and works. Our brain from a computer's perspective Our brains are essentially the world's most powerful supercomputers, except for the fact that they're made out of organic material, rather than silicon and other materials. Our right brain is largely perception-based, it's focused on the interpretation of environmental inputs like taste, feel, sound, sight, etc. Our left brain, on the other hand, is focused on rational thought. Our senses provide patterns to our right brain, and to our left brain, those senses provide the rationale for decision making. In a sense, we have two AIs in our head that work together to create a logical, yet also emotionally swayed machine. Human intelligence and our definition of what an intelligent thing is all drawback to how we ourselves process the world. In order for artificial intelligence to truly succeed, that is to be the best version of itself that it can be, then it needs to be intelligent from a human perspective. All this draws back to modern AI in a simple way, AI is programmed in how to make a decision. Machine learning algorithms allow code to be pseudo-organically generated so that algorithms can "learn" in a sense. All of this programming is based on reasoning, on "if, then, do this." Arguably, our brain's decision-making process is just as much based on emotions and feeling as it is reason. Emotional intelligence is a significant portion of what makes intelligence. It's the ability to read a situation, to understand other human's emotions and reactions. In order for AIs to evolve and be the best possible algorithm, they need to be able to process sensory input and emotion. Integrating emotional & human intelligence into modern AI Most artificial intelligence systems are primarily created on the foundation of deep learning algorithms. This is the means of exposing a computer program to thousands of examples and AI learning how to solve problems through this process. Deep learning can be boiled down to teaching a computer how to be smart. After any given deep learning phase for an AI, the system can perceive the inputs that it was trained on and make decisions therein. The decision-making tree that the AI forms from traditional deep learning mimics the way the right side of our brain works. It is based on the perception of inputs, of pseudo-senses. Deep learning is a way of getting computers to reason, not just with if-then statements, but through the understanding of the situation. That said, the current situations AI are being trained on aren't as complex as interpreting a conversation with Becky to see if she's into you. Rather it's more along the lines of is this a dark cat, a black bag, or the night sky. Primitive, but still sensory perception... While deep learning is currently heavily focused on one pathway, meaning AIs are developing specialties, eventually it won't be too far fetched to start training AIs on multiple things at once. Just like a toddler might learn colors and numbers at the same time. Expanding this out, as computer processing power grows, perhaps accelerated by practical quantum computing, there's no question that AIs will evolve to become more human. Understanding what this all means Advanced AI will continue to deal with understanding and processing patterns from the world around us. Through this, it will develop more complex models on how to process that information. In a sense, AIs are like toddlers, but soon they're going to be teenagers, and eventually, they may graduate with a doctorate. All figuratively of course... though, an age where an AI graduates a university probably isn't that far off. When we think about intelligent humans, we usually think of the most rationally minded people. Yet, we miss out on what is so unique about human intelligence – creativity. In a sense, we take for granted our creativity, yet it is the thing that makes us the most intelligent of living beings. Our ability to process situations, not just understand what the sum of two numbers is, is what makes us uniquely intelligent. So uniquely intelligent that we can design and create artificially intelligent beings that will soon be able to match our human intelligence. While modern AIs are primarily focused on singular strands of intelligence, whether that be finding which picture contains a bicycle or which email is spam, we're already training AIs to be all-around smart, humanly smart.
<urn:uuid:8e65d9b5-2d9b-49aa-a0f3-36d981d9c303>
CC-MAIN-2021-10
https://interestingengineering.com/artificial-intelligence-is-evolving-to-process-the-world-like-humans
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178366477.52/warc/CC-MAIN-20210303073439-20210303103439-00401.warc.gz
en
0.955445
1,091
3.53125
4
In a paper slated for the July issue of Nature Physics, researchers at MIT, MIT Lincoln Laboratory, Japan’s Institute of Physical and Chemical Research and NEC describe a new technique that extends the time a qubit can stay in superposition. Perhaps even more important, the same technique can be used to measure the physical characteristics of qubits that knock them out of superposition in the first place, paving the way to better qubit designs. Researchers have sought to realize qubits in a variety of ways, from test tubes full of molecules sandwiched between powerful magnets to trapped ions manipulated by lasers. The MIT researchers and their colleagues instead used a superconducting circuit, made from several layers of aluminum deposited on a silicon wafer and cooled to just a fraction of a degree above absolute zero. Because of weird quantum effects, current flow through the circuit can be in superposition: The current is, in effect, flowing clockwise and counterclockwise at once. Before this new paper, the previous published record for keeping a superconducting qubit in superposition was less than 10 microseconds. By repeatedly zapping their qubit with microwave radiation, however, Jonas Bylander and Simon Gustavsson, both postdocs at MIT’s Research Laboratory of Electronics (RLE), and their colleagues were able to keep it alive for 23 microseconds. That may not sound like a very long time, but it’s much closer to the threshold qubits need to cross in order to perform useful computations. Margin of error Like a conventional computer program, a quantum computer program would be a long series of simple mathematical operations. What determines the minimum lifetime of a qubit, though, is not the time it takes to perform an operation but the time it takes to ensure that it performed the operation correctly. “Just as people had done in the ’50s for computer science with conventional classical computation, people have developed what are called error-correcting codes that correct for the errors that occur in these qubits,” says William Oliver, the senior staff member at Lincoln Laboratory and visiting scientist at RLE who led the new study. “To make those error-correction codes feasible, the qubit has to have some minimum lifetime. You could think of the error-correcting code itself as some kind of processing that you need to do to ensure the operation is performed correctly.” The main threat to the superposition of a qubit is the type of unwanted disturbance that electrical engineers call “noise.” It could be electrical noise from the cables used to program the qubit; it could be heat, or thermal noise (which cooling is intended to prevent); it could even be the electrical properties of impurities in the materials that constitute the qubit itself. By carefully controlling the rate at which they fire microwaves at the qubit, the researchers can filter out noise that occurs outside a narrow frequency band, preventing the qubit from falling out of superposition. By changing the rate at which they fire the microwaves, however, Bylander, Gustavsson and their colleagues can also measure exactly how much noise the qubit experiences within any given frequency band. Knowing the frequency profile of the noise could help physicists identify its sources and determine how to mitigate it. The technique could also be applied to other types of qubits, not just those that use superconducting circuits. One key to the system is carefully tailoring the shape of the microwave pulses so that firing them frequently won’t cause computational errors in the qubit. Compounding the problem is that the signal that controls the pulses has to travel to microwave emitters inside the refrigeration tank that helps cool the qubit. “You send some pulse down, [but] it might look different at the sample, because of imperfections in the wires,” Gustavsson says. Gustavsson found a way to “predistort” the signal so it would have the proper shape when it reached the qubit. Not only do the microwave pulses extend the qubit’s lifetime, but in an actual quantum computer, they would also instruct the qubits in the execution of their error-correcting code. The complexity of that code would vary according to the algorithm that the quantum computer is running, but it would probably have somewhere around 10,000 separate instructions. In the Nature Physics paper, the researchers report hitting their qubit with 250 microwave pulses. They say, however, that since the experiments reported in the paper, they’ve refined their system so that they can get in about 1,000 pulses before their qubit falls out of superposition. Yet Patrice Bertet, who researches superconducting quantum circuits for the French Atomic Energy Commission, says that using microwaves to extend the lifetime of a superconducting qubit is not the most interesting aspect of Bylander and his colleagues’ new paper. “It is an additional tool that has never been used before, and it’s a nice experiment, it’s a very good experiment,” Bertet says. But he points out that it is simply an extension of a technique that has been used successfully on other types of qubits. More intriguing, he says, is that “they are able to provide a rather detailed spectrum of the noise that the flux qubit sees.” “When the microscopic details [of noise] are a bit clearer, it might help fix it or fight it,” Bertet says. “It’s not yet clear how, but nevertheless, it’s good to know what enemy you fight.”
<urn:uuid:a038cc6d-8ee2-4aed-b322-dfca641dde61>
CC-MAIN-2021-10
https://news.mit.edu/2011/qubit-practical-0602
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178366969.45/warc/CC-MAIN-20210303134756-20210303164756-00123.warc.gz
en
0.950293
1,160
3.984375
4
Using two flat-top diamonds and a lot of pressure, scientists have forced a magnetic crystal into a spin liquid state, which may lead to insights into high-temperature superconductivity and quantum computing. It sounds like a riddle: What do you get if you take two small diamonds, put a small magnetic crystal between them and squeeze them together very slowly? The answer is a magnetic liquid, which seems counterintuitive. Liquids become solids under pressure, but not generally the other way around. But this unusual pivotal discovery, unveiled by a team of researchers working at the Advanced Photon Source (APS), a U.S. Department of Energy (DOE) Office of Science User Facility at DOE’s Argonne National Laboratory, may provide scientists with new insight into high-temperature superconductivity and quantum computing. Though scientists and engineers have been making use of superconducting materials for decades, the exact process by which high-temperature superconductors conduct electricity without resistance remains a quantum mechanical mystery. The telltale signs of a superconductor are a loss of resistance and a loss of magnetism. High-temperature superconductors can operate at temperatures above those of liquid nitrogen (-320 degrees Fahrenheit), making them attractive for lossless transmission lines in power grids and other applications in the energy sector. But no one really knows how high-temperature superconductors achieve this state. This knowledge is needed to increase these materials’ operating temperature towards ambient temperature, something that would be required for full-scale implementation of superconductors in energy-conserving power grids. “A quantum spin liquid is a superposition of spin states, fluctuating but entangled. It’s fair to say that this process, should it create a quantum spin liquid with quantum superposition, will have made a qubit, the basic building block of a quantum computer.” — Daniel Haskel, physicist and group leader, XSD One idea put forth in 1987 by the late theorist Phil Anderson of Princeton University involves putting materials into a quantum spin liquid state, which Anderson proposed could lead to high-temperature superconductivity. The key is the spins of the electrons in each of the material’s atoms, which under certain conditions can be nudged into a state where they become “frustrated” and unable to arrange themselves into an ordered pattern. To relieve this frustration, electron spin directions fluctuate in time, only aligning with neighboring spins for short periods of time, like a liquid. It is these fluctuations that may aid in the electron pair formation needed for high-temperature superconductivity. Pressure provides a way to “tune” the separation between electron spins and drive a magnet into a frustrated state where magnetism goes away at a certain pressure and a spin liquid emerges, according to Daniel Haskel, the physicist and group leader in Argonne’s X-ray Science Division (XSD) who led a research team through a series of experiments at the APS to do just that. The team included Argonne assistant physicist Gilberto Fabbris and physicists Jong-Woo Kim and Jung Ho Kim, all of XSD. Haskel is careful to say that his team’s results, recently published in Physical Review Letters, do not conclusively demonstrate the quantum nature of the spin liquid state, in which the atomic spins would continue to move even at absolute zero temperatures — more experiments would be needed to confirm that. But they do show that, by applying slow and steady pressure, some magnetic materials can be pushed into a state similar to a liquid, in which the electron spins become disordered and magnetism disappears, while preserving the crystalline arrangement of the atoms hosting the electron spins. Researchers are confident they have created a spin liquid, in which the electron spins are disordered, but are not certain if those spins are entangled, which would be a sign of a quantum spin liquid. If this is a quantum spin liquid, Haskel said, the ability to create one by this method would have wide implications. “Some types of quantum spin liquids can enable error-free quantum computing,” Haskel said. “A quantum spin liquid is a superposition of spin states, fluctuating but entangled. It’s fair to say that this process, should it create a quantum spin liquid with quantum superposition, will have made a qubit, the basic building block of a quantum computer.” So what did the team do, and how did they do it? That brings us back to the diamonds, part of a unique experimental setup at the APS. Researchers used two diamond anvils, cut in a similar way to what you’d see in jewelry stores, with a wide base and a narrower, flat edge. They positioned the smaller flat edges together, inserted a sample of magnetic material (in this case a strontium-iridium alloy) between them, and pushed. “The idea is that as you pressurize it, it brings the atoms closer together,” said Fabbris. “And since we can do that slowly, we can do that continuously, and we can measure the properties of the sample as we go up in pressure.” When Fabbris says that pressure was applied slowly, he isn’t kidding — each one of these experiments took about a week, he said, using a sample of about 100 microns in diameter, or about the width of a thin sheet of paper. Since researchers didn’t know at what pressure magnetism would disappear, they had to carefully measure with each very slight increase. And see it disappear they did, at around 20 gigapascals — equivalent to 200,000 atmospheres, or about 200 times more pressure than can be found at the bottom of the Mariana Trench in the Pacific Ocean, the deepest trench on Earth. The spins of the electrons remained correlated over short distances, like a liquid, but remained disordered even at temperatures as low as 1.5 Kelvin (-457 degrees Fahrenheit). The trick, Haskel said — and the key to creating a spin liquid state — was to preserve the crystalline order and symmetry of the atomic arrangement, since the unwanted effect of random disorder in atomic positions would have led to a different magnetic state, one without the unique properties of the spin liquid state. Haskel likens the electron spins to neighbors on a city block — as they get closer, they all want to make each other happy, changing their spin direction to match their neighbors’. The goal is to get them so close together that they cannot possibly keep all of their neighbors happy, thereby “frustrating” their spin interactions, while still maintaining the structure of the city block. The research team used the intense X-ray imaging capabilities of the APS to measure the magnetism of the sample, and according to Haskel and Fabbris, the APS is the only facility in the United States where such an experiment could be done. In particular, Fabbris said, the ability to focus in on one type of atom, ignoring all others, was crucial. “The samples are very small, and if you try to measure magnetism with other techniques in a university lab, you will pick up the magnetic signal from components in the diamond anvil cell,” Fabbris said. “The measurements we did are impossible without a light source like the APS. It is uniquely capable of this.” Now that the team has achieved a spin liquid state, what’s next? More experimentation is needed to see if a quantum spin liquid has been created. Future experiments will involve probing the nature of spin dynamics and correlations more directly in the spin liquid state. But the recent results, Haskel said, provide a path for realizing these elusive quantum states, one that could lead to new insights into superconductivity and quantum information sciences. Haskel also pointed forward to the APS Upgrade, a massive project that will see the instrument’s brightness increased up to 1,000 times. This, he said, will allow for much deeper probes into these fascinating states of matter. “It’s up to anyone’s imagination which surprising quantum mechanical effects are waiting to be discovered,” he said. Reference: “Possible Quantum Paramagnetism in Compressed Sr2IrO4” by D. Haskel, G. Fabbris, J. H. Kim, L. S. I. Veiga, J. R. L. Mardegan, C. A. Escanhoela, Jr., S. Chikara, V. Struzhkin, T. Senthil, B. J. Kim, G. Cao, and J.-W. Kim, 11 February 2020, Physical Review Letters.
<urn:uuid:a4dd394f-840a-4ce4-ae5f-25024b51cb3e>
CC-MAIN-2021-10
https://scitechdaily.com/counterintuitive-superconductivity-and-quantum-computing-breakthrough-using-pressure-to-make-liquid-magnetism/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178376144.64/warc/CC-MAIN-20210307044328-20210307074328-00204.warc.gz
en
0.9368
1,822
3.625
4
If you are looking for some great science websites for interactive learning, then these eleven plus sites should, at the very least, scratch and itch. Most of these are aimed at younger learners but some will be as, if not more, entertaining for adults. 1. Khan Academy is great for people of all ages Khan Academy is one of the best resources for STEM learning on the web. And, guess what? It is free. This interactive website is filled to the brim with fantastic content led by professionals and teachers who are experts on the content, with the occasional STEM celebrity appearances. There is not that much gamification on this website. Most of the learning is done through fun interactive quizzes. The site is perfect if you need to build on the current topics you are learning from at school or are an adult. Khan Academy has courses for every level, from elementary school to college. 2. Curiosity Machine will teach you about AI Curiosity Machine helps children build, share, and receive feedback from experts. Its main focus is on teaching children, and their parents, about the power of Artificial Intelligence. Its main focus to bring family members together to learn and build their own AI. It has a specific “Family Challenge” which is a “free, hands-on AI education program that brings families, schools, communities, and technology know-it-alls together to give everyone the chance to learn, play and create with AI.” Families will be guided through the basics of AI and are then encouraged to look around their local communities for potential problems to solve using their new skills. Proposals can then be submitted to win the competition. 3. Teachers TryScience is full of online experiments Teachers TryScience is a website specifically designed to spark any young mind’s interest in science, technology, engineering, and math. At its very core, it aims to bring design-based learning to children at home or at school. According to the website, it helps children “to solve a problem in environmental science, students might need to employ physics, chemistry, and earth science concepts and skills.” To this end, it has a large collection of interactive experiments, field trips, and other adventures. It also includes lesson plans, strategies, and tutorials for teachers to better help them deliver awe-inspiring science lessons for their ever-curious students. 4. The Exploratorium is the go-to site for interactive learning The Exploratorium is the website arm of the San Francisco Exploratorium. This site offers hands-on experiences that will help teach children about basic, and more complex, scientific principles. It covers subjects from many disciplines of science from biology and earth science to astronomy. The site also has a parent and teacher section that will provide free resources to help you plan and incorporate its interactive material to boost your child’s learning. 5. Science Kids will engage your kid’s mind Science Kids is another interactive learning website that focuses on teaching children the wonders of science. The site has a great variety of interactive science games covering subjects from living things to physical processes and everything in between. The great thing about this site’s content is that it not only educates young minds but helps them put that knowledge to practical use to cement it in their memory. One particularly useful game will have your child design and build a virtual electrical circuit. Each subject comes in modules that are then subdivided into subcategories. Living things, by way of example, is divided into food chains, microbes, and the human body, etc. 6. BrainPOP will do just that BrainPOP is the place for science learning and it’s very well designed to boot. It is a very active site for young students with a myriad of animations, movies, and short interactive quizzes. It covers topics like cellular life and genetics, ecology and behavior, forces of nature, our fragile environment, scientific inquiry, and paleontology and anthropology. So young aspiring scientist is bound to find something that will spark their interest. It also has some interactive coding lessons which are always fantastic ways to learn something they might not normally be exposed to. The site will have them hacking government websites in no time – only joking of course. 7. HHMI Biointeractive – it’s in the name HHMI‘s website is full of great 3-D interactive, virtual labs, and printable activities for you to use. Its material is both engaging and interesting for science-buffs of all ages. These guys are famed for their award-winning virtual labs and high-quality informative videos so you know you are in good hands. Their site includes “Click & Learn” activities that include embedded video clips and animations, videos all of which have stop points and assessments to help check you’ve been paying attention. 8. Annenberg Learner Interactives is a great resource for Earth Science students Annenberg Learner Interactives‘ Earth Science-related topics are full of great and easy-to-understand graphics and other interactive content. It has a good collection of interactive lessons cover the big things like the Earth’s structure to plate tectonics. The site also covers many other subjects within Earth Sciences, such as the Rock Cycle and Volcanoes, which really makes this subject come alive to any young student. It also has other resources for other scientific subjects with interactive games and other lessons. 9. National Geographic Kids is fun and educational Being created by National Geographic you know you can trust this site to be top quality. And it doesn’t disappoint. This site includes a large collection of videos, interactive activities, and fun games that will keep children of all ages engaged for hours on end. National Geographic Kids‘ site is broken down into helpful subcategories for ease of navigating your child’s learning. Each section contains extensive and informative write-ups on different animals from lions to whales supported with world-class National Geographic footage. Each section also includes memory games, quizzes, and other different activities to reinforce their learning by applying their new-found knowledge. 10. PhET Interactive Simulations is all about Physics simulations PhET Interactive Simulations is a real gem of an interactive and fun science-related website. Built and run by the University of Boulder, Colorado it has a vast collection of simulators covering most topics with physics from circuits to waves to quantum mechanics. Be warned, however, you might find yourself aimlessly playing around with variables without noticing hours of your precious time have passed by. Do not, we repeat do not, try the spring simulation it is too much fun. It also has some materials covering Earth Science, chemistry, and life sciences but these are far less extensive. 11. Wonderville is great for all ages Wonderville is another great science-related website that is packed with interactive activities for children. According to the website Wonderville “makes learning science fun for kids. We help teachers teach and students learn. Used in 170 countries, our awarding-winning STEM content helps create lifelong learners.” Other than fun and entertaining games it also has a very good blog for the more curious children who want to go deeper into a subject. Adults love using Brilliant.org. The interactive games on this website do not try to teach you through memorization. The Brilliant team is dedicated to teaching you how to think critically about STEM topics. From Geometry to Quantum Computing, this website is an excellent way to spend your free time, if you are a dedicated life-long learner. Scientific Thinking is one of our favorite courses on Brilliant.Org. 13. The Raspberry Pi Foundation Raspberry Pi is a powerful but tiny affordable computer that can be used to do everything from creating your own DIY projects at home to learning programming. The mini-computer is great for both kids and adults interested in getting into the science of computing and programming. The projects pages have a wide range of projects for people of any age. You will have to get your hands on one of the many Rasberry Pi computers to get started. But they are cheap!
<urn:uuid:f1795aa1-89b8-45b9-aad1-290ec52010dc>
CC-MAIN-2021-10
https://www.sapiensdigital.com/11-best-science-websites-for-interactive-learning.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178364027.59/warc/CC-MAIN-20210302160319-20210302190319-00604.warc.gz
en
0.939395
1,693
3.53125
4
Cryptography is the backbone of our current digital society, but how did it become so important? Interestingly, the systematic study of cryptography as a science (and perhaps as an art) started only during the past 100 years. The word cryptography is derived from the Greek krypto and graphein, which mean hide and writing. The first type of cryptography was simple writing, since the majority of people could not read (New World, 2007). Later, most of the great civilizations used some kind of cryptography to transfer important private information. The earliest form of cryptography was a cipher (a cipher is an algorithm used for encryption or decryption). Ciphers had the problem of being easily broken using the frequency of the letters, and once a generalized way of breaking them was found they became obsolete. Middle ages to today The next big advance came in the 1600s when the first cryptographic key was recorded, which caused a big shift in the space, moving the importance from hiding the system to hiding the key. The system could be public, but one could still not read the message without the key. That overcame the problem of a system as a whole becoming obsolete with the discovery of its mechanism. Then, during 19th Century the first use of a rotor for encryption was recorded. In the 20th Century the invention of the Enigma machine (used by the German military during WWII) was a technical milestone, being one of the hardest ciphers to break. However, that too was eventually broken by Poland, and British cryptographers designed a means to obtain the daily key. After the war, cryptography found its way into commercial applications, with IBM being the first company to systematically develop a crypto-group and what ended up being the first U.S. standard for encryption. The standard, though, was short-lived, since it was also broken by a simple but very powerful method called a brute-force attack. Brute-force involves simply trying all the possible combinations in a very computationally intensive process. That is also why advances in computing power are followed by increases in the complexity of the private keys. Cryptography has been a continuous game of chase between the complexity of the cryptographic keys and the computing power available. In principle, any key is vulnerable to a brute-force attack; the more complex the key the more time consuming such an attack is. The importance of cryptography in the digital age Advances in technology and computing power have enabled people to move more and more of their data to the digital sphere. Moving data through any digital means—aside from the obvious advantage it brings to speed, accessibility, and convenience—comes with the mirroring disadvantage of being harder to protect. The need to protect digital data from being used for unlawful purposes is being tackled by cryptography. However, as with all rights, there are competing interests. Law enforcement has a legitimate right to intercept communications in certain circumstances. Balancing these rights requires a balance known as the tightrope between security and privacy. The importance of cryptography can be summarized by the fact that it is the only tool the user has in the digital world to protect their private data. And as we move more and more into the digital world, by association, cryptography is becoming more and more important. The state of cryptography today Today the need to communicate with parties we cannot necessarily trust, has given rise to “public-key cryptography” or “asymmetric cryptography.” This kind of cryptography relies on public keys that the sender uses to encrypt the message and private keys which the receiver has and uses to decipher the message. This process is one-way, meaning that no one else can decipher the message. Even these state-of-the-art methods are still breakable. If nothing else, an algorithm can be broken by a brute-force attack that cycles through every possible key. Therefore, the goal of present-day cryptography is to create algorithms that make it computationally infeasible for an attacker to recover the private key. What about privacy? Even though state-of-the-art cryptographic protocols are virtually unbreakable because of required computing time, companies and individuals are ever in search of more ways to transact more privately. Recently, with advances in computing power and cryptography, trust has become a new target for individuals and organizations concerned with privacy. Cryptographers have thought that if it is possible to encrypt and effectively hide the data from people who don’t have to see it, perhaps there is a way to still transact with them without showing the data. And sure enough, during the 1980s tools such as zero-knowledge proofs and calculations on encrypted data were discovered. By applying mathematical transformations to the underlying data, these tools enable people to interact with and validate encrypted data, effectively creating another revolution in the field. Now the data exchange can be private, even between parties that transact directly. Increased efficiency for high-demand protocols In 2012 Project Pinocchio from IBM and Microsoft found a way to reduce the computing needs of a zero-knowledge proof by 20x and for zero-knowledge verification by more than 50x, making it efficient enough for practical uses. It now can be used to hide the data between two parties and still allow them to transact, not only theoretically, but fast enough to have private and commercial applications. This breakthrough opened new possibilities to businesses and researchers, who started wondering what other applications are within reach and what other technological possibilities exist. That same curiosity is what drove us at decentriq to explore these technologies in the first place. Our team develops novel implementations for cutting-edge and privacy-preserving technologies. We explore applications such as: - Secure and private online voting - Augmented privacy for exchanges, enabling them to not have to reveal their whole order book - A bulletproof way for anyone to provide a proof of cryptographic assets without ever revealing the funds available in one’s account - A marketplace for alternative data providers and buyers that enables the business to try the data before deciding to buy it, while keeping the data hidden - Making possible a demonstration of the predictive power of a model on new data without disclosing the model or the data All these applications are made possible by recent and ongoing research, both by decentriq and by third-party open-source projects fueled by demand for increased security and privacy in individual and commercial datasets. What does the future of cryptography hold? These cutting-edge discoveries and advancements in cryptography are cultivating an exciting future for the field. What appears to be the biggest change on the horizon is quantum computing. Quantum computing, using the properties of the superpositioned particles, is able to exponentially increase the computing power available to us. That means the cryptographic transformations that today are inefficient to run on a silicon chip can be run efficiently on a quantum chip, potentially rendering today’s encryption obsolete. Today, we encrypt data as it travels over the internet and when it is at rest on a storage device. But we have to decrypt data to use or analyze it, creating a potential security vulnerability. Homomorphic encryption is a new idea that solves that problem, allowing users to process data without decrypting it. With homomorphic encryption, we process encrypted data and produce encrypted results. And while this is not a novel idea, new breakthroughs that vastly improved performance brought the possibility of efficient encrypted data processing back to the forefront. Thus, the chase continues. The advances in quantum computing have given rise to quantum encryption, which uses the properties of quantum particles to ensure unbreakable encryption. There are already several projects working on quantum encryption and how it can be implemented. Even though quantum computing at scale may be many years away, we at decentriq follow the technology closely to make sure we are ahead of the curve for our customers when the time comes. Nevertheless, until then, we apply our cryptographic skills to the betterment of cutting-edge protocols, making them more efficient, user-friendly, and wider known to everyone who could benefit from them. We believe that in a world where the most valuable asset is information, it is worth exploring novel technological uses for confidential computing to protect it.
<urn:uuid:6623ba7e-381d-4a2a-a34f-79bed7723229>
CC-MAIN-2021-10
https://blog.decentriq.com/evolution-of-cryptography/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178375529.62/warc/CC-MAIN-20210306223236-20210307013236-00486.warc.gz
en
0.954277
1,659
3.671875
4
Only a few decades ago quantum technology was purely a theoretical thing. Something that scientists dreamed of. But now, even though this is still an emerging field of physics and engineering, we are very close to breaking through a great discovery. Just in October 2019 Google “performed a test computation in just 200 seconds that would have taken the best-known algorithms in the most powerful supercomputers thousands of years to accomplish.” The effects of quantum computing are promising in various areas because mathematical operations can be done considerably faster than the most powerful supercomputers to date can. This is achieved by relying on the principles of quantum physics. While this might seem like something arcane or even impossible for some people, quantum-based technology has been in use for a while now. Take MRI machines, for example, which creates images based on the spinning of atoms inside our body. Or some more common technology, like one that can fit in your hand and could tell you where you are at any given time, GPSs. Global Positioning Systems are based on quantum theory. Among all the quantum technology, the one that causes more controversy is the quantum computer. Its immense computing power can be used to speed up research in medical fields, test more efficient building materials, have better control over certain processes, create algorithms that solve complex problems. If you think of some of the listed benefits the impact of quantum computing does not seem all that bad. So, why the controversy? What are the negative effects of quantum computing? To understand this better we have to first answer one question. What is quantum computing? The fact that there are quantum computers that can already operate 1 trillion times faster than what a supercomputer can, has led people to seriously question the cybersecurity implications of quantum computing. Traditional computers store data in bits. Based on a binary system in which values are 1 an 0 which can translate to 2 states of information, positive and negative respectively. This binary system in which traditional computing is based is also the reason why Megabytes are composed of 1,024 Kilobytes, instead of just 1000 as the name implies. Although current computing is not obsolete, the binary system has some limitations when it comes to really long and complex operations. Like the ones in which thousands of variables have an impact. On the other hand, we have quantum-based computers that use qubits instead of bits. The interesting thing about these qubits is that they do not only represent a 1 or a 0 value, but they can also exist as both values at the same time. This quantum property is called superposition, the ability to be positive and negative at the same time. One of the ways that a qubit can be created is by using superconductivity to maintain a quantum state of the particle. To achieve this, the superconducting qubits must be kept extremely cold, even colder than the vacuum of space. In fact, an absolute 0 cold, or as close as possible. This is the lowest limit of temperature according to the laws of thermodynamics. 0° Kelvin is about -273.15 C°. At this temperature, the matter has no heat or vibration remaining in it and it only retains its quantum mechanical properties. Although interesting, that might not seem like very useful information for the casual observer. The most important piece of information that you need to understand quantum computing is that it leverages the quantum properties of particles to exponentially increase processing power. For instance, Google’s Sycamore Quantum Computer completed a complex computation in 200 seconds. A calculation that would take even the most powerful supercomputers an estimate of 10,000 years. Although that calculation does not have any real use outside the world of quantum computing it is still pretty impressive. However, that same capability of solving complex problems becomes a menace when it is faced with mathematical problems that should not be solved. This is the case with cybersecurity. Although Cybersecurity is composed of various components, including best practices, encryption is fundamental to maintain information unreadable for unwanted eyes. Current encryption is based on mathematical formulas that transform this clear data into an encrypted message that is supposed to be secure. This way you can transmit or store information and no one without the proper digital key will be able to access it. Breaking an encryption key is a mathematically daunting task. To the point of being considered impossible to achieve by today’s computing power. The most straightforward way to break an encryption code is to try all the possible keys until you get the right one. It would seem simple, but imagine this: A simple 64 bits encryption has 1,800,000,000,000,000,000 possible combinations. A 128 bits encryption code has more than 300 undecillion possible solutions. Even the world’s fastest supercomputer would take an estimate of a trillion years to find that key. Up to a certain extent, conventional computers can do this. In July 2002 a group announced that it had uncovered a symmetric 64-bit key. However, it took 300,000 people and more than four and a half years of work to achieve this. A quantum computing method called Grover’s algorithm, however, speeds up the process, turning that 128-bit key into the quantum-computational equivalent of a 64-bit key. The defense is straightforward, though: Make keys longer. A 256-bit key, for example, has the same security against a quantum attack as a 128-bit key has against a conventional attack. Under these terms, a quantum computer that can operate trillions of times faster than the fastest supercomputer becomes a game-changer. Encryption is vital to cybersecurity and privacy, at a personal level, at a corporate level and even at a government level. Although the current most secure encryption methods (256 bits) will not become useless against quantum computing, it’s security will be considerably weakened. The implications of quantum computing in cybersecurity are tangible and it puts a lot more than just a text message at risk. We can only expect that as this technology evolves, encryption methods will evolve as well.
<urn:uuid:1306b6a4-f067-4167-98a6-0e58700a6d5a>
CC-MAIN-2021-10
https://tiktechtalk.com/what-is-a-quantum-computer-and-why-is-it-a-cybersecurity-game-changer/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178369420.71/warc/CC-MAIN-20210304143817-20210304173817-00008.warc.gz
en
0.948829
1,251
3.515625
4
Physicists at MIT and elsewhere have observed evidence of Majorana fermions — particles that are theorized to also be their own antiparticle — on the surface of a common metal: gold. This is the first sighting of Majorana fermions on a platform that can potentially be scaled up. The results, published in the Proceedings of the National Academy of Sciences, are a major step toward isolating the particles as stable, error-proof qubits for quantum computing. In particle physics, fermions are a class of elementary particles that includes electrons, protons, neutrons, and quarks, all of which make up the building blocks of matter. For the most part, these particles are considered Dirac fermions, after the English physicist Paul Dirac, who first predicted that all fermionic fundamental particles should have a counterpart, somewhere in the universe, in the form of an antiparticle — essentially, an identical twin of opposite charge. In 1937, the Italian theoretical physicist Ettore Majorana extended Dirac’s theory, predicting that among fermions, there should be some particles, since named Majorana fermions, that are indistinguishable from their antiparticles. Mysteriously, the physicist disappeared during a ferry trip off the Italian coast just a year after making his prediction. Scientists have been looking for Majorana’s enigmatic particle ever since. It has been suggested, but not proven, that the neutrino may be a Majorana particle. On the other hand, theorists have predicted that Majorana fermions may also exist in solids under special conditions. Now the MIT-led team has observed evidence of Majorana fermions in a material system they designed and fabricated, which consists of nanowires of gold grown atop a superconducting material, vanadium, and dotted with small, ferromagnetic “islands” of europium sulfide. When the researchers scanned the surface near the islands, they saw signature signal spikes near zero energy on the very top surface of gold that, according to theory, should only be generated by pairs of Majorana fermions. “Majorana ferminons are these exotic things, that have long been a dream to see, and we now see them in a very simple material — gold,” says Jagadeesh Moodera, a senior research scientist in MIT’s Department of Physics. “We’ve shown they are there, and stable, and easily scalable.” “The next push will be to take these objects and make them into qubits, which would be huge progress toward practical quantum computing,” adds co-author Patrick Lee, the William and Emma Rogers Professor of Physics at MIT. Lee and Moodera’s coauthors include former MIT postdoc and first author Sujit Manna (currently on the faculty at the Indian Institute of Technology at Delhi), and former MIT postdoc Peng Wei of University of California at Riverside, along with Yingming Xie and Kam Tuen Law of the Hong Kong University of Science and Technology. If they could be harnessed, Majorana fermions would be ideal as qubits, or individual computational units for quantum computers. The idea is that a qubit would be made of combinations of pairs of Majorana fermions, each of which would be separated from its partner. If noise errors affect one member of the pair, the other should remain unaffected, thereby preserving the integrity of the qubit and enabling it to correctly carry out a computation. Scientists have looked for Majorana fermions in semiconductors, the materials used in conventional, transistor-based computing. In their experiments, researchers have combined semiconductors with superconductors — materials through which electrons can travel without resistance. This combination imparts superconductive properties to conventional semiconductors, which physicists believe should induce particles in the semiconductor to split , forming the pair of Majorana fermions. “There are several material platforms where people believe they’ve seen Majorana particles,” Lee says. “The evidence is stronger and stronger, but it’s still not 100 percent proven.” What’s more, the semiconductor-based setups to date have been difficult to scale up to produce the thousands or millions of qubits needed for a practical quantum computer, because they require growing very precise crystals of semiconducting material and it is very challenging to turn these into high-quality superconductors. About a decade ago, Lee, working with his graduate student Andrew Potter, had an idea: Perhaps physicists might be able to observe Majorana fermions in metal, a material that readily becomes superconductive in proximity with a superconductor. Scientists routinely make metals, including gold, into superconductors. Lee’s idea was to see if gold’s surface state — its very top layer of atoms — could be made to be superconductive. If this could be achieved, then gold could serve as a clean, atomically precise system in which researchers could observe Majorana fermions. Lee proposed, based on Moodera’s prior work with ferromagnetic insulators, that if it were placed atop a superconductive surface state of gold, then researchers should have a good chance of clearly seeing signatures of Majorana fermions. “When we first proposed this, I couldn’t convince a lot of experimentalists to try it, because the technology was daunting,” says Lee who eventually partnered with Moodera’s experimental group to to secure crucial funding from the Templeton Foundation to realize the design. “Jagadeesh and Peng really had to reinvent the wheel. It was extremely courageous to jump into this, because it’s really a high-risk, but we think a high-payoff, thing.” Over the last few years, the researchers have characterized gold’s surface state and proved that it could work as a platform for observing Majorana fermions, after which the group began fabricating the setup that Lee envisioned years ago. They first grew a sheet of superconducting vanadium, on top of which they overlaid nanowires of gold layer, measuring about 4 nanometers thick. They tested the conductivity of gold’s very top layer, and found that it did, in fact, become superconductive in proximity with the vanadium. They then deposited over the gold nanowires “islands” of europium sulfide, a ferromagnetic material that is able to provide the needed internal magnetic fields to create the Majorana fermions. The team then applied a tiny voltage and used scanning tunneling microscopy, a specialized technique that enabled the researchers to scan the energy spectrum around each island on gold’s surface. Moodera and his colleagues then looked for a very specific energy signature that only Majorana fermions should produce, if they exist. In any superconducting material, electrons travel through at certain energy ranges. There is however a desert, or “energy gap” where there should be no electrons. If there is a spike inside this gap, it is very likely a signature of Majorana fermions. Looking through their data, the researchers observed spikes inside this energy gap on opposite ends of several islands along the the direction of the magnetic field, that were clear signatures of pairs of Majorana fermions. “We only see this spike on opposite sides of the island, as theory predicted,” Moodera says. “Anywhere else, you don’t see it.” “In my talks, I like to say that we are finding Majorana, on an island in a sea of gold,” Lee adds. Moodera says the team’s setup, requiring just three layers — gold sandwiched between a ferromagnet and a superconductor — is an “easily achievable, stable system” that should also be economically scalable compared to conventional, semiconductor-based approaches to generate qubits. “Seeing a pair of Majorana fermions is an important step toward making a qubit,” Wei says. “The next step is to make a qubit from these particles, and we now have some ideas for how to go about doing this.” The study was published in Massachusetts Institute of Technology
<urn:uuid:99ec6afb-ae6f-4ff2-abc4-e24801a5de16>
CC-MAIN-2021-10
https://www.sciencecover.com/4162-2-first-sighting-of-mysterious-majorana-fermion-on-a-common-metal/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178350942.3/warc/CC-MAIN-20210225095141-20210225125141-00490.warc.gz
en
0.942605
1,744
3.609375
4
Storing quantum bits of information, or qubits, is a lot harder than storing ordinary binary digits. It’s not simply ones or zeroes, but the whole range of subtle quantum superpositions between them. Electrons can easily slide out of those states if they’re not stored in the right materials, which is why electrical engineers at Princeton are working with a UK manufacturer to create a better storage material — synthetic diamonds — from scratch. They published an account of their success on Thursday in Science. For decades, physicists, materials engineers, and others have been trying to achieve the conceptual promise of quantum-encrypted communications because the data transferred in that process is theoretically immune to covert surveillance. Any attempt to observe that data between parties — à la the Heisenberg Uncertainty Principle — would fundamentally alter that information, quickly revealing that it was compromised. The problem has been storing and preserving qubits and then converting them to fiber optic-ready photons, and using diamonds appears to be the route toward achieving both. But not just any diamond will do, which is why Princeton’s team has been hard at work creating a synthetic one, as they describe in their paper. “The properties that we’re targeting are what’s relevant for quantum networks,” electrical engineer Nathalie de Leon tells Inverse. At Princeton, where de Leon is an assistant professor, her team’s focus is essentially inventing quantum hardware. “It’s applications where you want something that has a long storage time, and then also has a good interface with photons so that you can send light over very long distances.” Photonic interactions matter a lot for high-speed international communications because all of the information traveling along fiber optic cables moves through our global infrastructure as discrete photons — cruising at 69 percent of the speed of light. (Nice.) “That puts a lot of constraints on the optical characteristics,” de Leon says. “As one example, it’s really important that the color be stable. If the color of the photon is jumping around over time, then that’s really bad for these protocols.” Right now, de Leon’s group is trying to craft a version of these synthetic diamonds that can convert to the standard 1,550-nanometer wavelength on which photons now traverse fiber optic cables. Currently, her team’s synthetic diamonds support 946-nanometer photon wavelengths. (Photon “color” is a bit of a euphemism here since both of these wavelengths are shades of infrared outside the visible spectrum.) The hurdle that her team just succeeded in crossing is storing those qubits in crystalline quantum repeaters, similar to the repeaters that are currently used to prevent signal loss and degradation in today’s fiber-optic communications. The critical step in this process was producing synthetic diamonds with as little unwanted impurities as possible (nitrogen, mainly) and more of the impurities they actually did want (silicon and boron). “Nitrogen turns out to be the predominant defect that you get in these diamonds,” de Leon says. Her group’s partners at the British diamond maker Element Six had to create above-average vacuum conditions since even ordinary vacuums can leave enough nitrogen in the chamber to contaminate the artificially-made crystals. Because nitrogen has one more free electron than carbon, nitrogen impurities disturb the unique electrical makeup that the researchers are hoping for. Other small defects can undermine the qubit-storing potential of these diamonds, too. The goal is to have pairs of atom-sized vacancies in the crystal framework alongside a substituted silicon atom where a single carbon used to be, but sometimes those pairs can bunch up together in “vacancy clusters” that start to redistribute their electrons in annoying, counterproductive ways. Sometimes polishing and etching damage on the surface of the diamond can also cause a domino effect, messing with this pattern of electrons, too. This is where adding boron — which has one less free electron than carbon — can help. “What we had to do,” de Leon says, “is both start with this ultra-high purity diamond and then grow in some boron to basically soak up any of the extra electrons that we couldn’t control. Then there was a lot of materials processing — boring stuff like thermal annealing and repairing the surface at the end to make sure that we still get rid of a lot of these other types of defects that give you extra charges.” Mastering both of these challenges, many in the field suspect, are the keys to fully functional and nearly impossible to crack quantum encryption. Before the dawn of synthetic diamonds only a few years ago, researchers in the field of quantum optics had to rely on natural diamonds to do their work — one specific diamond, in particular. According to de Leon, everyone in the field of quantum optics had to rely on a single, naturally-made diamond from Russia that just happened to have the right percentage of boron, nitrogen, and other impurities to make their research possible. Fragments of the diamond were cleaved off and distributed to research groups across the world. “Many of the groups had their own little piece of the ‘magic’ Russian diamond,” as de Leon told Princeton’s in-house news service in 2016. “At Harvard, we called ours ‘Magic Alice’ and ‘Magic Bob.’” So, TL;DR, Western scientists are getting better at manufacturing their own magical quantum computing diamonds instead of depending on slivers of Russia’s magical quantum computing diamond. This is a factual sentence that sounds ridiculous. Classic 2018.
<urn:uuid:7ebf5cae-2c75-4e41-9d5f-4cfb723845d1>
CC-MAIN-2021-10
https://www.inverse.com/article/46728-synthetic-diamonds-are-necessary-for-quantum-computing-privacy
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178368431.60/warc/CC-MAIN-20210304021339-20210304051339-00570.warc.gz
en
0.939895
1,194
3.59375
4
Quantum computers are largely hypothetical devices that could perform some calculations much more rapidly than conventional computers can. Instead of the bits of classical computation, which can represent 0 or 1, quantum computers consist of quantum bits, or qubits, which can, in some sense, represent 0 and 1 simultaneously. Although quantum systems with as many as 12 qubits have been demonstrated in the lab, building quantum computers complex enough to perform useful computations will require miniaturizing qubit technology, much the way the miniaturization of transistors enabled modern computers. Trapped ions are probably the most widely studied qubit technology, but they’ve historically required a large and complex hardware apparatus. In today’s Nature Nanotechnology, researchers from MIT and MIT Lincoln Laboratory report an important step toward practical quantum computers, with a paper describing a prototype chip that can trap ions in an electric field and, with built-in optics, direct laser light toward each of them. “If you look at the traditional assembly, it’s a barrel that has a vacuum inside it, and inside that is this cage that’s trapping the ions. Then there’s basically an entire laboratory of external optics that are guiding the laser beams to the assembly of ions,” says Rajeev Ram, an MIT professor of electrical engineering and one of the senior authors on the paper. “Our vision is to take that external laboratory and miniaturize much of it onto a chip.” The Quantum Information and Integrated Nanosystems group at Lincoln Laboratory was one of several research groups already working to develop simpler, smaller ion traps known as surface traps. A standard ion trap looks like a tiny cage, whose bars are electrodes that produce an electric field. Ions line up in the center of the cage, parallel to the bars. A surface trap, by contrast, is a chip with electrodes embedded in its surface. The ions hover 50 micrometers above the electrodes. Cage traps are intrinsically limited in size, but surface traps could, in principle, be extended indefinitely. With current technology, they would still have to be held in a vacuum chamber, but they would allow many more qubits to be crammed inside. “We believe that surface traps are a key technology to enable these systems to scale to the very large number of ions that will be required for large-scale quantum computing,” says Jeremy Sage, who together with John Chiaverini leads Lincoln Laboratory’s trapped-ion quantum-information-processing project. “These cage traps work very well, but they really only work for maybe 10 to 20 ions, and they basically max out around there.” Performing a quantum computation, however, requires precisely controlling the energy state of every qubit independently, and trapped-ion qubits are controlled with laser beams. In a surface trap, the ions are only about 5 micrometers apart. Hitting a single ion with an external laser, without affecting its neighbors, is incredibly difficult; only a few groups had previously attempted it, and their techniques weren’t practical for large-scale systems. That’s where Ram’s group comes in. Ram and Karan Mehta, an MIT graduate student in electrical engineering and first author on the new paper, designed and built a suite of on-chip optical components that can channel laser light toward individual ions. Sage, Chiaverini, and their Lincoln Lab colleagues Colin Bruzewicz and Robert McConnell retooled their surface trap to accommodate the integrated optics without compromising its performance. Together, both groups designed and executed the experiments to test the new system. “Typically, for surface electrode traps, the laser beam is coming from an optical table and entering this system, so there’s always this concern about the beam vibrating or moving,” Ram says. “With photonic integration, you’re not concerned about beam-pointing stability, because it’s all on the same chip that the electrodes are on. So now everything is registered against each other, and it’s stable.” The researchers’ new chip is built on a quartz substrate. On top of the quartz is a network of silicon nitride “waveguides,” which route laser light across the chip. Above the waveguides is a layer of glass, and on top of that are niobium electrodes with tiny holes in them to allow light to pass through. Beneath the holes in the electrodes, the waveguides break into a series of sequential ridges, a “diffraction grating” precisely engineered to direct light up through the holes and concentrate it into a beam narrow enough that it will target a single ion, 50 micrometers above the surface of the chip. With the prototype chip, the researchers were evaluating the performance of the diffraction gratings and the ion traps, but there was no mechanism for varying the amount of light delivered to each ion. In ongoing work, the researchers are investigating the addition of light modulators to the diffraction gratings, so that different qubits can simultaneously receive light of different, time-varying intensities. That would make programming the qubits more efficient, which is vital in a practical quantum information system, since the number of quantum operations the system can perform is limited by the “coherence time” of the qubits. “As far as I know, this is the first serious attempt to integrate optical waveguides in the same chip as an ion trap, which is a very significant step forward on the path to scaling up ion-trap quantum information processors [QIP] to the sort of size which will ultimately contain the number of qubits necessary for doing useful QIP,” says David Lucas, a professor of physics at Oxford University. “Trapped-ion qubits are well-known for being able to achieve record-breaking coherence times and very precise operations on small numbers of qubits. Arguably, the most important area in which progress needs to be made is technologies which will enable the systems to be scaled up to larger numbers of qubits. This is exactly the need being addressed so impressively by this research.” “Of course, it's important to appreciate that this is a first demonstration,” Lucas adds. “But there are good prospects for believing that the technology can be improved substantially. As a first step, it's a wonderful piece of work.”
<urn:uuid:d800b8d1-c139-4e79-ab6c-a8b6bfd0ee15>
CC-MAIN-2021-10
https://news.mit.edu/2016/toward-practical-quantum-computers-0808
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178363782.40/warc/CC-MAIN-20210302065019-20210302095019-00531.warc.gz
en
0.941985
1,340
4.1875
4
Blog: Quantum Algorithms In previous articles we introduced the main topics related to Quantum Computing ( https://firstname.lastname@example.org/quantum-computing-7662907581e5) in order to have a basic idea of what is it and the definitions we need to know so that one is capable of understanding the entire work. Quantum computing algorithms can be divided into three groups: algorithms based on the quantum version of the Fourier transform, search algorithms and quantum simulations. In this section some well-known and relevant algorithms of quantum computing will be introduced, we will proceed to explain their purpose and their logic and, in turn, we will analyze in detail their respective circuits . The algorithms to analyze are the following: 1. Deutsch Algorithm 2. Grover Algorithm Deutsch’s algorithm combines what is known as quantum parallelism with another quantum phenomenon called interference. The problem that this algorithm tries to solve is how to determine if the function 𝑓(𝑥) of binary variable and binary image is constant or alternates with a minimum number of calls to the function 𝑓(𝑥). It is shown that classically two calls are needed while in the quantum algorithm is enough with one. It is the first quantum algorithm that showed quantum superiority. The circuit of this algorithm is as follows: In figure 1 one can see that we have two qubits to which a Hadamard gate is applied. This gateway is to prepare the qubit in a superposition. In this case we have as input with the help of the Hadamard gates we put the qubits into the following superposition states In figure 1, we have the Uf gate that performs the following action U stands for unitary and for practical purposes we will treat it as a black box. This gate affects the state by adding the following term Applied, then, to (3) we obtain Finally, we apply another Hadamard gate to the first qubit, remaining (7) as follows The established conditions tell us that if is 0 and 1 in other cases. In order to keep things easy to read, we rewrite (8) In this way, by measuring the first qubit, we can determine 𝑓(0)⊕𝑓(1). That is, the system allows us to know a global property of the function in a single evaluation. With a “classic” device we would have taken at least two evaluations. It should be noted that if we were in classical computing, we could not obtain information from the two solutions at the same time. However, in quantum computing solutions can interfere with each other to give us a global solution, as we got. Finally we are going to analyze Grover’s algorithm. We will do it without introducing too much into the circuit, unlike the other example. This algorithm belongs to a special class of quantum algorithms called quantum search algorithms. This type of algorithm attempts to solve the following problem: given a search space of size 𝑁, and without prior knowledge of what it contains, we want to find an element of that search space that satisfies a known property in the shortest possible time. Classically N operations are needed to solve this type of algorithms, but in the quantum version they are solved making √𝑁 operations. The algorithm works in the following way : before observing the search set we have no idea which element fulfills our property. Moreover, all positions have the same probability. In this way, we can express it in terms of a state called uniform superposition (Hadamard transformation) Since all positions have the same probability, when measuring the probability we would obtain 1/𝑁 = 1/2^𝑛. This is where we use a procedure called amplitude amplification. This increases the probability of obtaining the correct item in the final state. Next we will ennumerate the steps to carry out the algorithm - We begin the amplitude amplification in |𝑠⟩. This state is build as follows The initial state is 2. We apply a reflection 𝑈𝑓 to the state Geometrically corresponds to a reflection of the |𝜓t⟩ state over -|𝑤⟩. 3. We apply a new reflection 𝑈s to the state |𝑠⟩. This reflection is as follows 𝑈𝑠=2|𝑠⟩⟨𝑠|−1. In this way, the resultant state is We perform a rotation of the initial state to the winning state. 4. Return to step 1. We repeat this algorithm several times until reaching the winning state. In the end, as we said at the beginning we will end up doing √𝑁 operations. All in all, we have introduced two important algorithms in Quantum Computing. Understanding these algorithms will make the reading of future articles easier. With this article we finished the theoretical basis of Quantum Computing. In the next articles we are going to explain the theoretical basis of Artificial Intelligence. Keep it up! Michael A. Nielsen & Isaac L. Chuang. Quantum Computation and Quantum Information, 10th Anniversary Edition. Cambridge University Press, 2009. Michael A. Nielsen & Isaac L. Chuang. Quantum Computation and Quantum Information, 10th Anniversary Edition. Figure 1.19. Quantum circuit implementing Deutsch’s algorithm. Cambridge University Press, 2009.
<urn:uuid:d903ade5-1637-4b8d-a6a4-73b593f50d1f>
CC-MAIN-2021-10
https://timmccloud.net/blog-quantum-algorithms/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178363782.40/warc/CC-MAIN-20210302065019-20210302095019-00535.warc.gz
en
0.914212
1,162
3.609375
4
Researchers found that making adjustments to existing telecommunications equipment used in optics research could be optimized for quantum photonics research, which could lead to new ways to use these resources for both traditional and quantum communication. A team from the Department of Energy’s Oak Ridge National Laboratory conducted the series of experiments to gain a better understanding of quantum mechanics and pursue advances in quantum networking and quantum computing, which could lead to practical applications in cybersecurity and other areas. ORNL quantum researchers Joseph Lukens, Pavel Lougovski, Brian Williams, and Nicholas Peters—along with collaborators from Purdue University and the Technological University of Pereira in Colombia—summarized results from several of their recent academic papers in a special issue of the Optical Society’s Optics & Photonics News, which showcased some of the most significant results from optics-related research in 2019. Their entry was one of 30 selected for publication from a pool of 91. Conventional computer “bits” have a value of either 0 or 1, but quantum bits, called “qubits,” can exist in a superposition of quantum states labeled 0 and 1. This ability makes quantum systems promising for transmitting, processing, storing, and encrypting vast amounts of information at unprecedented speeds. To study photons—single particles of light that can act as qubits—the researchers employed light sources called quantum optical frequency combs that contain many precisely defined wavelengths. Because they travel at the speed of light and do not interact with their environment, photons are a natural platform for carrying quantum information over long distances. Interactions between photons are notoriously difficult to induce and control, but these capabilities are necessary for effective quantum computers and quantum gates, which are quantum circuits that operate on qubits. Nonexistent or unpredictable photonic interactions make two-photon quantum gates much more difficult to develop than standard one-photon gates, but the researchers reached several major milestones in recent studies that addressed these challenges. “Using this equipment to manipulate quantum states is the technological underpinning of all these experiments, but we did not expect to be able to move in the other direction and improve classical communication by working on quantum communication,” Lukens said. “These interesting and unanticipated findings have appeared as we delve deeper into this research area.” One such tool, a frequency beam splitter, divides a single beam of light into two frequencies, or colors, of light. “Imagine you have a beam of light going down an optical fiber that has a particular frequency, say, red,” Lukens said. “Then, after going through the frequency beam splitter, the photon will leave as two frequencies, so it will be both red and blue.” The members of this team were the first researchers to successfully design a quantum frequency beam splitter with standard lightwave communications technology. This device takes in red and blue photons simultaneously, then produces energy in either the red or the blue frequency. By using this method to deliberately change the frequencies of photons, the team tricked the stubborn particles into beneficial interactions based on quantum interference, the phenomenon of photons interfering with their own trajectories. “It turned out that off-the-shelf devices can deliver impressive control at the single-photon level, which people didn’t know was possible,” Lougovski said. Additionally, the researchers completed the first demonstration of a frequency tritter, which splits a beam of light into three different frequencies instead of two. Their results indicated that multiple quantum information processing operations can run at the same time without introducing errors or damaging the data. Another key accomplishment was the team’s design and demonstration of a coincidence-basis controlled-NOT gate, which enables one photon to control a frequency shift in another photon. This device completed a universal quantum gate set, meaning any quantum algorithm can be expressed as a sequence within those gates. “Quantum computing applications require much more impressive control levels than any sort of classical computing,” Lougovski said. The team also encoded quantum information in multiple independent values known as degrees of freedom within a single photon, which allowed them to observe quantum entanglement-like effects without needing two separate particles. Entanglement usually involves two linked particles in which changes made to the state of one particle also apply to the other. Finally, the researchers have completed quantum simulations of real-world physics problems. In collaboration with scientists at the Air Force Research Laboratory, they are now developing tiny, specialized silicon chips similar to those common in microelectronics in pursuit of even better photonic performance. “In theory, we can get all these operations onto a single photonic chip, and we see a lot of potential for doing similar quantum experiments on this new platform,” Lukens said. “That’s the next step to really move this technology forward.” Future quantum computers will allow scientists to simulate incredibly complex scientific problems that would be impossible to study on current systems, even supercomputers. In the meantime, the team’s findings could help researchers embed photonic systems into current high-performance computing resources. “We have a very diverse and talented team,” Lougovski said. “The most important thing is we’re getting results.” This research was funded by ORNL’s Laboratory Directed Research and Development program. UT-Battelle LLC manages Oak Ridge National Laboratory for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science. — Provided by Oak Ridge National Laboratory
<urn:uuid:a5570d1a-863e-412e-bc3b-6e34df9fde2e>
CC-MAIN-2021-10
https://thequantumdaily.com/2020/01/27/tweaking-existing-telecommunications-gear-could-help-quantum-communication-research/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178357929.4/warc/CC-MAIN-20210226145416-20210226175416-00096.warc.gz
en
0.937778
1,187
3.65625
4
Two newly published studies show that the accuracy and lifetime of silicon qubits are now suitable for large-scale quantum computers. A dramatic increase in the amount of time data can be stored on a single atom means silicon could once again play a vital role in the development of super-fast computers. The silicon chip revolutionized most aspects of everyday life since it was invented in the 1950s. It’s changed the way that we communicate with each other, and how we operate almost all everyday items, from cars to airplanes, fridges to televisions and our smart-phones and tablets. The reason for this is that silicon can be “crafted” into a dazzling array of complex electronic structures and devices, such as the billion or so transistors crammed into each silicon chip. While modern computers use these silicon chips (or integrated circuits) to perform an array of complex calculations, there are still some important problems that existing computers can’t solve. For example, medical researchers would love to be able to invent new pharmaceuticals with computer-aided design, much like the way automotive engineers design new cars, but they cannot do this today. The reason is that the molecules that make up the medicine are not “macro” objects, like a car, but they live in the “micro” or quantum world, which is far more complex to calculate. In fact, no computer as we know it today will ever be able to properly design such molecular systems. So we must turn to a new type of computer – a quantum computer – in which the “bits” of data used for the calculations are themselves stored on quantum particles, like individual atoms, or electrons. Such quantum computers are also expected to be able to solve other important problems, such as searching large data sets, or solving complex financial problems. The search for the best qubit For the past two decades or so, researchers around the world have been exploring a range of different physical systems to act as the “quantum bits” in such a quantum computer. Now it appears that silicon, which underpinned the previous information revolution, could well provide the key to the next quantum revolution. Over the past three years, our two research teams at UNSW have shown that silicon can be used to make functioning quantum bits, or qubits. In particular we found that a single atom of phosphorus could be used to tightly hold an electron, which also carries a “spin” (like a tiny magnet) that could be used as a quantum bit. But the binary code (0 or 1) stored on the electron spin got scrambled very quickly, making a fairly poor qubit. The core of the phosphorus atom also contains a nuclear spin, which could act as an excellent memory storage qubit thanks to its very weak sensitivity to the noise present in the surrounding environment. Even so, when placed inside a “natural” silicon chip, a phosphorus nuclear spin loses the quantum information encoded on it in less than a second. Storage time increased New research published in Nature Nanotechnology – two papers from our groups and one from a Dutch-US collaboration – show that the accuracy and lifetime of silicon qubits are now in a realm that makes them suitable for the manufacture of large-scale quantum computers. Our teams in Australia have used a specially purified type of silicon that contains only one isotope, called Si-28. This isotope is completely non-magnetic, because its nucleus has no spin. The electrical properties of a chip of purified Si-28 are identical to those of natural silicon, and so it works equally well for any electronic device. But when an electron or nuclear spin qubit are configured inside pure Si-28, the absence of magnetic noise allows us to store and manipulate the quantum state with unprecedented accuracy. In one of the new papers our team demonstrated that we can perform quantum logic operations on a single electron trapped in an “artificial atom”, which is created by small metallic electrodes on the surface of the chip. These devices are remarkably similar to existing silicon transistors, providing great promise for commercial manufacture. Thanks to the ultra-pure Si-28, we can now reach an accuracy of quantum operations well above 99%. This accuracy is significant because it surpasses the minimum requirement to ensure that the (rare) errors can be corrected using special codes. In a separate paper we report a similar accuracy, beyond 99%, for the operations on the electron spin held by a phosphorus “natural atom” in the same Si-28 material. In addition, with the nuclear spin of the phosphorus we have established the new world record for how long quantum information can be held onto a quantum bit in solid state: above 35 seconds, which is an eternity in the quantum world. The accuracy of the operations was a staggering 99.99%. With the exquisite quantum bits now demonstrated within a silicon electronic device, building functional quantum computers has become a much more realistic prospect. The new quantum revolution might well be built upon the old, trusted and omnipresent silicon microchip. - M. Veldhorst, et al., “An addressable quantum dot qubit with fault-tolerant control-fidelity,” Nature Nanotechnology (2014); doi:10.1038/nnano.2014.216 - Juha T. Muhonen, et al, “Storing quantum information for 30 seconds in a nanoelectronic device,” Nature Nanotechnology (2014); doi:10.1038/nnano.2014.211 Image: Dr Stephanie Simmons, UNSW
<urn:uuid:a04cb84b-abc2-4dee-804a-b43c6680981a>
CC-MAIN-2021-10
https://scitechdaily.com/silicon-qubits-key-quantum-revolution/?replytocom=400033
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178362899.14/warc/CC-MAIN-20210301182445-20210301212445-00097.warc.gz
en
0.918722
1,158
4.03125
4
People being people, most of us have gotten used to the idea that the methods we routinely use to protect our information are reliable and safe. This is why you educate your users to check if that little padlock appears in their browser search window before they check their bank balance. It's why we go to the trouble of implementing email encryption as well as secure file transfer systems. But in the tech industry, change is always on the horizon, which means you need to get used to the idea that what you thought was invulnerable today might easily be threatened tomorrow. One of those changes is quantum computing, and it's a field that's developing quickly. For example, earlier this year, Google announced that it had built the largest quantum computing chip ever: a 72-qubit (a quantum bit) processor. To put that into context, it's important to explain how a qubit differs from the bit you learned about back in computer science class. Those bits are basic units of information represented by either a 1 or a 0. Qubits, which are represented by the symbol '0> and '1>, can also encompass values of 1 or 0, but can then extend those values to essentially an infinite number of states in between 1 and 0. What happens is that the probability of some number changes as you move between 1 and 0. We're not going to go into detail about how this works (you can read more about it here), except to say that, by having more potential values between 1 and 0, you can perform some types of computation faster. In some cases, many thousands of times faster than what's possible with today's more advanced desktop CPU architectures, like the Intel i9. Because of the way quantum computers work, they can be used for jobs that are difficult for these more traditional CPU chipsets. This would include tasks such as multidimensional modeling, simulations, and, yes, codebreaking. It's the codebreaking and encryption cracking that's worrying security experts, and is also freaking out some folks involved with cryptocurrencies as well as those involved with the many other developments being made possible by blockchain technology. Blockchains and cryptocurrencies are, after all, simply very large numbers used to create a unit of whatever currency you're considering. Bitcoin, for example, depends on public key cryptography. Public key cryptography is considered one of the most vulnerable to cracking by a quantum computer, which is part of what's making folks with large Bitcoin investments sweat. What this means to you is that some types of encryption that you depend on are no longer considered secure. Exactly how that may apply to you is described in more detail in this "Report on Post-Quantum Cryptography" published by the US Department of Commerce's National Institute of Standards and Technology (NIST). What you'll find in this NIST paper is that public key encryption is vulnerable to cracking by using algorithms on a quantum computer. But other means of encryption, including Advanced Encryption Standard (AES), which uses symmetric keys, and Secure Hash Algorithm (SHA-2 and SHA-3), will remain secure with some modifications. Table 1 - Impact of Quantum Computing on Common Cryptographic Algorithms - Credit: NIST The most widely used version of AES, which uses 256-bit keys, is actually relatively secure against quantum computing attacks. AES-256 is commonly used for mundane tasks such as Wi-Fi encryption. However, another commonly used version of encryption, secure sockets layer (SSL), uses public key encryption. Calming Your Quantum Computing Fears For now, you don't need to worry, though as an IT professional, you should start to plan. Despite the rapid development of quantum computing, researchers don't appear to have reached the point where they can routinely decrypt routine business communications. While that may come someday, you're still fairly safe for now as long as you remember these key points: SSL communications are still safe; and because they are ephemeral, your users don't need to worry that there'll be a stored copy of their banking session or credit card purchase to be retrieved and cracked at a later date. However, that may change in the future. AES-256 will be safe, even against quantum attacks, for some time. Unless your data is valuable enough for a nation-state to spend millions of dollars to crack it, you don't need to worry. However, if your business handles national security data, then maybe you need to find a better way and it'd be a good idea to start staying on top of devleoping cryptographic trends. Age is important. Unless you need to protect your data for decades against future quantum attacks by using advanced algorithms, then some form of symmetric encryption (including AES) will do. Be prepared for encryption using longer key lengths because those are much harder to crack. Some keys can be found by using brute force techniques but, if the time to crack them by using the fastest quantum computer exceeds the expected age of the universe, then you're probably safe. Longer key lengths will require more computer power to handle, but probably not enough to bog down your systems when they're needed. Remember that the quality of encryption is only one part of the security puzzle. Poorly executed encryption, weak or faulty software surrounding the encryption, and poor security practices can still expose your critical data through other vulnerabilities. For example, it doesn't help to encrypt your communications if the bad guys can walk into your office and steal the data out of an unlocked file cabinet or, more often, the trash can. While some forms of encryption now have a limited lifetime, the fact is, you still have time to determine what data you have that may have vulnerabilities because of encryption, and then evaluate whether or not the risk down the road will affect you immediately. For most day-to-day operations, it won't. But if you deal with sensitive data that has a long lifetime, then you need to start planning for the future now.
<urn:uuid:7eb566da-1b47-484d-8bf2-ab5cf64f038c>
CC-MAIN-2021-10
https://www.pcmag.com/news/is-quantum-computing-really-a-threat-to-it-security
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178350846.9/warc/CC-MAIN-20210225065836-20210225095836-00138.warc.gz
en
0.962035
1,217
3.515625
4
Bell's theorem is an important philosophical and mathematical statement in the theory of quantum mechanics. It showed that a category of physical theories called local hidden variables theory could not account for the degree of correlations between the spins of entangled electrons predicted by quantum theory. The commonly accepted conclusion of the theorem is that quantum theory is inherently nonlocal in some way, although this is a topic of intense philosophical debate. Mathematically, Bell's theorem is justified by an important lemma, the CHSH inequality. This inequality bounds the amount of correlation possible between electron spins in a hidden variables theory. The violation of the CHSH inequality in both the theory and experimental results of quantum mechanics proves the theorem. The historical importance of Bell's theorem is that it proved Einstein, Podolsky, and Rosen [EPR] incorrect in their discussion of the EPR paradox. EPR advocated a philosophy of science called local realism. This philosophy specified that interactions should not be able to communicate instantly across large distances (nonlocally) in violation of relativity, and that systems ought to have a "realistic" definite value of quantities before these quantities are measured. However, experiments measuring the correlations between spins of entangled electrons seemed to communicate spin states instantaneously between locations (nonlocally). EPR thus concluded that quantum mechanics is incomplete; there must have been some extra hidden variable that set the responses of the entangled electrons to spin measurements at the beginning of the experiment, when both electrons were generated at the same site. Bell's theorem showed that this interpretation is not true: in order to reconcile theory with experimental results of quantum mechanics, one of locality and realism must be rejected. In fact, in the popular Copenhagen interpretation of quantum mechanics, not only does quantum mechanics "communicate" spin states instantaneously, but before quantities are measured, systems do not take definite values of these quantities: both locality and realism are rejected. Bell's theorem: No theory of local realism such as a local hidden variables theory can account for the correlations between entangled electrons predicted by quantum mechanics. The experimental results of quantum mechanics have several loopholes that the past fifty years of research in quantum theory has worked to close. Below two of the major loopholes are discussed: The detection loophole It is possible that the detectors at spin measurement sites have less than 100% efficiency, and only detect highly correlated spins, allowing uncorrelated spins to remain undetected. Thus, experiments would report a higher correlation than actually exists, meaning the actual amount of correlation might be explainable by a local hidden variables theory. In the best case scenario, any detectors below 67% efficiency would not close this loophole; in the standard case, about 83% efficiency is required. Traditional Bell test experiments have ignored this loophole by postulating the fair sampling assumption, which states that the spins measured at each detector are a fair representation of the actual distribution of entangled quantum states produced. While this seems intuitively physically reasonable, it is impossible to prove. The communication loophole Since it takes a small but nonzero amount of time to actually perform spin measurements and report the result, it is possible that after one spin is measured, the detector somehow communicates the result at lightspeed to the other detector, which is able to influence the spin of the other particle at measurement time. The only way to close the communication loophole is to separate the two detectors by a large distance and perform spin measurements in such a small amount of time that light could not have traveled between the detectors. This causally separates each detector from the other's influence. In late 2015, an experiment whose results were published in Nature claimed to have performed a fully loophole-free Bell experiment demonstrating violation of the CHSH inequality. Similar recent papers claim to have performed loophole-free Bell experiments for entangled photon polarizations, analogous to electron spins. It has been suggested that the late 2015 result has not quite closed all loopholes due to the possibility of communication between detectors in the past before the entangled electrons were emitted, which might have been able to correlate detectors in some way. This is known as the setting independence loophole. There exists a currently proposed experiment designed to circumvent this loophole by configuring detector settings using light from two very distant galaxies so far separated that light has not traveled between the two since the big bang. As a result, the detector settings will originate from sources not in causal contact, which means that it will be impossible for the detectors to have become correlated at any point in time. The proof of Bell's theorem considers an arbitrary local hidden variables theory and shows that any such classical theory attempting to mimic the results of quantum mechanics gives measurement results constrained by an inequality called the CHSH inequality for Clauser, Horne, Shimony, and Holt, although this inequality is also often referred to slightly incorrectly as Bell's inequality. The CHSH inequality is as follows: where and are two possible orientations for one Stern-Gerlach detector in a Bell experiment and and are two possible orientations for the second detector. The value gives the correlation of spins along these orientations, and is defined by the expectation of the product of spin states along each direction in quantum mechanics. The formal derivation of these inequality is fairly extensive, since it introduces a possible hidden variable and then defines the expectations in terms of integrals involving this variable. However, the intuition behind it is simple: each of the expectations along orientation , , , and is at most 1, because each correlation is at best 1 in a classical theory. So one has three plus signs and minus sign in a sum of terms that are bounded by one in absolute value; thus, the sum is bounded by . The tricky part of the derivation lies in manipulating the integral expressions that define the correlations to obtain nice inequalities that don't depend on the hidden variable or detector angle. An easy example of the violation of the CHSH inequality occurs in experiments measuring the spin in the entangled singlet state of two spin- particles: The desired correlations in quantum mechanics can be found by taking the expectations of spin measurements made at two Stern-Gerlach apparatuses. Letting be a measurement performed at apparatus A in orientation and similar, consider the following four possible measurements: where the and orientations are rotated by degrees with respect to and , so that with expectation values taken in the singlet state: The computations of the values given above are tedious but routine exercises in the formalism of spin measurement; see the quantum entanglement wiki for details of how they are performed. Substituting into the left-hand side of the CHSH inequality, one finds: This violates the bound of 2 predicted in a local hidden variables theory, as a result of quantum entanglement! Below, the correlation as a function of angle between orientations and is plotted. The disparity between the classical and quantum predictions is evident, especially at the degree difference used in the above calculation, where each correlation took value where it would have correlation at most in local hidden variables theory. - Hensen, B. Loophole-free Bell inequality violation using electron spins separated by 1.3 kilometres. Nature, 526, 682–686 (29 October 2015). - Gallicchio, J. Testing Bell’s Inequality with Cosmic Photons: Closing the Setting-Independence Loophole. Physical Review Letters, 112, 110405 – Published 18 March 2014. - Gill, R. Bell. Retrieved 22 December 2013, from https://en.wikipedia.org/w/index.php?curid=41434416
<urn:uuid:3b2309f6-2e9c-429e-a332-f9fa5274ea54>
CC-MAIN-2021-10
https://brilliant.org/wiki/bells-theorem/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178351374.10/warc/CC-MAIN-20210225153633-20210225183633-00221.warc.gz
en
0.941499
1,536
3.8125
4
The polarization of the photon refers to the “direction” of the axis of the energy field, with the magnetic axis being offset by 90°. Polarization can be a direction like up or down, left or right. Polarization can also mean spin in a clockwise or counter-clockwise direction. The axis of the electric field may be rotating or spinning in time and we would call the photon spin left or spin right. If the axis appears to stay vertical, we do not know for sure that the axis is vertical. We only know for sure that it has the highest probability of being measured in a vertical direction and zero probability of being measured in a horizontal direction. A simple form of Quantum Entanglement refers to the process of splitting a photon into a pair of photons and sending the two photons in different directions. Both photons start with the polarization of the original photon and the measurement of one photon in one location tells you the polarization of the other photon, or at least the probability of detecting the other photon at a specific angle. In 2002, Dietrich Dehlinger and M. W. Mitchell posted a paper called “Entangled photons, nonlocality and Bell inequalities in the undergraduate laboratory”. A closer look at this paper allows us to test and refine any Quantum Entanglement model. Alice and Bob – Entangled Photons The experiment described and performed by Dehlinger and Mitchell centers around the detection of photon pairs at two different locations. The polarization of a photon stream is first fixed in a specific direction from vertical with a linear polarizer, then the phase of one component is fixed with a birefringent quartz plate. The photon stream is then directed at beta barium borate (BBO) crystals that cause a small fraction of the laser photons to spontaneously decay into photon pairs with the same total energy as the original photon (a process called spontaneous parametric downconversion). Two single-photon counting modules (SPCMs), are used to detect the photons. One detector is traditionally named Bob and the other Alice. Because the photons of a downconverted pair are produced at the same time they cause coincident, i.e., nearly simultaneous, firings of the SPCMs. Simultaneous firings are considered coincidences if they occur within 25 nanoseconds of each other. The experiment is done by recording the number of coincidences that occur for various settings for the measurement angle of Bob and Alices detectors. The number of coincidences at different observer angles is what must be modelled correctly as a “local realistic hidden variable theory” (HVT). Modelling Entangled Photons – Close but not Quite Dehlinger and Mitchell propose a model where each photon has a polarization angle λ. When a photon meets a polarizer set to an angle γ , it will always register as Vγ if λ is closer to γ than to γ + π/2, i.e., - if |γ − λ| ≤ π/4 then vertical - if |γ − λ| > 3π/4 then vertical - horizontal otherwise. Dehlinger and Mitchell go on to generate their experimental results and generate the graphic below on the left. The open circles representing Alice at 0° and Bob at 0 to 180°. The closed circles represent Alice settings at 45° and Bob at 0 to 180°. Represented by icons, the model is plotted against the same angles and produces results shown as calculated on the right. Clearly the calculated results differ from the experimental results. The angle 22.5° shows the most difference between the model and experiment. Dehlinger and Mitchell choose this angle to analyse in detail and show that their experimental results match with quantum physics. In their words “Our HVT is very simple, and yet it agrees pretty well with quantum mechanics. We might hope that some slight modification would bring it into perfect agreement.”. Refining the Model – adding Probability To make the model a little more accurate, Animated Physics models the photons as not only having a specific “average” direction, but also as having a “wobble” or “instantaneous” direction. Represented as icons, photons present a more “fuzzy” picture of their polarization. The sample 24° photon, with a 30° wobble, will most of the time be picked up as a vertical, but sometimes when the combined angle is over 45°, it will be picked up as a horizontal. To determine polarity, we use these equations. - Chance of vertical measurement = (cos((γ − λ)*2)+1)/2 - Chance of horizontal measurement = (cos((γ − λ + π/2)*2)+1)/2 Now consider some angles. A horizontal photon has a 100% chance of getting through a horizontal polarizer. A vertical photon has 0% chance of getting through a horizontal polarizer. A photon with a polarization angle of 45° has a 50% chance of getting through a horizontal polarizer and a 50% chance of getting through a vertical polarizer. Finally, a photon with polarization angle of 22.5° has a 85% chance of getting through a horizontal polarizer and a 15% chance of getting through a vertical polarizer. Visualizing the Entanglement Model Let’s start the experiment. To calibrate the equipment, we test that Alice and Bob get maximum matches with both set at 0° (a), they get minimum matches with Alice at 0° and Bob at 90° since the vertical photons have no chance of getting though Bob’s filter (b), with the photon stream set to 45°, Bob and Alice match all photons when both set to 45° (c). The green numbers represent the probability of a photon getting through at that angle. With the equipment calibrated, we fix the polarizer that Alice is using at 0°, and rotate Bob’s polarizer through a variety of angles and collect our data. These results demonstrates that this model matches the results of experiment. In fact, the model follows the same cos(γ − λ)² rule that is used by quantum mechanics. To have fun playing around with photon polarization settings as well as filter angles, click on “Shoot the Photon”.
<urn:uuid:50c3f712-5469-414a-b839-ab072969cc8d>
CC-MAIN-2021-10
http://animatedphysics.com/insights/modelling-quantum-entanglement/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178351374.10/warc/CC-MAIN-20210225153633-20210225183633-00222.warc.gz
en
0.921857
1,318
3.703125
4
PT Symmetry Goes Quantum No physicist would tamper with the conservation of energy—the fundamental law that says energy cannot be created or destroyed. Researchers have, however, taken an interest in devices whose energy is conserved somewhat artificially. Known as PT-symmetric systems , these devices are engineered to have a balance of energy flowing in and out. And they feature many unusual properties, which have, for example, been harnessed to make optical components that only allow light to travel in one direction or that stop the flow of light altogether [2, 3]. Some of these properties might now be realized in the quantum domain. David Zueco of the University of Zaragoza in Spain and colleagues have proposed a realistic circuit in which microwaves interact with a quantum bit (qubit) and that would satisfy the requirements of PT symmetry . PT symmetry stands for parity-time symmetry. This terminology implies that a physical system looks exactly the same if one performs two operations on it. The first is a parity operation, which swaps left and right so as to exchange the components feeding energy in (the gain) with components allowing energy to escape (the loss). The second is time reversal, which is akin to running a “movie” of the device backwards. A simple example of a PT-symmetric system is a stick that’s being cooled on one end and heated on the other, both at exactly the same rate (Fig. 1, left). If you swap the sources performing the heating and cooling and then time reverse these processes, the system looks exactly the same. Researchers were initially interested in whether PT symmetry could be a fundamental property of nature, a discovery that would have far-reaching consequences for quantum theory. The reason has to do with quantum theory’s underlying mathematics. Physicists have a strict requirement that the Hamiltonian—the function that is used to calculate a system’s energies—must predict real, not complex, energies. Hamiltonians having the mathematical property of being Hermitian are guaranteed to produce real energies, so quantum theory has been built on the assumption that all viable Hamiltonians are Hermitian. But it turns out that PT-symmetric Hamiltonians also predict real energies and may thus serve as a starting point for an alternative or potentially more fundamental formulation of quantum mechanics. Textbook quantum mechanics has so far proven resilient to such attempts at a reformulation. But motivated by their potentially interesting properties, experimentalists have realized a variety of PT-symmetric devices by artificially engineering gain and loss in these systems. Most of these demonstrations have involved optical setups, since many methods exist to amplify and dampen light . One experiment, for example, used two coupled ring-shaped resonators—one providing gain, the other providing loss . Using such devices, researchers have explored many of the fascinating aspects of PT symmetry, such as the fact that it gives rise to “exceptional points”—conditions where a system’s allowed modes coalesce into a single mode. An optical PT-symmetric device operated near an exceptional point can behave in unconventional ways, examples of which include a laser that turns on as it incurs more loss or a “topological” waveguide that always transmits waves into a well-defined output mode irrespective of how the waves were injected . Most of the devices explored so far have been comprised of macroscopic components, and any quantum effects in them were negligible. The paper from Zueco and colleagues indicates that the field is coming back to its roots, with researchers asking how a quantum device possessing PT symmetry would behave. Crucially, the appropriate device would have dynamics that could only be accurately described by the Schrödinger equation, and its gain and loss components would need to be precisely controllable in the lab. Bose-Einstein condensates , optomechanical systems , and several other setups have already been suggested for this task. Zueco and co-workers’ proposed addition to this list of possibilities may prove to be particularly attractive for studying PT symmetry and its ramifications in the quantum realm. That’s because they focus on using the tools of circuit quantum electrodynamics (QED), which is one of the fastest growing areas for studying quantum effects and making technological use of them . In a circuit-QED device, single microwave photons are confined to resonant cavities. Light confined in this way can be made to interact strongly with qubit-like components that have quantized energy levels much like those of an atom. Zueco and co-workers propose a relatively simple setup comprised of two coupled microwave cavities, each with a qubit placed nearby (Fig. 1, right) . They then imagine achieving a balance of gain and loss in the resonators by driving the qubits with microwaves of the right amplitude and frequency. Using a simple model for this device, the researchers find the value of the coupling between the two resonators that will lead to a PT-symmetric phase transition and its associated exceptional point. They also predict how the transmission of a microwave signal through the circuit would be modified by the onset of this transition. At first sight, the expected effects are not very different from those predicted by classical models of PT symmetry. But new features are expected to emerge in the quantum domain. Quantum fluctuations are intrinsically linked to both gain and loss through the fluctuation-dissipation theorem. As a result, correlations between the fluctuations of the modes that merge at an exceptional point could lead to a big enhancement in the fluctuation amplitude of the merged mode. The proposed device might also enable the study of the quantum versions of so-called chiral population transfer schemes [9, 10]. Here, one varies a device's parameters such that its allowed energies sweep out a loop around an exceptional point. The way in which the loop is cycled—say clockwise versus counterclockwise—determines the final state of the device, which could be used to build robust and broadband switching elements. By interconnecting two very active fields of research— PT symmetry and circuit QED—the new proposal puts these and other research directions within reach. If researchers succeed in making the proposed device in the lab, we would legitimately be at the starting point of the new field of PT-symmetric quantum mechanics. This research is published in Physical Review A. - C. M. Bender and S. Boettcher, “Real Spectra in Non-Hermitian Hamiltonians Having PT Symmetry,” Phys. Rev. Lett. 80, 5243 (1998). - R. El-Ganainy, K. G. Makris, M. Khajavikhan, Z. H. Musslimani, S. Rotter, and D. N. Christodoulides, “Non-Hermitian Physics and PT Symmetry,” Nat. Phys. 14, 11 (2018). - T. Goldzak, A. A. Mailybaev, and N. Moiseyev, “Light Stops at Exceptional Points,” Phys. Rev. Lett. 120, 013901 (2018). - F. Quijandría, U. Naether, Ş. K. Özdemir, F. Nori, and D. Zueco, “PT-Symmetric Circuit QED,” Phys. Rev. A 97, 053846 (2018). - B. Peng et al., “Parity–Time-Symmetric Whispering-Gallery Microcavities,” Nat. Phys. 10, 394 (2014). - H. Cartarius and G. Wunner, “Model of a PT-Symmetric Bose-Einstein Condensate in a -Function Double-Well Potential,” Phys. Rev. A 86, 013612 (2012). - K. V. Kepesidis, T. J. Milburn, J. Huber, K. G. Makris, S. Rotter, and P. Rabl, “PT-Symmetry Breaking in the Steady State of Microscopic Gain–Loss Systems,” New J. Phys. 18, 095003 (2016). - G. Wendin, “Quantum Information Processing with Superconducting Circuits: A Review,” Rep. Prog. Phys. 80, 106001 (2017). - H. Xu, D. Mason, L. Jiang, and J. G. E. Harris, “Topological Energy Transfer in an Optomechanical System with Exceptional Points,” Nature 537, 80 (2016). - J. Doppler, A. A. Mailybaev, J. Böhm, U. Kuhl, A. Girschik, F. Libisch, T. J. Milburn, P. Rabl, N. Moiseyev, and S. Rotter, “Dynamically Encircling an Exceptional Point for Asymmetric Mode Switching,” Nature 537, 76 (2016).
<urn:uuid:a8259b7d-469c-48af-9493-b61097f4cc5f>
CC-MAIN-2021-10
https://physics.aps.org/articles/v11/54
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178351374.10/warc/CC-MAIN-20210225153633-20210225183633-00222.warc.gz
en
0.927434
1,903
3.578125
4
A team of researchers realize the first quantum-logic computer operation between two separate quantum modules in different laboratories. Today’s quantum computers contain up to several dozen memory and processing units, the so-called qubits. A team of researchers from the Max Planck Institute of Quantum Optics in Garching and ICFO, part of the QIA project, have successfully interconnected two such qubits located in different labs to a distributed quantum computer by linking the qubits with a 60-meter-long optical fiber. Over such a distance they realized a quantum-logic gate – the basic building block of a quantum computer. It makes the system the worldwide first prototype of a distributed quantum computer. The limitations of previous qubit architectures Quantum computers are considerably different from traditional “binary” computers: Future realizations of them are expected to easily perform specific calculations for which traditional computers would take months or even years – for example in the field of data encryption and decryption. While the performance of binary computers results from large memories and fast computing cycles, the success of the quantum computer rests on the fact that one single memory unit – a quantum bit, also called “qubit” – can contain superpositions of different possible values at the same time. Therefore, a quantum computer does not only calculate one result at a time but instead many possible results in parallel. The more qubits there are interconnected in a quantum computer; the more complex calculations it can perform. The basic computing operations of a quantum computer are quantum-logic gates between two qubits. Such an operation changes – depending on the initial state of the qubits – their quantum mechanical states. For a quantum computer to be superior to a normal computer for various calculations, it would have to reliably interconnect many dozens, or even thousands of qubits for equally thousands of quantum operations. Despite great successes, all current laboratories are still struggling to build such a large and reliable quantum computer, since every additionally required qubit makes it much harder to build a quantum computer in just one single set-up. The qubits are implemented, for instance, with single atoms, superconductive elements, or light particles, all of which need to be isolated perfectly from each other and the environment. The more qubits are arranged next to one another, the harder it is to both isolate and control them from outside at the same time. Data line and processing unit combined One way to overcome the technical difficulties in the construction of quantum computers is presented in a new study in the journal Science by first author Severin Daiss, Stefan Langenfeld and colleagues from the research group of Gerhard Rempe at the Max Planck Institute of Quantum Optics in Garching and the Institute of Photonic Sciences (Castelldefels, Spain). The team succeeded in connecting two-qubit modules across a 60-meter distance in such a way that they effectively form a basic quantum computer with two qubits. “Across this distance, we perform a quantum computing operation between two independent qubit setups in different laboratories,” Daiss emphasizes. This enables the possibility to merge smaller quantum computers into a joint processing unit. Simply coupling distant qubits to generate entanglement between them has been achieved in the past, but now, the connection can additionally be used for quantum computations. For this purpose, the researchers employed modules consisting of a single atom as a qubit that is positioned amidst two mirrors. Between these modules, they send one single light quanta, a photon, that is transported in the optical fiber. This photon is then entangled with the quantum states of the qubits in the different modules. Subsequently, the state of one of the qubits is changed according to the measured state of the “ancilla photon”, realizing a quantum mechanical CNOT-operation with a fidelity of 80 percent. A next step would be to connect more than two modules and to host more qubits in the individual modules. “Our scheme opens up a new development path for distributed quantum computing”Gerhard Rempe / Director of the Max-Planck-Institut für Quantenoptik Higher performance quantum computers through distributed computing Researcher Gerhard Rempe believes the result will allow to further advance the technology. It could enable, for instance, to build a distributed quantum computer consisting of many modules with few qubits that are interconnected with the newly introduced method. This approach could circumvent the limitation of existing quantum computers to integrate more qubits into a single setup and could therefore allow more powerful systems. Severin Daiss, Stephan Langenfeld, Stephan Welte, Emanuele Distante, Philip Thomas, Lukas Hartung, Olivier Morin, Gerhard Rempe. A Quantum-Logic Gate between Distant Quantum-Network Modules. Science, Vol. 371, Issue 6529, pp. 614-617. This article was originally published at Max Planck Institute of Quantum Optics newsroom and edited for clarity. More News?All News February 15th, 2021 Quantum systems learn joint computing Researchers realize the first quantum-logic computer operation between two separate quantum modules in different laboratories. February 5th, 2021 How complex oscillations in a quantum system simplify with time A team of researchers have shown that in a one-dimensional quantum system, the initially complex distribution of vibrations or phonons can change over time into a simple Gaussian bell curve. February 1st, 2021 Advances in visualizing large quantum states A study published in Physical Review A by Quantum Flagship projects PASQuanS and AQTION reports on a novel fast computational method that enables to visualize and thus to better understand large many-body quantum systems
<urn:uuid:c439f587-40d2-47cc-a747-f96db557bf90>
CC-MAIN-2021-10
https://qt.eu/about-quantum-flagship/newsroom/quantum-systems-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178364008.55/warc/CC-MAIN-20210302125936-20210302155936-00262.warc.gz
en
0.908104
1,177
3.953125
4
Post written by Paul Smith-Goodson is Moor Insights & Strategy's analyst in-residence for Quantum Computing. Qubits are the heartbeat of quantum computers. Their hard-to-imagine properties are what gives quantum machines their awesome computational power. Superconducting devices, spinning atoms, polarized photons, quantum dots, and trapped ions are not futuristic video games. They are different qubit technologies. Moreover, each qubit type has its peculiar advantages and disadvantages. Out of all the qubit types, superconducting is the most common. However, trapped ion qubits, a relatively new qubit technology, shows a great deal of promise. In addition to having faster gate speeds, superconducting qubits are solid-state fabrications. On the other hand, trapped ions are more stable and have better connectivity to other qubits than their superconducting counterpart. Functionally, all qubits depend on strange quantum properties. Instead of classical bits – a one or a zero – a quantum computer’s qubits (quantum bits) can be coded as a one, or a zero, or both a one and a zero. Qubits can also exist in all the possible states at the same time. That condition is called superpositioning. Bits in a classic computer act individually, while quantum properties allow qubits to become “entangled” with each other. Once entangled, a group of qubits can act as a single qubit. That enables a solution to multiple inputs to appear on a single qubit. Compared to classical computers that can only work on one computation at a time, superpositioning gives quantum computers the potential to execute millions of simultaneous operations. Quantum teleportation is another quantum feature. It sounds like science fiction, but it’s not. Instead of teleporting matter, quantum teleportation is limited to sharing quantum states between entangled particles regardless of how far apart they are. In the future, teleportation will be useful for controlling and using qubits in remote quantum servers as well as for telecommunications. A qubit is a qubit is a qubit … almost Superconducting heavy hitters It’s worth noting that Intel is not dependent on superconducting qubits. It is also investigating another qubit technology that operates in silicon, called spin qubit. The quantum state of spin qubits depends on the spin of an electron on silicon. One reason for Intel’s interest in spin qubits is because it is another technology that can leverage Intel’s vast experience in silicon manufacturing. Rigetti Computing, a recent but impressive California start-up, also uses superconducting qubits. It is a full-stack company that beefed up its application development capability by a recent acquisition of QxBranch. Superconducting qubit quirks Superconducting qubits are the most mature of all the qubit technologies. That means we know what improvements are required even though we may not yet know how to do it. Superconducting qubits also have the advantage of being built using existing semiconductor techniques. Superconducting qubits have a few disadvantages: - They require near absolute zero temperatures to operate - They are very susceptible to quantum noise - They retain their quantum states for short periods - Limited gate connectivity to qubits IonQ, an upstart ion startup IonQ, Alpine Quantum Technologies (Austria), and Honeywell all use trapped ion technology. However, IonQ is its driving force. IonQ was founded in 2015 by Christopher Monroe and Jungsang Kim. Monroe is the Bice Zorn Professor and a Distinguished Professor of Physics at the University of Maryland and Fellow of the Joint Quantum Institute. He is currently the Chief Scientist for IonQ. Kim is a professor in the department of electrical and computer engineering at Duke University. Building a better ion trap Trapped ion technology isn’t a radically new concept. It’s used to make some of the most accurate atomic clocks in the world. Like atomic clocks, IonQ uses an isotope of ytterbium to build its qubits. They start with a neutral atom of ytterbium, then use lasers to remove an electron from the atom’s outer shell. This process converts a regular atom of ytterbium into a ytterbium ion (Yb+). The ytterbium ion is held in place by electromagnetic fields in a linear ion trap. According to IonQ, because this technology is easy to reconfigure, they can load a hundred or more ions in a linear chain. Also, they can do it without the need to fabricate a new chip. So far, they have used single-qubit gates on a linear chain of 79 ions. There are many advantages to trapped ion qubits. Compared to superconducting qubits, they need less overhead for error correction. Entangling groups of qubits in a shared trap is easy due to the Coulomb force. Another big plus is the fact that dilution refrigerators are not needed. Long term view There is much work to be done before a universal fault-tolerant quantum computer is available. The long-term viability of superconducting and trapped ion qubits looks good. Superconducting qubits will steadily improve as a result of the financial resources of our biggest and best tech companies. If trapped ion computer researchers can solve the scaling problem with lasers, they have a good chance of exceeding the capabilities of their superconducting counterparts. Almost all researchers agree we are in the early experimental stages of quantum computing. Best estimates are it will take another 15-20 years for quantum computing to reach maturity. There is an excellent chance that future research will discover better qubit technologies or materials. Disclosure: My firm, Moor Insights & Strategy, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including IBM, Google, and Intel, which may be cited in this article. I do not hold any equity positions with any companies cited in this column.Follow me on Twitter or LinkedIn. Check out my website.
<urn:uuid:0e7866c2-1f4c-47f4-959e-8046fc573bbd>
CC-MAIN-2021-10
https://www.forbes.com/sites/moorinsights/2019/09/16/quantum-computer-battle-royale-upstart-ions-versus-old-guard-superconductors/amp/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178383355.93/warc/CC-MAIN-20210308082315-20210308112315-00463.warc.gz
en
0.912344
1,286
3.59375
4
A novel method for "plucking" individual particles of light out of a laser pulse could lead to major breakthroughs in quantum computing, researchers say. Using a combination of supercooled atoms and cutting-edge optical technology, physicists from the Weizmann Institute of Science in Israel were able to extract a single photon from a beam of light. Individual photons are of great interest to physicists because they are governed by the laws of quantum mechanics rather than the rules of classical physics (which normally apply to light). Many scientists also see photons as a promising candidate to carry information in future quantum computing systems. [Wacky Physics: The Coolest Little Particles in Nature] "Light composed of photons is already the best carrier of information we have," said Barak Dayan, a senior scientist at the Weizmann Institute of Science, whose lab developed the new method. "But once we move into quantum technologies, we are going to have to use single photons as the carriers of information, so being able to control single photons will be crucial." In a previous study published in the journal Science in 2014, the researchers showed how the method could be used to create an all-optical router for quantum communication systems. They created a switch to send single photons down different pathways and encode them with quantum information, with the position of the switch determined by its interaction with the photons. A key benefit of quantum communication is that it is ultrasecure, because the process of measuring any quantum system generally disturbs it, the researchers said. This would normally alert the operator to any eavesdroppers, but according to Dayan, the solution they devised could be used to spy on certain systems. At present, most single-photon sources are imperfect and occasionally produce more than one photon. "One of the worries is that someone smart could make sure that, if there's one photon, their device doesn't do anything, but if there are two photons, it intercepts the spare one," Dayan said. This is known as the "photon number splitting attack," and it could be used to decode messages without the interception (of the particle) being detected. Alternatively, operators could use the approach to purify their transmissions by removing extra photons, Dayan said. Researchers have removed single photons from a beam of light before, in a process called photon subtraction that uses low-reflectivity beam splitters to divert the particles. But the method is probabilistic, meaning it is hit-or-miss whether a photon will be removed with each pulse of light. In addition, the only way to determine whether the process was a success is to use a photon detector, which absorbs the particle and means it can't be used for anything else. [The 9 Biggest Unsolved Mysteries in Physics] "In our case, there are two advantages," Dayan told Live Science. "One: In principle, it always happens — it's deterministic. Two: You're not losing the photon, just diverting it, and you can use it for other processes." The solution uses a single rubidium atom held in place by lasers that cool it to near absolute zero. (Absolute zero equates to minus 273.15 degrees Celsius, or 459.76 degrees Fahrenheit.) Coupled to this is a micro optical resonator — effectively, a 30-micron-wide sphere of glass (for perspective, an average strand of human hair is about 100 microns wide) used to confine light long enough for individual photons to interact with the atom. Light is fed into the resonator using a nanoscale fiber-optic cable. The researchers rely on a physical effect they call "single-photon Raman interaction," or SPRINT. This causes the atom to block the transmission of light until a single photon is reflected, at which point, it becomes transparent to the remaining photons. Unlike previous methods of photon subtraction, the SPRINT effect, by its very nature, always removes a single photon from an incoming beam, the scientists said. And though the researchers currently send the extracted photons toward a detector to confirm their findings, the particles of light could be diverted elsewhere, they added. But Dayan is keen to stress that, for now, his team's work is designed to demonstrate the SPRINT effect, rather than to build a practical quantum communication device. "The realization is very complex — there's a reason no one has done this before," he said. "It combines several technologies, and that combination is very challenging. That's why it has taken us years to build this lab and this experimental setup." The use of supercooled atoms is beyond the scope of commercial systems, but Dayan said researchers are working on a number of technologies designed to mimic the unique properties of atoms, including quantum dots, which are tiny semiconductors that exhibit interesting quantum effects, such as being able to absorb light from one wavelength and convert it to highly saturated light at a different wavelength. "Once one of these technologies matures, that effect we have demonstrated will be applicable there as well," Dayan said. The new study was published online Nov. 23 in the journal Nature Photonics.
<urn:uuid:6e6c2d37-8b2d-45d5-971d-b7c608f28dc9>
CC-MAIN-2021-10
https://www.livescience.com/53087-extracting-photons-advances-quantum-computing.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178362481.49/warc/CC-MAIN-20210301090526-20210301120526-00024.warc.gz
en
0.955535
1,063
3.6875
4
Microwave photonics circuit elements will need to be similar to their RF analogs to provide the desired functionality. One of these analogous circuit elements is a terahertz microwave cavity resonator, which can be integrated onto an IC with standard CMOS processes. This is one of many circuit elements that can be placed on an IC and used to enable unique applications. These fibers will soon be integrated into semiconductor wafers as microwave lines to communicate with unique circuit elements like terahertz microcavity resonators. Microwave components have a lot more going on than what ends up in your microwave oven. Terahertz wave sources, detectors, and components have yet to be miniaturized, and the terahertz portion of the microwave spectrum is still largely unexplored. So far, the best we can do is get into the high GHz (low THz) region for oscillation, detection, and wave manipulation. This region is critical for many applications, including quantum computing, imaging, sensing, and ultra-fast communication. One fundamental set of components is terahertz microcavity resonators. These components are part of a larger photonics platform and they play analogous roles to RF resonators on a PCB. The simple geometry of these resonators also allows them to be placed on a chip alongside other photonic structures. If you’re a budding photonics engineer, keep reading to learn more about these resonator structures and how they might play a role in current and upcoming photonics systems. What Are Terahertz Microcavity Resonators? Much like any other resonator, terahertz microcavity resonators have a fundamental frequency that lies in the terahertz region. In terms of wavelength, a 1 THz wave in air has a wavelength of only 300 microns, which is quite large compared to today’s transistors. These structures provide the same function as well; they allow a wave matching the fundamental frequency or one of its harmonics to excite a high-Q resonance, whereby a standing wave can form in the cavity. Much like a wave on a string or in a waveguide, this standing wave at one of the eigenfrequencies will have very high intensity due to constructive interference inside the cavity. The very strong, very coherent electromagnetic wave in this structure can then be used for some other application. The challenges in working with these structures are wave generation and detection, both of which need to be solved for terahertz microcavity resonators to be useful at the chip level. Geometry and Eigenfrequencies The image below shows a simple rectangular terahertz microcavity resonator and its discrete eigenfrequency spectrum. The eigenfrequencies can be tuned to desired values by adjusting the geometry, just like any other resonator. The equation below applies to a closed rectangular cavity and provides a good first approximation for a slightly lossy cavity (i.e., with high dielectric constant contrast at the edge). Rectangular terahertz microcavity resonator geometry and eigenfrequencies. Although a rectangular geometry is shown above, more complex structures may be used for different applications. In a different structure (e.g., circular, hemispherical, or cylindrical) with an open edge, the eigenfrequencies may not obey such a simple equation. Instead, they may be determined from a dispersion relation that is a transcendental equation, which requires a numerical technique to extract specific frequencies. This is a well-known procedure for solving Sturm-Liouville problems in waveguides and resonators. If you have a much more complex structure that can’t be approximated as a simple shape, the various eigenfrequencies and the spatial distribution of the electromagnetic field can be determined using a 3D field solver (FDFD technique). A field solver you would normally use for IC packages can also be used for modeling terahertz microcavity resonators. Applications for terahertz microcavity resonators are still being researched, as are the device architectures required for different applications. Some proposed applications of terahertz microcavity resonators include: Sensing and imaging: High-Q terahertz microcavity resonators can be used for highly coherent imaging and sensing, with applications in molecular detection and biological imaging. Silicon photonics: While this application area is normally discussed in terms of SMF or MMF wavelengths, devices in this area can also operate at THz frequencies and will need terahertz microcavity resonators to act as filters and amplifiers. Communication: Currently, the world record for the highest data rate transmission belongs to an experimental wireless system operating at THz frequencies. Miniaturizing these systems at the chip level will require microcavity structures, including terahertz microcavity resonators. The important advancement provided by these structures is that they can occur on an integrated circuit. Today, these applications still involve large optical systems where an infrared mode comb in a femtosecond soliton laser is used to generate a terahertz wave through interference. Similarly, large systems are also used for the detection and manipulation of terahertz waves. Terahertz microcavity resonators are one class of components that can provide high-Q or low-Q reception of THz frequencies, which can then be passed to a detector element or other photonic circuit. The range of useful materials for building terahertz microcavity resonators, or for building coupling structures, is also an open research question. Some material platforms used for terahertz microcavity resonators include: Silicon: This material is the most promising for the fabrication of terahertz devices and their integration alongside other electronic circuits. GaAs, other III-V’s, and II-VI’s: This promising set of photonic materials has already shown interesting results at ~3 THz frequencies, particularly for the generation of laser light. This material platform is promising for photonics in general. Photonic crystals: Periodic nanostructures that are fabricated through chemical deposition methods provide a tunable platform for fabricating a range of terahertz devices, including terahertz microcavity resonators. Dielectrics: This broad range of materials includes oxides, salts, polymers, and other materials that can support transmission or absorption in various THz frequency ranges. For integration, the best set of materials should bond to the industry’s current range of semiconductors. Microcavity resonator materials should be chosen to integrate into existing semiconductor materials platforms and manufacturing processes. As your technology and designs push into more advanced spaces with the years to come, more advanced software that can navigate the nuances and challenges of THz components will be necessary. Be sure to prepare adequately as you stay ahead of the frequency curve.
<urn:uuid:e096ec4d-5147-4061-bbe6-4385f44d526d>
CC-MAIN-2021-10
https://resources.system-analysis.cadence.com/view-all/2020-todays-and-tomorrows-terahertz-microcavity-resonators
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178361723.15/warc/CC-MAIN-20210228175250-20210228205250-00224.warc.gz
en
0.889878
1,468
3.78125
4
BASIC KNOWLEDGE - SEMICONDUCTORS The workings and applications of semiconductors Thanks to semiconductors, the world is a safer, smarter and more convenient place. But what are they made of, what do they do, and where are they found? Without semiconductors, our world would look much more like it did in the late 1950s or early 1960s; we would have no electronic hand calculators, microwave ovens, digital alarm clocks, cellphones, tablets, personal computers, electronically controlled transmissions or washing machines. What are semiconductors and what do they do? Semiconductors are the backbone of the information technology and modern electronics industries—and therefore of our society as we know it. Without them, the vast majority of the electronic devices prevalent today would not exist. Despite its status as an essential building block of any electronic device, a semiconductor’s purpose is relatively simple: to allow an amplified current to move along a circuit board, which in turn enables the elements on the circuit board to be powered. Semiconductors are typically made of an insulating material such as silicon, and often have atom-sized impurities mixed in during the production process (known as “doping”) to influence their conductivity depending on the application. A brief history of semiconductor technology Semiconductors revolutionized the electronics industry back when the first transistor was developed in the 1940s, as they enabled signals to be amplified to such an extent that they powered an electrical circuit. Scientists soon discovered that semiconductors could be reduced in size, paving the way for the development of computer processors, memory chips, integrated circuits and systems on a chip (SoC). While these devices have gradually became more complex, rugged, efficient and reliable, it’s their reduction in size above all (to a matter of nanometers) that’s enabled a host of technologies to become smaller and more powerful. These technologies, in turn, have opened the door to most of the communication, transportation, entertainment, industry and medical innovations that have helped to shape society over the past 70 years. Types, groups and classifications The majority of semiconductor materials are inorganic, and can be divided into two basic groups: intrinsic, where purity is retained, and extrinsic, which are “doped” with impurities to affect the material’s conductivity. They can also be divided by type, namely N-type and P-type. In semiconductors, electrons move across the substrate to holes as part of the process of electrical conductance. N-type semiconductors are made by doping the material with an electron donor element, meaning there are more electrons than holes. In this case, the electrons are known as the majority carriers and the holes as the minority carriers. In P-type semiconductors, the holes are the majority carriers, while the electrons are the minority carriers. With the advent of the metal-oxide-semiconductor process in the late 1950s, which enabled semiconductors to be miniaturized for the first time, silicon became the most commonly used element in their production. This is due to its ease of production and strong electrical and mechanical characteristics. Other semiconductor materials include: gallium arsenide, which is used in radio-frequency modules and is difficult to produce; germanium, which was used in early transistor technology (along with lead sulfide); silicon carbide; gallium phosphide; and cadmium sulfide. One semiconductor material that’s gaining ground in the field of electronics is gallium nitride (GaN). Hailed as the silicon of the future, gallium nitride semiconductors are highly temperature resistant, conduct more current, improve power density and are more efficient overall. The material has found major support within the aerospace industry, and is now increasingly being used in household appliances and road vehicles. A constant companion in everyday life Once reserved for televisions and radios, semiconductors are now unavoidable in day-to-day life. From making toast in the morning to switching on a light, checking the weather or reading an e-book, even the most banal activities are made possible thanks to semiconductors. They’re the reason why smartphones are more powerful than the supercomputers of 20 years ago, why cars will soon be able to drive themselves, and why it’s possible to communicate with people instantly all over the world. Semiconductors are as critical to modern life as air or water—and with artificial intelligence, quantum computing and advanced wireless networks on the horizon, their importance won’t be diminishing any time soon. It’s a little known fact that Britney Spears is an expert in semiconductor physics. (Yes, you read that right.) Britney Spears knows the ins and outs of the vital laser components that have made it possible to hear her super music in a digital format.
<urn:uuid:d9b33337-2bf9-457f-8c71-ef78cf766840>
CC-MAIN-2021-10
https://www.power-and-beyond.com/the-workings-and-applications-of-semiconductors-a-909344/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178358798.23/warc/CC-MAIN-20210227084805-20210227114805-00264.warc.gz
en
0.944238
1,024
3.546875
4