text
stringlengths
4.06k
10.7k
id
stringlengths
47
47
dump
stringclasses
20 values
url
stringlengths
26
321
file_path
stringlengths
125
142
language
stringclasses
1 value
language_score
float64
0.71
0.98
token_count
int64
1.02k
2.05k
score
float64
3.5
4.53
int_score
int64
4
5
Research on quantum networking is well under way. In April 2012, Gerhard Rempe and other researchers at the Max Planck Institute of Quantum Optics in Germany announced their first working quantum network to the world. Then, just this year, Wolfgang Tittel and his researchers at the University of Calgary transported a light particle’s properties through six kilometres of cable. Scientists know how to transmit quantum data through fibre optics or similar free space physical transmission. In a quantum transmission, photons alter across a long link of highly sensitive atoms. In a fibre optic cable, that transmission is sent through tiny glass fibres via light emissions. Free space connections also carry a quantum signal via light emissions, but without glass fibre. Therefore, a clear line of sight must exist between the starting point and destination of the signal. That means we can transmit data even quicker than through fibre optics, but it’s trickier to control. Of course, there is already lots of fibre optic cable in developed parts of the world, but the usual usage, as of 2016, is for the binary digital signals that we all know and love. Quantum data is kind of intriguing and mind-blowing because, while binary bits can only contain a 1 or a 0, quantum bits (qubits) can be both or neither. They’re elusive, and their physical properties add a bizarre new dimension to computer science. Binary data has been around since before ENIAC was introduced in 1946. And look where we’ve taken it in over seventy years! We used to require a roomful of machinery for simple arithmetic. Now we can transfer minutes of audio and 1080p video from one end of the world within seconds from our 100 gram pocket sized devices. That’s all binary data, and we still haven’t completely explored its potential! As of now, we can only transmit a very simple piece of data, such as a light particles information, in the quantum way. Broadcast transmission, which a lot of the internet must do, is impossible for us to do with quantum signals so far. There must be a single point A and point B unless we find some way around that. Just looking at a qubit changes its data. A mere look is a photonic alteration in and of itself! Imagine designing firewalls and network monitoring tools for that… And the existing block and stream ciphers we use for the encryption of binary data absolutely won’t work with qubits either. So, you’d think that the eventual implementation of quantum networks will pose challenges to information security like we’ve never seen before. It will, indeed. But because merely looking at a photon or changing its direction in any way will change its data, man-in-the-middle (MITM) attacks will be yesterday’s news. Well, at least as we know them. (Never say never!) Here’s how MITM attacks usually work: A client machine initiates a transmission to a server on the internet. The attacker gets between the legitimate client-to-server transmission. A cryptographic key request is made from the client machine with the server as its intended recipient. The man-in-the-middle attacker just sends that through and over to the server. The server sends a key to the client, but unbeknownst to the client and the server, the attacker makes a copy of that key before the key reaches the client. Because a cryptographic key has been received by the client, the client and the server think they have a secure, encrypted connection such as over HTTPS while the person using the client machine is doing their online banking. Because the attacker can now decrypt with that key, they have access to all of that supposedly secure and highly sensitive financial data that’s being transmitted in that session. Those usual sorts of MITM attacks won’t work with quantum networks and qubits, for merely looking at the photon alters it. And the client, server, or both will be aware of that alteration. It’s the photon itself that contains the qubit. Large swaths of Canada and the United States could be covered in fibre optic cable already if it weren’t for the avarice and corporate collusion of certain tier one ISPs. Much of the developed world, including parts of Canada and the US do have fibre optic networks already, but we’d have so much more if it weren’t for corporate greed. Even binary data sent over fibre optics is more secure and much faster than binary data sent over coaxial. I can only imagine the corporate resistance to free space cable later on! It’s these sorts of factors that will impede the implementation of quantum networking technology for the service of ordinary people. Quantum cryptography offers tremendous potential to information security, as well. Artur Ekert of the National University of Singapore demonstrated some of this potential at the American Association for the Advancement of Science back in 2012. Just as MITM attacks seem impossible in the transmission of qubits, so does attempting to interfere with the transmission of a quantum cryptographic key. Just looking at the photon changes it, so the targeted parties will become aware of the attack. This is the curious world of the transmission of information through the tiniest possible things; quantum things, photons. Securing that information, quantum information security, is a whole new world of advantages, complexity, and challenges. About the Author: Kim Crawley spent years working in general tier two consumer tech support, most of which as a representative of Windstream, a secondary American ISP. Malware related tickets intrigued her, and her knowledge grew from fixing malware problems on thousands of client PCs. Her curiosity led her to research malware as a hobby, which grew into an interest in all things information security related. By 2011, she was already ghostwriting study material for the InfoSec Institute’s CISSP and CEH certification exam preparation programs. Ever since, she’s contributed articles on a variety of information security topics to CIO, CSO, Computerworld, SC Magazine, and 2600 Magazine. Her first solo developed PC game, Hackers Versus Banksters, had a successful Kickstarter and was featured at the Toronto Comic Arts Festival in May 2016. This October, she gave her first talk at an infosec convention, a penetration testing presentation at BSides Toronto. She considers her sociological and psychological perspective on infosec to be her trademark. Given the rapid growth of social engineering vulnerabilities, always considering the human element is vital. Editor’s Note: The opinions expressed in this guest author article are solely those of the contributor, and do not necessarily reflect those of Tripwire, Inc.
<urn:uuid:03ac3b18-89bd-4dc3-a686-dbb9589476cb>
CC-MAIN-2022-05
https://www.tripwire.com/state-of-security/security-data-protection/cyber-security/quantum-networking-end-man-middle-attacks/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320300244.42/warc/CC-MAIN-20220116210734-20220117000734-00539.warc.gz
en
0.93206
1,383
3.5
4
At any second of our lives, we can expect some 420 billion solar neutrino particles to pass through every square inch of Earth’s surface, including our bodies, our pets, our homes, our cars – all of our proud possessions. Don’t be alarmed. This has been happening since the birth of our Sun, for over four and a half billion years, even before we made our appearance on Earth, there were solar neutrinos. Neutrinos were not seen, just theorized, first by Wolfgang Pauli in 1930 in the process of beta decay, understood later to be related to the process of fusion, the source of a star’s energy. Now physicists have used the underground Borexino Detector in Italy to find the first solar neutrinos formed in the fusion process responsible for most of our Sun’s energy. Deep in the core of the Sun, roiling heat and pressure cause pairs of protons to fuse forming heavier atoms, releasing particles called neutrinos in the process. Neutrinos almost never interact with regular particles and fly straight through the empty space between atoms in our bodies and all other normal matter. Occasionally, given the right environment, a neutrino will collide with an atom and knock an electron loose. The Borexino instrument, in a laboratory 1.4 kilometres deep beneath the Italian Apennine Mountains, is a metal sphere shielded by a large tank, containing 1,000 tonnes of water to prevent neutrons and gamma rays from entering. The 2,000 photomultiplier tubes lining the walls of the sphere are intended to pick up the measurements of proton-proton (p-p) neutrinos, which form in 99% of the Sun’s fusion process. What makes fusion in the Sun, a process that showers us with phantom particles, like ghosts that pass through our bodies? Heat and light from the Sun are a result of the fusion process (more on fusion) happening in 0.2 solar radii of a million mile diameter sphere we call the Sun. Inside this zone, pressure is millions of times greater than the surface of the Earth with a temperature of more than 15 million Kelvin. On the Fahrenheit scale, that’s 26,999,540.3 degrees! The temperature and pressure is enough to keep some 8.4 x 1056 (almost an octodecillion) hydrogen atoms bumping against one another and fusing for around another 4–5 billion years before they are spent. So your life insurance for your line of descendents will be spent for quite a while! That’s because our star is not Betelgeuse, which could explode at any time, though we wouldn’t know it for 640 years – which means it could have happened 639 years, 364 days ago and we won’t know until tomorrow. But more about neutrinos…Why do we care? We know about things that physically affect us. The rest is in the realm of theory or superstition, depending on your society’s level of sophistication and your thirst for knowledge. If your level of curiosity is curbed by a personal agenda or torpor, then you and your similar-minded network of progress will be stymied, and if other networks are not similarly stymied – namely, the rest of the world—then you and your network are left hopelessly behind, mired in age-old beliefs. Science can often reveal truths that dispel myths, superstitions, and fears that humankind has dealt with for hundreds of thousands of years, even more recent mysteries that put science in the realm of science fiction. But no one could see, hear or feel neutrinos, so no mythology needed to be built around them, only sensing the visible light of the Sun whose fusion process produced neutrinos. We rebuke ideas like ghosts, for example, because we cannot see them. We can draw parallels with neutrinos, that don’t interact with particle orthodoxy. Now we can see their direct evidence because of science and hard-headed scientists with curiosity. I don’t know about ghosts and photomultiplier tubes – it’s just an example. When there is near-unanimity among scientists in believing anthropogenic climate change, then the outliers must have an agenda, that intellectual torpor, or know something we don’t know. Would you bet on the latter? Progress depends on curiosity, unbiased thinking, an open mind – the willingness to seek knowledge. Neutrinos and non-visible wavelengths are examples of things we cannot see without the enhanced tools science has brought for that clear vision. What other mysteries can we solve by having an open mind? Have you seen a wormhole in space that enables us to go from galaxy to galaxy (the movie Interstellar [should be Intergalactic] deploys them)? Our theories don’t allow us to go faster than the speed of light so science fiction uses wormholes and warp speed (warping space). But our theories are based on the science of a Type 0 Civilization (Karashev Scale). Does quantum entanglement (Wacky Physics) change this? That’s the point. Do we only believe or even consider what can be sensed and/or conditioned by the body of knowledge of a Type 0 civilization – or worse yet, based on an agenda? Jim Hoover is a recently retired systems engineer. He has advanced degrees in Economics and English. Prior to his aerospace career, he taught in high schools, and he has also served as an adjunct college instructor. He recently published a science fiction novel called Extraordinary Visitors and writes political and science columns on several websites.
<urn:uuid:5f86921e-b12b-4332-b22a-881bc873cb66>
CC-MAIN-2022-05
https://www.thebubble.org.uk/current-affairs/science-technology/science-or-agenda-neutrinos-in-question/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320300849.28/warc/CC-MAIN-20220118122602-20220118152602-00179.warc.gz
en
0.929358
1,178
3.859375
4
Researchers at the National Institute of Standards and Technology (NIST) and Wavsens LLC have developed a method for using radio signals to create real-time images and videos of hidden and moving objects, which could help firefighters find escape routes or victims inside buildings filled with fire and smoke. The technique could also help track hypersonic objects such as missiles and space debris. The new method, described in Nature Communications, could provide critical information to help reduce deaths and injuries. Locating and tracking first responders indoors is a prime goal for the public safety community. Hundreds of thousands of pieces of orbiting space junk are considered dangerous to humans and spacecraft. “Our system allows real-time imaging around corners and through walls and tracking of fast-moving objects such as millimeter-sized space debris flying at 10 kilometers per second, more than 20,000 miles per hour, all from standoff distances,” said physicist Fabio da Silva, who led the development of the system while working at NIST. This demonstration of the m-Widar (micro-Wave image detection, analysis and ranging) system shows, in the video on the left, a person walking and later crouching and lying down in an anechoic chamber. The transmitters and receiver are in a vertical line on the right side of the chamber. The second video on the right shows the instrument’s view of the same scene. About 21 seconds into the video, a wallboard is inserted between the person and the instrument in the anechoic chamber, to show that m-Widar can “see” through walls. Credit: NIST “Because we use radio signals, they go through almost everything, like concrete, drywall, wood, and glass,” da Silva added. “It’s pretty cool because not only can we look behind walls, but it takes only a few microseconds of data to make an image frame. The sampling happens at the speed of light, as fast as physically possible.” The NIST imaging method is a variation on radar, which sends an electromagnetic pulse, waits for the reflections, and measures the round-trip time to determine distance to a target. Multisite radar usually has one transmitter and several receivers that receive echoes and triangulate them to locate an object. “We exploited the multisite radar concept but in our case use lots of transmitters and one receiver,” da Silva said. “That way, anything that reflects anywhere in space, we are able to locate and image.” Da Silva explains the imaging process like this: “To image a building, the actual volume of interest is much smaller than the volume of the building itself because it’s mostly empty space with sparse stuff in it. To locate a person, you would divide the building into a matrix of cubes. Ordinarily, you would transmit radio signals to each cube individually and analyze the reflections, which is very time consuming. By contrast, the NIST method probes all cubes at the same time and uses the return echo from, say, 10 out of 100 cubes to calculate where the person is. All transmissions will return an image, with the signals forming a pattern and the empty cubes dropping out.” Da Silva has applied for a patent, and he recently left NIST to commercialize the system under the name m-Widar (microwave image detection, analysis, and ranging) through a startup company, Wavsens LLC (Westminster, Colorado). The NIST team demonstrated the technique in an anechoic (non-echoing) chamber, making images of a 3D scene involving a person moving behind drywall. The transmitter power was equivalent to 12 cellphones sending signals simultaneously to create images of the target from a distance of about 10 meters (30 feet) through the wallboard. Da Silva said the current system has a potential range of up to several kilometers. With some improvements the range could be much farther, limited only by transmitter power and receiver sensitivity, he said. The basic technique is a form of computational imaging known as transient rendering, which has been around as an image reconstruction tool since 2008. The idea is to use a small sample of signal measurements to reconstruct images based on random patterns and correlations. The technique has previously been used in communications coding and network management, machine learning and some advanced forms of imaging. Da Silva combined signal processing and modeling techniques from other fields to create a new mathematical formula to reconstruct images. Each transmitter emits different pulse patterns simultaneously, in a specific type of random sequence, which interfere in space and time with the pulses from the other transmitters and produce enough information to build an image. The transmitting antennas operated at frequencies from 200 megahertz to 10 gigahertz, roughly the upper half of the radio spectrum, which includes microwaves. The receiver consisted of two antennas connected to a signal digitizer. The digitized data were transferred to a laptop computer and uploaded to the graphics processing unit to reconstruct the images. The NIST team used the method to reconstruct a scene with 1.5 billion samples per second, a corresponding image frame rate of 366 kilohertz (frames per second). By comparison, this is about 100 to 1,000 times more frames per second than a cellphone video camera. With 12 antennas, the NIST system generated 4096-pixel images, with a resolution of about 10 centimeters across a 10-meter scene. This image resolution can be useful when sensitivity or privacy is a concern. However, the resolution could be improved by upgrading the system using existing technology, including more transmitting antennas and faster random signal generators and digitizers. In the future, the images could be improved by using quantum entanglement, in which the properties of individual radio signals would become interlinked. Entanglement can improve sensitivity. Radio-frequency quantum illumination schemes could increase reception sensitivity. The new imaging technique could also be adapted to transmit visible light instead of radio signals — ultrafast lasers could boost image resolution but would lose the capability to penetrate walls — or sound waves used for sonar and ultrasound imaging applications. In addition to imaging of emergency conditions and space debris, the new method might also be used to measure the velocity of shock waves, a key metric for evaluating explosives, and to monitor vital signs such as heart rate and respiration, da Silva said. Reference: “Continuous Capture Microwave Imaging” by Fabio C. S. da Silva, Anthony B. Kos, Grace E. Antonucci, Jason B. Coder, Craig W. Nelson and Archita Hati, 25 June 2021, Nature Communications. This work was funded in part by the Public Safety Trust Fund, which provides funding to organizations across NIST leveraging NIST expertise in communications, cybersecurity, manufacturing and sensors for research on critical, lifesaving technologies for first responders.
<urn:uuid:aec7934d-1570-4c4d-a690-107e996a24cf>
CC-MAIN-2022-05
https://scitechdaily.com/new-technology-uses-radio-signals-to-image-hidden-and-speeding-objects/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301863.7/warc/CC-MAIN-20220120130236-20220120160236-00661.warc.gz
en
0.929429
1,433
3.640625
4
Physicists set a new record by linking together a hot soup of 15 trillion atoms in a bizarre phenomenon called quantum entanglement. The finding could be a major breakthrough for creating more accurate sensors to detect ripples in space-time called gravitational waves or even the elusive dark matter thought to pervade the universe. Entanglement, a quantum phenomena Albert Einstein famously described as "spooky action at a distance," is a process in which two or more particles become linked and any action performed on one instantaneously affects the others regardless of how far apart they are. Entanglement lies at the heart of many emerging technologies, such as quantum computing and cryptography. Entangled states are infamous for being fragile; their quantum links can be easily broken by the slightest internal vibration or interference from the outside world. For this reason, scientists attempt to reach the coldest temperatures possible in experiments to entangle jittery atoms; the lower the temperature, the less likely atoms are to bounce into each other and break their coherence. For the new study, researchers at the Institute of Photonic Science (ICFO) in Barcelona, Spain, took the opposite approach, heating atoms to millions of times hotter than a typical quantum experiment to see if entanglement could persist in a hot and chaotic environment. "Entanglement is one of the most remarkable quantum technologies, but it is famously fragile," said Jia Kong, a visiting scientist at ICFO and lead author of the study. "Most entanglement-related quantum technology has to be applied in a low-temperature environment, such as a cold atomic system. This limits the application of entanglement states. [Whether or not] entanglement can survive in a hot and messy environment is an interesting question." Things get hot and messy The researchers heated a small glass tube filled with vaporized rubidium and inert nitrogen gas to 350 degrees Fahrenheit (177 degrees Celsius), coincidentally the perfect temperature to bake cookies. At this temperature, the hot cloud of rubidium atoms is in a state of chaos, with thousands of atomic collisions taking place every second. Like billiard balls, the atoms bounce off each other, transferring their energy and spin. But unlike classical billiards, this spin does not represent the physical motion of the atoms. In quantum mechanics, spin is a fundamental property of particles, just like mass or electric charge, that gives particles an intrinsic angular momentum. In many ways, the spin of a particle is analogous to a spinning planet, having both angular momentum and creating a weak magnetic field, called a magnetic moment. But in the wacky world of quantum mechanics, classical analogies fall apart. The very notion that particles like protons or electrons are rotating solid objects of size and shape doesn't fit the quantum worldview. And when scientists try to measure a particle's spin, they get one of two answers: up or down. There are no in-betweens in quantum mechanics. Fortunately, the tiny magnetic fields created by a particle's spin allow scientists to measure spin in a number of unique ways. One of those involves polarized light, or electromagnetic waves that oscillate in a single direction. The researchers shot a beam of polarized light at the tube of rubidium atoms. Because the atoms' spins act like tiny magnets, the polarization of the light rotates as it passes through the gas and interacts with its magnetic field. This light-atom interaction creates large-scale entanglement between the atoms and the gas. When researchers measure the rotation of the light waves that come out the other side of the glass tube, they can determine the total spin of the gas of atoms, which consequently transfers the entanglement onto the atoms and leaves them in an entangled state. "The [measurement] we used is based on light-atom interaction," Kong said. "With proper conditions, the interaction will produce correlation between light and atoms, and then if we do correct detection, the correlation will be transferred into atoms, therefore creating entanglement between atoms. The surprising thing is that these random collisions didn't destroy entanglement." In fact, the "hot and messy" environment inside the glass tube was key to the experiment's success. The atoms were in what physicists call a macroscopic spin singlet state, a collection of pairs of entangled particles' total spin sums to zero. The initially entangled atoms pass their entanglement to each other via collisions in a game of quantum tag, exchanging their spins but keeping the total spin at zero, and allowing the collective entanglement state to persist for at least a millisecond. For instance, particle A is entangled with particle B, but when particle B hits particle C, it links both particles with particle C, and so on. This "means that 1,000 times per second, a new batch of 15 trillion atoms is being entangled," Kong said in a statement. One millisecond "is a very long time for the atoms, long enough for about 50 random collisions to occur. This clearly shows that the entanglement is not destroyed by these random events. This is maybe the most surprising result of the work." Because the scientists are only able to understand the collective state of the entangled atoms, the application of their research is limited to special uses. Technologies like quantum computers are likely out of the question, since the state of individually entangled particles needs to be known to store and send information. However, their results may help to develop ultra-sensitive magnetic field detectors, capable of measuring magnetic fields more than 10 billion times weaker than Earth's magnetic field. Such powerful magnetometers have applications in many fields of science. For example, in the study of neuroscience, magnetoencephalography is used to take images of the brain by detecting the ultra-faint magnetic signals given off by brain activity. "We hope that this kind of giant entangled state will lead to better sensor performance in applications ranging from brain imaging, to self-driving cars, to searches for dark matter," Morgan Mitchell, a professor of physics and the lab's group leader, said in the statement. Their results were published online May 15 in the journal Nature Communications. - The 18 biggest unsolved mysteries in physics - The 11 biggest unanswered questions about dark matter - The 15 weirdest galaxies in our universe Originally published on Live Science. For a limited time, you can take out a digital subscription to any of our best-selling science magazines for just $2.38 per month, or 45% off the standard price for the first three months.
<urn:uuid:a199f0f9-0d1b-4557-8366-024b5610294f>
CC-MAIN-2022-05
https://www.livescience.com/physicists-entangle-15-trillion-hot-atoms.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303779.65/warc/CC-MAIN-20220122073422-20220122103422-00582.warc.gz
en
0.920769
1,333
3.796875
4
Quantum computing is to classical computing what space travel is to the horse and cart. Comparisons have even been made to the cognitive awakening of early man. Then again, every generation believes they are in the grip of advanced and unparalleled technology. This may be true, but despite the incredible scientific breakthroughs of past centuries, we still don’t really understand how vast swathes of our world works, including our own bodies. Quantum computing has the potential to address these myriad gaps on a granular level as never before. Some predict that this novel technology will create whole new families of drugs and diagnostic processes, new industrial and chemical processes, new ways to address climate change, new methods of logistics and transport, advanced space exploration, surveillance and monitoring… the list goes on. Most estimations put working quantum computers within a future window of between five and 20 years. GlobalData predicts quantum supremacy – when quantum computers surpass classical computers in computational power and accuracy – will be achieved within five years, but this timeframe will deliver intermediate quantum computers that offer an advantage for specific optimisation applications rather than a full spectrum of use cases. The important milestone, however, is that the technology has left the lab and is on the cusp of commercialisation: early adoption and exploration by businesses is already under way. What is quantum computing? Today’s classical computers are based on information stored on binary bits, which are transistors represented by either 0s or 1s. The computing power is linear and increases with the number of transistors. This means that the main limitation of classical computing is a finite level of processing power that can be held on a chip. All calculations are deterministic with the same input resulting in the same output, and all processing is carried out in sequential order. Instead of classic computing’s binary processing, quantum computing uses the properties of quantum physics: the counterintuitive behaviour of subatomic particles that results in the quantum states of superposition and entanglement. Quantum computing bits are called qubits and have the ability to represent 0 and 1 simultaneously. By increasing qubits, the computational power grows exponentially, not linearly. For example, think about the problem of finding a way out of a complex maze where there are millions of possible exit routes. A classical computer using binary processing would check each escape route one after the other in a linear manner until it found a correct solution. A quantum computer, on the other hand, would test all possible escape routes simultaneously and come up with a solution in a fraction of the time. This means the theoretical limits of quantum computing are endless and its computational power is in order of magnitudes greater than classical computing. According to IBM, if you wanted to find one item in a list of one trillion and each item took one microsecond to check, a classical computer would take a week to complete this task versus only a second for a quantum computer. How will quantum computing change the business world? According to Markets and Markets, the quantum computing market is expected to reach $1.77bn by 2026, up from $472m in 2021. This level of investment in such an unproven technology demonstrates a consensus about its potential for disruption. Mass commercial applications could transform everything from drug discovery and disease diagnostics to calculating financial risk and refining industrial processes. Potential applications include: - pharmaceutical industry: drug discovery, disease diagnostics and personalised medicine through gene sequencing and analysis - optimisation problems: supply chain logistics, delivery fleet optimisation, mapping, traffic/air traffic control and transport systems - climate change: forecasting, climate modelling and carbon capture technologies (the UK Met Office is already investing in quantum computing to help improve weather forecasting) - financial services: forecasting financial risk with complex financial modelling - machine learning: the convergence of quantum computing and artificial intelligence (AI) has the potential to be a game changer. The ability to analyse huge quantities of data using quantum computing will provide the information needed for high-performance AI. Where is quantum computing’s global centre of gravity? The US and China are locked in a battle for global quantum supremacy. The US launched its National Quantum Initiative in 2019, pledging $1.2bn over five years. In 2020, the White House Office of Science and Technology Policy, together with the National Science Foundation and the Department of Energy, announced a fund of $1bn to establish 12 AI and quantum information science research centres nationwide. Similarly, in 2016, China’s 13th five-year plan included the aspiration to become the pre-eminent global quantum computing and communication superpower. Indeed, China leads in quantum communications via satellites and long-path optical fibres, launching the world’s first quantum satellite, Micius, in 2016. China is also building a Quantum Information Sciences National Laboratory with initial funding of $1bn. Patent data from GlobalData demonstrates that the US and China are at the global forefront of the sector’s technology development. The UK, however, punches above its weight as a pioneer in the quantum computing sector. The National Quantum Technology Programme (NQTP) was established in 2013 with an estimated public and private sector investment of £1bn by 2024, according to the NQTP’s 2020 strategic intent report. Promising start-ups include Cambridge Quantum Computing and Oxford Quantum Circuits, with a major sector hub evolving around Oxford University. Global quantum computing hubs of note have also developed in Australia, Canada, Germany, Japan, Russia, Singapore and South Korea. All global hubs have the backing of policymakers and have been the beneficiaries of concerted efforts by governments recognising the need to stay abreast of this emerging technology. Public funding of quantum technologies is said to have reached $24.4bn globally by mid-2021 in an estimate by quantum resources and careers company Quereca. The quantum computing private sector is seeing significant growth, with deal size and numbers increasing. According to Pitchbook, private investment in quantum computing companies reached $1.02bn by September 2021, more than the combined figure for the previous three years. Consolidation is beginning, which indicates a maturing of the sector. Deal activity demonstrates this flood of private investment, with the US at the forefront. Quantum threats and challenges For all the potential advantages, the technology behind quantum computers has many hurdles to clear before it becomes ready for market. A quantum computer is still prohibitively expensive for most companies or organisations to own. For now, exploration is taking place in the cloud with shared services the preferred way to access the technology. Common standards are still being worked out and possible qubit architectures are still in formation (with five main quantum computing architectures in contention). These various methods are being used by start-ups and tech giants alike and most look promising, but none have dominated the market. For example, Google is leading in the area of superconducting qubits, Silicon Valley based start-up PsiQuantum is pioneering photonic qubits and UK start-up Cambridge Quantum Computing uses trapped ion qubits. No matter the qubit architecture, until the high error rates in quantum computing outcomes are fixed, the technology will not be widely used for real-world problems. Quantum computing companies are also struggling to attract and retain talent, and this will be a significant future challenge for the sector. The greatest risk quantum technology poses is its potential to decode all current cryptography. Businesses need to become alert to the security risks that are likely to ensue with quantum supremacy. However, according to the Global Risk Institute, it will be at least ten years before such attacks are feasible. While most industry insiders believe quantum supremacy will be achieved within a decade, public perception of the risk timeline is somewhat out of step and therefore businesses are lagging on mitigating potential risks. A survey from the Global Risk Institute assessing the quantum risk timeline found that 90% of respondents indicated the quantum threat timeline was nearer the 20-year mark. GlobalData’s prediction of five years for quantum supremacy comes with caveats. Even when the hardware and software are available, businesses still need to know how to use quantum computing and understand that it may not be a panacea for business problems. The analyst says the technology will be exceedingly useful but initially for a narrow and well-defined set of problems. For now, IT departments should keep abreast of developments, but it is other parts of the company that will need to be prepared for problems that are amenable to quantum computing solutions. No business needs to have a quantum computer just yet, but early-mover exploration is accessible in the cloud and it is time to start thinking about the possibilities.
<urn:uuid:766ca38a-1230-4c2d-ae32-ff7b6c18305d>
CC-MAIN-2022-05
https://www.investmentmonitor.ai/tech/what-is-quantum-computing-and-how-will-it-impact-the-future
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320300722.91/warc/CC-MAIN-20220118032342-20220118062342-00343.warc.gz
en
0.932651
1,765
3.65625
4
Google has just announced that it’s achieved “quantum supremacy” by using a quantum computer it built to perform a test computation that would take even the most powerful non-quantum supercomputers thousands of years to execute. It’s an early yet significant milestone in the development of quantum computers—one which Google refers to as the field’s “hello world” moment—because it stands as a proof-of-concept of the real-world applications of this technology, which will likely one day include everything from creating unbreakable encryption to modeling quantum systems to helping AI program itself. The idea that Google researchers—who published their breakthrough in a paper in the journal Nature—are dealing with science and technology that’s just barely on the edge of humanity’s understanding isn’t hyperbolic in the least considering the complexity of quantum computers. At their core, quantum computers distinguish themselves from non-quantum computers thanks to their use of quantum bits, or qubits, of information. Unlike binary bits of information, which are made up of either zeroes or ones, qubits are “superpositions” of both zeroes and ones. This means that qubits exist in two equally valid states simultaneously. It’s difficult to grasp exactly what this means because, as Google itself notes, this kind of dual-state existence runs so counter to our normal day-to-day experiences. A big part of the reason the physical rules that govern the quantum world and quantum computing are so difficult to grasp is because we don’t have any useful references, or even metaphors, for the way the subatomic world works. (How often is your sandwich both there and not there in front of you at lunch?) But on the quantum scale, particles existing in a superposition of multiple states is the norm. While this feature of the quantum world is totally counterintuitive to our everyday lives, it does make the existence of qubits possible, which are useful because their dual nature as both zeroes and ones means they can be used to perform much more complex calculations much faster than normal computers. In other words quantum computers can have exponentially more computational power relative to normal computers thanks to the fact that they have access to far more computations far more quickly. Excited about what quantum computing means for the future – it gives us another way to speak the language of the universe and better understand the world, not just in 1s and 0s but in all of its states: beautiful, complex, and with limitless possibility. https://t.co/P6YX4KguMX — Sundar Pichai (@sundarpichai) October 23, 2019 For instance, if you have two bits of information, they can exist in four possible states (00,11,01,10) whereas two qubits of information would allow you to put those four possible states in superposition, meaning those four states, plus any variation of all four states, all exist simultaneously. In practice, this would mean performing calculations much faster thanks to the availability of exponentially more information, which is exactly what’s happened with this landmark quantum supremacy computation. In the case of Google’s quantum computer, a 54-qubit processor named “Sycamore” has been deployed, which can simultaneously be in 2^54 possible computational states. Or—wait for it—18,014,398,509,481,984 simultaneous computational states. Yes, this means quadrillions of simultaneous computational states. The technology giant’s #Sycamore quantum processor was able to perform a specific task in 200 seconds that would take the world’s best supercomputers 10,000 years to complete. pic.twitter.com/kYGXI4QiWW — Nicholas Stevenson (@NSR_Stevenson) October 23, 2019 Looking forward, there is an enormous amount of research to be done and an equally enormous number of breakthroughs to be made before quantum processors like Google’s Sycamore become widely available. No one source found while researching this article gave a concrete estimate in terms of when quantum computing will become commonplace, but the general consensus seems to be that it will take decades. But once quantum computers like these do come online, Google says they’ll be able to help with everything from solving complex climate change problems to helping to find cures for diseases to coming up with more efficient battery designs to “simulating the world on a molecular level.” Which means this “hello world” moment is going to mean us saying hello back to worlds that we literally can’t yet imagine, but one day, with quantum computing’s help, will be able to. What do you think of Google’s “quantum supremacy” moment? And where do you think quantum computing is going to take us in the decades to come? Let us know in the comments!
<urn:uuid:61862324-86b2-4993-b9e1-4b88edee6fd1>
CC-MAIN-2022-05
https://nerdist.com/article/why-google-achieving-quantum-supremacy-huge-deal/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304134.13/warc/CC-MAIN-20220123045449-20220123075449-00543.warc.gz
en
0.934092
1,035
3.53125
4
Technology Research News Give an electron two paths to get to one location and it will usually take both. This fact of quantum physics plays a leading role in a computer architecture that could replace today's chip technology when it reaches its limits in a decade or so. According to the laws of quantum physics, electrons are waves as well as particles. Like ocean waves, where two crests meet they reinforce each other and where a crest and trough meet they cancel each other out. Researchers at University of Missouri at Rolla have devised a scheme for using electron wave interference to represent the ones and zeros of digital Traditional electronic computers use combinations of transistors, which are tiny electronic switches, as the logic units that perform the binary arithmetic at the heart of digital computing. Electron wave computers would use networks of microscopic wire rings that form the two paths for the electron waves to follow, said Cheng-Hsiao Wu, a professor of electrical and computer engineering at the University of Missouri at Rolla. "You do not need transistors to control the flow of charge if all the devices involved are very small and at low temperature," said Wu. The researchers' proposal involves using modified forms of Aharonov-Bohm rings, which are used in basic physics research, to form the logic gates of computers. Aharonov-Bohm rings are circles of extremely thin wire and are commonly made several times smaller than a red blood cell. Due to their wave nature, electrons entering the Aharonov-Bohm rings travel in both directions at once, meeting -- and reinforcing each other -- at the Using a magnetic field perpendicular to the ring, researchers can speed up or slow down the electron wave traveling in one side of the ring, throwing the waves in the two sides out of sync and causing the waves to cancel each other out when they meet at the other end. The reinforced waves and the canceled waves could represent the ones and zeros of computing, according Aharonov-Bohm rings have an input and an output terminal. The researchers' scheme calls for making three- and four-terminal Aharonov-Bohm rings. Their work shows that three-terminal rings could be combined to form IF-THEN, XOR, OR, AND and INVERTER logic units. These logic units could, in turn, be combined to form half adders and full adders. A half adder adds two binary numbers but cannot carry, and a full adder includes the carry function. A single, four-terminal Aharonov-Bohm ring could also be used as a half adder, said Wu. "It replaces eight transistors for the same function." And two connected four-terminal Aharonov-Bohm rings could serve as a full adder. "This replaces about two dozen transistors in traditional microelectronic circuits," he said. In addition to the potential for making smaller, and therefore faster, computer circuits, electron wave computers could solve certain problems faster than even the fastest ordinary computer by examining all of the possible solutions to a problem at once, according to Wu. Electron wave interference could be used to make massively parallel processing computers, he said. "Millions of inputs enter a large network [of rings] simultaneously with desirable outputs when the waves arrive at the output terminals. This is similar to optical computing." Optical computers use light waves that reinforce and cancel each other out. Last year, researchers at the University of Rochester demonstrated an optical computer running a quantum search algorithm. The electron wave scheme is an idea worth trying, said Ian Walmsley, a professor of experimental physics at the University of Oxford and a professor of optics at the University of Rochester. "The nice thing about electrons is that [their] wavelengths are inherently smaller than optical wavelengths, so the whole machine can be smaller. At present I see the advance as a technical one rather than a fundamental one," he added. "It's a very neat idea but... completely theoretical," said Mike Lea, a professor of physics at the University of London. "I'd be quite skeptical about claims without at least some analysis of the likely practicalities based on real experiments," he said. The researchers are working out the physics for larger networks of Aharonov-Bohm rings, said Wu. "I would like to convince experimentalists elsewhere to simply extend the original Aharonov-Bohm effect to three or four terminals. I promise nice results will come out of such a simple extension," he said. Given that today's semiconductor technology is likely to reach its limits by the year 2015, researchers and engineers should have a good idea of how to build devices smaller than 10 nanometers by then, said Wu. At that point, electron wave computing could be a contender for the next generation computer architecture, he said. Wu's research colleague was Diwakar Ramamurthy. They published the research in the February 15, 2002 issue of the journal Physical Review B. The research was funded by the university. Timeline: 13 years TRN Categories: Quantum Computing and Communications; Integrated Story Type: News Related Elements: Technical paper, "Logic Functions from Three-Terminal Quantum Resistor Networks for Electron Wave Computing," Physical Review B, February 15, 2002 Electron waves compute Porous glass makes Internet map improves Magnets channel biomatter Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:b3ed1043-3299-4eb2-b771-37a19ba31fd4>
CC-MAIN-2022-05
http://trnmag.com/Stories/2002/040302/Electron_waves_compute_040302.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303356.40/warc/CC-MAIN-20220121101528-20220121131528-00625.warc.gz
en
0.905964
1,307
3.921875
4
emits linked photons Technology Research News The way lasers work can only be explained by quantum physics, the realm of atoms and subatomic particles. Lasers stimulate already-energized atoms, causing them to emit energy in the form of photons, the particles of light. A team of researchers at the University of Oxford in England is taking the technology deeper into the bizarre regions of quantum physics with the development of a rudimentary laser that produces linked pairs of photons. The work promises to make perfectly secure communications devices more practical and advance long-term efforts to build ultra-powerful quantum The device makes it easier to produce linked, or entangled, sets of two or even four photons. The researchers have demonstrated "laser-like operation" for entangled photons, said Antia Lamas-Linares, a graduate student at the University of Oxford. When two or more quantum particles become entangled, one or more of their properties march in lockstep. For example, two photons can have their polarizations, or electric field orientations, entangled. But when photons are entangled they exist in an unmeasurable netherworld of quantum mechanics where they are in some mixture of all possible polarizations until one of the pair is observed or otherwise comes into contact with the environment. When this happens, both photons are knocked out of entanglement and into the same definite polarization, regardless of the physical distance The usual way of producing pairs of entangled photons is shining ultraviolet laser light into a crystal, which transforms a tiny percentage of the ultraviolet photons into entangled pairs of infrared photons. The Oxford device bounces the entangled photon pairs back into the crystal while the laser is still shining on it. For each pair sent back into the crystal, four new pairs are generated. The laser action produces more pairs of entangled photons for the same amount of power as non-lasing schemes, "and, perhaps more importantly, higher-number entangled photon states," she said. Ordinary conversion produces about 5,000 detectable photon pairs per second, said Lamas-Linares. "Our source in its current form would produce four times more pairs, and the number would grow exponentially with the number of passes." In addition, the device entangles groups of four photons. "Current sources produce about one 4-photon state per minute, while our source will amplify this by a factor of 16, making it feasible to perform experiments on them," she said. The Oxford device currently passes the light through the crystal only twice. Ordinary lasers use a reflective chamber, or cavity, to bounce light back and forth through a gas hundreds of times, each pass causing the gas atoms to emit more photons. The researchers' next step is to add a reflective cavity to their device, making it more like a true laser and multiplying further the number of entangled photons it could produce. "We are working on building a cavity system... to obtain a more conventional lasing action," said Lamas-Linares. The goal is to produce a device that can generate useful numbers of pairs of entangled photons. "Entanglements are the main resource in quantum information," said Lamas-Linares. "One of the main problems in the field currently is to produce entanglement in a controllable and reliable way." Current sources of entangled photons are not bright enough for some proposed quantum information processing experiments and a brighter source would make them possible, said Paul Kwiat, a professor of physics at the University of Illinois. A true entangled-photon laser "would be a very bright source of entanglement," he said. The Oxford source of entangled photons could be used for quantum cryptography in five years and is currently being used as a tool by physicists to explore the fundamentals of quantum mechanics, said Lamas-Linares. "That is really our main interest," she said. Lamas-Linares' research colleagues were John C. Howell and Dik Bouwmeester of the University of Oxford. They published the research in the August 30, 2001 issue of the journal Nature. The research was funded by the UK Engineering and Physical Sciences Research Council (EPSRC), the UK Defense Evaluation and Research Agency and the European Union (EU). Timeline: 5 years TRN Categories: Quantum Computing Story Type: News Related Elements: Technical paper, "Stimulated Emission of Polarization-Entangled Photons," Nature, August 30, 2001 Hubs key to Net viruses water spins gold into wire Virtual reality gets easier Laser emits linked photons Dye brightens micromachines Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:212d2259-322c-41f2-afd9-b8afd617282c>
CC-MAIN-2022-05
http://trnmag.com/Stories/2001/110701/Laser_emits_linked_photons_110701.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320300574.19/warc/CC-MAIN-20220117151834-20220117181834-00225.warc.gz
en
0.877567
1,114
3.6875
4
Back in February 2020, scientists from the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago revealed that they had achieved a quantum entanglement — in which the behavior of a pair two tiny particles becomes linked, so that their states are identical — over a 52-mile (83.7 kilometer) quantum-loop network in the Chicago suburbs. You may be wondering what all the fuss is about, if you're not a scientist familiar with quantum mechanics — that is, the behavior of matter and energy at the smallest scale of reality, which is peculiarly different from the world we can see around us. But the researchers' feat could be an important step in the development of a new, vastly more powerful version of the internet in the next few decades. Instead of the bits that today's network uses, which can only express a value of either 0 or 1, the future quantum internet would utilize qubits of quantum information, which can take on an infinite number of values. (A quibit is the unit of information for a quantum computer; it's like a bit in an ordinary computer). That would give the quantum internet way more bandwidth, which would make it possible to connect super-powerful quantum computers and other devices and run massive applications that simply aren't possible with the internet we have now. "A quantum internet will be the platform of a quantum ecosystem, where computers, networks, and sensors exchange information in a fundamentally new manner where sensing, communication, and computing literally work together as one entity, " explains David Awschalom via email. He's a spintronics and quantum information professor in the Pritzker School of Molecular Engineering at the University of Chicago and a senior scientist at Argonne, who led the quantum-loop project. Explaining the Quantum Internet So why do we need this and what does it do? For starters, the quantum internet is not a replacement of the regular internet we now have. Rather it would be a complement to it or a branch of it. It would be able to take care of some of the problems that plague the current internet. For instance, a quantum internet would offer much greater protection from hackers and cybercriminals. Right now, if Alice in New York sends a message to Bob in California over the internet, that message travels in more or less a straight line from one coast to the other. Along the way, the signals that transmit the message degrade; repeaters read the signals, amplify and correct the errors. But this process allows hackers to "break in" and intercept the message. However, a quantum message wouldn't have that problem. Quantum networks use particles of light photons to send messages which are not vulnerable to cyberattacks. Instead of encrypting a message using mathematical complexity, says Ray Newell, a researcher at Los Alamos National Laboratory, we would rely upon the peculiar rules of quantum physics. With quantum information, "you can't copy it or cut it in half, and you can't even look at it without changing it." In fact, just trying to intercept a message destroys the message, as Wired magazine noted. That would enable encryption that would be vastly more secure than anything available today. "The easiest way to understand the concept of the quantum internet is through the concept of quantum teleportation," Sumeet Khatri, a researcher at Louisiana State University in Baton Rouge, says in an email. He and colleagues have written a paper about the feasibility of a space-based quantum internet, in which satellites would continually broadcast entangled photons down to Earth's surface, as this Technology Review article describes. "Quantum teleportation is unlike what a non-scientist's mind might conjure up in terms of what they see in sci-fi movies, " Khatri says. "In quantum teleportation, two people who want to communicate share a pair of quantum particles that are entangled. Then, through a sequence of operations, the sender can send any quantum information to the receiver (although it can't be done faster than light speed, a common misconception). This collection of shared entanglement between pairs of people all over the world essentially constitutes the quantum internet. The central research question is how best to distribute these entangled pairs to people distributed all over the world. " Once it's possible to do that on a large scale, the quantum internet would be so astonishingly fast that far-flung clocks could be synchronized about a thousand times more precisely than the best atomic clocks available today, as Cosmos magazine details. That would make GPS navigation vastly more precise than it is today, and map Earth's gravitational field in such detail that scientists could spot the ripple of gravitational waves. It also could make it possible to teleport photons from distant visible-light telescopes all over Earth and link them into a giant virtual observatory. "You could potentially see planets around other stars, " says Nicholas Peters, group leader of the Quantum Information Science Group at Oak Ridge National Laboratory. It also would be possible for networks of super-powerful quantum computers across the globe to work together and create incredibly complex simulations. That might enable researchers to better understand the behavior of molecules and proteins, for example, and to develop and test new medications. It also might help physicists to solve some of the longstanding mysteries of reality. "We don't have a complete picture of how the universe works," says Newell. "We have a very good understanding of how quantum mechanics works, but not a very clear picture of the implications. The picture is blurry where quantum mechanics intersects with our lived experience." Challenges of Building the Quantum Internet But before any of that can happen, researchers have to figure out how to build a quantum internet, and given the weirdness of quantum mechanics, that's not going to be easy. "In the classical world you can encode information and save it and it doesn't decay, " Peters says. "In the quantum world, you encode information and it starts to decay almost immediately. " Another problem is that because the amount of energy that corresponds to quantum information is really low, it's difficult to keep it from interacting with the outside world. Today, "in many cases, quantum systems only work at very low temperatures," Newell says. "Another alternative is to work in a vacuum and pump all the air out. " In order to make a quantum internet function, Newell says, we'll need all sorts of hardware that hasn't been developed yet. So it's hard to say at this point exactly when a quantum internet would be up and running, though one Chinese scientist has envisioned that it could happen as soon as 2030. Originally Published: Mar 30, 2020
<urn:uuid:98b733e5-ad5c-47e0-9e29-255fe5981707>
CC-MAIN-2022-05
https://electronics.howstuffworks.com/future-tech/quantum-internet.htm
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301730.31/warc/CC-MAIN-20220120065949-20220120095949-00106.warc.gz
en
0.938152
1,362
3.59375
4
Artificial Intelligence has been around since the 1950's. Alan Turing envisioned a machine that could think. He devised a test, aptly named the Turing Test, published in an article titled Computing Machinery and Intelligence. He proposed the notion that a computational machine could answer a series of questions from a panel of judges. The responses would be rational, thoughtful, and indistinguishable to another human. Prior to that, Turing spent a decade creating the blueprint for "machine intelligence". John McCarthy, a distinguished professor at MIT coined the term Artificial Intelligence and organized an international conference dedicated to the pursuit of AI. There he met Marvin Minsky and together they worked to advance the theories and concepts of bringing AI to life. They used the LISP language as their programming language of choice. They ran into some obstacles, included limited processing power, limited storage capacity, high costs, lack of funding and the underlying complexity surrounding the concepts involved. In the mid-1960s, an MIT professor, Joseph Weizenbaum, created a computer program named ELIZA. It simulated a virtual "doctor" and was able to interpret natural language input with a somewhat intelligent set of responses. Although limited in functionality, it gets credited for being one of the first AI programs in existence. In the mid 1980's, a concept known as Backpropagation was created. This technology leveraged complex algorithms to process information based on a known set of data. The program received a set of input data, flowed through a series of Neurons, which performed a calculation, produced a number between 0 and 1, and based on the configuration, it either fired a signal to another connected neuron, similar to the synapses in the human brain. It then flowed into a series of "hidden" neurons which also had complex calculations. Finally, the data flowed to the end and produced a final response. That response was compared to the known data and if differences appeared, the set of data was flowed backwards through the neural network to recalculate the weights on each of the neurons. Soon, multi layered neural networks were created to increase the capacity and accuracy. In 1984, the entire field of Artificial Intelligence slowed down. At a conference of American Association of Artificial Intelligence, the term AI Winter was used to define this stagnation in the field. Basically, the hype surrounding AI was under scrutiny by the funding groups such as government bureaucrats and venture capitalists. This pessimism pushed the AI field into obscurity for some time. In the early to mid-1990s, AI was becoming known in the business world. Two of the main reasons were: the increase in compute power; and isolating specific problems within specific domains. “An Intelligent Agent is a system that perceives its environment and takes action which maximize its chances of success.” In 1997, IBM's Deep Blue knocked off the world chess champion Garry Kasparov. Strong vs. Weak Artificial Intelligence In the pursuit of Artificial Intelligence, there are basically two camps. Weak or Narrow Artificial Intelligence Personal Assistants typically reside on computers and smart phones. They can learn the behavior of the individual using the application. Preferences, places of travel, history of searches and browser trails, the application is able to inform its users to alter a course of action based on given parameters in real time. These assistants are becoming more precise as they are embedded into everyday applications. Strong or Artificial General Intelligence The second type of AI is known as Strong AI or Artificial General Intelligence. Strong AI or AGI is the intelligence of a machine that match or surpass the abilities of a human being in performing tasks. Some of its characteristics are having the ability to learn, reason, communicate in natural language, possess creativity, have morals and be self-aware. When computers become aware of themselves, they will be able to recursively build machines that are more intelligent than themselves. This will lead to an exponential increase in the pace of progress, eventually moving the intelligence of machines beyond the comprehension of human beings. This hypothetical event is commonly referred to as the Singularity. Artificial Neural Networks Artificial neural networks (ANN) are about function approximation. Basically, you have a Neural Network. The Neural Network takes in Input. That input is interpreted by Neurons. These neurons have approximated weights. Based on the results of the calculation, if they exceed a specified threshold, it fires a 1 or 0 as output, which is sent downstream to other Neurons. It's possible to over-train a model, in which the output has to contort itself drastically, which sends the results into a tailspin and results become nonsense. Due to the complexity of the human brain, researchers have not been able to reproduce it at this point in time. Artificial Neural Networks attempt to simulate that complexity, with some level of accuracy. There are pre-canned packages you can purchase which do all the heavy lifting for you, exposing this technology to more people. Artificial Quantum Neural Networks Researchers are combining Quantum Physics with computers. Although this research is in the early stages, these computers are known as Quantum Turing Machines. In classical computing, everything is based off of the concept of “binary” or being in the state 0 or 1. In the Quantum world, the binary unit is replaced with a unit named Quantum Bit or Qubit for short. When Quantum Mechanic principles are applied, the neural unit can result in more than two states, 0, 1 or both. This has great implications as these new systems can perform calculations extremely fast and can actually solve some problems deemed impossible in the classical binary approach. One example is the ability for quantum computers to decrypt public keys, which are the foundation for internet security today. The underlying rules that make up Quantum physics are quite complex. One company leading the charge with Quantum computers is D-Wave. They define Quantum Computation as follows: “Rather than store information as 0s or 1s as conventional computers do, a quantum computer uses qubits – which can be a 1 or a 0 or both at the same time. This “quantum superposition”, along with the quantum effects of entanglement and quantum tunneling, enable quantum computers to consider and manipulate all combinations of bits simultaneously, making quantum computation powerful and fast.“ This cutting edge technology is making strides in problem solving and could potentially be used to advance the world of Artificial Intelligence by leaps and bounds. Morals and Ethics With the rise of intelilgent machines, some thought needs to be spent on the Ethical consequences. How will AI and Robots behave amongst humans? What will determine the moral blueprints for acceptable behavior? What if an AI kills a human? Would it go to machine prison? Could it get married? Or buy insurance? Who owns it? Will machines have funerals? Can a machine be sold? What if it steals? Should they be entitled to vote? What if your machine gets stolen or abducted? Can they reproduce? If so, are they responsible for the care of their youth until they graduate from High School? Can you euthanize a robot? Do robots get paid for services rendered? Every day we get closer to the reality of Artificial General Intelligence. At some point in the future, machines will be integrated into our society. It's up to us, now, to determine the roles, rights, duties and responsibilities assigned to our new intelligent beings. Current AI Organizations One organization dedicated to the pursuit of Artificial Intelligence was created by Paul Allen, one of the original founders of Microsoft, called Allen Institute for Artificial Intelligence with the moto: “Our mission is to contribute to humanity through high-impact AI research and engineering. “OpenAI is a non-profit artificial intelligence research company. Our goal is to advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return.” “We’re committed to advancing the field of machine intelligence and developing technologies that give people better ways to communicate. In the long term, we seek to understand intelligence and make intelligent machines.” Microsoft Research has an Artificial Intelligence (AI) Group. “The Artificial Intelligence (AI) group consists of an elite team of researchers who have strong expertise in artificial intelligence, machine learning, game theory, and information retrieval. The group is devoted to the following research directions: large-scale distributed machine Learning, cloud computing, robot, game-theoretic machine learning, and deep learning techniques for text mining.” IBM has a research team, going back to the 1950’s when AI was first introduced. IBM is known for their Cognitive Machine called Watson. Another leading tech company, Baidu, has a research facility in Silicon Valley called Silicon Valley AI Lab. Intelligent Machines have grown since the early days of the mid 1950's. With the increases in storage capacity, compute power, accessibe software, shared knowledge and technogology advances over the past 60 years, we've witness the rise of Artificial Intelligence into smart applications called Personal Assistants. How soon until we make another leap into the world of Artificial General Intelligence, where machines can learn and interact in real time and pass the Turning test? How soon will machines work side by side with humans to solve complex problems, reduce costs and make the world a better place. True artificial intelligence could be integrated into mainstream society sooner than we think.
<urn:uuid:70829e42-c471-455e-be9f-4e9ca90b5f9d>
CC-MAIN-2022-05
https://resources.experfy.com/bigdata-cloud/rise-of-intelligent-machines-as-artificial-intelligence-goes-mainstream/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304883.8/warc/CC-MAIN-20220129092458-20220129122458-00705.warc.gz
en
0.94881
1,944
3.78125
4
Mapping quantum structures with light to unlock their capabilities Rather than installing new “2D” semiconductors in devices to see what they can do, this new method puts them through their paces with lasers and light detectors. A new tool that uses light to map out the electronic structures of crystals could reveal the capabilities of emerging quantum materials and pave the way for advanced energy technologies and quantum computers, according to researchers at the University of Michigan, the University of Regensburg and the University of Marburg. A paper on the work is published in Science. Applications include LED lights, solar cells and artificial photosynthesis. “Quantum materials could have an impact way beyond quantum computing,” said Mackillo Kira, a professor of electrical engineering and computer science at the University of Michigan, who led the theory side of the new study. “If you optimize quantum properties right, you can get 100% efficiency for light absorption.” Silicon-based solar cells are already becoming the cheapest form of electricity, although their sunlight-to-electricity conversion efficiency is rather low, about 30%. Emerging “2D” semiconductors, which consist of a single layer of crystal, could do that much better—potentially using up to 100% of the sunlight. They could also elevate quantum computing to room temperature from the near-absolute-zero machines demonstrated so far. “New quantum materials are now being discovered at a faster pace than ever,” said Rupert Huber, a professor of physics at the University of Regensburg in Germany, who led the experimental work. “By simply stacking such layers one on top of the other under variable twist angles, and with a wide selection of materials, scientists can now create artificial solids with truly unprecedented properties.” The ability to map these properties down to the atoms could help streamline the process of designing materials with the right quantum structures. But these ultrathin materials are much smaller and messier than earlier crystals, and the old analysis methods don’t work. Now, 2D materials can be measured with the new laser-based method at room temperature and pressure. The measurable operations include processes that are key to solar cells, lasers and optically driven quantum computing. Essentially, electrons pop between a “ground state,” in which they cannot travel, and states in the semiconductor’s “conduction band,” in which they are free to move through space. They do this by absorbing and emitting light. By simply stacking such layers one on top of the other under variable twist angles, and with a wide selection of materials, scientists can now create artificial solids with truly unprecedented properties.”Rupert Huber, University of Regensburg professor of physics The quantum mapping method uses a 100 femtosecond (100 quadrillionths of a second) pulse of red laser light to pop electrons out of the ground state and into the conduction band. Next the electrons are hit with a second pulse of infrared light. This pushes them so that they oscillate up and down an energy “valley” in the conduction band, a little like skateboarders in a halfpipe. The team uses the dual wave/particle nature of electrons to create a standing wave pattern that looks like a comb. They discovered that when the peak of this electron comb overlaps with the material’s band structure—its quantum structure—electrons emit light intensely. That powerful light emission along, with the narrow width of the comb lines, helped create a picture so sharp that researchers call it super-resolution. By combining that precise location information with the frequency of the light, the team was able to map out the band structure of the 2D semiconductor tungsten diselenide. Not only that, but they could also get a read on each electron’s orbital angular momentum through the way the front of the light wave twisted in space. Manipulating an electron’s orbital angular momentum, known also as a pseudospin, is a promising avenue for storing and processing quantum information. In tungsten diselenide, the orbital angular momentum identifies which of two different “valleys” an electron occupies. The messages that the electrons send out can show researchers not only which valley the electron was in but also what the landscape of that valley looks like and how far apart the valleys are, which are the key elements needed to design new semiconductor-based quantum devices. For instance, when the team used the laser to push electrons up the side of one valley until they fell into the other, the electrons emitted light at that drop point too. That light gives clues about the depths of the valleys and the height of the ridge between them. With this kind of information, researchers can figure out how the material would fare for a variety of purposes. The paper is titled, “Super-resolution lightwave tomography of electronic bands in quantum materials.” This research was funded by the Army Research Office, the German Research Foundation and the U-M College of Engineering Blue Sky Research Program. The Army Research Office is an element of the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory.
<urn:uuid:5cda6eaa-e8e7-4155-8a10-fd43756d523a>
CC-MAIN-2022-05
https://optics.engin.umich.edu/stories/mapping-quantum-structures-with-light-to-unlock-their-capabilities
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304345.92/warc/CC-MAIN-20220123232910-20220124022910-00226.warc.gz
en
0.927182
1,084
3.875
4
Chips measure electron spin Technology Research News Practical quantum computers are at least a decade away, and some researchers are betting that they will never be built. This is because controlling individual particles like atoms, electrons and photons is extraordinarily challenging. Information carried in particles always comes in shades of gray and can be corrupted or wiped out by the slightest wisp of energy from the environment. A pair of experiments has brightened prospects for quantum computing, however, by making it more likely that a practical means of reading electron-based quantum bits, or qubits, can be developed. Research teams from the University of California at Los Angeles and from Delft University of Technology in the Netherlands have developed electronic methods of detecting the spins of individual electrons. Spin is a property of electrons that is akin to the rotation of a top. The two spin directions, spin up and spin down, are magnetically opposite, like the two poles of a kitchen magnet. The spins can represent the 1s and 0s and digital information. Particles that are isolated from their environment are in the weird quantum state of superposition, meaning they are in some mix of the two spin directions. This means a qubit can be in some mix of 1 and 0, which allows a string of qubits to represent every binary number at once. This gives a quantum computer the ability to check every possible answer to a problem with a single set of operations, promising speedy solutions to problems that classical computers have to churn through one answer at a time. These include factoring large numbers, a problem whose difficulty is the foundation of most of today's security codes. Electronic equipment has become sensitive enough that it is no longer difficult to detect the presence of a single electron. But detecting an electron's spin orientation is another matter. In recent years, researchers have succeeded in detecting electron spin optically using specialized laser setups. The key to using electron spin in quantum computers whose architecture is similar to today's computer chips is being able to detect the spin orientation electronically. The UCLA team's method of electron spin detection uses devices that are already mass-produced. The researchers flipped a single electron spin in a commercial transistor chip, and detected the spin flip by measuring changes in current flowing through the device. Several proposed quantum computer architectures call for circuits that can be manufactured using today's chipmaking techniques. "The transistor structure used for our experiment [closely] resembles some proposed spin-based qubit architectures," said Hong-Wen Jiang, a professor of physics at the University of California at Los Angeles. "We believe that our read-out scheme can be readily adapted in a scalable quantum information processor," he said. Electrons travel through a transistor via a semiconductor channel that is electrically insulated. The transistor is controlled by a gate electrode, which produces an electric field that penetrates the insulator and increases the conductivity of the channel, allowing electrons to flow. Occasionally defects occur, producing one or more spots in the insulator that can draw individual electrons from the channel and trap them. The researchers sought out transistors that contained single defect traps, set the gate voltage so that the trap had an equal chance of attracting an electron or not, and applied a large magnetic field to the trap. A high magnetic field causes electrons in the spin-down state to have slightly more energy than spin-up electrons. The researchers flipped the electron's spin with a microwave pulse. An electron that is spin-up fills the trap but a higher-energy spin-down electron leaves room, electrically speaking, for a second, spin-up electron from the channel to join it in the trap. The difference between having one and having two electrons in the trap is measurable as a change in the current flowing through the transistor. Two electrons decrease the amount of current. The researchers can observe a microwave pulse flipping the spin of an electron in the trap by measuring the current. In its present form, the UCLA device uses a randomly-positioned defect as its electron trap, and electrons cycle through the trap rapidly enough that the spin measurement is an average of a few thousand electrons. The researchers are conducting similar experiments in specially designed semiconductor structures that promise greater control over electron spin, the ability to entangle two spins, and to eventually build a scalable quantum processor, said Jiang. Properties of entangled particles, including spin, remain in lockstep regardless of the distance between them. Entanglement is a basic requirement of quantum algorithms, and entangled electrons would enable information to be teleported between circuits within a quantum computer. Meanwhile, the Delft team devised a way to measure the spin of an electron trapped in a quantum dot -- a tiny semiconductor device that produces electric fields capable of confining one or a few electrons. "The technique works fully electrically, and is therefore... suitable for integration with existing solid-state technologies," said Jeroen Elzerman, a researcher at Delft University of Technology. The researchers applied a large magnetic field to the trapped electron, which caused the spin-down state to have slightly more energy than the spin-up state. They tuned the quantum dot's electric field so that the energy of a spin-down electron was just high enough for it to escape, but the energy of a spin-up electron was below the threshold. Therefore, if an electron is present it is spin-up, and if the quantum dot is empty, the electron that escapes is spin-down. The researchers next step is to to use pulsed microwaves to control the exact quantum superposition of the spin, said Elzerman. They then plan to entangle two spins. "When this is done, all the basic ingredients for a quantum computer are in place," he said. Coupling many spins and controlling their interactions accurately enough to perform a quantum algorithm is a matter of improving control over the fabrication process, said Elzerman. "We need cleaner and purer materials and more reproducible electron beam lithography so that all dots on a single chip are really identical," he said. Jiang's research colleagues were Ming Xiao and Eli Yablonovitch of UCLA, and Ivar Martin of Los Alamos National Laboratory. They published the research in the July 22, 2004 issue of Nature. The research was funded by the Defense Advanced Research Projects Agency (DARPA) and the Defense Microelectronics Activity (DMEA). Elzerman's research colleagues were Ronald Hanson, Laurens Willems van Beveren, Benoit Witkamp, Lieven Vandersypen and Leo Kouwenhoven. They published the research in the July 22, 2004 issue of Nature. The research was funded by DARPA, the Office of Naval Research, the European Union and the Dutch Organization for Fundamental Research on Matter (FOM). Timeline: 10 years; 10-20 years TRN Categories: Physics; Quantum Computing and Communications Story Type: News Related Elements: Technical papers, "Electrical detection of the spin resonance of a single electron in a silicon field-effect transistor," Nature, July 22, 2004; "Single-shot read-out of an individual electron spin in a quantum dot," Nature, July 22, 2004 August 11/18, 2004 Projector lights radio Cell phone melds video Sound system lets Chips measure electron Twisted fiber filters bring walking to VR Speck trios make Single gold atoms Pen writes micro wires Design eases nano Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:878ea69d-48fb-4812-9bc3-ed0e1ba94fb2>
CC-MAIN-2022-05
http://trnmag.com/Stories/2004/081104/Chips_measure_electron_spin_081104.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320300289.37/warc/CC-MAIN-20220117031001-20220117061001-00106.warc.gz
en
0.922861
1,670
3.78125
4
D-Wave implements quantum annealing, while Google has digitized adiabatic quantum computation. D-Wave advertises their line of quantum computers as having thousands of qubits, though these systems are designed specifically for quadratic unconstrained binary optimization. More information about D-Wave's manufacturing process. It is D-Wave's claim that: "It is best suited to tackling complex optimization problems that exist across many domains such as": Sampling / Monte Carlo Pattern recognition and anomaly detection Software / hardware verification and validation Bioinformatics / cancer research D-Wave's QPU uses quantum annealing (QA), a metaheuristic for finding the global minimum of a given objective function over a given set of candidate solutions (candidate states), by a process using quantum fluctuations. Quantum annealing is used mainly for problems where the search space is discrete (combinatorial optimization problems) with many local minima; such as finding the ground state of a spin glass. D-Wave's architecture differs from traditional quantum computers. It is not known to be polynomially equivalent to a universal quantum computer and, in particular, cannot execute Shor's algorithm because Shor's Algorithm is not a hillclimbing process. Shor's Algorithm requires a universal quantum computer. D-wave claims only to do quantum annealing. Experimental quantum annealing: case study involving the graph isomorphism problem Defects in Quantum Computers Google's claim is: "The goal of the Google Quantum AI lab is to build a quantum computer that can be used to solve real-world problems. Our strategy is to explore near-term applications using systems that are forward compatible to a large-scale universal error-corrected quantum computer using linear array technology". "State preservation by repetitive error detection in a superconducting quantum circuit" "Digitized adiabatic quantum computing with a superconducting circuit" Inaccurate layperson' explanation: A Graphic Card has more Cores than a CPU. GPUs are optimized for taking huge batches of data and performing the same operation over and over very quickly, unlike PC microprocessors, which tend to skip all over the place. Architecturally, the CPU is composed of just few cores with lots of cache memory that can handle a few software threads at a time. In contrast, a GPU is composed of hundreds of cores that can handle thousands of threads simultaneously. Technical, but not overly complicated, layperson's explanation: Why is Google's new device newsworthy then? Is it better than D-Wave's machine in some respects? If so, how? There are "Annealing QPUs" and "Universal QPUs" as explained above, an incomplete list is offered on Wikipedia's page: "List of Quantum Processors". In quantum annealing, the strength of transverse field determines the quantum-mechanical probability to change the amplitudes of all states in parallel. In the case of annealing a purely mathematical objective function, one may consider the variables in the problem to be classical degrees of freedom, and the cost functions to be the potential energy function (classical Hamiltonian). Moreover, it may be able to do this without the tight error controls needed to harness the quantum entanglement used in more traditional quantum algorithms. That makes it easier to provide more qubits, but the kinds of problems they are able to solve is more limited than the qubits provided in a universal QPU. In general the ground state of a Hamiltonian can be used to encode a wider variety of problems than NP (know QMA-complete problems), and so decision to focus on NP optimization problems has led to restrictions which prevent the device from being used for general purpose quantum computing (even if noise was not an issue). There is an interesting subtlety as regards noise: If you add noise to the adiabatic algorithm, it degrades gracefully into one of the best classical algorithms for the same problem. The adiabatic model can encode universal quantum computation, however the limitations of DWave's implementation means that specific machine cannot. Google's universal QPU can solve a wider range of problems than D-Wave's QPU (in it's current implementation) if they can solve their decoherence problem. In the case of Google's Bristlecone caution is warranted. Bristlecone is a scaled up version of a 9-qubit Google design that has failed to yield acceptable error rates for a commercially viable quantum system. In real-world settings, quantum processors must have a two-qubit error rate of less than 0.5 percent. According to Google, its best result has been a 0.6 percent error rate using its much smaller 9-qubit hardware. The commercial success of quantum computing will require more than high qubit numbers. It will depend on quality qubits with low error rates and long-lasting circuit connectivity in a system with the ability to outperform classic computers in complex problem solving, i.e., “quantum supremacy”. Google will use it's record number of more useful qubits to correct the error rate of those error prone qubits. More qubits are needed to solve bigger problems and longer living (coherent) qubits to are needed to hold the information long enough for the quantum algorithm to run. IBM describes the problem as: "Quantum Volume: preferring fewer errors per qubit over more qubits", see also: What is the leading edge technology for creating a quantum computer with the fewest errors? . Google plans to use Surface Codes to resolve this problem, for more info and a comparison to spin glass models see: "Quantum Computation with Topological Codes: from qubit to topological fault-tolerance". IBM has a video titled: "A Beginner’s Guide to Quantum Computing" which explains quantum computing for laypersons in under 20 minutes. Microsoft intends to take the wind from everyone's sails with the integration of Q# (Q sharp) into Visual Studio and some information about their Majorana fermion based qubits, and a great reduction in the error rate, in the months to come. See: "Majorana-based fermionic quantum computation". The will enable a system that uses less than 25% as many better qubits to accomplish the same amount of work as Google's qubits. The website "The Next Platform" describes the current situation as: "Quantum Computing Enters 2018 Like it's 1968".
<urn:uuid:66097d5c-2519-46f1-8c40-b52e9eb6118c>
CC-MAIN-2022-05
https://quantumcomputing.stackexchange.com/questions/1428/is-googles-72-qubit-device-better-than-d-waves-machines-which-feature-more-th/1435#1435
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301263.50/warc/CC-MAIN-20220119033421-20220119063421-00588.warc.gz
en
0.899357
1,418
3.515625
4
Nowadays, the very abstract ideas underlying the quantum physics are being translated into reality thanks to new technological capabilities in the field of nanotechnology and optical interactions. One of these ideas, the idea of a quantum internet and a quantum computer, will be discussed further in this article. While the subject is very broad, we’ll try to summarize the basic ideas behind these technologies. Quantum Internet allows to send quantum data (quantum bits or qubits) from one quantum computer to another. The media here is either a fiber optic cable or a free space connection with a clear line of sight between the starting and the destination point of a signal. Classical computers work with conventional bits that can be either zero or one. Quantum mechanics, however, allows qubits to be in a superposition state that can be 1 and 0 at the same time. Therefore, we can encode more information in qubits than in conventional bits. The amount of information that can be stored and processed using qubits is 2n, where n is the number of qubits. So, in two qubit systems, we need four numbers (bits) to determine the state of the system. To define the state of the three qubits system we need 8 numbers. If we have 300 qubits, the equivalent of classical bit information is 2300. Quantum computer is a computer where the number of operations grows exponentially. However, the improvement is not in the speed of an individual operation but rather a total number of operations to write the result. Therefore, quantum computers are not generally faster, they are faster only for specific types of calculations . We can easily grasp this concept by playing the light switch game provided by D-Wave . The game explains why a quantum computer is faster than a conventional computer in a process of finding the best combination of switches when a number of the switches is large. As stated, “The quantum computer begins with the bits in superposition (the switch can be in both ON and OFF states), ends with them behaving as regular classical bits, and finds the answer along the way”. However, with only 500 switches, there is not enough time in the universe to check all the configurations when conventional processors are used. So far, only a small number of quantum algorithms have been found. Here are some of the most famous ones: Shor’s Algorithm (factorization) Grover’s Algorithm (quick search in an unordered database) Deutsch–Jozsa Algorithm (produces an answer; the function is either constant or balanced) Let’s review the Shor’s algorithm in a bit more detail. It allows to solve any of the two mathematically equivalent problems below: - Finding the period of a complex periodic function or - Decomposing a very large number into the prime factors The second of these tasks is of significant practical importance since it is used in cryptography. When encrypting and decrypting secret messages (public key encryption) large numbers are used for which their factorization is known. It is clear that such numbers are easy to obtain: it is enough to multiply a large number of prime numbers, and we get a very large number for which the factorization is known. The recipient of the encoded secret message can decode it because the decoding procedure uses factorization of a long number, and he/she knows this decomposition. If a third party could factor this number into the prime factors, he/she would also be able to decode the message. However, this decomposition takes a lot of time. Therefore, from a practical point of view, it is impossible to decode such messages. But if the third party would’ve had a quantum computer, then he/she could decompose long numbers into simple factors quite fast and therefore could easily decipher such messages. The common cryptography method used today would stop working. This is one of the arguments that make the creation of a quantum computer important. On the other hand, quantum networking provides another secure communication benefit. Quantum Key Distribution (QKD) enables secure communication whose security relays on quantum mechanics. For instance, the spin of an electron can be used as a qubit since it can undergo transitions between the spin-up and spin-down quantum states, represented classically by 0 and 1. In other words, qubits are based on physicals properties of the particles such as electron spins or polarization of photon. However, if we would want to measure the electron’s spin, some of its properties would change. If we were to apply the temperature near the absolute zero (-273 Celsius), the electron would be spin down ↓. If we were to write the information to a qubit we would put the electron into a spin-up state ↑ by hitting it with a pulse of microwaves with specific frequency. We would not know the spin of an electron until we measure it. And when we measure it, the qubit’s physical properties are changed. Therefore, it is also impossible to make exact copies of qubits or to clone it. This is known as a quantum no-cloning theorem. Qubits perfectly suit for secure communication. If Bob and Alice exchange an encryption key using qubits and Eve intercepts communication, both Alice and Bob know that someone messed with qubits as the physicals properties of the qubits changed. Therefore, extracting quantum information without leaving a trace is impossible. The presence of Eve’s eavesdropping communication, can be easily detected. Nowadays, we can send qubits to short distances over telecommunication fibers up to 200 kilometers. The reason for that is Decoherence – a situation where the system being measured loses its specific quantum properties. In other words, the pure state quickly turns into a mixture when the quantum system interacts with the environment. So, the real challenge in building quantum Internet is to send qubits further than a few hundred kilometers. The single photon sent over a fiber optic cable can be lost. As we know, qubits cannot be copied or amplified so they cannot be resent without a notice. To solve this issue, the box called a quantum repeater is placed in the middle of the communication line and the pair of photons is exchanged between the repeater and the quantum computer on the left side of the line. Similarly, another pair of photons is exchanged between the repeater and a quantum computer located to the right of the communication line. Quantum repeaters are crucial for entanglement over long distances using fiber optic cables. The vision is to build a long-range quantum internet that will operate in parallel to the Internet we know today. We have already mentioned, that transmission of quantum signals over long distances is prevented by fiber attenuation and the no-cloning theorem. Therefore, one of the realistic scenarios is that the future Quantum Internet will consist of a global network of quantum repeaters that are developed and used in order to extend the range of communication. However, there is also another approach to this problem which is based on the deployment of satellite technology. China launched world’s first quantum communication satellite Miciusin 2016, and has since been busy testing and extending the limitations of sending entangled photons from space to ground stations on Earth and back again . Chinese and European researchers have tested the system by creating secure video conference between Europe and China. There are certain issues associated with quantum computing besides decoherence, such as the search for new algorithms as well as new methods of error correction. All of these problems however can be described in one phrase – scalability issues. Quantum computers are the “holy grail” of modern physics and informatics. The idea of a quantum computer and a quantum network looks unrealistic at first. A regular classical computer was probably perceived the same way at the time of Charles Babbage, the invention of which happened only a hundred years later. QCs on two or three qubits already exist, but they require the use of high technologies (pure substances, precise implantation of individual atoms, a highly accurate measurement system, etc.). However, as mentioned earlier, the main challenge is not the technological one but the fundamental one of scalability. It is unlikely that quantum computers will replace the classical computers in the near future. We can only speculate that the QCs would be put into clouds to offer unique services whereas personal computers would transmit or access the quantum-encrypted information through the cloud-based QCs. Hopefully, the scientific and technical progress of our time is fast enough, and we will not have to wait too long for quantum computing to become a common reality. Boost BGP Preformance Automate BGP Routing optimization with Noction IRP
<urn:uuid:72e851ef-a6af-4efc-b930-be7275ef0f74>
CC-MAIN-2022-05
https://www.noction.com/blog/quantum-computing-future-networking
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320306181.43/warc/CC-MAIN-20220129122405-20220129152405-00307.warc.gz
en
0.940461
1,778
3.953125
4
As early as 1959 the American physicist and Nobel laureate Richard Feynman noted that, as electronic components begin to reach microscopic scales, effects predicted by quantum mechanics occur—which, he suggested, might be exploited in the design of more powerful computers. In particular, quantum researchers hope to harness a phenomenon known as superposition. In the quantum mechanical world, objects do not necessarily have clearly defined states, as demonstrated by the famous experiment in which a single photon of light passing through a screen with two small slits will produce a wavelike interference pattern, or superposition of all available paths. (Seewave-particle duality.) However, when one slit is closed—or a detector is used to determine which slit the photon passed through—the interference pattern disappears. In consequence, a quantum system “exists” in all possible states before a measurement “collapses” the system into one state. Harnessing this phenomenon in a computer promises to expand computational power greatly. A traditional digital computer employs binary digits, or bits, that can be in one of two states, represented as 0 and 1; thus, for example, a 4-bit computer register can hold any one of 16 (24) possible numbers. In contrast, a quantum bit (qubit) exists in a wavelike superposition of values from 0 to 1; thus, for example, a 4-qubit computer register can hold 16 different numbers simultaneously. In theory, a quantum computer can therefore operate on a great many values in parallel, so that a 30-qubit quantum computer would be comparable to a digital computer capable of performing 10 trillion floating-point operations per second (TFLOPS)—comparable to the speed of the fastest supercomputers. During the 1980s and ’90s the theory of quantum computers advanced considerably beyond Feynman’s early speculations. In 1985 David Deutsch of the University of Oxford described the construction of quantum logic gates for a universal quantum computer, and in 1994 Peter Shor of AT&T devised an algorithm to factor numbers with a quantum computer that would require as few as six qubits (although many more qubits would be necessary for factoring large numbers in a reasonable time). When a practical quantum computer is built, it will break current encryption schemes based on multiplying two large primes; in compensation, quantum mechanical effects offer a new method of secure communication known as quantum encryption. However, actually building a useful quantum computer has proved difficult. Although the potential of quantum computers is enormous, the requirements are equally stringent. A quantum computer must maintain coherence between its qubits (known as quantum entanglement) long enough to perform an algorithm; because of nearly inevitable interactions with the environment (decoherence), practical methods of detecting and correcting errors need to be devised; and, finally, since measuring a quantum system disturbs its state, reliable methods of extracting information must be developed. Plans for building quantum computers have been proposed; although several demonstrate the fundamental principles, none is beyond the experimental stage. Three of the most promising approaches are presented below: nuclear magnetic resonance (NMR), ion traps, and quantum dots. In 1998 Isaac Chuang of the Los Alamos National Laboratory, Neil Gershenfeld of the Massachusetts Institute of Technology (MIT), and Mark Kubinec of the University of California at Berkeley created the first quantum computer (2-qubit) that could be loaded with data and output a solution. Although their system was coherent for only a few nanoseconds and trivial from the perspective of solving meaningful problems, it demonstrated the principles of quantum computation. Rather than trying to isolate a few subatomic particles, they dissolved a large number of chloroform molecules (CHCL3) in water at room temperature and applied a magnetic field to orient the spins of the carbon and hydrogen nuclei in the chloroform. (Because ordinary carbon has no magnetic spin, their solution used an isotope, carbon-13.) A spin parallel to the external magnetic field could then be interpreted as a 1 and an antiparallel spin as 0, and the hydrogen nuclei and carbon-13 nuclei could be treated collectively as a 2-qubit system. In addition to the external magnetic field, radio frequency pulses were applied to cause spin states to “flip,” thereby creating superimposed parallel and antiparallel states. Further pulses were applied to execute a simple algorithm and to examine the system’s final state. This type of quantum computer can be extended by using molecules with more individually addressable nuclei. In fact, in March 2000 Emanuel Knill, Raymond Laflamme, and Rudy Martinez of Los Alamos and Ching-Hua Tseng of MIT announced that they had created a 7-qubit quantum computer using trans-crotonic acid. However, many researchers are skeptical about extending magnetic techniques much beyond 10 to 15 qubits because of diminishing coherence among the nuclei. Just one week before the announcement of a 7-qubit quantum computer, physicist David Wineland and colleagues at the U.S. National Institute for Standards and Technology (NIST) announced that they had created a 4-qubit quantum computer by entangling four ionized beryllium atoms using an electromagnetic “trap.” After confining the ions in a linear arrangement, a laser cooled the particles almost to absolute zero and synchronized their spin states. Finally, a laser was used to entangle the particles, creating a superposition of both spin-up and spin-down states simultaneously for all four ions. Again, this approach demonstrated basic principles of quantum computing, but scaling up the technique to practical dimensions remains problematic. Quantum computers based on semiconductor technology are yet another possibility. In a common approach a discrete number of free electrons (qubits) reside within extremely small regions, known as quantum dots, and in one of two spin states, interpreted as 0 and 1. Although prone to decoherence, such quantum computers build on well-established, solid-state techniques and offer the prospect of readily applying integrated circuit “scaling” technology. In addition, large ensembles of identical quantum dots could potentially be manufactured on a single siliconchip. The chip operates in an external magnetic field that controls electron spin states, while neighbouring electrons are weakly coupled (entangled) through quantum mechanical effects. An array of superimposed wire electrodes allows individual quantum dots to be addressed, algorithms executed, and results deduced. Such a system necessarily must be operated at temperatures near absolute zero to minimize environmental decoherence, but it has the potential to incorporate very large numbers of qubits.
<urn:uuid:b94cbbfa-e421-49d5-b386-f3ab783ac25a>
CC-MAIN-2022-05
https://www.britannica.com/technology/quantum-computer
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320299852.23/warc/CC-MAIN-20220116093137-20220116123137-00428.warc.gz
en
0.930201
1,357
4.15625
4
Flat solar panels still face big limitations when it comes to making the most of the available sunlight each day. A new spherical solar cell design aims to boost solar power harvesting potential from nearly every angle without requiring expensive moving parts to keep tracking the sun’s apparent movement across the sky. The spherical solar cell prototype designed by Saudi researchers is a tiny blue sphere that a person can easily hold in one hand like a ping pong ball. Indoor experiments with a solar simulator lamp have already shown that it can achieve between 15 percent and 100 percent more power output compared with a flat solar cell with the same ground area, depending on the background materials reflecting sunlight into the spherical solar cell. The research group hopes its nature-inspired design can fare similarly well in future field tests in many different locations around the world. “The placement and shape of the housefly’s eyes increase their angular field of view so they can see roughly 270 degrees around them in the horizontal field,” says Nazek El-Atab, a postdoctoral researcher in microsystems engineering at the King Abdullah University of Science and Technology (KAUST). “Similarly, the spherical architecture increases the ‘angular field of view’ of the solar cell, which means it can harvest sunlight from more directions.” To create the spherical solar cell design, El-Atab and her colleagues built upon their previous work, which demonstrated how to create thinner and more flexible solar cell designs based on a corrugated groove technique. The new work is detailed in a paper that has been submitted for review to the journal MRS Communications. Measurement setup of the spherical solar cell under a solar simulator in air and using a regular a white paper as the reflective background material. Photo: Nazek El-Atab/KAUST Testing with the solar simulator lamp showed that the spherical solar cell provided 24 percent more power output over a traditional flat solar cell upon immediate exposure to sunlight. That power advantage jumped to 39 percent after both types of solar cells had begun to heat up and suffered some loss in power efficiency—an indication that the spherical shape may have some advantages in dissipating heat. The spherical solar cell also delivered about 60 percent more power output than its flat counterpart when both could collect only scattered sunlight under a simulated roof rather than receiving direct sunlight. Additional experiments with different reflective backgrounds—including an aluminum cup, aluminum paper, white paper, and sand—showed that the hexagonal aluminum cup background helped the spherical solar cell outperform the flat solar cell by 100 percent in terms of power output. The Saudi team created the spherical solar cell using the monocrystalline silicon solar cells that currently account for almost 90 percent of the world’s solar power production. That choice sprang from the goal of helping to maximize the light-harvesting potential of such solar cells, along with the aim of potentially making it easier to scale up production if the design proves cost efficient. “What surprises me is the authors have demonstrated the ultra-flexibility that can be achieved with rigid silicon solar cells using the corrugation technique in a series of articles,” says Zhe Liu, a postdoctoral researcher in solar engineering at MIT, who was not involved in the study. “I’m more excited about the ability to make spherical cells, which means you can have industrial IBC-type (interdigitated back contact) silicon solar cells cover any shapes and ‘solarize’ everywhere.” Previous solar cell designs have fabricated tiny microscale spherical cells—sometimes made with nanowires or quantum dot cells—on top of a flat surface to help better collect both direct and scattered sunlight, says Rabab Bahabry, an assistant professor of physics at the University of Jeddah in Saudi Arabia. But the larger spherical solar cell may offer improved efficiency and coverage compared with the microsphere arrays when it comes to collecting sunlight reflected from background surfaces. Creating the large spherical solar cell required the researchers to etch alternating grooves in 15 percent of a flat solar cell to make a pattern resembling a band of elliptical shapes connected at the middle. A CO2 laser created the appropriate pattern in a polymeric hard mask covering the solar cell and allowed a deep reactive ion etching tool to create grooves in the exposed areas of the silicon solar cell. The flex and bend in those groove areas allowed the researchers to subsequently fold the solar cell into a spherical shape. Dust accumulation on a spherical solar cell is limited to the silicon area with a small tilt angle. Image: Rabab Bahabry/University of Jeddah and KAUST The loss of solar cell material in the areas that have been etched out reduces the overall potential solar power output. But the researchers see cost over time favoring spherical solar cells over flat solar cells in certain parts of the world because the spherical design is less prone to dust accumulation and may help dissipate heat that might otherwise reduce the solar cell’s efficiency. In addition, the spherical solar cells don’t require additional costly moving parts to continually track the sun. Still, the spherical solar cells may not replace traditional solar cell technology at utility-scale solar power plants, says Liu at MIT. In his view, this particular spherical solar cell design could find use in more niche market applications. He noted that one of his colleagues is currently searching for a solar cell design to cover a golf ball so that it can power a tracker inside the ball. But Liu sees much promise in such ultra-flexible solar cell designs being installed in buildings, cars, or even mobile devices. “The application of spherical design may seem very limited, but the ability to make commercial silicon solar cells into any shapes would enable broad adaption of photovoltaic in autonomous devices, such as IoT (Internet of Things) sensors, and autonomous vehicles,” Liu says. “If we can fully power these autonomous devices with shaped photovoltaic panels, this could be a game changer.” For future testing, Liu says he would like to see how the spherical solar cell performs in a wide array of both outdoor and indoor lighting environments at different times of day. He also wants to see how well the spherical solar cells can be integrated into certain applications that they might power. And he is curious about seeing a “quantified cost” summary of all the processing steps required to make such spherical solar cells in order to better understand the technology’s commercialization potential. The Saudi researchers had to manually fold and form their spherical solar cells in their latest demonstration, but they have already begun designing and developing ways to automate the process using “robotic hands” to mimic the manual folding, says Muhammad Mustafa Hussain, a professor of electrical and computer engineering at KAUST who was one of the study’s coauthors. Eventually, Hussain and his colleagues envision building and testing large arrays of the spherical solar cells. And they’re already working on new shapes that resemble tents or umbrellas to see if those offer any advantages. They are also integrating solar cells with the surfaces of drones that have unusual shapes. The COVID-19 pandemic that forced the closure of research labs has delayed the Saudi group’s initial plans for outdoor testing. But Hussain says the group still plans to move forward with field trials before the end of 2020. He expects help from the KAUST alumni network in eventually testing the spherical solar cells in California, along with countries such as Bangladesh, China, India, South Korea, Germany, Spain, Brazil, Colombia, Mexico, South Africa, Australia, and New Zealand. “We will be creating arrays of spherical cells for 100-square-foot to 1,000-square-foot areas, and will compare functionality over cost benefit with that of traditional cells,” Hussain says. “Next, we will deploy it in different geographic locations throughout the year to understand its performance and reliability.” Editor’s note: A correction to this article was made on 16 June 2020. The sentence on indoor experiments was revised to correct an inaccurate interpretation of the power output comparison between the spherical solar cell and flat solar cell in the submitted paper. Jeremy Hsu has been working as a science and technology journalist in New York City since 2008. He has written on subjects as diverse as supercomputing and wearable electronics for IEEE Spectrum. When he’s not trying to wrap his head around the latest quantum computing news for Spectrum, he also contributes to a variety of publications such as Scientific American, Discover, Popular Science, and others. He is a graduate of New York University’s Science, Health & Environmental Reporting Program.
<urn:uuid:26c9f993-c209-411f-9621-69dcf2e76276>
CC-MAIN-2022-05
https://spectrum.ieee.org/spherical-solar-cells-soak-up-scattered-sunlight
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303868.98/warc/CC-MAIN-20220122164421-20220122194421-00429.warc.gz
en
0.930014
1,781
3.546875
4
A diverse range of breakthrough technologies, including “artificial leaves” that turn CO2 into fuel, and a technique that harvests water from air, could soon be playing a role in tackling the world’s most pressing challenges, according to a list published today by the World Economic Forum. The technologies were selected by the World Economic Forum’s Expert Network and Global Future Councils in collaboration with Scientific American and its board of advisors. Each technology was chosen for its potential to improve lives, transform industries and safeguard the planet. The experts were also looking for indications that the technologies have reached a level of maturity that would enable widespread take-up in the coming three to five years. “New technologies are redefining industries, blurring traditional boundaries and creating new opportunities on a scale never seen before. Public and private institutions must develop the correct policies, protocols and collaborations to allow such innovation to build a better future, while avoiding the risks that unchecked technological change could pose,” said Murat Sönmez, Head of the Center for the Fourth Industrial Revolution and member of the managing board of the World Economic Forum. The top 10 technologies to make this year’s list are: Liquid biopsies mark a step forward in the fight against cancer. First, they are an alternative where traditional tissue-based biopsies are not possible. Second, they provide a full spectrum of information compared to tissue samples, which only reflect the information available in the sample. Lastly, by homing in on circulating-tumor DNA (ctDNA), genetic material that routinely finds its way from cancer cells into the bloodstream, disease progression or resistance to treatment can be spotted much faster than otherwise relying on symptoms or imaging. Harvesting clean water for air The ability to extract clean water from air is not new, however existing techniques require high moisture levels and a lot of electricity. This is changing. A team from MIT and University of California, Berkeley has successfully tested a process using porous crystals that convert the water using no energy at all. Another approach, by a start-up called Zero Mass Water from Arizona is able to produce 2-5 litres of water a day based on an off-grid solar system. Deep learning for visual tasks Computers are beginning to recognize images better than humans. Thanks to deep learning, an emerging field of artificial intelligence, computer-vision technologies are increasingly being used in applications as diverse as driving autonomous vehicles, medical diagnostics, damage assessment for insurance claims and monitoring of water levels and crop yield. Liquid fuels from sunshine Can we mimic the humble leaf to create an artificial photosynthesis to generate and store energy? The prospects are looking increasingly positive. The answer lies in using sunlight-activated catalysts to split water molecules into water and hydrogen, and then using the same hydrogen to convert CO2 into hydrocarbons. Such a closed system – wherein CO2 emitted by combustion is then transformed back into fuel instead of the atmosphere – could prove to be revolutionary for the solar and wind industries. The human cell atlas An international collaboration aimed at deciphering the human body, called the Human Cell Atlas, was launched in October 2016. The project, backed by the Chan Zuckerberg Initiative aims to identify every cell type in every tissue; learn exactly which genes, proteins and other molecules are active in each type and the processes which control that activity; determine where the cells are located exactly; how the cells normally interact with one another, and what happens to the body’s functioning when genetic or other aspects of a cell undergo change, among other things. The end product will be an invaluable tool for improving and personalizing health care. The Fourth Industrial Revolution is providing farmers with a new set of tools to boost crop yield and quality while reducing water and chemical use. Sensors, robots, GPS, mapping tools and data-analytics software are all being used to customize the care that plants need. While the prospect of using drones to capture plant health in real time may be some way off for most of the world’s farmers, low-tech techniques are coming online too. Salah Sukkarieh, of the University of Sydney, for instance, has demonstrated a streamlined, low-cost monitoring system in Indonesia that relies on solar power and cell phones. Affordable catalysts for green vehicles Progress is being made on a promising zero-emission technology, the hydrogen-fed fuel cell. Progress to date has been stymied by the high price of catalysts which contain platinum. However, much progress has been made reducing reliance on this rare and expensive metal, and the latest developments involve catalysts that include no platinum, or in some cases no metal at all. Vaccines based on genes are superior to more conventional ones in a number of ways. They are faster to manufacture for one thing, which is crucial at times of a violent outbreak. Compared to manufacturing proteins in cell cultures or eggs, producing genetic material should also be simpler and less expensive. A genomics-based approach to vaccines also enables more rapid adaptation in the event of a pathogen mutating, and finally allows scientists to identify people who are resistant to a pathogen, isolate the antibodies that provide that protection, and design a gene sequence that will induce a person’s cells to produce those antibodies. Sustainable design of communities Applying green construction to multiple buildings at once has the potential to revolutionize the amount of energy and water we consume. Sending locally-generated solar power to a smart microgrid could reduce electricity consumption by half and reduce carbon emissions to zero if a project currently under development at the University of California at Berkeley Goes to plan. Meanwhile, the same project’s plan to re-design water systems so that waste water from toilets and drains is treated and re-used on site, with rainwater diverted to toilets and washers, could cut demand for potable water by 70%. Quantum computers’ almost limitless potential has only ever been matched by the difficulty and cost of their construction. Which explains why today the small ones that have been built have not yet managed to exceed the power of supercomputers. But progress is being made and in 2016 the technology firm IBM provided the public access to the first quantum computer in the cloud. This has already led to more than 20 academic papers being published using the tool and today more than 50 start-ups and large corporations worldwide are focused on making quantum computing a reality. With such progress behind us, the word on people’s lips now is “Quantum Ready.”
<urn:uuid:2e2b4f1a-8d0f-422e-a1cd-03d662fbee96>
CC-MAIN-2022-05
https://asiatimes.com/2017/06/top-10-emerging-technologies-2017/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303868.98/warc/CC-MAIN-20220122164421-20220122194421-00429.warc.gz
en
0.943038
1,358
3.65625
4
switch promises powerful computers Technology Research News At first glance, a switch is a simple concept. It is either on or off. Today's computer chips harbor millions of microscopic electrical switches. These transistors turn on when an electromagnetic field generated by a control electrode lowers the transistor's resistance to the flow of electrons, which allows electrical current to flow from one end of the device to the other. The presence or absence of this flow represents a 1 or a 0 of digital computing. Circuits that switch light rather than electricity would make for faster computers, but it's difficult to use a beam of light to turn another light beam on and off. Light beams usually just pass through each other, especially if they are relatively weak. Researchers from the University of Toronto in Canada have figured out a way to allow beams of individual photons to affect each other, and have made a device that switches light in a manner similar to the way electrical transistors switch electrical current. Photon transistors could pave the way for fast, low-power, all-optical Extremely low-power switches are also a necessary component of quantum computers, which use the delicate differences in the states of atoms and subatomic particles to compute. The researchers demonstrated the photon switch by shooting two weak beams of light into a crystal that was simultaneously bombarded by intense laser light of another wavelength. "The switch allows two beams of light so weak that they contain at most a single photon, and most often none at all, to meet up inside a thin optical crystal," said Aephraim Steinberg, an associate professor of physics at the University of Toronto in Canada. One of the weird quantum traits of light is that it is simultaneously a continuous wave and a stream of tiny particles, or photons. Different colors of light are different wavelengths. Red light, for example, is around 650 nanometers, or millionths of a millimeter, from crest to trough, while higher-frequency blue light measures around 450 nanometers. Lit up by an intense laser beam of blue light that measures half the wavelength of the weak red beams, the researchers' crystal allows weak beams of red light to pass through unless they both contain a photon. "The crystal is transparent to the two weak signal beams except when both beams contain a photon, in which case the two photons annihilate [each other], and are prevented from passing. This is the switch effect," said Steinberg. The red color of the weak beams disappears, turning the switch off, when each contains a photon because the two photons essentially merge into one higher-energy photon of blue light, a process known as upconversion, according to Steinberg. "A single red photon doesn't possess enough energy to "turn blue" and will therefore be transmitted undisturbed," he said. "But since any pair of red photons will upconvert, it's as though a single photon is enough to switch off the path for the other photon." The switching interaction occurs in a region of the crystal that is about one tenth of a millimeter across, but the equipment required for the researchers' prototype includes an inch-long crystal and a six-foot-wide table containing lasers and detectors. Because the actual switching is purely optical, it could in theory be miniaturized using techniques that exist today, The researchers' prototype works about 60 percent of the time, but the concept could lead to a reliable switch, according to Steinberg. The researchers' eventual aim is to use the switch in quantum computers, Steinberg said. "Our hope is that this could be used as a fundamental logic gate inside quantum computers, whose [potential] uses are still... being discovered," said Steinberg. Quantum computers could be much faster than the fastest possible electronic computers, because they have the potential to examine every possible answer to a problem at once. "If you know how to ask the computer the right question, instead of getting the results of just a single calculation, you may find out something about the results of all possible calculations, something the classical computer would've had to run exponentially many times to determine," Steinberg said. The research is impressive, and "potentially very significant," said Robert Boyd, a professor of optics at the University of Rochester. "It's been well-established that a strong beam of light can be used to control another beam of light. The novel feature of the present approach is that the two weak beams interact in the presence of a strong beam, which allows the interaction to be strong even though the control and signal beams are both weak," he said. This method has the potential to produce energy-efficient optical switches that operate with very weak power levels, which would be useful for applications like telecommunications and optical computing devices, said Boyd. The switches are potentially useful for quantum computing for similar reasons. "The signal levels must necessarily be very weak" for quantum applications, he said. Although there are many research efforts under way to bring quantum computing to reality, it is hard to know if and when these fantastically fast computers will materialize, said Steinberg. "Thousands of people around the world are working towards the construction of quantum computers and algorithms for use on them, but none of us knows if a full-scale device will ever work," he said. "I'd say it's equally likely that we will never see a quantum computer in our lifetimes, or that people will stumble across the right architecture for one in the next ten years or so." Steinberg's research colleagues were Kevin J. Resch and Jeff S. Lundeen. They published the research in the November 15, 2001 issue of Physical Review Letters. The research was funded by the Canadian Natural Sciences and Engineering Research Council, Photonics Research Ontario, the Canada Fund for Innovation, the Ontario Research and Development Challenge Fund, and the U.S. Air Force. Timeline: > 10 years TRN Categories: Optical Computing, Optoelectronics and Photonics; Story Type: News Related Elements: Technical paper, "Nonlinear Optics with Less Than One Photon," Physical Review Letters, September 17, 2001. Disks set to go ballistic queries bridge search and speech nerve cells to electronics Silicon chips set to promises powerful computers Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:a5b0aa75-25b2-4671-9a7c-de11b51cc9b7>
CC-MAIN-2022-05
http://trnmag.com/Stories/2002/072402/Light_switch_promises_powerful_computers_072402.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320305494.6/warc/CC-MAIN-20220128104113-20220128134113-00030.warc.gz
en
0.921724
1,503
3.84375
4
A Laser-Sharp View of Electron Correlations The photoelectric effect refers to the emission of electrons from a metal that is injected with light—a phenomenon that was discovered over 100 years ago and explained by Einstein. Today, the effect is the basis for a powerful experimental method known as angle-resolved photoemission spectroscopy (ARPES). This technique uses light to take a “picture” of a material’s electronic energy bands, the structure of which dictates many material properties. Researchers have steadily increased the resolution of these electron pictures by various means, including employing lasers as the light source. Now, using laser-based ARPES, Anna Tamai of the University of Geneva and colleagues provide an unprecedented test of a theory for materials in which electron correlations are strong . The researchers studied the unconventional superconductor, , and determined that correlations enhance a parameter known as spin-orbit coupling (SOC) by a factor of 2—in agreement with the theoretical prediction. Their accurate measurement of SOC may also help physicists resolve a puzzle surrounding the superconducting state of . has been a font of interesting physics . In 1994, experimentalists discovered that the material becomes superconducting at about 1.5 K. Theorists soon speculated that was unlike other known superconductors. They also conjectured that its “Cooper pairs” of electrons, which carry the superconducting current, had a spin of 1 instead of a spin of 0 , indicating an unusual pairing mechanism. But the type of pairing has been an ongoing subject of debate. Support for the spin-1 picture comes from various early experiments, such as measurements of the superconducting phase, muon spin rotation, and the Kerr effect, whereas a recent nuclear magnetic resonance (NMR) experiment indicates spin-0 pairs . The nature of the pairing is also relevant to the possibility that is a topological superconductor, an exotic phase of interest for a robust form of quantum computing. is also attractive because its physics, including the pairing mechanism, is affected by interactions (or “correlations”) between the electrons. In fact, the material has become a model system for understanding these effects both experimentally and theoretically. A theory developed specifically for materials with strong correlations, known as dynamical mean-field theory (DMFT), predicts that electrons in enhance the coupling between electron momentum and spin (spin-orbit coupling) [5, 6]. But this predicted enhancement has yet to be tested. The work by Tamai and co-workers provides the best such test to date . The team investigated electron correlations in using ARPES to measure three energy bands near the Fermi energy. These bands are derived from three of the 4d orbitals of the ruthenium atoms, and their qualitative shape has been measured in previous ARPES experiments. What was harder to see until now was a theoretically predicted separation (in energy and momentum) between the bands. This band “splitting” occurs at the Fermi energy, and it is caused by SOC involving the 4d electrons. The team determined the Fermi surface and the energy bands of with unprecedented accuracy by using an 11-eV laser light source with an energy resolution of 3 meV and an angular resolution of 0.2°. Compared with earlier ARPES studies, the bands measured by Tamai et al. have narrower widths, making it easier to see the distortions induced by SOC. The group also took steps to suppress contributions from surface states, ensuring that their measured energy bands correspond to “bulk” electrons. (The experiments were performed at 5 K in the “normal” state of .) The team determined the magnitude of the correlation-enhanced SOC in experimentally by measuring the band splitting. The enhanced SOC is about twice as large as its “bare” value (no correlations), in agreement with the value calculated within DMFT. A separate, direct measurement of the correlation effects comes from comparing the measured bands with three calculations based on density-functional theory (DFT). This computational tool is more standard than DMFT, but it typically applies to materials without strong electron correlations. DFT calculations were performed without SOC, with bare SOC, and with an “effective” SOC that includes an enhancement from electron correlations (Fig. 1). The excellent agreement of the calculation (Fig. 1, right) with the ARPES data provides direct measurement of the enhanced SOC. These tests of DFT and DMFT give weight to the applicability of these approaches to as well as to other materials with multiple d-electron orbitals, strong SOC, and strong electron correlations, such as the iron-based superconductors. The “cleanliness” of the ARPES data also allowed the authors to confirm a fundamental assumption of DMFT that is related to the determination of so-called electron self-energies. These are shifts in energy that result from electron interactions, and they can have sizable effects on the energy bands. DMFT typically assumes the self-energies are momentum independent to simplify calculations. The researchers confirmed this “ansatz” by extracting self-energies from their measured bands, a result that provides further strength to the applicability of DMFT for . Beyond testing theory, knowing the strength of the SOC is of interest for understanding superconductivity in —a far from settled topic. Strong SOC could substantially mix the spin-0 and spin-1 states of the Cooper pairs. Figuring out whether the SOC is sufficiently strong for this mixing to occur will require additional calculations . But this step is worth making as physicists try to reconcile the new NMR data , which suggest spin-0 Cooper pairs, with older measurements, which support spin-1 pairs . This research is published in Physical Review X. - A. Tamai et al., “High-resolution photoemission on reveals correlation-enhanced effective spin-orbit coupling and dominantly local self-energies,” Phys. Rev. X 9, 021048 (2019). - A. P. Mackenzie and Y. Maeno, “The superconductivity of and the physics of spin-triplet pairing,” Rev. Mod. Phys. 75, 657 (2003). - T. M. Rice and M. Sigrist, “: An electronic analogue of ?,” J. Phys. Condens. Matter 7, L643 (1995). - A. Pustogow et al., “Pronounced drop of NMR Knight shift in superconducting state of ,” arXiv:1904.00047. - G. Zhang et al., “Fermi Surface of : Spin-orbit and anisotropic Coulomb interaction effects,” Phys. Rev. Lett. 116, 106402 (2016). - M. Kim et al., “Spin-orbit coupling and electronic correlations in ,” Phys. Rev. Lett. 120, 126401 (2018). - Q. H. Wang, C. Platt, Y. Yang, C. Honerkamp, F. C. Zhang, W. Hanke, T. M. Rice, and R. Thomale, “Theory of superconductivity in a three-orbital model of ,” Europhys. Lett. 104, 17013 (2013).
<urn:uuid:b6c0314b-5557-4584-977e-f345d7e1f18c>
CC-MAIN-2022-05
https://physics.aps.org/articles/v12/89
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303868.98/warc/CC-MAIN-20220122164421-20220122194421-00431.warc.gz
en
0.940739
1,557
3.71875
4
Cryptography is a method of protecting information and communications through the use of codes, so that only those for whom the information is intended can read and process it. The prefix “crypt-” means “hidden” or “vault” — and the suffix “-graphy” stands for “writing.” In computer science, cryptography refers to secure information and communication techniques derived from mathematical concepts and a set of rule-based calculations called algorithms, to transform messages in ways that are hard to decipher. These deterministic algorithms are used for cryptographic key generation, digital signing, verification to protect data privacy, web browsing on the internet, and confidential communications such as credit card transactions and email. Cryptography is closely related to the disciplines of cryptology and cryptanalysis. It includes techniques such as microdots, merging words with images, and other ways to hide information in storage or transit. However, in today’s computer-centric world, cryptography is most often associated with scrambling plaintext (ordinary text, sometimes referred to as cleartext) into ciphertext (a process called encryption), then back again (known as decryption). Individuals who practice this field are known as cryptographers. Modern cryptography concerns itself with the following four objectives: - Confidentiality: the information cannot be understood by anyone for whom it was unintended - Integrity: the information cannot be altered in storage or transit between sender and intended receiver without the alteration being detected - Non-repudiation: the creator/sender of the information cannot deny at a later stage his or her intentions in the creation or transmission of the information - Authentication: the sender and receiver can confirm each other’s identity and the origin/destination of the information Procedures and protocols that meet some or all of the above criteria are known as cryptosystems. Cryptosystems are often thought to refer only to mathematical procedures and computer programs; however, they also include the regulation of human behavior, such as choosing hard-to-guess passwords, logging off unused systems, and not discussing sensitive procedures with outsiders. Cryptosystems use a set of procedures known as cryptographic algorithms, or ciphers, to encrypt and decrypt messages to secure communications among computer systems, devices such as smartphones, and applications. A cipher suite uses one algorithm for encryption, another algorithm for message authentication, and another for key exchange. This process, embedded in protocols and written in software that runs on operating systems and networked computer systems, involves public and private key generation for data encryption/decryption, digital signing and verification for message authentication, and key exchange. Types of cryptography Single-key or symmetric-key encryption algorithms create a fixed length of bits known as a block cipher with a secret key that the creator/sender uses to encipher data (encryption) and the receiver uses to decipher it. Types of symmetric-key cryptography include the Advanced Encryption Standard (AES), a specification established in November 2001 by the National Institute of Standards and Technology as a Federal Information Processing Standard (FIPS 197), to protect sensitive information. The standard is mandated by the U.S. government and widely used in the private sector. In June 2003, AES was approved by the U.S. government for classified information. It is a royalty-free specification implemented in software and hardware worldwide. AES is the successor to the Data Encryption Standard (DES) and DES3. It uses longer key lengths (128-bit, 192-bit, 256-bit) to prevent brute force and other attacks. Public-key or asymmetric-key encryption algorithms use a pair of keys, a public key associated with the creator/sender for encrypting messages and a private key that only the originator knows (unless it is exposed or they decide to share it) for decrypting that information. The types of public-key cryptography include RSA, used widely on the internet; Elliptic Curve Digital Signature Algorithm (ECDSA) used by Bitcoin; Digital Signature Algorithm (DSA) adopted as a Federal Information Processing Standard for digital signatures by NIST in FIPS 186-4; and Diffie-Hellman key exchange. To maintain data integrity in cryptography, hash functions, which return a deterministic output from an input value, are used to map data to a fixed data size. Types of cryptographic hash functions include SHA-1 (Secure Hash Algorithm 1), SHA-2 and SHA-3. Attackers can bypass cryptography, hack into computers that are responsible for data encryption and decryption, and exploit weak implementations, such as the use of default keys. However, cryptography makes it harder for attackers to access messages and data protected by encryption algorithms. Growing concerns about the processing power of quantum computing to break current cryptography encryption standards led the National Institute of Standards and Technology (NIST) to put out a call for papers among the mathematical and science community in 2016 for new public key cryptography standards. Unlike today’s computer systems, quantum computing uses quantum bits (qubits) that can represent both 0s and 1s, and therefore perform two calculations at once. While a large-scale quantum computer may not be built in the next decade, the existing infrastructure requires standardization of publicly known and understood algorithms that offer a secure approach, according to NIST. The deadline for submissions was in November 2017, analysis of the proposals is expected to take three to five years. History of cryptography The word “cryptography” is derived from the Greek kryptos, meaning hidden. The origin of cryptography is usually dated from about 2000 B.C., with the Egyptian practice of hieroglyphics. These consisted of complex pictograms, the full meaning of which was only known to an elite few. The first known use of a modern cipher was by Julius Caesar (100 B.C. to 44 B.C.), who did not trust his messengers when communicating with his governors and officers. For this reason, he created a system in which each character in his messages was replaced by a character three positions ahead of it in the Roman alphabet. In recent times, cryptography has turned into a battleground of some of the world’s best mathematicians and computer scientists. The ability to securely store and transfer sensitive information has proved a critical factor in success in war and business. Because governments do not wish certain entities in and out of their countries to have access to ways to receive and send hidden information that may be a threat to national interests, cryptography has been subject to various restrictions in many countries, ranging from limitations of the usage and export of software to the public dissemination of mathematical concepts that could be used to develop cryptosystems. However, the internet has allowed the spread of powerful programs and, more importantly, the underlying techniques of cryptography, so that today many of the most advanced cryptosystems and ideas are now in the public domain. Thanks for visiting !!!!
<urn:uuid:d2cc40ae-dc27-4775-988d-c4e80162b2f9>
CC-MAIN-2022-05
https://darklegion.code.blog/2020/05/17/cryptography/?shared=email&msg=fail
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303884.44/warc/CC-MAIN-20220122194730-20220122224730-00712.warc.gz
en
0.943043
1,440
3.84375
4
reads quantum bits Technology Research News The key to quantum computing is being able to use the spins of subatomic particles such as electrons to represent the ones and zeros of computing. A particle can be spin-up or spin-down in a way similar to a top spinning either clockwise or counterclockwise. If you could reliably distinguish between spin-up and spin-down energy in large numbers of particles, the spin possibilities in each particle could serve as a quantum bit, or qubit, representing a one or a zero, and you could build a fantastically powerful computer in very little space. The trouble is, it's difficult to measure spin. Scientists have done so by trapping isolated atoms and using lasers to measure spin states, but they are still a long way from being able to read the millions of quantum bits required to form a practical quantum computer. Researchers at the University of California at Berkeley have taken a step towards that goal by showing that it is possible to measure the spin of a quantum state of an electron in a nickel atom embedded in a copper oxide crystal. The development has the potential to make a promising quantum scheme considerably more practical. There are four major problems to be solved in making a quantum computer: its qubits must be able to represent a one or zero long enough for the computer to perform logic operations on them; the qubits must be able to interact with each other to carry out those operations; there must be some way to read the information contained in a qubit in order to see the results of the operations; and the system must contain a lot of qubits to do useful computing. By measuring the spin of a single atom, the Berkeley researchers have found a way to read the information contained in a certain type of qubit. This type of qubit -- a single atom embedded in a solid made of other atoms -- has already shown potential for solving the other three problems associated with quantum computing. A theoretical proposal by University of Maryland researcher Bruce Kane shows that qubits made from phosphorus atoms embedded in silicon could hold their spin states for a long enough time to do computing, could be placed closely enough to interact with each other, and could be made in a large quantity. The Berkeley method addresses the key missing piece in that plan by showing that it is possible to measure the spin of a single electron within an impurity, or atom of one material embedded in another. The researchers used a scanning tunneling microscope (STM) to measure the spin of an electron associated with a nickel impurity embedded in copper oxide, but they had to make some modifications to do so. Scanning tunneling microscopes use tips that resemble needles, but are so sharp that they taper to a single atom. The tip hovers over the surface of a material and maps the changes the material's electron energy makes to the electron current flowing through the tip similar to the way a seismograph Because spin-up and spin-down states have different energy, the researchers were able to distinguish between them. "We are trying to get an electron to jump into one of the quantum states from a nearby metal tip. The spin-down state exists at a lower energy than the spin-up state at the atom we studied, so by measuring the rate at which the electrons jump into the state as a function of their energy we can tell which is which," said Davis. To make the scheme work, however, the researchers had to solve a pair First, the spin energy of an electron can only be split into discernible spin-up or spin-down states under certain conditions, said Davis. "In each [impurity] atom there's a single wave function of the electron... you can split that wave function into a spin-up and spin-down state if you're in a high magnetic field at low temperatures," he said. The amount by which the two energy levels are split is proportional to the strength of the magnetic field, so the stronger the magnetic field, the easier it is to distinguish the two levels. Second, heat energy easily drowns out spin energy. "The amount of energy associated with the temperature has to be smaller than the splitting between the two levels [otherwise] thermal energy would just be knocking electrons up and down from the bottom [energy level] to the top one all the time," The researchers solved the problems by measuring electron spin in a nickel impurity embedded in a superconductor at a relatively low temperature. Copper oxide is a high-temperature superconductor, meaning its electrons are free to travel without resistance at 85 degrees Kelvin, or -188 degrees Celsius, which, though very cold, is less cold than the temperatures of 4 degrees Kelvin, or -269 degrees Celsius required by low-temperature Because nickel is magnetic, it exerts a magnetic force that is very strong at distances of 10 or 20 nanometers away from the atom. "The effective field at the nickel atom is hundreds of Tesla. So we didn't need a big external magnet, we got it for free by putting a magnetic atom into the solid," said Davis. The researchers next plan to use the same technique to measure electron spin in a phosphorus atom embedded in a silicon chip, which is the setup required in the Kane quantum computer proposal. Because phosphorus is not magnetic, the Berkeley researchers need to generate a large magnetic field in order to measure the spins of its quantum particles. The researchers are planning to build an STM that can generate an eight Tesla field at temperatures as low as 20 millikelvin in order to carry out the measurements, said Davis. If the researchers are able to measure spin states in phosphorus atoms, "then that's really big news because that was the really big problem of the Kane proposal," said Paul Kwiat, a physics professor at the University of Illinois at Urbana-Champaign. "The main reason people were skeptical about [the Kane proposal] was the need for reading out single spins, which seemed like it was not going to be very easy, and it still may not be very easy. But certainly this is an experiment in the right direction," Kwiat said. The Kane proposal is probably the most promising model so far for quantum computing, largely because it is based on silicon, Kwiat added. "If you can do something in silicon... and you get it to work, you can hand it to the silicon industry," he said. Researchers in the quantum field generally agree that practical quantum computers are at least two decades away, if they can be built at all. "It's like asking when fusion will generate cheap energy. It's a possible but technically hard challenge," said Davis. Davis' research colleagues were Eric W. Hudson of the University of California at Berkeley and the National Institute of Standards and Technology, Christine M. Lang and Vidya Madhavan of the University of California at Berkeley, Shuheng H. Pan of the University of California at Berkeley and Boston University, Hiroshi Eisaki from the University of Tokyo in Japan and Stanford University, and Shin-ichi Uchida of the University of Tokyo. They published the research in the June 21, 2001 issue of the journal Nature. The research was funded by the Office of Naval Research (ONR) and the Department of Energy (DOE). Timeline: > 20 years TRN Categories: Quantum Computing Story Type: News Related Elements: Technical paper, "Interplay of Magnesium and High Tc Superconductivity at individual Ni impurity atoms in Bi2Sr2CaCu2O8+ d," Nature, June 21, 2000; Additional images at the Davis group website: Tool reads quantum bits Study shows fiber has room to grow Search tool builds atoms advance quantum chips Electron beam welds Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:957c0755-ff4e-4fed-b838-2e93d9f28d65>
CC-MAIN-2022-05
http://trnmag.com/Stories/080101/Tool_reads_quantum_bits_080101.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320300289.37/warc/CC-MAIN-20220117031001-20220117061001-00114.warc.gz
en
0.919296
1,851
4
4
QUANTUM INFORMATION PROCESSING Entanglement is central to the phenomenon of quantum teleportation. It is the process by which quantum information (e.g. the exact state of an atom or photon) is transmitted, exactly, using classical communication channel from one location to another, with the help of previously shared quantum entanglement between the sending and receiving locations. Anybody who has watched Star-Trek is familiar with the idea of teleportation where an object or person is made to “disappear” in one place while a perfect replica emerges somewhere else (“apparate” for Harry Potter fans). The science behind teleportation is usually not explained by science fiction writers, but the effect they portray is dramatic. Information from the original object is “extracted” and transmitted to the receiving end which is then used to construct the replica. The replica does not contain actual material of the original, but is invariably created from the same kinds of atoms, their arrangement modeled in exactly in the same way as the original. Think of a fax machine that works on 3-dimensional objects as well as documents, but produces an exact copy instead of an approximate facsimile, destroying the original during the process of “scanning”. Quantum teleportation is a much more realistic and subtler effect where information is transferred between entangled quantum states. The idea of teleporting quantum particles emerged from purely theoretical considerations of a young researcher named William Wootters, who, in 1980, wrote his Ph.D. thesis centered on the question: from what principles can Born’s rule in quantum theory be derived? Important to his considerations was a task known as quantum state tomography. Since measurement of a quantum state results in its modification, obtaining a complete characterization of a quantum state requires measurements on many identical copies of itself. Quantum state tomography is the process by which a quantum state is reconstructed using measurements on an ensemble of identical quantum states. In the fall of 1989, Asher Peres found strong numerical evidence showing joint measurements on a pair of systems yielded superior tomography than the separate measurements. It seemed, therefore, that if a pair of similarly prepared particles was separated in space, an experimenter would be less likely to identify their state than if they were together. After attending a seminar delivered by Wootters in 1992, Charlie Bennett of IBM Research Division, T.J. Watson Research Center, started to ponder whether the inherent nonlocality associated with spatially separated entangled systems could achieve the same quality of quantum state tomography as opposed to the case when they were in contact. In 1993, Bennett and an international group of six scientists including Wootters, showed that the quantum state of a system could indeed be transferred from one party to distant party using only local operations and classical communication, provided the original is destroyed, and in so doing were able to circumvent the no-cloning theorem. The trick was dubbed “quantum teleportation” by its authors. The abstract of their paper, published in Physical Review Letters reads: “An unknown quantum state can be disassembled into, then later reconstructed from, purely classical information and purely nonclassical Einstein-Podolsky-Rosen (EPR) correlations. To do so the sender Alice, and the receiver Bob, must prearrange the sharing of an EPR-correlated pair of particles. Alice makes a joint measurement on her EPR particle and the unknown quantum system, and sends the classical result of this measurement. Knowing this, Bob can convert the state of his EPR particle into an exact replica of the unknown state which Alice destroyed.” In a conventional facsimile transmission, the original object is practically unscathed after the scanning process is complete, although scanning in this case is capable of extracting only partial information about the object. The scanned information is then transmitted to the receiving station, where it is imprinted on paper (or on some other surface) to produce an approximate copy of the original. In contrast, two entangled objects B and C (Figure 12) that were originally in contact, are separated in quantum teleportation—object B is brought to the sending station, while object C is transmitted to the receiving station. A, the original object to be teleported, is scanned together with object B at the sending station. This process is irreversible as it disrupts the states of both A and B. The scanned information is accepted by the receiving station, where it is used to “select one of several treatments” to be applied to object C. This makes C as an exact replica of A. The information is considered teleported because the original object A never travels the distance between the two locations. In subsequent years, various groups have demonstrated teleportation experimentally in a variety of systems, including single photons, coherent light fields, nuclear spins, and trapped ions. Quantum teleportation has the immense promise as it can facilitate long range quantum communications. One day it could also be the enabling technology for a “quantum internet”. In 2014, physicists from the Kavli Institute of Nanoscience at the Delft University of Technology in the Netherlands, reported successful transmission of quantum data involving the spin state of an electron to another electron about 10 feet away (Figure 13). Successful experiments with quantum teleportation has been reported in the past, but the results of the Delft University study have an unprecedented replication rate of 100 percent for their studied distance. In 2016, researchers at the National Institute of Standards and Technology (Valivarthi, et al., 2016) were able to teleport quantum information carried by photons over 6.2 kilometers (km) of optical fiber, four times farther than the previous record. The researchers used a variation of the method described above: here three observers participate rather than the conventional two. Bob and Alice each make measurements of an entangled state and another photon, about 8 kilometers from each other. Their results are then sent to Charlie, who combines the two results to achieve quantum teleportation. This method assures that the experiment extended beyond a single lab location, and it was done using existing dark fiber and wavelengths of light commonly used in current fiber internet.
<urn:uuid:693154cc-82f2-4dfa-97ff-fe79f591de4a>
CC-MAIN-2022-05
https://cosmicglimpses.blog/2016/10/29/entanglement/21/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320305242.48/warc/CC-MAIN-20220127072916-20220127102916-00514.warc.gz
en
0.944147
1,259
3.546875
4
In what has been hailed as a computing milestone, a team of researchers from the University of Science and Technology of China has achieved quantum supremacy thanks to a device that can manipulate tiny particles of light. Dubbed Jiuzhang, the system performed a quantum computation called "Gaussian boson sampling", which has been shown to be intractable for classical computers. Quantum supremacy is achieved when a quantum device is proven to be able to carry out a task that a classical computer would find impossible, or take too long to complete. While Jiuzhang achieved Gaussian boson sampling in just 200 seconds, the researchers estimated that the same calculation would take the world's fastest supercomputer, Fugaku, 600 million years to complete. Quantum supremacy has only been claimed once before. Last year, Google's researchers showed off a 54-qubit processor that they said could run a test computation in 200 seconds – a calculation that, according to the research, would take the world's biggest supercomputers 10,000 years to complete. Qubits come with unprecedented computational power due to their ability to exist in a dual quantum state, and therefore to carry out many calculations at one. Researchers expect that, armed with enough stable qubits, quantum computers will shake up industries ranging from AI to finance through transportation and supply-chains. The crux of the challenge consists of creating and maintaining enough qubits to make a quantum computer useful, and there are different ways to do so. The quantum technology developed by Google, for example, is entirely different from Jiuzhang's set up: the search giant, for its part, is investing in metal-based superconducting qubits. This is also IBM's preferred quantum technique, and both tech giants have poured large sums of money into superconducting circuits to push quantum computing research. For superconducting qubits to remain controllable, however, they need to be kept in very cold temperatures – colder than in deep space. Needless to say, making this practical is still a significant barrier. The extreme sensitivity of qubits to their external environment also means that it is hard to scale up the devices. Instead of particles of metal, Jiuzhang manipulates photons. The device was built specifically for the quantum task that it carried out, Gaussian boson sampling, which consists of simulating and predicting the erratic behavior of photons. The task consists of injecting particles of light into a network of beam splitters and mirrors that give photons multiple choices of paths to travel through before reaching different output ports. Photons, however, come with strange quantum properties that complicate the matter: there is no way of knowing deterministically which way they will choose. What's more, if two identical photons hit the beam splitter at exactly the same time, they will stick together and both travel the same randomly-chosen path. All of this makes it very difficult for classical computers to identify patterns of photon behavior, and to predict the output configuration of photons based on how the particles were input. The difficulty of the calculation also exponentially increases as more photons get involved, which means that a Gaussian boson sampling device is difficult to scale up. Christine Silberhorn, professor of integrated quantum optics at Paderborn University in Germany, has been working on Gaussian boson sampling for many years. "The scheme has its own challenges," she tells ZDNet. "Scaling up the system is hard, because all components have to be engineered for a quantum experiment, and they have to work accurately together. Moreover, it requires the detections and processing of very large datasets." The researchers equipped Jiuzhang with 300 beam splitters and 75 mirrors, and said that they managed to measure up to 76 photons during their experiments – enough particles of light to make the calculation intractable for a classical computer. Cracking the Gaussian boson sampling equation has limited usefulness. For now, in fact, the experiment has done little more than show that Jiuzhang is better than classical computers at solving one very specific task – simulating the unpredictable behavior of photons. That doesn't mean, however, that a large-scale quantum computer will be built anytime soon to solve real-life problems. The value of the experiment rather lies in the proof that light-based quantum computers might be just as promising as their matter-based counterparts, which so far, courtesy of big tech's interest, have grabbed most of the headlines. "This experiment is an important milestone experiment for quantum simulations based on linear optical systems," says Silberhorn. "It demonstrates the high potential for scalable quantum computation using photons." Researchers have recently taken interest in photonic quantum computers because of the potential that particles of light have to remain stable even in uncontrolled environments. Unlike devices based on superconducting qubits, photons don't require extreme refrigeration, and could in theory scale up much faster. "The Boson sampling experiment reported by the USTC group is a real tour de force, and illustrates the potential of photonics as a quantum technology platform," Ian Walmsley, chair in experimental physics at Imperial College London, told ZDNet. "This is a real step forward in developing technologies that harness the power of quantum physics to perform tasks that that are not possible using current technologies." The new milestone achieved by the team at the University of Science and Technology of China, therefore, is likely to bring new impetus to the on-going race to build up quantum technologies. Google and IBM are only two examples of deep-pocketed players who have shown interest in developing quantum computers, and a rich ecosystem is growing at pace to bring new innovations to the space. In addition to industry players, nation states have shown strong interest in developing quantum technologies. The Chinese government, for one, is investing heavily in the field. In fact, Jian-Wei Pan, who led the research team that worked on Jiuzhang, was also behind a recent quantum cryptography breakthrough that achieved quantum key distribution over a record-breaking 745 miles.
<urn:uuid:0b88104e-fba2-46d7-9343-9f68473a5fdc>
CC-MAIN-2022-05
https://www.zdnet.com/article/quantum-supremacy-milestone-achieved-by-light-emitting-quantum-computer/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301863.7/warc/CC-MAIN-20220120130236-20220120160236-00675.warc.gz
en
0.954966
1,231
3.6875
4
This morning, Google scientists confirmed in a blog post that their quantum computer had needed just 200 seconds to solve a problem that they claim would take the world’s fastest supercomputer 10,000 years to complete. The team first ran the algorithm last spring using a 54-qubit processor called “Sycamore.” While the achievement is called quantum supremacy, it doesn’t mean that quantum computers are suddenly more capable than classical computers, since Google’s quantum computer only beat the competition at a single, highly contrived problem. Quantum computers with day-to-day applications may still be decades away, but this is an important scientific milestone when comparing quantum computers to their classical counterpart. “For such large-scale endeavors it is good engineering practice to formulate decisive short-term goals that demonstrate whether the designs are going in the right direction,” Google’s John Martinis and Sergio Boxio, chief scientists of quantum hardware and quantum computing theory, wrote in the blog post. “So, we devised an experiment as an important milestone to help answer these questions.” Quantum computers are a new kind of computing device that could one day be capable of solving problems that classical computers can’t. Instead of series of transistors linked together, representing two-choice bits like in classical computers, their base unit is the quantum bit, or qubit, a piece of hardware that mimics the behavior of a subatomic particle. Qubits communicate via the probability-driven theory of quantum mechanics instead of the regular rules of logic. They’re still two-choice systems that output binary code, but getting to the answer incorporates the quantum mathematical ideas of entanglement, superposition, and interference. This new architecture may one day excel at simulating the behavior of subatomic particles well enough to create new medicines and new materials. It might also be able to crack the code that modern-day encryption is based on. Scientists must first find a physical system that assumes quantum properties. But quantum states are incredibly fragile—the slightest bump of heat or vibrational energy can make initialized qubits lose their quantumness and turn into regular bits. Google’s engineers built theirs from loops superconducting wire, controlled by quick, customized microwave pulses. Google’s quantum supremacy experiment essentially sets up random circuits out of these qubits. Certain outputs become more common than others. It’s easy for Sycamore to find these outputted strings, but with each new qubit, it would take a regular supercomputer exponentially more time to come up with an answer. Google’s scientists ran the experiment repeatedly, incorporating a new qubit until the supercomputer simulating the quantum computer couldn’t keep up, according to the paper published in Nature. The main application of such an experiment is that it can produce truly random numbers, something useful in various fields of science, cryptography, art, and of course, online gambling. But a hypothesis called the Church-Turing thesis claims that a theoretical computer called the Turing machine, which basically simplifies all computers to symbols on tape, is the most efficient way to solve computer problems. Google’s quantum computer provides evidence against this thesis. Rumors and hype have surrounded the Google quantum supremacy announcement since the team published a paper describing how they’d achieve the milestone on the arXiv physics preprint server in 2016. Last month, the Financial Times reported that it had found the Google paper describing the completed quantum supremacy experiment on a NASA server, but Google would not confirm the veracity of the report until today. Already, scientists are debating whether the quantum supremacy experiment actually demonstrates what it claims. On Monday, the IBM quantum team published a blog post arguing that a classical computer could more accurately run the Google problem in just 2.5 days. Simply put, it’s hard to prove a claim that a classical computer can’t do something. Regardless, this experiment does not mean that quantum computers are suddenly going to appear in your iPhone; there’s a lot of work left. John Preskill, the CalTech physicist who coined the term quantum supremacy, told Gizmodo in March that while it’s certainly worth pursuing these supremacy experiments, “I think it’s more important to try and develop the tools that we need to scale up further: perfecting error-correction methods, improving the qubits, and addressing the system’s engineering issues that you need to control the platform with thousands or millions of qubits.” It’s clear that we’ve entered a new era of quantum computing (it’s called the NISQ era), as companies now have noisy but functional devices that may actually be useful soon. I’m at Google’s lab at Santa Barbara, California today to get a first look at the device. I’ll report back with more images of the computer and what it’s actually like.
<urn:uuid:4af3c01b-af22-44b7-94aa-fb35298cc384>
CC-MAIN-2022-05
https://gizmodo.com/google-confirms-achieving-quantum-supremacy-1839288099
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304947.93/warc/CC-MAIN-20220126101419-20220126131419-00555.warc.gz
en
0.926868
1,030
3.625
4
A new project will use the electric field in an accelerator cavity to try to levitate a tiny metallic particle, allowing it to store quantum information. Quantum computing could solve problems that are difficult for traditional computer systems. It may seem like magic. One step toward achieving quantum computing even resembles a magician’s trick: levitation. A new project at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility will attempt this trick by levitating a microscopic particle in a superconducting radiofrequency (SRF) cavity to observe quantum phenomena. Typically at Jefferson Lab and other particle accelerator facilities, SRF cavities enable studies of the atom’s nucleus. They do this by accelerating subatomic particles, such as electrons. This project will use the same type of cavity to instead levitate a microscopic particle of metal, between 1 and 100 micrometers in diameter, with the cavity’s electric field. “No one has ever intentionally suspended a particle in an electric field in a vacuum using SRF cavities,” said Drew Weisenberger, a principal investigator on this project, as well as Chief Technology Officer and head of the Radiation Detector and Imaging Group in the Experimental Nuclear Physics Division at Jefferson Lab. If the project team is able to levitate a particle, they might be able to then impart a quantum state on it by cooling the trapped particle to its lowest possible energy level (because that’s when quantum properties occur). “Storing quantum information on a levitated nanoparticle is our ultimate goal, but for now, it is a proof of principle experiment,” said Pashupati Dhakal, another principal investigator on the project and a staff scientist at Jefferson Lab in the Accelerator Operations, Research and Development Division. “We want to know if we can trap and levitate particles inside the cavity using the electric field.” Exploring the Quantum with Accelerator Cavities The idea for this project came from observations of accelerator experts. They think they have already unintentionally levitated unwanted and rare nanoparticles of metal, such as niobium and iron, inside SRF cavities during particle accelerator operations. They suspect that this unintentional levitation has impacted the performance of SRF cavity components. Researchers are attempting to use a several-decades-old technique called “laser trapping”, as a step toward reliably imparting a quantum state on a particle suspended in a laser beam. But, the Jefferson Lab project team thinks that SRF cavities may provide a better tool for those researchers. “An electric field could go potentially beyond the capabilities of laser trapping,” Weisenberger said. Intrinsic characteristics of SRF cavities will overcome some limits of laser trapping. A levitated particle in an SRF cavity that is under vacuum and chilled to super cold temperatures will only interact with the cavity’s electric field and not lose information to the outside, which is important for maintaining a quantum state. “Like storing information on a computer chip, the quantum state will stay and not dissipate,” Weisenberger said. “And that could eventually lead to applications in quantum computing and quantum communications.” This project, titled “SRF Levitation and Trapping of Nanoparticles Experiment,” is funded by the Laboratory Directed Research & Development program, which provides resources for Jefferson Lab personnel to make rapid and significant contributions to critical science and technology problems relevant to the mission of Jefferson Lab and the DOE. A Multidisciplinary Approach The project was conceived and launched by Rongli Geng in October 2021 before he transitioned to Oak Ridge National Laboratory. It has now shifted to a larger and more multi-disciplinary team led by Weisenberger and Dhakal, the current co-principal investigators. Weisenberger’s team researches detector technology for nuclear physics research, whereas Dhakal’s work focuses on developing SRF cavities to accelerate electrons at high speeds. Weisenberger says that the multidisciplinary approach will bring together their expertise as they branch out together into the less familiar territory of this LDRD project. Both principal investigators remark that the project is moving forward well, thanks to the diligence and expertise supplied by every member of the team. Team members include John Musson, Frank Marhauser, Haipeng Wang, Wenze Xi, Brian Kross and Jack McKisson. “It’s an interesting step outside of the usual things that we do,” Weisenberger said. “The LDRD program lets loose Jefferson Lab scientists and engineers on a research question that isn’t directly related to what we’re actually hired to do, but is making use of all the expertise that we bring and it’s a great resource to tap to try to stretch. That’s what we’re doing with this project, stretching.” Building and Testing Before turning the project over the Weisenberger and Dhakal, Geng and his colleagues had determined the required parameters of the cavity and electric field with simulations and calculations. “We have everything on paper but we have to make it into a reality,” Dhakal said. The team is currently setting up the experiment in real life. “We have to see if what was simulated can actually happen,” Weisenberger said. First, they’ll assemble a mock-up of the experiment at room temperature. Then, they’ll circulate liquid helium around the outer surfaces of the cavity to cool it to superconducting temperatures approaching absolute zero. Next comes the most difficult part. They must get a single microscopic particle in the correct region of the cavity while the cavity is locked up inside a containment vessel at superconducting temperatures, under vacuum, and with the electric field on. “We’ve come up with a way to remotely launch a particle in the cavity under experimental conditions, we just have to test it now,” Weisenberger said. “In the research and development world, you often can’t do what you thought you could do. We try and test and run into problems, try to solve the problems, and keep going.” This is a year-long project with the possibility of another year of funding, depending on how things go. It is also an early stage, proof of principle project. If it is ultimately successful, there would still be a long road of R&D before the concepts could be applied toward building quantum computers. Such computers would require levitating and imparting quantum states on tens to hundreds to thousands of much smaller particles predictably and reliably. Still, the researchers are looking forward to the discoveries they hope this study will enable regarding microscopic particle levitation and potential observation of a quantum state. “I’m optimistic,” Dhakal said. “Either way, we’ll discover something. Failure is just as much a part of R&D as success. You learn from both. Basically, whether the particle levitates or not, or whether we can impart the quantum state to it or not, it’s something that’s never been done before. It’s very challenging and exciting.” The team already has a research paper in the works for this project, but only time will tell whether they can realize this bit of magic in the laboratory.
<urn:uuid:25728089-cf72-4b62-b8ab-71ee420a0766>
CC-MAIN-2022-05
https://scitechdaily.com/levitation-classic-magic-trick-may-enable-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301670.75/warc/CC-MAIN-20220120005715-20220120035715-00556.warc.gz
en
0.928432
1,549
3.671875
4
Quantum computers are predicted to be faster than any supercomputer that can be built with silicon chips. The laws of physics are supposed to be symmetrical in time and space. That means a ball thrown at a certain speed and in a certain direction would always go the same distance, no matter when you throw or where on Earth you throw it (assuming no other factors, like weather, came into play). There are, however, things that do break space symmetry. Magnets have north and south poles, which means the magnetic spins of atoms within a magnet are not, as would be expected, spread in all directions but aligned in one direction or another. The same is true of crystals. Though fixed in space, some atoms in a crystal have preferred positions, which makes them appear different based on where you observe them from. Time symmetry, however, never breaks. The Nobel Prize-winning physicist Frank Wilczek noticed this and started asking questions. What if, for example, you threw the ball once at 12pm on Friday, and then again at 10am on Tuesday, at the exact same speed in the exact same conditions—and the two throws went a completely different distance? Wilczek knows that, in the macro world, that wouldn’t happen. But as Albert Einstein once pointed out, things get “spooky ” at the atomic level. In 2012, Wilczek proposed that perhaps at the atomic level it would be possible to create a type of matter that broke time symmetry; he called it a “time crystal.” The idea kicked off a storm of interest. Soon, however, calculations by Masaki Oshikawa at the University of Tokyo showed that time crystals would be impossible. His team found any system in the lowest energy state and in equilibrium would not be able to break time symmetry. By equilibrium, physicists mean all molecules of, say, water in the liquid state have at least a certain amount of energy. If you add energy in the form of heat to liquid water, all the molecules that gain a certain amount of energy will reach a new equilibrium, becoming steam and enter a new state of matter. Every distinct state of matter we know about is in equilibrium. However, if physicists could get matter to enter a nonequilibrium state, they could also, based on Oshikawa’s calculations, make time crystals. If they do, it would be a hitherto unknown state of matter. The trouble was physicists didn’t know how to do it—until now. Two studies published in Nature March 9 show such nonequilibrium states can be entered and matter in those states can be classified as types of time crystals. Christopher Monroe, a physicist at the University of Maryland-College Park, created the system using 10 charged atoms of the element ytterbium and four sets of lasers. One laser converted each ytterbium atoms into a magnet. Another laser naughtily caused disorder to ensure the atoms were in a nonequilibrium state. Then, a third laser made the magnetic atoms to flip—as if the north pole switched to the south pole in a large magnet. The combination of these three lasers (the fourth was used to read the status of each atom) caused the atoms to oscillate, but at twice the frequency at which their magnetic states were being flipped. In other words, it broke time symmetry. “It’s like playing with a jump rope, and somehow our arm goes around twice but the rope only goes around once,” Yao told Nature. Mikhail Lurkin, a physicist at Harvard University, also created a time crystal but in a different system. He used a “dirty” diamond, which is like a normal diamond but has lots of nitrogen atom impurities. He used microwave pulses to flip the spins of nitrogen atoms—and, it turned out, just like Monroe’s ytterbium atoms, the frequency at which the atoms flipped their spin different from the frequency of the pulses. The time crystals created by Monroe and Lurkin are slightly different than the ones Wilczek proposed. They require regular inputs of energy, which Wilczek’s time crystals would not have needed. “It’s less weird than [Wilczek’s] idea, but it’s still fricking weird,” Yao told Nature. Because of this difference, not everyone is convinced what the physicists have achieved are actually time crystals. “This is an intriguing development, but to some extent it’s an abuse of the term,” Oshikawa told Nature. The creators of the newly announced time crystals aren’t fussed about definitions. They are, instead, excited by the possibility of using their methods to accelerate the development of quantum computers, which are predicted to be faster than any supercomputer that can be built with silicon chips. Quantum computers require atoms to exist in entangled states, where changing the state of one automatically causes the other to change state too. At present, such states can be achieved only at extremely low temperatures. Lurkin got all the nitrogen atoms in his dirty diamond to change position together at a constant frequency—meaning they were held in quantum entanglement—and he did it at room temperature. “In my main job, I use atoms to create quantum computers,” Monroe told me. “Our finding could help create larger quantum computers that don’t need to be at such cold temperatures.”
<urn:uuid:f1f46843-0693-4c23-8f2d-8a892f9ccdbb>
CC-MAIN-2022-05
https://www.nextgov.com/emerging-tech/2017/03/physicists-have-created-impossible-state-matter-could-power-quantum-computers/136072/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304872.21/warc/CC-MAIN-20220125190255-20220125220255-00156.warc.gz
en
0.955445
1,133
3.625
4
Science education in schools has a long way to go to be taken seriously, and this is why it is so important to get teachers in the classroom to understand how to teach it. It is not enough to teach that which is taught, but that which cannot be taught. We need to teach what is really going on, as opposed to what is shown in a lecture. So, let’s start with the basics of science education. What is science? Scientific knowledge is the process of discovering, analyzing and understanding the nature of the world around us. We are all born with this knowledge. It’s not something that can be learnt, but we need to be taught to be able to understand it, and it is crucial to do that in a classroom setting. The first step is to have a good understanding of science. Science is a collection of things that we observe, think about and understand, and they have a physical basis. The process of discovery is the understanding of these things. What this means is that we are constantly making discoveries, trying to understand the world in new ways and then coming up with new explanations for them. So it is the idea that everything we do has a physical cause. When we study, for example, the way the Earth rotates, we are actually looking at the behaviour of something that is happening on the surface of the Earth. The motion of the planet is a mathematical function that is measured, and so is the behaviour. This is a physical theory. We know this because we have a telescope that sees through the sky, but it is also possible to look through the atmosphere and see the behaviour as well. And this is the physics of how things interact. The second step is the description of the universe as we see it. We live in a scientific world, where all the elements of our universe are observable. So there are different types of observational evidence that can provide clues to how the world is, and the universe has a certain kind of physical structure. It does not change. So how do we explain the laws of physics, for instance? We can do this by measuring the motions of objects and their properties, which are then described using equations, and we can then compare these to the behaviour in nature. There are two kinds of physical theories that can help us understand the universe: quantum mechanics and general relativity. In quantum mechanics, you are looking at particles, called quarks, that can move in very particular ways. This means that there are things that can behave in a particular way. And these quarks are called elementary particles. They have properties like momentum, mass, spin and charge. They are called general relativity because we can use the laws that describe the behaviour and interactions of the particles to describe the laws on the ground that they are travelling through. In general relativity, we use the Einstein equations, which describe how gravity works. We measure the speed of light, and look at how light behaves, and see if it behaves in a certain way. These equations are the same as those used by quantum mechanics. In fact, you could use these equations in physics too, as you might know by now. There is a third kind of theory, which is called quantum entanglement. This applies to the way in which certain particles interact with other particles, which then can create quantum states. This can give us new ideas about the world. You can think of it like having an entangled toy, where each time you press on the toy it causes it to move. The result of this interaction is an entangled state. These are what quantum theory describes, but they are not the same thing as the elementary particles we are looking for. In the next step, we learn about what the world looks like when we look at the universe. When you look at an image on the wall, or a photograph on a wall, you can see how that image or photograph looks, and what the colour of the image is. In this step, you will also learn how the laws governing the universe relate to these different types. These three steps help you understand the way our universe behaves. There might be other things that you can study in school, such as astronomy, but there are also other things you need to learn, like geology and mathematics. The final step is learning to use these concepts to understand what is going on in the world, and to solve problems. Science education The idea that science is a science that is taught in schools is not new. Many of the sciences that we know have been taught in classrooms over the past century. In science education, we should take these lessons from our history of teaching science to our present day. Science in schools was always a science subject in the schools. There was no such thing as a scientific society, no national curriculum or national standards of knowledge. So when science education was introduced, there were no national standards. But, as we learn from our own history, there is an important lesson in this. Science was always taught in a way that it was not
<urn:uuid:ceecd7bb-5830-44ee-8c93-17b5114ab863>
CC-MAIN-2022-05
https://mentorsofnewthought.com/how-to-teach-science-in-schools/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304961.89/warc/CC-MAIN-20220126192506-20220126222506-00396.warc.gz
en
0.96648
1,067
3.71875
4
What are quantum computers? Quantum computers work in a different way to conventional computers. While the latter are binary and store information in the form of 0 or 1, quantum computers use a system of measurement based on quantum physics and the nature of matter itself, called qubits. These are subatomic particles that can be zero, one or a linear combination of both. That combination is called superposition. It means they can store and simultaneously process far more data in far less physical memory space than binary machines. For example, 8 bits on binary systems will represent any number between 0 and 256, while 8 qubits can represent all those numbers at the same time. That explanation is a gross over-simplification for a technology that’s innately hard to understand. What it means is that quantum computers should handle bigger problems and vastly larger data sets than conventional PCs, enabling analysis of vast amounts of data that couldn’t be solved – or even stored – until now, and all at the same time. That last element is key. “As opposed to a classical computer that must, essentially, trial every route before deciding on the optimum, quantum computers have the potential to provide an optimum by taking all routes into account at the same time,” explained researchers from GlobalData. In 2019, Google demonstrated how a quantum computer could solve in minutes a problem it would take conventional computers 10,000 years to solve. This is still a relatively early-stage technology, and while Google, IBM, Microsoft, PsiQuantum, governments and startups are developing quantum computing technologies, they are expensive, suffer from decoherence, require error correction, and don’t yet scale beyond a certain point, typically 50 or 60 qubits. This is subject to change: IBM promises its 127-qubit Quantum Eagle processor in 2021 and says it will introduce 1,000+ qubit systems by 2023. The technology will eventually become sufficiently mature to meet real-world challenges. What will this mean for business? As the technology improves, it will be applied. Boston Consulting Group (BCG) recently predicted quantum computing will add up to $850 billion in economic value by 2050. “Recent advances and roadmaps from major hardware companies such as IBM, Google, Honeywell, IonQ, PsiQuantum and others have increased the confidence that we will have machines powerful enough to tackle important business and society problems before the end of this decade. Impacted companies and governments should get prepared for an accelerated timeline,” said BCG partner, Jean-Francois Bobier. But how will the technology create this value? Bobier believes use of quantum computers to run vast simulations will benefit medical research and drug discovery, battery design and fluid dynamics. “For a top pharma company with an R&D budget in the $10 billion range, quantum computing could represent an efficiency increase of up to 30%,” BCG states. Sci-tech research can also reach new frontiers: Google researchers used quantum computing to demonstrate a genuine “time crystal,” a piece of matter that evades the second law of thermodynamics. These systems may enable government and enterprise to optimize logistics and insurers to better manage risk, while machine-learning advances may help protect against (and, perhaps, enable) online crime and fraud and become the mind in autonomous vehicle systems. Military uses may include radar, highly-secure, very-long-distance communications and submarine detection systems as are being developed in China. Many financial institutions, including JP Morgan, Goldman Sachs and Wells Fargo are exploring how the tech can be applied to financial instruments such as stocks. An April 2021 Goldman Sachs report showed risk analysis could be conducted at 1,000 times the speed of existing technologies but warned of high error rates at this time. Quantum computer hardware capable of running such tests successfully are expected to become available in 10-20-years’ time. Most quantum computing reports always note the security use cases cryptography and encryption. Quantum computers are expected to be capable of breaking existing encryption but, conversely, should also enable the development of more powerful encryption standards. In Europe, the European quantum communication network EuroQCI is developing an ultra-secure EU-wide communications infrastructure to secure critical infrastructure. In France, President Emmanuel Macron announced a 1.8 billion Euro Quantum Plan initiative for supporting research and development of quantum technologies. France isn’t alone. China, the U.S., UK, and many other nations are investing billions in what is becoming a quantum computing arms race for technological supremacy. Quantum computers may also contribute to the struggle against climate change, according to a BCG report. Because they can model complex molecular interactions existing computers cannot, they may enable researchers to innovate technological solutions that reduce emissions in carbon-intensive industries like construction, fertilizer production and transportation. “Quantum computing could help bring more low-carbon technologies into economic reach,” says BCG. “It is in the best interest of governments and companies to fast-track progress in the race for our future.” Oxford Quantum Circuits launched the UK’s first quantum computing as a service (QaaS) platform in July. Clients can access these machines via the cloud to identify the impact on their business. “Early adopters will have the advantage of understanding what [quantum] means in terms of their market and their business,” said Dr. Ilana Wisby. IBM and other big names also support online access to quantum computing resources. QaaS platforms may play an important part in making the benefits of this machinery available. A commercial 50 qubit quantum computer costs in the region of $15 million, not to mention running and maintenance costs. The cost and fragility of quantum systems will limit their appeal for some time yet. Orange is involved with three European projects investigating quantum communications: CiViQ (Continuous Variable Quantum Communications), OPENQKD (Open European Quantum Key distribution testbed) and the QOSAC (Quantum Overarching System Architecture Concepts). Also read about quantum cryptography and quantum teleportation and the race to build the quantum Internet. Jon Evans is a highly experienced technology journalist and editor. He has been writing for a living since 1994. These days you might read his daily regular Computerworld AppleHolic and opinion columns. Jon is also technology editor for men's interest magazine, Calibre Quarterly, and news editor for MacFormat magazine, which is the biggest UK Mac title. He's really interested in the impact of technology on the creative spark at the heart of the human experience. In 2010 he won an American Society of Business Publication Editors (Azbee) Award for his work at Computerworld.
<urn:uuid:a90a66fc-6a25-480b-9ad2-5479a7899880>
CC-MAIN-2022-05
https://www.orange-business.com/en/blogs/what-are-quantum-computers-and-what-do-they-mean-my-business
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303729.69/warc/CC-MAIN-20220122012907-20220122042907-00038.warc.gz
en
0.926938
1,375
3.703125
4
Quantum researchers have managed to simulate the reversal of time at the quantum scale, using IBM’s quantum computers. Although the effect is simulated, the implication of this experiment is that the arrow of time doesn’t necessarily have to flow in one direction, but can be reversed, allowing what once was to be once again. The passage of time, at least here in the macroscopic world that we live in, travels in only one direction (that’s forward for those of us that perceive otherwise), and that effect is marked by the concept of entropy, in which isolated systems continually fall into an increasing state of disorder, like the erosion of a mountain into smaller rocks and sand, or the scattering of a set of racked billiard balls after the break. Interestingly, entropy is virtually the only quantity in the physical sciences that requires the arrow of time to move in one direction; otherwise, the application of most physical laws can work just as well backwards as they do forward. An extremely simple mathematical example would be that 1+2=3, but the equation also works in reverse: 2+1=3. Quantum physicists from the Moscow Institute of Physics and Technology, along with colleagues in Switzerland and the U.S, decided to see if they could reverse the state of disorder in an isolated system, for instance in a single electron, effectively hitting the rewind button on entropy. “Suppose the electron is localized when we begin observing it. This means that we’re pretty sure about its position in space,” explains study co-author Andrey Lebedev from MIPT and ETH Zurich. “The laws of quantum mechanics prevent us from knowing [its position] with absolute precision, but we can outline a small region where the electron is localized.” But with the passage of time, the boundaries of the electron become smeared over an increasingly larger area of space, indicating an increasing state of chaos associated with the particle—this is the concept of entropy at work. Basically, over the course of the observation of the electron, the particle falls into an ever increasing state of chaos. While we don’t see the opposite happen in our everyday world, the laws of quantum mechanics don’t actually prohibit the electron from falling into a more ordered state. It would be much like billiard balls that are scattered across a table somehow falling into their racked position if the table were jarred in just the right way—an utterly implausible scenario, but certainly not an impossible one. Armed with that improbability, the quantum scientists turned to a system that could give their proverbial pool table a jolt that had a reasonable chance to actually re-rack the balls: a quantum computer. In this case, the computer’s individual qbits (analogous to the computational “bits” used by a regular computer) would represent individual electrons: each qubit would be allowed to fall out of the well-ordered state that they would start in, and then be given a mathematical “kick” to see if they reverted back to their earlier ordered state from the resulting chaotic one. The experiment was conducted in four stages: the first stage was to set the individual qubits to their initial “ground” state—racking the billiard balls, as it were—a highly ordered configuration that would be analogous to the localization of an electron in a small region. In stage two, the break shot is made. The researchers allowed the ordered state of the qubits to degrade into chaos, much like how the position of our imaginary electron would appear to become smeared over an increasingly large area of space as the passage of time increases. Needless to say, this was the simplest stage, since natural entropy did all the work. Stage three: back to the past! Once the state of the qbits had reached a sufficient state of disorder, a special program was then run that modified the state of the computer—the improbable “kick” given to the billiard table to send the balls back to their racked position—to simulate an effect on our isolated electron that would otherwise appear to be a chaotic happenstance, such as a random fluctuation in the cosmic microwave background (CMB), the faint background radiation left over from the birth of the Universe. In stage four, the disordered state of the qbits would revert to their initial state, simulating the reversal of time in our hypothetical electron to an earlier, more ordered state. Or, returning to our billiard table analogy, the improbable circumstance of a movement in the table causing the balls to return to their racked position. Each iteration of the experiment had a high success rate, with 85% of the cases returning to their ground state when two qubits were used. However, when three qbits were used, the experiment only had a 50/50 chance of succeeding, apparently due to errors caused by imperfections present in the computers themselves. But what does this mean in terms of real-world applications? “Our findings break ground for investigations of the time reversal and the backward time flow in real quantum systems,” according to the study paper. The research team predicts that this experiment could also help improve future quantum computing systems. “Our algorithm could be updated and used to test programs written for quantum computers and eliminate noise and errors,” according to Lebedev. - “Great Andromeda Nebula by Isaac Roberts, 1899.” Andromeda galaxy (M31) is two million light-years away. Thus we are viewing M31’s light from two million years ago, a time before humans existed on Earth. via Wikimedia Commons News Source: phys.org Subscribers, to watch the subscriber version of the video, first log in then click on Dreamland Subscriber-Only Video Podcast link.
<urn:uuid:bbd1ef08-bf68-4101-81f6-771175c99450>
CC-MAIN-2022-05
https://www.unknowncountry.com/headline-news/researchers-reverse-time-in-a-quantum-computer/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303512.46/warc/CC-MAIN-20220121162107-20220121192107-00197.warc.gz
en
0.948615
1,209
3.65625
4
Quantum computers are a new type of machine that operate on quantum mechanical hardware and are predicted to give enormous speed advantages in solving certain problems. Research groups at leading universities and companies, including Google, Microsoft and IBM, are part of a worldwide race to realise the first quantum computer that crosses into the 'quantum computational singularity'. This represents a problem so complex that today's top supercomputer would take centuries to find a solution, while a quantum computer could crack it in minutes. Now a team of scientists from Bristol have discovered that the boundary to this singularity is further away than previously thought. The research is reported inNature Physics. The results apply to a highly influential quantum algorithm known as 'boson sampling', which was devised as a very direct route to demonstrate quantum computing's supremacy over classical machines. The boson sampling problem is designed to be solved by photons - particles of light - controlled in optical chips - technology pioneered by Bristol's Quantum Engineering and Technology Labs (QETLabs). Predicting the pattern of many photons emerging from a large optical chip is related to an extremely hard random matrix calculation. With the rapid progress in quantum technologies, it appeared as though a boson sampling experiment that crossed into the quantum computational singularity was within reach. However, the Bristol team were able to redesign an old classical algorithm to simulate boson sampling, with dramatic consequences. Dr. Anthony Laing, who heads a group in QETLabs and led this research, stated: "It's like tuning up an old propeller aeroplane to go faster than an early jet aircraft. We're at a moment in history where it is still possible for classical algorithms to outperform the quantum algorithms that we expect to ultimately be supersonic. But demonstrating such a feat meant assembling a crack team of scientists, mathematicians, and programmers." Classical algorithms expert Dr. Raphaël Clifford, from Bristol's Department of Computer Science, redesigned several classical algorithms to attack the boson sampling problem, with the 1950's Metropolised Independence Sampling algorithm giving the best performance. The simulation code was optimised by QETLabs researcher 'EJ', a former LucasArts programmer. Expertise on computational complexity came from Dr. Ashley Montanaro, of Bristol's School of Mathematics, while QETLabs students Chris Sparrow and Patrick Birchall worked out the projected performance of the competing quantum photonics technology. At the heart of the project and bringing all these strands together was QETLabs PhD student and first author on the paper, Alex Neville, who tested, implemented, compared, and analysed, all of the algorithms. He stated: "The largest boson sampling experiment reported so far is for five photons. It was believed that 30 or even 20 photons would be enough to demonstrate quantum computational supremacy." Yet he was able to simulate boson sampling for 20 photons on his own laptop, and increased the simulation size to 30 photons by using departmental servers. Alex Neville added: "With access to today's most powerful supercomputer, we could simulate boson sampling with 50 photons." The research builds on Bristol's reputation as a centre of activity for quantum science and the development of quantum technologies. Through QETLabs, the university has embarked on an ambitious programme to bring quantum technologies out of the laboratory and engineer them in to useful devices that have real-world applications for tackling some of society's toughest problems. In addition to collaborations with tech companies such as Microsoft, Google, and Nokia, start-ups and new business activities focused on quantum technologies have emerged in Bristol. An important theme across the overall quantum research activity is developing our understanding of exactly how quantum technologies can provably outperform conventional computers. Recently Dr. Montanaro, together with Professor Noah Linden of the School of Mathematics, organised a Heilbronn Focused Research Group on the topic of quantum computational supremacy. This meeting brought some of the world leaders in the field, from both industry and academia, to Bristol for a week of intense discussions and collaboration. Among the attendees was one of the theorists who devised boson sampling, Professor Scott Aaronson, from the University of Texas at Austin. Although outperforming classical computers might take a little longer than originally hoped, Dr. Laing is still optimistic about the prospects for building a device to do just that. He stated: "We now have a solid idea of the technological challenge we must meet to demonstrate that quantum machines can out-compute their classical counterparts. For boson sampling, the singularity lies just beyond 50 photons. It's a tougher nut to crack than we first thought, but we still fancy our chances." With Dr. Laing's group focused on practical applications of quantum technologies, the current work puts bounds on the size and sophistication of photonic devices that will be required to tackle industrially relevant problems that are beyond the capabilities of today's classical algorithms. The paper titled "Classical boson sampling algorithms with superior performance to near-term experiments" is authored by A. Neville, C. Sparrow, R. Clifford, E. Johnston, P. Birchall, A. Montanaro and A. Laing and has appeared inNature Physics.
<urn:uuid:bb95ac98-1f79-461a-9954-8ecf629affd0>
CC-MAIN-2020-16
http://primeurmagazine.com/weekly/AE-PR-11-17-35.html
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370497042.33/warc/CC-MAIN-20200330120036-20200330150036-00058.warc.gz
en
0.938865
1,071
3.546875
4
The one thing everyone knows about quantum mechanics is its legendary weirdness, in which the basic tenets of the world it describes seem alien to the world we live in. Superposition, where things can be in two states simultaneously, a switch both on and off, a cat both dead and alive. Or entanglement, what Einstein called “spooky action-at-distance” in which objects are invisibly linked, even when separated by huge distances. But weird or not, quantum theory is approaching a century old and has found many applications in daily life. As John von Neumann once said: “You don’t understand quantum mechanics, you just get used to it.” Much of electronics is based on quantum physics, and the application of quantum theory to computing could open up huge possibilities for the complex calculations and data processing we see today. Imagine a computer processor able to harness super-position, to calculate the result of an arbitrarily large number of permutations of a complex problem simultaneously. Imagine how entanglement could be used to allow systems on different sides of the world to be linked and their efforts combined, despite their physical separation. Quantum computing has immense potential, making light work of some of the most difficult tasks, such as simulating the body’s response to drugs, predicting weather patterns, or analyzing big datasets. Such processing possibilities are needed. The first transistors could only just be held in the hand, while today they measure just 14 nm–500 times smaller than a red blood cell. This relentless shrinking, predicted by Intel founder Gordon Moore as Moore’s law, has held true for 50 years, but cannot hold indefinitely. Silicon can only be shrunk so far, and if we are to continue benefiting from the performance gains we have become used to, we need a different approach. Advances in semiconductor fabrication have made it possible to mass-produce quantum-scale semiconductors – electronic circuits that exhibit quantum effects such as super-position and entanglement. The image, captured at the atomic scale, shows a cross-section through one potential candidate for the building blocks of a quantum computer, a semiconductor nano-ring. Electrons trapped in these rings exhibit the strange properties of quantum mechanics, and semiconductor fabrication processes are poised to integrate these elements required to build a quantum computer. While we may be able to construct a quantum computer using structures like these, there are still major challenges involved. In a classical computer processor a huge number of transistors interact conditionally and predictably with one another. But quantum behavior is highly fragile; for example, under quantum physics even measuring the state of the system such as checking whether the switch is on or off, actually changes what is being observed. Conducting an orchestra of quantum systems to produce useful output that couldn’t easily by handled by a classical computer is extremely difficult. But there have been huge investments: the U.K. government announced 270 million pounds (about $417 million) funding for quantum technologies in 2014 for example, and the likes of Google, NASA and Lockheed Martin are also working in the field. It’s difficult to predict the pace of progress, but a useful quantum computer could be ten years away. The basic element of quantum computing is known as a qubit, the quantum equivalent to the bits used in traditional computers. To date, scientists have harnessed quantum systems to represent qubits in many different ways, ranging from defects in diamonds, to semiconductor nano-structures or tiny superconducting circuits. Each of these has is own advantages and disadvantages, but none yet has met all the requirements for a quantum computer, known as the DiVincenzo Criteria. The most impressive progress has come from D-Wave Systems, a firm that has managed to pack hundreds of qubits on to a small chip similar in appearance to a traditional processor. The benefits of harnessing quantum technologies aren’t limited to computing, however. Whether or not quantum computing will extend or augment digital computing, the same quantum effects can be harnessed for other means. The most mature example is quantum communications. Quantum physics has been proposed as a means to prevent forgery of valuable objects, such as a banknote or diamond, as illustrated in the image below. Here, the unusual negative rules embedded within quantum physics prove useful; perfect copies of unknown states cannot be made and measurements change the systems they are measuring. These two limitations are combined in this quantum anti-counterfeiting scheme, making it impossible to copy the identity of the object they are stored in. The concept of quantum money is, unfortunately, highly impractical, but the same idea has been successfully extended to communications. The idea is straightforward: the act of measuring quantum super-position states alters what you try to measure, so it’s possible to detect the presence of an eavesdropper making such measurements. With the correct protocol, such as BB84, it is possible to communicate privately, with that privacy guaranteed by fundamental laws of physics. Quantum communication systems are commercially available today from firms such as Toshiba and ID Quantique. While the implementation is clunky and expensive now it will become more streamlined and miniaturised, just as transistors have miniaturised over the last 60 years. Improvements to nanoscale fabrication techniques will greatly accelerate the development of quantum-based technologies. And while useful quantum computing still appears to be some way off, it’s future is very exciting indeed.
<urn:uuid:8e133bdc-daa8-43f2-8ae0-2d7fbfe4cd2e>
CC-MAIN-2020-16
https://www.theepochtimes.com/get-used-to-it-quantum-computing-will-bring-immense-processing-possibilities_1741305.html
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370493684.2/warc/CC-MAIN-20200329015008-20200329045008-00220.warc.gz
en
0.937072
1,123
3.546875
4
Periodic Table of Elements (International Union of Pure and Applied Chemistry) Interactive Periodic Table of Elements (Los Alamos National Laboratory) Periodic Table (American Chemical Society) These are organized by a classification scheme developed exclusively for Cosma. More… Periodic table is a tabular arrangement of the chemical elements, ordered by their atomic number, electron configuration, and recurring chemical properties, whose structure shows periodic trends. Generally, within one row (period) the elements are metals to the left, and non-metals to the right, with the elements having similar chemical behaviours placed in the same column. Table rows are commonly called periods and columns are called groups. Six groups have accepted names as well as assigned numbers: for example, group 17 elements are the halogens; and group 18 are the noble gases. Also displayed are four simple rectangular areas or blocks associated with the filling of different atomic orbitals. The organization of the periodic table can be used to derive relationships between the various element properties, but also the predicted chemical properties and behaviours of undiscovered or newly synthesized elements. Russian chemist Dmitri Mendeleev was the first to publish a recognizable periodic table in 1869, developed mainly to illustrate periodic trends of the then-known elements. He also predicted some properties of unidentified elements that were expected to fill gaps within the table. Most of his forecasts proved to be correct. Mendeleev’s idea has been slowly expanded and refined with the discovery or synthesis of further new elements and the development of new theoretical models to explain chemical behaviour. The modern periodic table now provides a useful framework for analyzing chemical reactions, and continues to be widely used in chemistry, nuclear physics and other sciences. All the elements from atomic numbers 1 (hydrogen) through 118 (oganesson) have been either discovered or synthesized, completing the first seven rows of the periodic table. The first 98 elements exist in nature, although some are found only in trace amounts and others were synthesized in laboratories before being found in nature. Elements 99 to 118 have only been synthesized in laboratories or nuclear reactors. The synthesis of elements having higher atomic numbers is currently being pursued: these elements would begin an eighth row, and theoretical work has been done to suggest possible candidates for this extension. Numerous synthetic radionuclides of naturally occurring elements have also been produced in laboratories. — Wikipedia Phys.org - latest science and technology news stories Phys.org internet news portal provides the latest news on science including: Physics, Nanotechnology, Life Sciences, Space Science, Earth Science, Environment, Health and Medicine. Researchers test the way we understand forces in... on April 1, 2020 at 5:50 pm A discovery by a team of researchers led by UMass Lowell nuclear physicists could change how atoms are understood by scientists and help explain extreme phenomena in outer space. Radiation damage spreads among close neighbors on March 17, 2020 at 2:18 pm A single X-ray can unravel an enormous molecule, physicists report in the March 17 issue of Physical Review Letters. Their findings could lead to safer medical imaging and a more nuanced understanding of the electronics of heavy metals. Scientists created an 'impossible'... on March 3, 2020 at 1:41 pm Scientists have created new superconducting compounds of hydrogen and praseodymium, a rare-earth metal, one substance being quite a surprise from the perspective of classical chemistry. The study helped find the optimal metals for room-temperature superconductors. The results were published in Science Advances. Artificial atoms create stable qubits for quantum... on February 11, 2020 at 9:00 am Quantum engineers from UNSW Sydney have created artificial atoms in silicon chips that offer improved stability for quantum computing. Quantum logic spectroscopy unlocks potential of... on January 29, 2020 at 5:00 pm Scientists from the PTB and the Max Planck Institute for Nuclear Physics (MPIK), both Germany, have carried out pioneering optical measurements of highly charged ions with unprecedented precision. To do this, they isolated a single Ar13 + ion from an extremely hot plasma and brought it practically to rest inside an ion trap together with a laser-cooled, singly charged ion. Employing quantum logic spectroscopy on the ion pair, they have increased the relative precision by a factor of a hundred […] Current model for storing nuclear waste is... on January 27, 2020 at 3:00 pm The materials the United States and other countries plan to use to store high-level nuclear waste will likely degrade faster than anyone previously knew because of the way those materials interact, new research shows. Stopping yellow spot fungus that attacks wheat... on January 23, 2020 at 12:57 pm Scientists from the Centre for Crop and Disease Management (CCDM) and Curtin University in Western Australia have used an advanced imaging technique at the Australian Synchrotron for an in-depth look at how a fungus found in wheat crops is damaging its leaves. Making new catalysts from unique metallic alloys on January 17, 2020 at 1:48 pm Heusler alloys are magnetic materials made from three different metals that are not magnetic individually. The alloys are used broadly for their magnetic and thermoelectric properties, and their ability to regain their original shape after being deformed, known as shape memory. Investigations by Tohoku University's advanced materials scientist An-Pang Tsai and colleagues now show that these materials can also be fine-tuned to speed up chemical reactions. This catalytic capability is reviewed in […] Potassium-driven rechargeable batteries: An... on January 16, 2020 at 11:26 am Our modern lifestyle would be immensely different without rechargeable batteries. Owing to their low-cost, recyclable technology, these batteries are used in most portable electronic devices, electric and hybrid vehicles, and renewable power generation systems. They offer an elegant solution to the world's growing energy demands. Moreover, rechargeable batteries are an essential tool in systems that harvest renewable energy, such as the wind and sunlight, because these sources can fluctuate […] Simulation of dwarf galaxy reveals different... on January 10, 2020 at 11:40 am Simulations of a dwarf galaxy by RIKEN astrophysicists have revealed the various processes by which moderately heavy metals such as strontium are birthed. They have found that at least four kinds of stars are needed to explain the observed abundance of these metals in dwarf galaxies.
<urn:uuid:11afcee0-196a-4acc-9da4-27a73e7a8e29>
CC-MAIN-2020-16
https://cosma.org/periodic-table/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370506580.20/warc/CC-MAIN-20200402014600-20200402044600-00421.warc.gz
en
0.935718
1,338
3.8125
4
Quantum theory is the theoretical basis of modern physics that explains the nature and behavior of matter and energy on the atomic and subatomic level. The nature and behavior of matter and energy at that level is sometimes referred to as quantum physics and quantum mechanics. In 1900, physicist Max Planck presented his quantum theory to the German Physical Society. Planck had sought to discover the reason that radiation from a glowing body changes in color from red, to orange, and, finally, to blue as its temperature rises. He found that by making the assumption that energy existed in individual units in the same way that matter does, rather than just as a constant electromagnetic wave - as had been formerly assumed - and was therefore quantifiable, he could find the answer to his question. The existence of these units became the first assumption of quantum theory. Planck wrote a mathematical equation involving a figure to represent these individual units of energy, which he called quanta. The equation explained the phenomenon very well; Planck found that at certain discrete temperature levels (exact multiples of a basic minimum value), energy from a glowing body will occupy different areas of the color spectrum. Planck assumed there was a theory yet to emerge from the discovery of quanta, but, in fact, their very existence implied a completely new and fundamental understanding of the laws of nature. Planck won the Nobel Prize in Physics for his theory in 1918, but developments by various scientists over a thirty-year period all contributed to the modern understanding of quantum theory. The Development of Quantum Theory - In 1900, Planck made the assumption that energy was made of individual units, or quanta. - In 1905, Albert Einstein theorized that not just the energy, but the radiation itself was quantized in the same manner. - In 1924, Louis de Broglie proposed that there is no fundamental difference in the makeup and behavior of energy and matter; on the atomic and subatomic level either may behave as if made of either particles or waves. This theory became known as the principle of wave-particle duality: elementary particles of both energy and matter behave, depending on the conditions, like either particles or waves. - In 1927, Werner Heisenberg proposed that precise, simultaneous measurement of two complementary values - such as the position and momentum of a subatomic particle - is impossible. Contrary to the principles of classical physics, their simultaneous measurement is inescapably flawed; the more precisely one value is measured, the more flawed will be the measurement of the other value. This theory became known as the uncertainty principle, which prompted Albert Einstein's famous comment, "God does not play dice." The Copenhagen Interpretation and the Many-Worlds Theory The two major interpretations of quantum theory's implications for the nature of reality are the Copenhagen interpretation and the many-worlds theory. Niels Bohr proposed the Copenhagen interpretation of quantum theory, which asserts that a particle is whatever it is measured to be (for example, a wave or a particle), but that it cannot be assumed to have specific properties, or even to exist, until it is measured. In short, Bohr was saying that objective reality does not exist. This translates to a principle called superposition that claims that while we do not know what the state of any object is, it is actually in all possible states simultaneously, as long as we don't look to check. To illustrate this theory, we can use the famous and somewhat cruel analogy of Schrodinger's Cat. First, we have a living cat and place it in a thick lead box. At this stage, there is no question that the cat is alive. We then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if the cyanide capsule has broken and the cat has died. Since we do not know, the cat is both dead and alive, according to quantum law - in a superposition of states. It is only when we break open the box and see what condition the cat is that the superposition is lost, and the cat must be either alive or dead. The second interpretation of quantum theory is the many-worlds (or multiverse theory. It holds that as soon as a potential exists for any object to be in any state, the universe of that object transmutes into a series of parallel universes equal to the number of possible states in which that the object can exist, with each universe containing a unique single possible state of that object. Furthermore, there is a mechanism for interaction between these universes that somehow permits all states to be accessible in some way and for all possible states to be affected in some manner. Stephen Hawking and the late Richard Feynman are among the scientists who have expressed a preference for the many-worlds theory. Quantum Theory's Influence Although scientists throughout the past century have balked at the implications of quantum theory - Planck and Einstein among them - the theory's principles have repeatedly been supported by experimentation, even when the scientists were trying to disprove them. Quantum theory and Einstein's theory of relativity form the basis for modern physics. The principles of quantum physics are being applied in an increasing number of areas, including quantum optics, quantum chemistry, quantum computing, and quantum cryptography. See Brian Greene's introduction to quantum theory on Nova:
<urn:uuid:08f877dc-5fbf-4b60-bd10-2313df806475>
CC-MAIN-2020-16
https://whatis.techtarget.com/definition/quantum-theory
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370504930.16/warc/CC-MAIN-20200331212647-20200401002647-00462.warc.gz
en
0.954545
1,083
3.796875
4
Donuts, math, and superdense teleportation of quantum information Putting a hole in the center of the donut—a mid-nineteenth-century invention—allows the deep-fried pastry to cook evenly, inside and out. As it turns out, the hole in the center of the donut also holds answers for a type of more efficient and reliable quantum information teleportation, a critical goal for quantum information science. Quantum teleportation is a method of communicating information from one location to another without moving the physical matter to which the information is attached. Instead, the sender (Alice) and the receiver (Bob) share a pair of entangled elementary particles—in this experiment, photons, the smallest units of light—that transmit information through their shared quantum state. In simplified terms, Alice encodes information in the form of the quantum state of her photon. She then sends a key to Bob over traditional communication channels, indicating what operation he must perform on his photon to prepare the same quantum state, thus teleporting the information. Quantum teleportation has been achieved by a number of research teams around the globe since it was first theorized in 1993, but current experimental methods require extensive resources and/or only work successfully a fraction of the time. Now, by taking advantage of the mathematical properties intrinsic to the shape of a donut—or torus, in mathematical terminology—a research team led by physicist Paul Kwiat of the University of Illinois at Urbana-Champaign has made great strides by realizing “superdense teleportation”. This new protocol, developed by coauthor physicist Herbert Bernstein of Hampshire College in Amherst, MA, effectively reduces the resources and effort required to teleport quantum information, while at the same time improving the reliability of the information transfer. With this new protocol, the researchers have experimentally achieved 88 percent transmission fidelity, twice the classical upper limit of 44 percent. The protocol uses pairs of photons that are “hyperentangled”—simultaneously entangled in more than one state variable, in this case in polarization and in orbital angular momentum—with a restricted number of possible states in each variable. In this way, each photon can carry more information than in earlier quantum teleportation experiments. At the same time, this method makes Alice’s measurements and Bob’s transformations far more efficient than their corresponding operations in quantum teleportation: the number of possible operations being sent to Bob as the key has been reduced, hence the term “superdense”. Kwiat explains, “In classical computing, a unit of information, called a bit, can have only one of two possible values—it’s either a zero or a one. A quantum bit, or qubit, can simultaneously hold many values, arbitrary superpositions of 0 and 1 at the same time, which makes faster, more powerful computing systems possible. “So a qubit could be represented as a point on a sphere, and to specify what state it is, one would need longitude and latitude. That’s a lot of information compared to just a 0 or a 1.” “What makes our new scheme work is a restrictive set of states. The analog would be, instead of using a sphere, we are going to use a torus, or donut shape. A sphere can only rotate on an axis, and there is no way to get an opposite point for every point on a sphere by rotating it—because the axis points, the north and the south, don’t move. With a donut, if you rotate it 180 degrees, every point becomes its opposite. Instead of axis points you have a donut hole. Another advantage, the donut shape actually has more surface area than the sphere, mathematically speaking—this means it has more distinct points that can be used as encoded information.” Lead author, Illinois physics doctoral candidate Trent Graham, comments, “We are constrained to sending a certain class of quantum states called ‘equimodular’ states. We can deterministically perform operations on this constrained set of states, which are impossible to perfectly perform with completely general quantum states. Deterministic describes a definite outcome, as opposed to one that is probabilistic. With existing technologies, previous photonic quantum teleportation schemes either cannot work every time or require extensive experimental resources. Our new scheme could work every time with simple measurements.” This research team is part of a broader collaboration that is working toward realizing quantum communication from a space platform, such as the International Space Station, to an optical telescope on Earth. The collaboration—Kwiat, Graham, Bernstein, physicist Jungsang Kim of Duke University in Durham, NC, and scientist Hamid Javadi of NASA’s Jet Propulsion Laboratory in Pasadena, CA—recently received funding from NASA Headquarter's Space Communication and Navigation program (with project directors Badri Younes and Barry Geldzahler) to explore the possibility. “It would be a stepping stone toward building a quantum communications network, a system of nodes on Earth and in space that would enable communication from any node to any other node,” Kwiat explains. “For this, we’re experimenting with different quantum state properties that would be less susceptible to air turbulence disruptions.” The team’s recent experimental findings are published in the May 28, 2015 issue of Nature Communications, and represent the collaborative effort Kwiat, Graham, and Bernstein, as well as physicist Tzu-Chieh Wei of State University of New York at Stony Brook, and mathematician Marius Junge of the University of Illinois. This research is funded by NSF Grant No. PHY-0903865, NASA NIAC Program, and NASA Grant No. NNX13AP35A. It is partially supported by National Science Foundation Grants DMS-1201886, No. PHY 1314748, and No. PHY 1333903. ______________________ Contact: Siv Schwink, communications coordinator, Department of Physics, 217/300-2201. Paul Kwiat, Department of Physics, University of Illinois at Urbana-Champaign. Image by Precision Graphics, copyright Paul Kwiat, University of Illinois at Urbana-Champaign. Source: http://engineering.illinois.edu/news/article/11151?
<urn:uuid:0cc044eb-b8be-4aa1-bb01-c76ca0b9defe>
CC-MAIN-2020-16
https://www.nanotechnologyworld.org/post/2015/05/29/donuts-math-and-superdense-teleportation-of-quantum-information
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371700247.99/warc/CC-MAIN-20200407085717-20200407120217-00023.warc.gz
en
0.91913
1,299
3.671875
4
Particle accelerator technology could solve one of the most vexing problems in building quantum computers Last year, researchers at Fermilab received over $3.5 million for projects that delve into the burgeoning field of quantum information science. Research funded by the grant runs the gamut, from building and modeling devices for possible use in the development of quantum computers to using ultracold atoms to look for dark matter. For their quantum computer project, Fermilab particle physicist Adam Lyon and computer scientist Jim Kowalkowski are collaborating with researchers at Argonne National Laboratory, where they'll be running simulations on high-performance computers. Their work will help determine whether instruments called superconducting radio-frequency cavities, also used in particle accelerators, can solve one of the biggest problems facing the successful development of a quantum computer: the decoherence of qubits. "Fermilab has pioneered making superconducting cavities that can accelerate particles to an extremely high degree in a short amount of space," said Lyon, one of the lead scientists on the project. "It turns out this is directly applicable to a qubit." Researchers in the field have worked on developing successful quantum computing devices for the last several decades; so far, it's been difficult. This is primarily because quantum computers have to maintain very stable conditions to keep qubits in a quantum state called superposition. Classical computers use a binary system of zeroes and ones—called bits—to store and analyze data. Eight bits combined make one byte of data, which can be strung together to encode even more information. (There are about 31.8 million bytes in the average three-minute digital song.) In contrast, quantum computers aren't constrained by a strict binary system. Rather, they operate on a system of qubits, each of which can take on a continuous range of states during computation. Just as an electron orbiting an atomic nucleus doesn't have a discrete location but rather occupies all positions in its orbit at once in an electron cloud, a qubit can be maintained in a superposition of both zero and one. Since there are two possible states for any given qubit, a pair doubles the amount of information that can be manipulated: 22 = 4. Use four qubits, and that amount of information grows to 24 = 16. With this exponential increase, it would take only 300 entangled qubits to encode more information than there is matter in the universe. Qubits don't represent data in the same way as bits. Because qubits in superposition are both zero and one at the same time, they can similarly represent all possible answers to a given problem simultaneously. This is called quantum parallelism, and it's one of the properties that makes quantum computers so much faster than classical systems. The difference between classical computers and their quantum counterparts could be compared to a situation in which there is a book with some pages randomly printed in blue ink instead of black. The two computers are given the task of determining how many pages were printed in each color. "A classical computer would go through every page," Lyon said. Each page would be marked, one at a time, as either being printed in black or in blue. "A quantum computer, instead of going through the pages sequentially, would go through them all at once." Once the computation was complete, a classical computer would give you a definite, discrete answer. If the book had three pages printed in blue, that's the answer you'd get. "But a quantum computer is inherently probabilistic," Kowalkowski said. This means the data you get back isn't definite. In a book with 100 pages, the data from a quantum computer wouldn't be just three. It also could give you, for example, a one percent chance of having three blue pages or a one percent chance of 50 blue pages. An obvious problem arises when trying to interpret this data. A quantum computer can perform incredibly fast calculations using parallel qubits, but it spits out only probabilities, which, of course, isn't very helpful—unless, that is, the right answer could somehow be given a higher probability. Consider two water waves that approach each other. As they meet, they may constructively interfere, producing one wave with a higher crest. Or they may destructively interfere, canceling each other so that there's no longer any wave to speak of. Qubit states can also act as waves, exhibiting the same patterns of interference, a property researchers can exploit to identify the most likely answer to the problem they're given. "If you can set up interference between the right answers and the wrong answers, you can increase the likelihood that the right answers pop up more than the wrong answers," Lyon said. "You're trying to find a quantum way to make the correct answers constructively interfere and the wrong answers destructively interfere." When a calculation is run on a quantum computer, the same calculation is run multiple times, and the qubits are allowed to interfere with one another. The result is a distribution curve in which the correct answer is the most frequent response. Listening for signals above the noise In the last five years, researchers at universities, government facilities and large companies have made encouraging advancements toward the development of a useful quantum computer. Last year, Google announced that it had performed calculations on their quantum processor called Sycamore in a fraction of the time it would have taken the world's largest supercomputer to complete the same task. Yet the quantum devices that we have today are still prototypes, akin to the first large vacuum tube computers of the 194zeroes. "The machines we have now don't scale up much at all," Lyon said. There's still a few hurdles researchers have to overcome before quantum computers become viable and competitive. One of the largest is finding a way to keep delicate qubit states isolated long enough for them to perform calculations. If a stray photon—a particle of light—from outside the system were to interact with a qubit, its wave would interfere with the qubit's superposition, essentially turning the calculations into a jumbled mess—a process called decoherence. While the refrigerators do a moderately good job at keeping unwanted interactions to a minimum, they can do so only for a fraction of a second. "Quantum systems like to be isolated," Lyon said, "and there's just no easy way to do that." Which is where Lyon and Kowalkowski's simulation work comes in. If the qubits can't be kept cold enough to maintain an entangled superposition of states, perhaps the devices themselves can be constructed in a way that makes them less susceptible to noise. It turns out that superconducting cavities made of niobium, normally used to propel particle beams in accelerators, could be the solution. These cavities need to be constructed very precisely and operate at very low temperatures to efficiently propagate the radio waves that accelerate particle beams. Researchers theorize that by placing quantum processors in these cavities, the qubits will be able to interact undisturbed for seconds rather than the current record of milliseconds, giving them enough time to perform complex calculations. Qubits come in several different varieties. They can be created by trapping ions within a magnetic field or by using nitrogen atoms surrounded by the carbon lattice formed naturally in crystals. The research at Fermilab and Argonne will be focused on qubits made from photons. Lyon and his team have taken on the job of simulating how well radio-frequency cavities are expected to perform. By carrying out their simulations on high-performance computers, known as HPCs, at Argonne National Laboratory, they can predict how long photon qubits can interact in this ultralow-noise environment and account for any unexpected interactions. Researchers around the world have used open-source software for desktop computers to simulate different applications of quantum mechanics, providing developers with blueprints for how to incorporate the results into technology. The scope of these programs, however, is limited by the amount of memory available on personal computers. In order to simulate the exponential scaling of multiple qubits, researchers have to use HPCs. "Going from one desktop to an HPC, you might be 10,000 times faster," said Matthew Otten, a fellow at Argonne National Laboratory and collaborator on the project. Once the team has completed their simulations, the results will be used by Fermilab researchers to help improve and test the cavities for acting as computational devices. "If we set up a simulation framework, we can ask very targeted questions on the best way to store quantum information and the best way to manipulate it," said Eric Holland, the deputy head of quantum technology at Fermilab. "We can use that to guide what we develop for quantum technologies."
<urn:uuid:a9824282-bb91-4afd-a879-923f1f6de429>
CC-MAIN-2020-16
https://phys.org/news/2020-02-particle-technology-problems-quantum.html
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371807538.83/warc/CC-MAIN-20200408010207-20200408040707-00222.warc.gz
en
0.953725
1,811
3.78125
4
Rachel Goldman’s lab is working to produce “designer alloys” with carefully tailored electrical and light-absorbing properties. These materials could one day be used to build solar cells with double the efficiency of the flat-panel silicon cells that dot rooftops today. The new cells, called concentrator photovoltaics, use gallium arsenide semiconductors instead of the silicon-based semiconductors used in today’s cells. Gallium arsenide could move us toward the utility-scale solar arrays we’ll need to make solar energy a large part of our electrical infrastructure. In her most recent paper, Goldman and her collaborators moved forward the science by figuring out how incorporating small fractions of nitrogen and bismuth in gallium arsenide semiconductors affects their structure and light-absorbing properties, creating a new map for bandgap engineering of designer semiconductor alloys. The advance could accelerate the development of concentrator photovoltaics, and could also lead to advances in semiconductor lasers and quantum computing. Goldman is a professor of materials science and engineering. We sat down with her recently to learn more about her work. How is your “magic ratio” useful in solar cells? Concentrator photovoltaics will depend on the development of alloys that are safer and less expensive than those currently used in gallium arsenide semiconductors. In our earlier research, we developed alloys that use a combination of nitrogen and bismuth. Since then, we’ve been working to develop a more complete understanding of exactly how the nitrogen-bismuth combination functions, and how changing the proportion of those two elements affects the alloy’s overall properties. That research led us to the “magic ratio”—the precise proportion of bismuth to nitrogen that works best with a gallium arsenide substrate. We’ve found that by slightly tweaking that ratio within a certain range, we can control what bandwidth of light that the alloy absorbs. What’s the main hurdle standing in the way of concentrator photovoltaics? Turning “near-infrared” light into electricity is one big challenge—this is light that’s just outside the visible spectrum. A gallium arsenide solar cell consists of several thin layers of metal alloy sprayed onto a gallium arsenide substrate. It’s these thin layers that turn light into electrical charge. Each layer absorbs only a specific wavelength of light. A wavelength that slips through one layer can be caught by the next. The “magic ratio” should help researchers dial in the exact mix of an alloy to absorb whatever bandwidth of light they choose. How were you able to do what others couldn’t? We had to start by acknowledging that the conventional way of thinking about alloy composition doesn’t work for bismuth-nitrogen alloys. Making an alloy out of individual atoms is a little like filling a box with a mix of differently-sized marbles. If you know the sizes of the marbles and the size of the box, you can calculate the combination of marbles that will fill the box exactly. Researchers can calculate the composition of most alloys by using x-ray diffraction to measure the “box” and then calculating the combination of atoms that fits. That doesn’t work with bismuth and nitrogen. Bismuth is very large and nitrogen is very small, so it’s more like mixing sand and marbles. It’s hard to measure the size of a single grain of sand and even harder to predict how it will flow around all those marbles. So we worked with labs in New Mexico, Poland and Romania, as well as here at U-M, to develop a series of measurements that would each solve part of the puzzle. Then we brought them all together to precisely determine the ratio of nitrogen to bismuth in a wide range of sample alloys, and how that ratio affects light absorption properties. Where else might these kinds of alloys be useful? A better understanding of nitrogen-bismuth alloys could help us build more efficient infra-red lasers, which are widely used in fiber-optic communications and in the military. They could also be used in quantum computing, to build transistors that use the spin of electrons as a way to store information. When will the results of this research go into widespread use? There’s still a lot of progress to be made. But this research opens the door to a better understanding of exactly how these alloys work and how to make them do what we want, in solar power and elsewhere. Goldman’s most recent paper is titled “Mapping the composition-dependence of the energy bandgap of GaAsNBi alloys.” It is published in the August 23, 2019 issue of Applied Physics Letters. U-M graduate researcher Jordan Occena, T. Jen and J.W. Mitchell are also authors on the paper. An earlier, related paper is titled “Bi-enhanced N incorporation in GaAsNBi alloys.” It published in the June 15, 2017 issue of Applied Physics Letters.
<urn:uuid:9303847f-e1c2-420b-83fc-8c86bd051b8e>
CC-MAIN-2020-16
https://news.engin.umich.edu/2019/09/the-magic-ratio-that-could-power-tomorrows-solar-cells/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371826355.84/warc/CC-MAIN-20200408233313-20200409023813-00305.warc.gz
en
0.925602
1,091
3.625
4
The concept of quantum computing was proposed in the 1980s. By capturing the uncertainty of molecules at absolute zero (-273℃), scientists succeeded to create a computing apparatus that brings us to the next level. In 2017, such studies are about to bear fruit. D-Wave, the first company that aims for the commercial use of this technology, exhibited satisfying benchmark results that their quantum computer solved certain problems one million times faster than a conventional computer. In March, IBM declared the plan that they will bring a quantum computer, “IBM Q” system, to market in a few years. They note, “quantum computers will deliver solutions to important problems where patterns cannot be seen because the data doesn’t exist and the possibilities that you need to explore to get to the answer are too enormous to ever be processed by classical computers.” (IBM, 2017) Conventional, or “classic” computers are based on transistors, in which each “bit,” the smallest unit of memory, has a binary state, a “0” or a “1.” Although computers have become significantly more powerful, smaller, and cheaper for the last several decades, they cannot flee from the fetters of the binary system. Quantum computers, in contrast, store information into a “qubit,” which can be represented by an atom in ambiguous states, say, a “0” or “1” or “0 and 1.” And, most importantly, those computers are able to calculate such different universes simultaneously, thus greatly compressing the computation time. I enjoy Japanese chess —shogi— in my free time. It has more complex rules than Western chess does: in shogi, on a little larger nine-by-nine board, we can promote and strengthen a piece, capture an opponent’s piece, and use it under your control. Specifically, shogi has 10 to the 71st of possible states, whereas Western chess has 10 to the 47th. But what do these numbers mean in terms of computation? I assume you have once played tic-tac-toe, which has 765 possibly different positions. If you have a computer that calculates the best move for each position respectively in one millisecond (a thousandth of a second), you can obtain the perfect answer immediately because 765 millisecond is shorter than a second. Chess and shogi, however, require an astronomical amount of time. In fact, it is more than astronomical. 1047 milliseconds is equivalent to 1036 (1000000000000000000000000000000000000) years. The earth is barely 4.5 billion years old. The whole universe is said to have the history of fewer than 1011 years. Our planets are babies compared to the time necessary to be omnipotent in the board game. Like chess and shogi, a certain type of problem is theoretically able to be solved, but practically unsolvable because of time. Were you to purchase thousands of cutting-edge supercomputers, you would likely be able to remove some zeros in the number of “years for computing” but would never see the result while you are alive. Therefore, computer scientists have devoted their time to inventing algorithmic devices and estimating the result by employing statistical approaches. But what if a new technology shifts paradigms and changes the laws of the universe? Quantum computing is, of course, not exclusive to board games. It will unravel the mystery of our DNA, helping invent more effective medicine. The technology will augment artificial intelligence, which would read our subtle nuances rather than “yes” or “no.” Quantum computers will also enable us to decipher any transactions on the Internet in a moment’s notice. All of the online encryption technology is underpinned by “practically” irreversible keys, so-named because of classic computing limitations. Once this assumption overturns, our privacy and a country’s cyber-security will be vulnerable. I am quite sure that this will become a fierce, controversial issue among politicians around the world. Some take another view on this technology. In the Guardian’s article “Has the age of quantum computing arrived?” MIT professor Scott Aaronson, who has dubbed himself Chief D-Wave Sceptic,” says, “there was no reason to believe they played a casual role or that they were faster than a classical computer.” (Anthony, 2016) Quantum computing is still in the early stage of developing, and the company D-Wave has long been “accused of hype and exaggeration.” (Anthony, 2016) Also, like the decryption issue, ethics matters. Quantum computing is so powerful that we should handle it properly, ethically, and openly. With no exception, technology is a double-edged sword. Is quantum computing a savior or a devil? Or something ambiguous between them like us, human beings? The answer is not a “0” or “1.” Works CitedIBM. "IBM Unveils Roadmap for Commercial." IBM News Room - 2017-03-06 IBM Building First Universal Quantum Computers for Business and Science - United States. IBM, 06 Mar. 2017. Web. 07 Mar. 2017. Anthony, Andrew. "Has the Age of Quantum Computing Arrived?" The Observer. Guardian News and Media, 22 May 2016. Web. 07 Mar. 2017.
<urn:uuid:d0e404cf-1e0f-4195-a67c-509abaacfcf0>
CC-MAIN-2020-16
https://mogproject2.blogspot.com/2017/03/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370493120.15/warc/CC-MAIN-20200328194743-20200328224743-00345.warc.gz
en
0.944566
1,130
3.765625
4
Today’s date — 20 May 2019 — marks a major milestone in measurement history. For the first time, the definitions of the base units that comprise the International System of Units (SI) are entirely derived from constants of nature like the speed of light and Avogadro’s number instead of human-made artifacts. The kilogramme, the SI base unit that held out the longest, will now be defined in terms of the Planck constant rather than the platinum-iridium cylinder known as ‘Le Grand K’, or ‘The Big K’. Physical constants, unlike physical objects, are inherently stable and do not experience minute fluctuations in their properties. As a result, the definitions of all seven SI base units — the second, the meter, the kilogramme, the ampere, the kelvin, the mole, and the candela — will remain accurate and unchanging for their countless applications in science, manufacturing, commerce, and other industries where near-perfect calibration is required. The End of Big K So how was Le Grand K, a fixture of scientific measurement for nearly 130 years, finally dethroned? The redefinition of the kilogramme only became possible with the groundbreaking discovery of the quantum Hall effect by German physicist Klaus von Klitzing in 1980. The scientific community immediately recognised the significance of his findings, and he went on to receive the Nobel Prize in Physics a mere five years later. “At the time of the discovery of the effect, I never believed that this has some influence on the kilogramme,” said von Klitzing during his 2016 Lindau lecture, which focused on how his Nobel Prize-awarded work will contribute to the new and improved SI. The Nobel Laureate will also participate in the upcoming 2019 Lindau Meeting. The quantum Hall effect is a quantum-mechanical version of the conventional Hall effect, first discovered by American physicist Edwin Hall in 1879. While completing his graduate studies at Johns Hopkins University, he noticed that when a magnetic field was applied perpendicularly to a thin metal sheet through which an electric current is flowing, a small voltage appeared from one side of the sheet to the other. The magnetic field exerts a force on the moving electric charges, and the accumulation of charge on one side of the conductor leaves the other side oppositely charged, leading to the observed potential difference. Discovering the Quantum Hall Effect Von Klitzing wanted to observe the Hall effect in more extreme conditions, at very low temperatures with a much stronger magnetic field. He performed experiments with two-dimensional electron systems, in which electrons are forced to move within an extremely thin layer, and smoothly varied the magnetic field. Surprisingly, he found that the observed Hall resistance — the ratio of the created voltage to the current — changed in discrete steps with exceptionally high accuracy. In other words, the Hall resistance was exactly quantised. The resistance quantum, h/e2, where e is the electron charge and h is the Planck constant, is now known as the ‘von Klitzing constant’. Because of its extraordinarily high precision, the von Klitzing constant has been used in resistance calibrations worldwide since 1990. In combination with the Josephson constant (KJ = 2e/h), which originates from another electrical phenomenon called the Josephson effect, the von Klitzing constant (RK = h/e2) can be used in experiments to connect mass to the Planck constant. In 1999, scientists Peter Mohr and Barry Taylor at the National Institute of Standards and Technology proposed the redefinition of the kilogramme with such a method, motivated by recent progress in the development of the Kibble balance. Also known as a ‘watt balance’, this device precisely measures mass through the use of electrical measurements “These two constants were the origin of the change we expect in the future for our SI system,” said von Klitzing in 2016, before the revised definitions were formally accepted. “The Josephson effect and the quantum Hall effect are the driving force for the expected change in the SI system in 2018.” The New and Improved SI And as predicted, in November 2018, representatives from more than 60 countries voted to redefine the kilogramme in terms of the Planck constant during the 26th meeting of the General Conference on Weights and Measures in France. The new SI chosen to come into effect today, on World Metrology Day 2019, whose theme is “The International System of Units – Fundamentally better.” The date itself, 20th May, refers back to the signature of the Metre Convention in 1875 by representatives of 17 nations, which created the International Bureau of Weights and Measures (BIPM). While most of us won’t notice a difference in everyday life, the new SI improves the precision of measurements for nanotechnology, communications, security, medicine, and emerging technologies such as quantum computing. In other words, the ‘fundamentally better’ SI might not impact you and me directly, but it will provide greater stability and accuracy to countless applications that have a significant effect on society. Additional information: During a lecture at the upcoming 69th Lindau Nobel Laureate Meeting, Klaus von Klitzing will talk about the “Quantum Hall Effect and the New SI System”. Read an abstract of his lecture here.
<urn:uuid:404ef4c1-faea-4d44-9a8f-d3b97884206a>
CC-MAIN-2020-16
https://www.lindau-nobel.org/blog-redefinition-of-the-kilogram/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370506870.41/warc/CC-MAIN-20200402080824-20200402110824-00305.warc.gz
en
0.937153
1,119
3.75
4
Artificial intelligence (AI) is intelligence exhibited by machines. This is a cluster of topics in AI or related to AI. Types of AI Artificial Intelligence (AI) is classified into types based on the degree to which an AI system can replicate or go beyond human capabilities. One classification system uses four types: reactive machines, limited memory machines, theory of mind and self-aware AI. Another classification divides AI into two divisions: Weak AI or Narrow AI and Strong AI or General AI or Artificial General Intelligence. Different branches of AI are referred to by the method used to achieve AI. - Weak or narrow AI or artificial narrow intelligence - Artificial superintelligence (ASI) - Reactive machines: These do not have past memory and cannot use past information for future actions - Limited memory machines: These can use past experiences to inform future decisions - Theory of Mind: In humans it is the ability to infer other people’s thoughts, desires and beliefs in others and to understand that they may be different from your own. Theory of mind level AI would be able to interact socially with people - Self-aware AI (hypothetical, not realized) Branches of AI Machine learning is a technique for realizing AI and it is an application of AI where machines are given access to data from which they learn form themselves. Machine learning tools Tools, algorithms, libraries and interfaces for machine learning Artificial neural network (ANN) processing devices can be algorithms or actual hardware that are loosely modeled after the neuronal structure of the mammalian cerebral cortex. Neural networks are used in the branch of machine learning called deep learning. The following are types of neural networks used in machine learning as well as topics associated with neural networks. Deep learning frameworks A Deep Learning Framework is an interface, library or a tool which allows users to build deep learning models more easily and quickly, without getting into the details of underlying algorithms. Libraries are useful for individuals who want to implement Deep Learning techniques but don’t have robust fluency in back-propagation, linear algebra or computer math. These libraries provide pre-written code for functions and modules that can be reused for deep learning training for different purposes. - Deep Q-Learning : Algorithm in deep reinforcement learning - Deep voice 1: Trains deep neural networks to learn from large amounts of data and simple features Reinforcement learning is an area of machine learning focusing on how machines and software agents react in a specific context to maximize performance and achieve reward known as reinforcement signal. The following are algorithms, tools and research topics related to reinforcement learning. Supervised learning is a type of machine learning in which data is fully labelled and algorithms learn to approximate a mapping function well enough that they can accurately predict output variables given new input data. This section contains supervised learning techniques. For example, Gradient Descent is a technique to optimize neural networks in supervised machine learning. Gradient descent optimization algorithms are used to speed up the learning process of deep neural networks. Another example, Support Vector Machine (SVM), is a type of algorithm that is a discriminative classifier formally defined by a separating hyperplane used for regression and classification tasks. A decision tree is a simple representation for classifying samples. Decision tree algorithms are used in supervised machine learning where data is continuously split according to a parameter. - Classification and regression trees (CART) Unsupervised learning is a branch of machine learning that tries to make sense of data that has not been labeled, classified, or categorized by extracting features and patterns on its own. The following are methods used in unsupervised machine learning. In unsupervised machine learning, clustering is the process of grouping similar entities together in order to find similarities in the data points and group similar data points together. Ensemble methods are meta-algorithms that combine several machine learning techniques into one predictive model. The purpose is to decrease variance (bagging), bias (boosting), or improve predictions (stacking). In machine learning classification problems when there are too many factors or variables, also called features. When most of the features are correlated or redundant, dimensionality reduction algorithms are used to reduce the number of random variables. Certain features are selected and others are extracted. Parameterized statistical models Machine learning models are parameterized to tune their behavior for a given problem. Noise contrastive estimation (NCE) is an estimation principle for parameterized statistical models. NCE is a way of learning a data distribution by comparing it against a defined noise distribution. The technique is used to cast an unsupervised problem as a supervised logistic regression problem. NCE is often used to train neural language models in place of Maximum Likelihood Estimation. - Noise-contrastive estimation Computer vision is the ability of artificially intelligent systems to “see” like humans. In the computer vision field machines are developed that automate tasks that require visual cognition. Deep learning and artificial neural networks are used to develop computer vision. The following are topics related to computer vision as well as tools and libraries. Companies developing or selling computer vision products are under the Computer Vision subheading under the AI applications and companies section. Natural language processing Natural language processing is a branch of AI that helps computers understand, interpret and manipulate human language. The following are tools and topics related to NLP. NLP companies developing or selling NLP applications are found in the AI applications and companies section under Natural language processing. Advances in deep learning are expected to increase understanding in quantum mechanics. It is thought that quantum computers will accelerate AI. Quantum computers have the potential to surpass conventional ones in machine learning tasks such as data pattern recognition. The following are topics, companies and technologies that link quantum computing and AI. Semantic computing deals with the derivation, description, integration and use of semantics (meaning, context and intention) for resources including data, document, tool, device, process and people. Semantic computing includes analytics, semantics description languages, integration of data and services, interfaces and applications. In AI, semantic computing involves the creation of ontologies that are combined with machine learning to help computers create new knowledge. Semantic technology helps cognitive computing extract useful information from unstructured data in pattern recognition and natural-language processing. - The IEEE Computer Society Technical Committee on Semantic Computing (TCSC) IoT (Internet of Things) The Internet of Things (IoT) refers to objects that connect and transfer data via the internet and the sharing of information between devices. IoT based smart systems generate a large volume of data including sensor data valuable to researchers in healthcare, bioinformatics, information sciences, policy and decision making, government and enterprises. AI can be combined with machine learning for analysis of data and prediction. Artificial life and evolutionary computation While some lines of AI research aim to simulate the human brain. Artificial life or animate approach is concerned with the conception and construction of artificial animals as simulations or actual robots. It aims to explain how certain faculties of the human brain might be inherited from the simplest adaptive abilities of animals. Evolutionary computation is a generic optimization technique that draws inspiration from the theory of evolution by natural selection. AI applications and companies The following are companies using AI to develop products or producing AI software for various applications. AI programs designed for a specific applications are also listed. Medical, veterinary and pharmaceutical - Enzbond - prediction programs for enzyme development Computer vision, image recognition and generation Computer vision has applications in healthcare, security, manufacturing and transportation. Natural language processing Art and music creation Industry/Factory automation and monitoring Employee Behavior Analytics Social Media/Human Interaction/Recruitment - Viv Labs AI software and API development Other AI companies Documentaries, videos and podcasts Digital education platform San Antonio, US Barry N. Perkins San Francisco, California, US Redwood City, California, US Las Vegas, US Direct Aviated Response (DAR) System Automated digital advertising Destin AI Chatbot Digital risk analysis Los Angeles, US Customized AI conversation bots Automated job candidate search San Francisco, US Automated, customizable chatbots San Francisco, California, US Mountain View, US App for fashion recommendations San Francisco, US AI-enhanced acne treatment
<urn:uuid:f5d92904-6b20-4ac7-b34b-b89d434aa430>
CC-MAIN-2020-16
https://golden.com/wiki/Cluster%3A_Artificial_intelligence-JNZDPNG
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370493684.2/warc/CC-MAIN-20200329015008-20200329045008-00232.warc.gz
en
0.886254
1,716
3.84375
4
Harvard scientists have taken a critical step toward building a quantum computer — a device that could someday harness, for example, the intrinsic properties of subatomic particles such as electrons to perform calculations far faster than the most powerful supercomputers. As described in a paper published April 13 in Science, researchers have, for the first time, demonstrated a system in which two semiconducting spin quantum bits, or qubits, interact with each other in a process known as entanglement. Without that entanglement, quantum computers simply can’t exist. “Entanglement is an essential component of quantum computing — it’s what gives you the ability to do generalized, universal quantum computation,” said Amir Yacoby, professor of physics and of applied physics, who led the research. “Without this kind of entanglement, there’s no way to get anywhere in this field.” Quantum computers rely on quantum mechanical properties of particles to store data and perform computations. Unlike the transistors used in digital computers, which encode data “bits” as either zero or one, qubits can hold both values simultaneously. In theory, that inherently parallel nature allows quantum computers to be vastly more powerful than traditional computers, which perform operations in sequence. As a first step toward making those parallel computations possible, researchers working in Yacoby’s lab have established a new method for creating an entangled state between two qubits. By taking advantage of the electrostatic interaction between the particles, Yacoby, in collaboration with postdoctoral researchers Oliver Dial and Hendrik Bluhm, and graduate students Michael Shulman and Shannon Harvey, was able to create pairs of qubits in a state that has no classical analog, known as an entangled state. By entangling one qubit with another, researchers can control the state of one qubit by operating on the other. This interconnectedness gives quantum computers their advantage over their classical counterparts. “There are two elements to this paper,” Yacoby explained. “The first is determining how to create these entangled states. We took advantage of the fact that our qubits are made of electrons, so we used their electrostatic interaction to create this conditional, or entangled, state between them. As a method for creating entanglement, that has not been demonstrated before. “The second element in the paper is that the electrostatic interaction is weak, so it takes time to create that entanglement,” Yacoby continued. “But during that time, various elements are trying to interact with the individual qubits, which cause them to lose their information. It took some creative thinking to design a system that would allow their entanglement to accumulate, but would limit their interaction with the rest of their environment.” As Shulman put it, “The trick is to keep them sensitive to each other, and to nothing else.” The solution, Yacoby said, came in the form of an echo. Though subtle fluctuations in the evolution of each qubit cause the entangled qubits to get out of sync, researchers found a novel way to solve the problem. They allowed the qubits to interact for a precise amount of time, then flipped each qubit, causing them to return to their initial state. The method has two benefits, Yacoby said. First, allowing a pair of qubits to interact builds the necessary entanglement between them. Second, bringing the bits back to their initial state preserves the data that had been coded into them. “You can think of it like runners on parallel tracks,” Yacoby said. “They run a certain distance, and then on cue they turn and run back the same amount, so they wind up back where they began.” Similar to traditional computers, Yacoby’s design for a quantum computer begins with a thin wafer of semiconducting material — in this case gallium arsenide — “grown” in the Weizmann Institute. Researchers then deposit nanometer-size wires onto the wafer to form metal “gates.” The entire device is then supercooled to a few hundredths of a degree above absolute zero to slow the motion of atoms in the wafer. When attached to an electric voltage, the gates trap electrons, allowing researchers to construct their quantum bits. “Conceptually, people had laid out the idea that this type of entanglement was possible as early as 15 years ago, but the gap between being able to conceive of something and demonstrating it in a real system is huge,” Yacoby said. “It’s huge in the sense that, when people were laying out these concepts, they didn’t take into account all the problems that exist in a real system. Just because nature doesn’t fundamentally forbid it, doesn’t mean it can be done. But the fact is it can be done, it can be done today, and it can be done quite elegantly.”
<urn:uuid:b5c5b09e-1a58-4083-97a5-dcddf30cf5ca>
CC-MAIN-2020-16
https://news.harvard.edu/gazette/story/2012/04/elegant-entanglement/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371880945.85/warc/CC-MAIN-20200409220932-20200410011432-00394.warc.gz
en
0.948639
1,053
4.09375
4
Machine learning is the newest thing at BYU, thanks to the work of engineer Dah-Jye Lee, who has created an algorithm that allows computers to learn without human help. According to Lee, his algorithm differs from others in that it doesn’t specify for the computer what it should or shouldn’t look for. Instead, his program simply feeds images to the computer, letting it decide on its own what is what. Similar to how children learn differences between objects in the world around them in an intuitive way, Lee uses object recognition to show the computer various images but doesn’t differentiate between them. Instead, the computer is tasked with doing this on its own. According to Lee: “It’s very comparable to other object recognition algorithms for accuracy, but, we don’t need humans to be involved. You don’t have to reinvent the wheel each time. You just run it.” Of course, computers can’t think, reason, or rationalize in quite the same way as humans, but researchers at Carnegie Mellon University are using Computer Vision and Machine Learning as ways of optimizing the capabilities of computers. NEIL’s task isn’t so much to deal with hard data, like numbers, which is what computers have been doing since they first were created. Instead, NEIL goes a step further, translating the visual world into useful information by way of identifying colors and lighting, classifying materials, recognizing distinct objects, and more. This information then is used to make general observations, associations, and connections, much like the human mind does at an early age. While computers aren’t capable of processing this information with an emotional response–a critical component that separates them from humans–there are countless tasks that NEIL can accomplish today or in the near future that will help transform the way we live. Think about it: how might Computer Vision and Machine Learning change the way you live, work, and interact with your environment? While your smart device of today may appear to be multi-tasking with GPS, text messaging and music streaming all running at once, in reality, it’s cycling between these tasks, serially. Computers have been operating this way since the computer age began. Quantum computers, on the other hand, would address simultaneity from the ground up. They would perform many operations in parallel and be well-suited to machine learning where there’s a need to search instantly through a myriad of possibilities and choose the best solution. One of the more controversial aspects of quantum computing’s massive potential is to render today’s data encryption technologies, obsolete. (For a surprisingly easy-to-follow explanation of the difference between classical computing versus quantum computing, see this 1999 article by Lov K. Grover, inventor of what may be the fastest possible search algorithm that could run on a quantum computer.) One focus of the lab will be to advance machine learning. Google Director of Engineering, Hartmut Neven blogs: Machine learning is all about building better models of the world to make more accurate predictions. And if we want to build a more useful search engine, we need to better understand spoken questions and what’s on the web so you get the best answer. In venture capital circles, machine learning startups are about to catch fire. This makes sense as the size of data sets that companies and organizations need to utilize spirals beyond what the human brain can fathom. As Derrick Harris at Gigaom reports, Skytree landed $18 million in Series A funding from US Venture Partners, United Parcel Service and Scott McNealy, the Sun Microsystems co-founder and former CEO. The company began just over a year earlier with $1.5 million in seed funding. As big data gets bigger ever more quickly, machine learning makes it possible to identify meaningful patterns in real time that would elude sharp humans even with the best of query tools. Still, there’s often a place for human judgment to flesh out the findings of machine learning algorithms. The flagship Skytree product, Skytree Server, lets users run advanced machine learning algorithms against their own data sources at speeds much faster than current alternatives. The company claims such rapid and complete processing of large datasets yields extraordinary boosts in accuracy. Skytree’s new beta product, Adviser, allows novice users to perform machine learning analysis of their data on a laptop and receive guidance about methods and findings. As the machine learning space becomes more accessible to a wider audience, expect to see more startups get venture funding. Writing for Mason Research at George Mason University, Michele McDonald reports on how machine learning is helping doctors determine the best course of treatment for their patients. What’s more, machine learning is improving efficiency in medical billing and even predicting patients’ future medical conditions. Wojtusiak points out how current research and studies focus on the average patient whereas those being treated want personalized care at the lowest risk for the best outcome. Machine learning can identify patterns in reams of data and place the patient’s conditions and symptoms in context to build an individualized treatment model. As such, machine learning seeks to support the physician based on the history of the condition as well as the history of the patient. The data to be mined is vast and detailed. It includes the lab tests, diagnoses, treatments, and qualitative notes of individual patients who, taken together, form large populations. Machine learning uses algorithms that recognize the data, identify patterns in it and derive meaningful analyses. For example, researchers at the Machine Learning and Inference Lab are comparing five different treatment options for patients with prostate cancer. To determine the best treatment option, machine learning must first categorize prostate cancer patients on the basis of certain commonalities. When a new patient comes in, algorithms can figure out which group he is most similar to. In turn, this guides the direction of treatment for that patient. Given the high stakes consequences involved with patient care, the complexity that must be sorted out when making diagnoses and the ongoing monitoring of interventions against outcomes, machine learning development in health care is risk-mitigating and cost-effective. For more about The Machine Learning and Inference Lab and the health care pilot projects they are working on, see the original article here. As the new frontier in computing. machine learning brings us software that can make sense of big data, act on its findings and draw insights from ambiguous information. Spam filters, recommendation systems and driver assistance technology are some of today’s more mainstream uses of machine learning. Like life on any frontier, creating new machine learning applications, even with the most talented of teams, can be difficult and slow for a lack of tools and infrastructure. DARPA (The Defense Advanced Research Projects Agency) is tackling this problem head on by launching the Probabilistic Programming for Advanced Machine Learning Program (PPAML). Probabilistic programming is a programming paradigm for dealing with uncertain information. In much the same way that high level programming languages spared developers the need to deal with machine level issues, DARPA’s focus on probabilistic programming sets the stage for a quantum leap forward in machine learning. More specifically, machine learning developers using new programming languages geared for probabilistic inference will be freed up to deliver applications faster that are more innovative, effective and efficient while relying less on big data, as is common today. For details, see the DARPA Special Notice document describing the specific capabilities sought at http://go.usa.gov/2PhW.
<urn:uuid:13ec92c0-f4c9-4c0f-a626-a8766110d1f0>
CC-MAIN-2020-16
https://machinelearningsoftwareblog.wordpress.com/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370493120.15/warc/CC-MAIN-20200328194743-20200328224743-00355.warc.gz
en
0.927326
1,566
3.765625
4
Quantum supremacy is the experimental demonstration of a quantum computer's dominance and advantage over classical computers by performing calculations that were previously impossible at unmatched speeds. To confirm that quantum supremacy has been achieved, computer scientists must be able to show that a classical computer could never have solved the problem while also proving that the quantum computer can perform the calculation quickly. Computer scientists hope that quantum supremacy will lead to the cracking of Shor's algorithm -- a currently impossible calculation that is the basis of most modern cryptography -- as well as advantages in drug development, weather forecasts, stock trades and material designs. Quantum computing is consistently evolving; quantum computers have not yet reached a point where they can show their supremacy over classical computers. This is mostly due to the huge amount of quantum bits, or qubits, that are required to perform meaningful operations on quantum computers. As the amount of necessary gates and number of qubits increases, so does the error rate, and if the error rate gets too high, the quantum computer loses any advantage it had over the classical computer. To successfully perform useful calculations -- such as determining the chemical properties of a substance -- a few million qubits would be necessary. Currently, the largest quantum computer design is Google's Bristlecone, with a 72-qubit quantum processor, which was released in March 2018. Quantum computers vs. classical computers The primary difference between quantum and classical computers is in the way they work. Classical computers process information as bits, with all computations performed in a binary language of 1s and 0s. The current in classical computers is either flowing through the transistor or not; there is no in between. Conversely, quantum computers use quantum theory as the basis of their systems. Quantum theory focuses on the extraordinary interactions between particles on an invisible scale -- such as atoms, electrons and photons. Therefore, the binary states used in classical computers can no longer be applied to quantum computers. Qubits can theoretically outperform the computation scale of binary bits by magnitudes. This is mostly due to quantum superposition -- or the ability for a subatomic particle to exist in two states at once. Superposition allows qubits to run specific computations on various possibilities simultaneously. Trapped ions, photons and superconductors give quantum computers the ability to perform calculations at exceptionally fast speeds and take in massive amounts of data. However, the real value that quantum computers could provide is the ability to solve problems that are too complex for classical computers to address or that would take classical computers billions of years to answer. Quantum computers should be able to create a series of samples from a random quantum circuit that follow a specific, correct distribution. While these advantages could lead to quantum supremacy, processors have not yet been built with all the capabilities. Classical computers continue to surprise computer scientists with computational power and their ability to solve certain types of problems. Until a quantum computer is built that solves a problem it has been proven a classical computer cannot solve, it continues to be possible that a better classical algorithm exists and quantum supremacy will not be achieved. Applications of quantum supremacy Some people believe a quantum computer that achieves quantum supremacy could be the most disruptive new technology since the Intel 4004 microprocessor was invented in 1971. Certain professions and areas of business will be significantly impacted by quantum supremacy. Examples include: - The ability to perform more complex simulations on a larger scale will provide companies with improved efficiency, deeper insight and better forecasting, thus improving optimization processes. - Enhanced simulations that model complex quantum systems, such as biological molecules, would be possible. - Combining quantum computing with artificial intelligence (AI) could make AI immensely smarter than it is now. - New customized drugs, chemicals and materials can be designed, modeled and modified to help cultivate new pharmaceutical, commercial or business products. - The ability to factor extremely large numbers could break current, long-standing forms of encryption. Overall, quantum supremacy could start a new market for devices that have the potential to boost AI, intricately model molecular interactions and financial systems, improve weather forecasts and crack previously impossible codes. While most of these applications appear to provide nothing but benefits, quantum supremacy also has the ability to destabilize the math that underlies most current data encryption. Therefore, once quantum supremacy is achieved, computer scientists will have to completely reevaluate computer security and how to protect information and data. Unfortunately, this will become extremely difficult with the high speeds and large amounts of data that the quantum computers will be working with. Examples of quantum supremacy While the problem that first exemplifies quantum supremacy could be whatever computer scientists want, it is expected that they will use a problem known as random circuit sampling. This problem requires a computer to correctly sample from the possible outputs of a random quantum circuit -- similar to a series of actions that can be performed on a set of qubits. Classical computers do not possess any fast algorithms to generate these samples; therefore, as the array of possible samples increases, classical computers become overwhelmed. If a quantum computer can efficiently pull samples in this instance, then it will prove quantum supremacy. Importance of quantum supremacy The first quantum algorithms were solved in the 1990s and, while the problems themselves were useless, the process provided the computer scientists who designed them with knowledge and insights they could use to develop more meaningful algorithms -- like Shor's algorithm -- which could potentially have large practical consequences. Computer scientists hope that quantum supremacy will repeat this process and drive inventors to create a quantum computer that is capable of outperforming a classical computer -- even if it only solves a simple, useless problem -- because this work could be the key to building a beneficial and supreme quantum computer. Some people also believe Moore's Law is ending soon. This would inhibit AI research because the necessary smarter applications, such as fully autonomous cars, require huge amounts of processing power. Once quantum supremacy is reached, then quantum computing should be able to resolve this problem as well as revolutionize machine learning (ML). Finally, quantum supremacy would greatly affect the field of theoretical computer science. For decades, scientists in this field have believed in the extended Church-Turing thesis, which states that classical computers can efficiently complete any problem that any other type of computer can accomplish. Quantum supremacy totally violates that assumption. Scientists would be forced to open their minds to a whole new world of computer science. The future of quantum supremacy The final goal for quantum computing is to create a fully functional, universal fault-tolerant gate computer. However, before this machine can be built, computer scientists need to develop: - Refined error correction that doesn't require huge amounts of hardware - Advanced algorithms that can support the uniquely complex problems - Enhanced noise - Qubits with less noise sensitivity, longer coherence times and increased reliability - Quantum processors that possess thousands of qubits The U.S. and China have been the most focused on investing in quantum projects along with organizations and businesses such as Google, Microsoft, IBM, Lockheed Martin and Alibaba. Google has developed a 72-qubit quantum processor -- called Bristlecone -- which they claim will achieve quantum supremacy by the end of 2019. Once quantum supremacy is displayed, quantum computers will provide superior use for crunching large data sets, such as those used in cancer research, drug design, genetic engineering particle physics and weather forecasting. Unfortunately, due to superposition, programmers working on developing tools to code quantum computers are unable to view the paths that their data takes from input to output, making the debugging process highly complicated. Furthermore, while quantum supremacy can be extremely beneficial to various industries, the breakthrough could also lead to rogue states or actors using quantum computers for destructive purposes.
<urn:uuid:1a5480b1-568e-4a50-abca-911c31b1c3a7>
CC-MAIN-2020-16
https://searchsecurity.techtarget.com/definition/quantum-supremacy
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370508367.57/warc/CC-MAIN-20200402204908-20200402234908-00076.warc.gz
en
0.942256
1,560
3.875
4
Building a better supercomputer is something many tech companies, research outfits, and government agencies have been trying to do over the decades. There’s one physical constraint they’ve been unable to avoid, though: conducting electricity for supercomputing is expensive. Not in an economic sense—although, yes, in an economic sense, too—but in terms of energy. The more electricity you conduct, the more resistance you create (electricians and physics majors, forgive me), which means more wasted energy in the form of heat and vibration. And you can’t let things get too hot, so you have to expend more energy to cool down your circuits. Any gamer or regular laptop user is familiar with overheating problems. Supercomputing deals with the same issues on an exponential scale, with energy use similarly enlarged and thus a significant cost concern (there’s the economic bit). That’s why consumer supercomputers that try to control these issues will run you at least $6,000, and why supercomputing as a whole has been reserved for code-cracking spy agencies, really Big Data crunching at various companies and governments, and mega-funded research institutions. A new nanowire breakthrough could be the first in a wave of similar innovations set to change all that. Superconductors Gone Miniature One of the holy grails in supercomputing development—and in electronics/physics at large—has been developing a resource- and cost-effective superconductive material. Unlike regular electric conductors, superconductors transmit electrons (i.e. “electricity”) without any resistance. They output exactly the same amount of energy that was inputted and do not dissipate “waste” energy in terms of heat or sound. This means they require much less energy to run and no energy to cool…sort of. While superconductors don’t generate any heat while conducting electricity, all of the superconductive materials yet discovered and developed have to be cooled below a critical temperature to gain their superconductive powers. This has made them more expensive and unwieldy in most applications, including supercomputers, than using more traditional materials and cooling systems. But researchers at the University of Illinois at Urbana-Champaign have just created a working superconductive nanowire memory cell that could prove to be a game changer. “An SEM image of Device 7715s1. Two carbon nanotube templated Mo75Ge25 wires lay across a roughly 150 nm wide trench, 2.5 μm apart. The two wires have similar dimensions, but are not identical. (b) An SEM image of Device 82915s2. The Mo75Ge25 (dark) is patterned into two geometrically different nanowires sitting 150 nm apart. The right wire has a non-uniform width.” Get all that? Good. The cell consists of a nanowire loop and a couple of electrodes and is programmed simply by subjecting it to an initial positive or negative charge, which sends electrons around the loop either clockwise or counterclockwise. This binary option equates to the 0s and 1s needed for computing, and the cell’s memory is preserved with no additional energy applied—that is, the electrons keep moving as long as the wires stay superconductive. Keeping nanowires at the cool conditions needed for superconducting—especially when they don’t generate any heat of their own—could prove far more affordable and far less space-intensive than current supercomputing solutions demand. And when things get smaller and cheaper, they open up whole new markets and usher in a wave of further innovations. Putting a superconducting supercomputer onboard a self-driving car could become a reality, making them vastly more responsive, reliable and more affordable, both to build and to operate. Space-bound supercomputers could proliferate, with all kinds of academic and commercial applications. We’re not likely to see even a nano-sized cooling system in our smartphones anytime soon, but desktop supercomputers could be markedly reduced in upfront and long-term costs. Quantum computing, which uses relatively low energy to change the quantum states of electronics to preserve memory, is undoubtedly going to hit the market first, and we’ll see some major leaps in computing power, size, and efficiency. Nanotech is keeping supercomputers on the map for a variety of applications, though, and we’re eager to see just how small things can get. Daniel A. Guttenberg is an Atlanta-based writer who fell into the startup world by accident and has been gleefully treading water ever since. He will be survived by his beard and his legacy of procrastination. Latest posts by Daniel A. Guttenberg (see all) - The Startup Rushing to Usher in the Self-Driving Era Even Faster - July 7, 2017 - Who Are the AR Leaders…And Who’s Just Hype? - June 30, 2017 - As US Stock Worries Loom, Canada’s Startup Scene Booms - June 23, 2017
<urn:uuid:7ec13ee7-0bc6-4a7c-b640-6a4edc92213d>
CC-MAIN-2020-16
https://www.snapmunk.com/nanowire-tech-supercomputing/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370497042.33/warc/CC-MAIN-20200330120036-20200330150036-00077.warc.gz
en
0.930414
1,078
3.703125
4
Quantum computers have the potential to fundamentally change digital technology and therefore the ways in which we solve problems, interact and do business. In her series for Roman Road Journal, science writer Gemma Milne looks into how quantum computers work and how they might change our lives. In Part 1, she explores the why, what and when behind quantum computing and some of the potential applications for what could be the most significant technological development of the 21st century. In a world where we can send instant messages to anyone on the planet at the touch of a button; where there are warehouses full of robots which build millions of planes, trains and automobiles every year; and where ideas, discussions and revolutions spread digitally within seconds…you’d think we’d have reached the limit of our computational ability. But what if I told you that, in fact, an entirely different kind of computer is on the horizon? Not simply a faster, better computer, but one which can solve some of the world’s most intricate and complicated problems like nothing we’ve seen before. I’m talking about a quantum computer. Let me give you an example of the limitations of our standard computers. We use 4% of all the world’s energy making fertiliser, using a process created in the early 1900s called the Haber process. Essentially, you take nitrogen from the air and hydrogen from natural gas, and turn them into ammonia. In order to break the bonds between the nitrogen atoms to create the new substance, we use an iron catalyst, heated up to about 450°C and kept at an extremely high pressure. The energy required to create the ammonia – the key ingredient needed for fertilisation of crops – is therefore huge. When you think about how much food is needed by the entire world, the growth of which is powered by fertiliser, it’s easy to see why so much energy is eaten up creating ammonia. We know that there’s a better, more energy-efficient way, as microbes in soil manage to create ammonia from the nitrogen in the air with only tiny amounts of sunlight, in a process called Biological Nitrogen Fixation. It’s the exact same equation – nitrogen plus hydrogen to create ammonia – but instead of using a chamber with high pressure and temperature to break the bonds, bacteria living in soil and plant roots ‘fix’ the nitrogen naturally. We don’t yet understand how they do it – all we know is that this bacteria is capable of breaking those bonds and opening the nitrogen up to turn into ammonia. We can’t simply grab all the soil microbes and task them with creating fertiliser for us – it wouldn’t create enough and they’ve got their own work to do – but if we could understand how they do it and copy their genius, our man-made process could be revolutionised. But to do this, to create a man-made scalable method that we could implement in fertiliser factories all over the world, it requires simulating the chemistry of life at such a small scale, with so many different variables, that our conventional computers simply cannot handle the numbers. Discovering a new drug, and bringing it to the people who need it most, on average takes ten years and costs $2.6billion. Part of the reason it takes so long and costs so much, is linked to the arduous process of finding the right molecules to match together, to create an effective drug. It’s a case of trial and error, testing so many different combinations, that the problem is of a similar scale as with Biological Nitrogen Fixation. At the moment, pharmaceutical companies will run billions of comparisons on their super-fast versions of conventional computers, but they are still severely limited to using only the small molecules which conventional computers can handle. Again, understanding the complex chemistry of life proves to be too difficult and full of way too many to-do lists to be sped up with conventional computers. Quantum computers could solve these kinds of problems for breakfast. Now, there are two concepts which are worth getting to grips with enough to understand quantum computing: superposition, and the uncertainty principle. Superposition basically means that all particles – everything that you, me and everything that surrounds us is made up of at the tiniest of levels – can be in more than one place at one time. Sounds mental, yes, but the idea is that instead of our particles being tied to one physical space, there’s a variety of places they can be, each with a different probability. As opposed to saying ‘the particle could be here or could be there’, we describe them as being in all those multiple places at the same time. The uncertainty principle states that we can never know both the position and the momentum of a particle at any one point – if we decide to measure one, we lose the other. In other words, if we want to locate the particle, we cannot know what speed it is going at, and if we want to know the speed, then we cannot locate it – we can only know one at a time. Taking both of these concepts together, means that particles can be in many places at once, but only when we leave them to it, and don’t actually try to work out where exactly they are. It’s worth mentioning that Niels Bohr, the Danish Nobel Prize winning physicist, famously said: “Anyone not shocked by quantum mechanics has not yet understood it.” Even the most accomplished scientists cannot get their head around quantum physics, but it’s exactly this kind of science which sits at the heart of what will become our solution to the world’s hardest problems. So we know that in the quantum world, particles can be in more than one place at one time. In the same way that multiple versions of yourself would result in a much more productive person, multiple particles in multiple places inside a quantum computer results in multiple calculations being able to be done at the same time. Conventional computers can do long arduous calculations, but they essentially have to work through a big long line of 0s and 1s (like a to-do list) to get to the answer. A quantum computer can do all the tasks on the long to-do list all at once, meaning they can work on very complicated tasks – like simulating the chemistry of life. So how long until we can feed the world using less energy and get drugs to the people who need them most, faster? In short: we still have a way to go. We haven’t yet got to the point at which a quantum computer has been proven to be better than a conventional computer – known as ‘quantum supremacy’. Google, Microsoft, IBM and many universities and startups are working towards a working, useful, reliable quantum computer, but so far their efforts are still at an early stage. That’s not a reason to lose hope, however. Progress has been accelerating over the last 10 years, particularly with the interest of corporates matching that of academia, and it’s been estimated that quantum supremacy could be reached as soon as this year. As with all hyped-up tech trends though, it’s worth bearing in mind that real world applications are always far behind the first working model. A whole new way of writing algorithms for these computers has to be worked out, and translating real world problems which abide by the laws of classical physics into problems which make sense in the quantum world is a hefty task in itself. Quantum computing is coming, and with it come exciting prospects for solving complicated issues lying at the heart of huge social issues around the world. It’s not all rosy though: with any new technology, changes can disrupt the status quo in negative ways too. For our world to work, we need problems which cannot be solved – this idea is at the root of encryption, for example. Hype around new technologies can also prompt poor investment decisions and early hustlers out to make a quick buck off a buzzword. In the same way that particles can be more than one thing at one time, quantum computers can be thought of both as a blessing and a curse… This article continues in Part 2 here
<urn:uuid:e986c88a-6084-4f70-a0f7-80c6c0efd991>
CC-MAIN-2020-16
https://romanroadjournal.com/quantum-computers-part-1-whats-all-the-fuss-about/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371637684.76/warc/CC-MAIN-20200406133533-20200406164033-00397.warc.gz
en
0.947803
1,704
3.640625
4
Researchers Discover New Way To Split And Sum Photons With Silicon A team of researchers at The University of Texas at Austin and the University of California, Riverside have found a way to produce a long-hypothesized phenomenon—the transfer of energy between silicon and organic, carbon-based molecules—in a breakthrough that has implications for information storage in quantum computing, solar energy conversion and medical imaging. The research is described in a paper out today in the journal Nature Chemistry. Silicon is one of the planet’s most abundant materials and a critical component in everything from the semiconductors that power our computers to the cells used in nearly all solar energy panels. For all of its abilities, however, silicon has some problems when it comes to converting light into electricity. Different colors of light are comprised of photons, particles that carry light’s energy. Silicon can efficiently convert red photons into electricity, but with blue photons, which carry twice the energy of red photons, silicon loses most of the energy as heat. The new discovery provides scientists with a way to boost silicon’s efficiency by pairing it with a carbon-based material that converts blue photons into pairs of red photons that can be more efficiently used by silicon. This hybrid material can also be tweaked to operate in reverse, taking in red light and converting it into blue light, which has implications for medical treatments and quantum computing. “The organic molecule we’ve paired silicon with is a type of carbon ash called anthracene. It’s basically soot,” said Sean Roberts, a UT Austin assistant professor of chemistry. The paper describes a method for chemically connecting silicon to anthracene, creating a molecular power line that allows energy to transfer between the silicon and ash-like substance. “We now can finely tune this material to react to different wavelengths of light. Imagine, for quantum computing, being able to tweak and optimize a material to turn one blue photon into two red photons or two red photons into one blue. It’s perfect for information storage.” For four decades, scientists have hypothesized that pairing silicon with a type of organic material that better absorbs blue and green light efficiently could be the key to improving silicon’s ability to convert light into electricity. But simply layering the two materials never brought about the anticipated “spin–triplet exciton transfer,” a particular type of energy transfer from the carbon-based material to silicon, needed to realize this goal. Roberts and materials scientists at UC Riverside describe how they broke through the impasse with tiny chemical wires that connect silicon nanocrystals to anthracene, producing the predicted energy transfer between them for the first-time. “The challenge has been getting pairs of excited electrons out of these organic materials and into silicon. It can’t be done just by depositing one on top of the other,” Roberts said. “It takes building a new type of chemical interface between the silicon and this material to allow them to electronically communicate.” Roberts and his graduate student Emily Raulerson measured the effect in a specially designed molecule that attaches to a silicon nanocrystal, the innovation of collaborators Ming Lee Tang, Lorenzo Mangolini and Pan Xia of UC Riverside. Using an ultrafast laser, Roberts and Raulerson found that the new molecular wire between the two materials was not only fast, resilient and efficient, it could effectively transfer about 90% of the energy from the nanocrystal to the molecule. “We can use this chemistry to create materials that absorb and emit any color of light,” said Raulerson, who says that, with further fine tuning, similar silicon nanocrystals tethered to a molecule could generate a variety of applications, from battery-less night-vision goggles to new miniature electronics. Other highly efficient processes of this sort, called photon up-conversion, previously relied on toxic materials. As the new approach uses exclusively nontoxic materials, it opens the door for applications in human medicine, bioimaging and environmentally sustainable technologies, something that Roberts and fellow UT Austin chemist Michael Rose are working towards. At UC Riverside, Tang’s lab pioneered how to attach the organic molecules to the silicon nanoparticles, and Mangolini’s group engineered the silicon nanocrystals. “The novelty is really how to get the two parts of this structure—the organic molecules and the quantum confined silicon nanocrystals—to work together,” said Mangolini, an associate professor of mechanical engineering. “We are the first group to really put the two together.” The paper’s other authors include Devin Coleman and Carter Gerke of UC Riverside. Funding for the research was provided by the National Science Foundation, the Robert A. Welch Foundation, the Research Corporation for Science Advancement, the Air Force Office of Scientific Research and the Department of Energy. Additionally, Raulerson holds the Leon O. Morgan Graduate Fellowship at UT Austin. Source: The University of Texas at Austin
<urn:uuid:6db00800-cc06-44bf-9899-7c626ff69fd5>
CC-MAIN-2020-16
https://www.photonicsonline.com/doc/researchers-discover-new-way-to-split-and-sum-photons-with-silicon-0001
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370505826.39/warc/CC-MAIN-20200401161832-20200401191832-00118.warc.gz
en
0.915509
1,029
3.546875
4
Teleportation of electricity This article summarizes some recent work from our group. For another perspective on the same research, see PhysRev Focus (6 February 2004). Teleportation is the transfer of a quantum mechanical state between two particles that can only communicate by classical means. Because the transfer takes place without exchange of matter, it is reminiscent of the well known Beam me up! from the StarTrek television series. Teleportation of isolated particles was invented ten years ago and demonstrated for photons in free space . We have found a way to teleport electrical charge in the solid state . This discovery could be used to transfer quantum mechanical bits (qubits) in a quantum computer. Electron meets hole There exist two types of charge carriers in the solid state, electrons and holes. Because they are oppositely charged, they can only exist simultaneously if they are separated from each other by an insulating barrier. If the barrier still passes a small current and an electron meets a hole, then both are annihilated. We have discovered that this need not be the end of the story. Under special circumstances the electron can continue its existence at a distant location by teleportation. The meeting of an electron and a hole is illustrated is figure 1. Both particles live in the conduction band of a metal or semiconductor. At low temperature all energy levels are filled with electrons up to a maximal energy. This "sea" of electrons is called the Fermi sea and the maximal energy level is the Fermi level. The fully filled Fermi sea is in equilibrium and therefore carries no electrical current. To pass a current you need excitations. These are filled states (electrons) above the Fermi level or empty states (holes) below it. The meeting takes place at an insulating barrier, which plays the role of a sluice: The Fermi level is a little higher on one side of the barrier than on the other, so that the hole is elevated to the same energy as the electron. Figure 1: An electron meets a hole. Usually the electron and the hole are reflected by the barrier, but each time they meet there is a small probability that the electron will tunnel through the barrier and fall into the hole at the other side. Then both particles disappear without leaving a trace. End of story. Unless ... the hole had been entangled in the past with another electron, at some distant location in the material. I will refer to this second, distant electron as the "heavenly" electron and to the first electron as the "earthly" electron. Entangled electron-hole pair Entanglement is a quantum mechanical correlation between the spins. The spin of the hole is, on the one hand, oriented completely isotropically and, on the other hand, fully correlated with the spin of the heavenly electron. In classical mechanics this would be impossible, but not so in quantum mechanics. The wave function that describes such an entangled state of hole and electron is . It is a superposition of two states; In the first state electron and hole have both spin and in the second state they have both spin . Such an entangled state is created naturally when the heavenly electron tunnels through a barrier and leaves behind a hole with the same spin . See figure 2. Figure 2: Creation of an entangled electron-hole pair. Both spins are isotropically distributed but perfectly correlated. The hole continues its path and will eventually be annihilated by the electron from figure 1. Because of the entanglement of the spins, the remaining electron takes on the state of the annihilated electron. Teleportation by electron-hole annihilation Back to the earthly electron. It falls in the hole and disappears in the Fermi sea. Its quantum mechanical state was unknown and seems lost. The entanglement, however, acts like a "soul" that transfers the state from the earthly electron to the heavenly electron. Here is how it works: In order to fill the hole, the spin of the earthly electron and that of the hole have to line up. The entanglement ensures that this spin correlation is inherited by the heavenly electron. The spin of the heavenly electron is therefore no longer distributed isotropically, but has acquired the state of the earthly electron. This instantaneous transfer of a quantum mechanical state between two distant particles is what Bennett et al. have called teleportation , with a nod to StarTrek. Although teleportation is instantaneous, Einstein can rest assured: There is no instantaneous transfer of information. Since it is unpredictable when a tunneling attempt is successful, a message will need to be sent by regular (classical) means that the teleportation has happened. The distant electron can not measure whether its state is still isotropic or not, because any measurement will destroy the quantum mechanical state itself. The instantaneous transfer of the state from one to the other electron is necessary to satisfy the no-cloning theorem from quantum mechanics: At no instant does there exist more than a single copy of the state . The state is the qubit in a quantum computer. Teleportation makes it possible in principle to transport that state from one part of the electrical circuit to the other, without having to disturb that state by a measurement. That is the long-term motivation of our research. On the short term, it would be a major breakthrough if the entanglement of the electron-hole pair could be measured. Teleportation over a distance of a few micrometers would then be the logical next step. A small step, perhaps, for Captain Kirk, but a giant step for science. A Dutch version of this article appeared in: Nederlands Tijdschrift voor Natuurkunde 70, 112 (2004). C.H. Bennett, G. Brassard, C. Crépeau, R. Jozsa, A. Peres, W.K. Wootters, Phys.Rev.Lett. 70, 1895 (1993). D. Bouwmeester, J.-W. Pan, K. Mattle, M. Eibl, H. Weinfurter, A. Zeilinger, Nature 390, 575 (1997). C.W.J. Beenakker, M. Kindermann, Phys.Rev.Lett. 92, 056801 (2004). C.W.J. Beenakker, C. Emary, M. Kindermann, J.L. van Velsen, Phys.Rev.Lett. 91, 147901 (2003).
<urn:uuid:ab0b5acc-bf8f-465b-b789-d9398084459e>
CC-MAIN-2020-16
https://www.ilorentz.org/beenakkr/mesoscopics/topics/teleportation/teleportation.html
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370520039.50/warc/CC-MAIN-20200404042338-20200404072338-00479.warc.gz
en
0.905525
1,338
3.515625
4
ORNL neutrons add advanced polarization capability for measuring magnetic materials Understanding magnetism at its most fundamental level is vital to developing more powerful electronics, but materials with more complex magnetic structures require more complex tools for studying them--powerful tools simply referred to as "neutrons." Two of the world's most powerful sources for neutron scattering at the US Department of Energy's (DOE's) Oak Ridge National Laboratory (ORNL) are getting upgrades. Adding an advanced capability called spherical neutron polarimetry will enable researchers using ORNL's High Flux Isotope Reactor (HFIR) and Spallation Neutron Source (SNS) to make measurements of materials featuring exotic magnetic structures and quantum states that were previously inaccessible in the United States. "Neutrons are ideal for studying magnetic phenomena," said ORNL post-masters researcher Nicolas Silva. "They're electrically neutral, or have no charge, and exhibit magnetic moments, which sort of make them like tiny magnets themselves." When neutrons pass through a material and scatter off magnetic fields generated by a material's atoms, they paint an atomic portrait or even a 3D model of the material's atomic arrangement and reveal how the atoms within the system are behaving. Neutrons have a "spin," or orientation, like the north and south poles of refrigerator magnets. In a typical neutron beam, the neutrons within the beam have spins that are arranged randomly. Measuring certain highly dynamic or complex magnetic systems, however, requires more uniformity, which is provided by a polarized neutron beam in which each neutron spin is aligned in parallel and with the same orientation. "Neutron polarization filters allow us to see through the stuff we don't want to see that might be muddying up the signal we're interested in," said instrument scientist Barry Winn. "Similar to how polarized lenses allow anglers to see fish swimming below that would be otherwise blocked by the water's reflection." Neutrons will change their spins in predictable ways when they scatter. Using a polarized beam enables researchers to better understand what's happening in a material by establishing the neutron spin before and measuring the neutron spin after the beam strikes the sample. For example, a neutron's spin could be flipped in the opposite direction during scattering. "In the US, most of the measurements we've been doing with polarized neutrons until now have been based on whether the neutron, after being scattered from the material or its magnetic field, gets rotated 180 degrees or preserves its orientation. We call that spin-flip and non-spin-flip," said Winn. "But there's a problem with that. If we get any scattering off the sample that's something other than a non-spin-flip or spin-flip--or something other than 0 and 180 degrees--then the strategy blows up in our face." The strategy works well for conventional magnetic materials such as ferromagnets and antiferromagnets, in which all the magnetic atoms are pointing either in the same direction or in alternate directions, but remain parallel to their neighbors. However, the strategy does not work for more complex magnetic structures. For example, the technique is limited when it comes to investigating exotic particles such as skyrmions--quasi-particles that exhibit chiral motion, or tangled vortices, or whirlpools of asymmetric field lines. Such particles provide exciting potential for materials used in advanced data storage and quantum computing applications. To tackle the problem, polarization scientist Peter Jiang is leading an ORNL team including Winn and Silva in a laboratory directed research and development project to develop spherical neutron polarimetry for multiple ORNL beamlines. The technology will enable neutron measurements of materials that don't conform to the traditional spin-flip and non-spin-flip domains, or, in other words, will enable researchers to see the dynamical magnetic behavior that exists in between. "The traditional techniques are not sophisticated enough to study certain complex magnetic systems," said Jiang. "Now, we're no longer restricted to spin-flips. This allows us to look at magnetic arrangements that we weren't able to figure out before." Spherical neutron polarimetry has been used in Europe, and now Jiang and the ORNL team are adapting the technology to instruments at SNS and HFIR. They're building the technology based on ongoing research conducted by Tianhao Wang, first as a graduate student at Indiana University, Bloomington, and later as a postdoctoral research on the ORNL team. The basic technology incorporates additional optical devices installed on both the incoming beam that hits the sample--the incident beam--and the outgoing beam that scatters off it, which enables measurements of scattered neutrons oriented in any direction. The ORNL technology builds on previous prototype designs and will offer several innovations. With the ORNL spherical neutron polarimetry devices, the scattered beam trajectory need not be in line with the incident beam but instead can be angled around the sample. "That means if the neutron doesn't experience a full flip, we can adjust the field on the other end, or move the apparatus to detect neutrons scattering in different directions," explained Silva. The team also developed two independent cooling systems to enable researchers to study how magnetic structures change as a function of temperature. The first system cools two spherical neutron polarization components located on either side of the sample to make them superconducting. The second system introduces an extra cryostat with liquid helium auto-refilling capability that allows researchers to more easily explore materials under a range of temperatures without interfering with the temperatures required for superconductivity in the first system. Finally, the spherical neutron polarimetry devices are made with more efficient materials. Whereas previous designs use niobium for the superconducting sheets, the new design uses an yttrium-barium-copper-oxide (YBCO) that superconducts at 93 Kelvin (-292° F), a significantly higher temperature than its niobium predecessor. Additionally, the superconducting films are coupled with Mu-metal yokes that combine to shield all other magnetic fields and establish a zero field around the sample to study the materials' spins in their natural state. "Reaching superconductivity requires a significant amount of cooling power. Niobium needs to be cooled to below 10 K to maintain superconductivity, so the European designs required extensive cooling systems that had to be manually refilled with liquid helium often," said Jiang. "With the high-temperature YBCO films, we can use a single-stage closed-cycle refrigerator to cool the film to far below its critical temperature, so we're not worried about any loss in superconductivity. And, with the added liquid helium autofill system for the cryostat and the closed-cycle refrigeration system, the device will be easier to use and more efficient." What's more, the system is compact by comparison with previous systems--the high-temperature superconductors that negate the need for a large cooling system make it mobile. "If anything, there's a testament to how portable the device is. We've moved it to the nuclear reactor at the University of Missouri, then back to HFIR, and from HFIR to SNS," said Silva. "I've put it together and taken it apart multiple times, and each time I've found easier ways to connect the pieces--just little quality-of-life changes we're making to enhance its utility." The system has been successfully tested, wherein full polarization measurements were made using several known materials including silicon, manganese-oxide, and bismuth-iron-oxide. The team plans to implement the system at HFIR's PTAX triple axis spectrometer and the GP-SANS diffractometer, which will be optimized for the reactor's steady-state neutron beam, with full capabilities expected by the end of 2020. Subsequently, the team will develop a similar spherical neutron polarimetry device exclusively for the HYSPEC instrument at SNS which will make it the only instrument in the world that couples a super-mirror array and wide-angle capability. The device will also benefit from the unique capabilities enabled by the SNS pulsed-source accelerator. "In the meantime," said Winn, "we're going to have a workhorse in PTAX that's going to knock our socks off." HFIR and SNS are DOE Office of Science User Facilities. UT-Battelle LLC manages ORNL for the DOE Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.--by Jeremy Rumsey
<urn:uuid:ed428953-9b3e-44d0-a2f5-6418ee032752>
CC-MAIN-2020-16
https://www.eurekalert.org/features/doe/2020-03/drnl-ona031320.php
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370497309.31/warc/CC-MAIN-20200330212722-20200331002722-00402.warc.gz
en
0.936147
1,803
3.546875
4
Quantum computer emulated by a classical system (Phys.org)—Quantum computers are inherently different from their classical counterparts because they involve quantum phenomena, such as superposition and entanglement, which do not exist in classical digital computers. But in a new paper, physicists have shown that a classical analog computer can be used to emulate a quantum computer, along with quantum superposition and entanglement, with the result that the fully classical system behaves like a true quantum computer. Physicist Brian La Cour and electrical engineer Granville Ott at Applied Research Laboratories, The University of Texas at Austin (ARL:UT), have published a paper on the classical emulation of a quantum computer in a recent issue of The New Journal of Physics. Besides having fundamental interest, using classical systems to emulate quantum computers could have practical advantages, since such quantum emulation devices would be easier to build and more robust to decoherence compared with true quantum computers. "We hope that this work removes some of the mystery and 'weirdness' associated with quantum computing by providing a concrete, classical analog," La Cour told Phys.org. "The insights gained should help develop exciting new technology in both classical analog computing and true quantum computing." As La Cour and Ott explain, quantum computers have been simulated in the past using software on a classical computer, but these simulations are merely numerical representations of the quantum computer's operations. In contrast, emulating a quantum computer involves physically representing the qubit structure and displaying actual quantum behavior. One key quantum behavior that can be emulated, but not simulated, is parallelism. Parallelism allows for multiple operations on the data to be performed simultaneously—a trait that arises from quantum superposition and entanglement, and enables quantum computers to operate at very fast speeds. To emulate a quantum computer, the physicists' approach uses electronic signals to represent qubits, in which a qubit's state is encoded in the amplitudes and frequencies of the signals in a complex mathematical way. Although the scientists use electronic signals, they explain that any kind of signal, such as acoustic and electromagnetic waves, would also work. Even though this classical system emulates quantum phenomena and behaves like a quantum computer, the scientists emphasize that it is still considered to be classical and not quantum. "This is an important point," La Cour explained. "Superposition is a property of waves adding coherently, a phenomenon that is exhibited by many classical systems, including ours. "Entanglement is a more subtle issue," he continued, describing entanglement as a "purely mathematical property of waves." "Since our classical signals are described by the same mathematics as a true quantum system, they can exhibit these same properties." He added that this kind of entanglement does not violate Bell's inequality, which is a widely used way to test for entanglement. "Entanglement as a statistical phenomenon, as exhibited by such things as violations of Bell's inequality, is rather a different beast," La Cour explained. "We believe that, by adding an emulation of quantum noise to the signal, our device would be capable of exhibiting this type of entanglement as well, as described in another recent publication." In the current paper, La Cour and Ott describe how their system can be constructed using basic analog electronic components, and that the biggest challenge is to fit a large number of these components on a single integrated circuit in order to represent as many qubits as possible. Considering that today's best semiconductor technology can fit more than a billion transistors on an integrated circuit, the scientists estimate that this transistor density corresponds to about 30 qubits. An increase in transistor density of a factor of 1000, which according to Moore's law may be achieved in the next 20 to 30 years, would correspond to 40 qubits. This 40-qubit limit is also enforced by a second, more fundamental restriction, which arises from the bandwidth of the signal. The scientists estimate that a signal duration of a reasonable 10 seconds can accommodate 40 qubits; increasing the duration to 10 hours would only increase this to 50 qubits, and a one-year duration would only accommodate 60 qubits. Due to this scaling behavior, the physicists even calculated that a signal duration of the approximate age of the universe (13.77 billion years) could accommodate about 95 qubits, while that of the Planck time scale (10-43 seconds) would correspond to 176 qubits. Considering that thousands of qubits are needed for some complex quantum computing tasks, such as certain encryption techniques, this scheme clearly faces some insurmountable limits. Nevertheless, the scientists note that 40 qubits is still sufficient for some low-qubit applications, such as quantum simulations. Because the quantum emulation device offers practical advantages over quantum computers and performance advantages over most classical computers, it could one day prove very useful. For now, the next step will be building the device. "Efforts are currently underway to build a two-qubit prototype device capable of demonstrating entanglement," La Cour said. "The enclosed photo [see above] shows the current quantum emulation device as a lovely assortment of breadboarded electronics put together by one of my students, Mr. Michael Starkey. We are hoping to get future funding to support the development of an actual chip. Leveraging quantum parallelism, we believe that a coprocessor with as few as 10 qubits could rival the performance of a modern Intel Core at certain computational tasks. Fault tolerance is another important issue that we studying. Due to the similarities in mathematical structure, we believe the same quantum error correction algorithms used to make quantum computers fault tolerant could be used for our quantum emulation device as well." More information: Brian R. La Cour and Granville E. Ott. "Signal-based classical emulation of a universal quantum computer." New Journal of Physics. DOI: 10.1088/1367-2630/17/5/053017 Journal information: New Journal of Physics © 2015 Phys.org
<urn:uuid:ca7bf4c5-f922-47f9-9ccf-8073e1e699c7>
CC-MAIN-2020-16
https://m.phys.org/news/2015-05-quantum-emulated-classical.html
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370505730.14/warc/CC-MAIN-20200401100029-20200401130029-00243.warc.gz
en
0.93355
1,228
3.78125
4
Introduction to Quantum Computing Guest post by Anita Ramanan Software Development Engineer at Microsoft Anita graduated from University College London in 2014 with an MSci in Natural Sciences: Atomic and Particle Physics and Physical Chemistry (TL;DR: Quantum Mechanics). Since then, she has been working at Microsoft and is now a Software Engineer focusing on the Internet of Things (particularly as it relates to healthcare), Xamarin, Power BI and now Quantum Computing. The concept of quantum computing was famously discussed by Richard Feynman during his 1981 keynote delivery at the first ‘Physics of Computation’ conference (worth a read if you’re that way inclined) ( Feynman, 1982 ) . In his speech, he explored the difficulties of simulating complex quantum systems using classical computers and raised the suggestion that to accurately simulate quantum systems, we must strive to build quantum computers. Since then, the field of Quantum Computing has developed at a rapid pace, bringing us within touching distance of a true, physical realisation of a scalable quantum computer (more on this in future posts). The most fundamental difference between a classical computer and a quantum one is the way in which the bit is realised. The bit (‘binary digit’) is the smallest possible unit of digital data. Classically, bits can only take one of two values at any one time: 0 or 1. A quantum bit (qubit) obeys the laws of quantum mechanics however, and can therefore exist in a superposition of both states 0 and 1 simultaneously. (that is, the qubit has a 100% chance that it will be found in either of these two states at a particular time, and a 0% chance that it will be measured to be in any other state at that time). We are able to rewrite our wavefunction |ψ〉 like so: Now we have it in this form, we can visualise this superposition of states |0〉 and |1〉 using the Bloch Sphere: Now any unitary transformation we do on |ψ〉 can be visualised as simply moving the point (marked |ψ〉) around the surface of the sphere. For example, if our state were |ψ〉 = |0〉, the point would sit on the z-axis at the location marked |0〉. Sadly, this visualisation can only be used for single qubit states, as there is no known (simple) generalisation that applies to multi-qubit systems. We will revisit the Bloch Sphere later on in this series. A quantum computer can take advantage of superposition and entanglement* to perform certain calculations more efficiently than is believed to be possible for classical computers – for example, prime factorisation (Shor, 1997)and unstructured search problems (Grover, 1997 ) . Furthermore, these unique properties of quantum physics offer unique new applications such as quantum cryptography (Bennett & Brassard, 1984 ) . The next section describes the accepted requirements necessary to construct such a system. *Superposition is the phenomenon where a quantum system exists as a probabilistic distribution of states a single qubit can exist in a superposed state such as Entanglement requires two or more qubits (or degrees of freedom, more generally) and is what Einstein famously described as ‘spooky action at a distance’ – the concept that the perturbation of one particle can affect the state of another regardless of distance or physical separation from one another (despite not allowing faster than light communication). One example of an entangled state is the Bell State . Five (Plus Two) Criteria for Quantum Computing In 2008, David DiVincenzo published five requirements (refined from his original 1996 paper) which a system must fulfil in order to qualify as a scalable quantum computer. These criteria will be used as a basis for discussion throughout this series of posts. I have provided a high-level summary below (for a proper discussion, please see the original paper): 1. The physical system must be scalable and the qubits must be well-known You must be able to ‘scale up’ the system from a single qubit to the many qubits required for complex computation. A “well characterised” qubit is one that has well-known properties and interactions with the rest of the system. 2. We must be able to repeatedly prepare the qubits in a simple starting state (such as |000…〉 ) The system must be in a simple, accurately-known state at the start of computation. If you can’t repeatedly initialise the system in this simple starting state, it can’t really be considered a computer of any sort. 3. The system must survive long enough to perform operations on the qubits For several reasons (such as interactions with external systems), it is difficult to maintain a system of qubits in a prepared state for long before they ‘decohere’ because of unwanted correlations emerging between the system and its unknown and uncontrolled surroundings. When a quantum system decoheres, the quantum bits follow a statistical distribution when measured of 0 or 1 rather than a quantum distribution. Once decohered, no quantum operations can be used to re-cohere the state. The time taken for our system to decohere must therefore be much longer than the time needed for gate operations. 4. We must be able to implement a ‘universal set’ of gates using our system A ‘universal set’ contains all the gates needed to perform any quantum computation. At a minimum, we must be able to move single qubits to any position on the Bloch Sphere (using single-qubit gates), as well as introduce entanglement in the system (this requires a multi-qubit gate). For example, the Hadamard, phase, CNOT and gates form a universal set, from which any quantum computation (on any number of qubits) can be generated. 5. Measurement of specific qubits must be possible One must be able to ‘read out’ the result of the computation by measuring the final state of specific qubits. There are two additional requirements that refer to quantum communication – these requirements relate to quantum information processing: 1. The system must be able to reliably convert data stored in stationary (computational) qubits to networking (“flying”) qubits (e.g. photons) and back again. 2. The system must be able to reliably transmit flying qubits between specified points. There are currently several different physical models for quantum computing in development, ranging from ion trap to photon-based to topological qubits and more. Any system developed to fulfil the role of a quantum computer must satisfy the five (plus two) criteria outlined above. A handful of these candidate systems will be explored in a later blog post, but first we must familiarise ourselves with quantum gates and circuit diagrams, which will be the topic of my next blog post. I look forward to seeing you there! Anita’s GitHub: https://github.com/anraman Anita’s Personal Blog: https://whywontitbuild.com/ Anita’s LinkedIn: https://www.linkedin.com/in/anitaramanan/ Microsoft Quantum https://www.microsoft.com/quantum Microsoft Quantum Development Kit https://www.microsoft.com/en-us/quantum/development-kit Microsoft Quantum Blog https://cloudblogs.microsoft.com/quantum/
<urn:uuid:6002e8c9-339d-4583-bf95-8e33b3287daa>
CC-MAIN-2020-16
https://docs.microsoft.com/en-us/archive/blogs/uk_faculty_connection/introduction-to-quantum-computing
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371821680.80/warc/CC-MAIN-20200408170717-20200408201217-00445.warc.gz
en
0.909376
1,577
3.515625
4
What are Quantum Computers Truly strange and unexpected behaviors of subatomic particles have been discovered by science in the recent past – just the last about 50 years or so. Einstein’s work predicted some of this, but even Einstein couldn’t figure out if the theories were right or what it even meant IF it were true. While we don’t yet fully understand why these things happen the way they do, some brilliant people have figured out ways to leverage these behaviors for something useful. They’re building totally new kinds of supercomputers called “Quantum Computers”. These machines use these strange subatomic behaviors to conduct logical operations in a totally new way. What’s the big deal? When scaled up just a little, it becomes physically impossible for traditional electronic computers to compare. Google recently announced that its quantum computer just achieved something called “quantum supremacy.” It claims to have processed a calculation in just 200 seconds that would have required the world’s most powerful supercomputers over 10,000 years to crunch. While there’s a debate on the use of the “quantum supremacy” term and technicalities of the results, it is nonetheless incredibly impressive! How do quantum computers work? To appreciate quantum computers, we should really look at the strange stuff they work with to do the amazing feats of calculation they can accomplish. The first is called “entanglement” now, but Einstein labeled it “spooky action at a distance”. Since it is still only partially understood and even then only by a small group of physicists, let’s try to get a basic understanding from a metaphor using coins within boxes instead of subatomic bits: First, imagine you have two boxes, each with a coin inside. Opening box A, you will see the coin sitting flat in the bottom showing, let’s say “heads” up. Opening box B, you will see its coin also sitting flat in the bottom showing “heads”. Now we’re going to put the lids on the boxes and shake them so that the coins wildly bounce around in the boxes. Set them down on the table. Open them up, and you’ll see box A now has “tails”, but box B still has “heads”. Repeat this over and over and over again any number of times and each time you open the boxes you’ll see either heads or tails with a 50/50 chance. Sometimes the coins in the boxes will match, sometimes they won’t. It’s a random probability. This is classical physics of the comfortable world we know and have learned to expect. These coins are not entangled. If they were entangled, it would go more like this: You look in box A and see heads, you look in box B and see tails. You close the boxes and shake them up as before. You open box A and see “heads”, and now open box B and see “tails”. You shake them up again. Box A shows “tails”, then box B shows “heads”. You repeat this over and over again any number of times and see that every single time the coin in box B is the opposite of box A. They NEVER show the same side. This is weird, right? So you test it. You leave box B on the table and only shake box A. When you open them, box A shows “heads” and box B shows “tails”. Close and shake box A again – leaving box B on the table. Box A shows “tails” and box B shows… “heads”! But you didn’t shake that box. It just sat there on the table. How did the coin flip?!? Maybe there’s some force field between the boxes, so you ship box B to your very good friend in Tokyo. While you’re on the phone, you both check your boxes. Yours (box A) shows “heads” and the one on the other side of the planet shows “tails”. You close and shake your box and check it again, while your friend leaves box B sitting still on the table. Now box A on your side of the planet shows “tails” and the one on the opposite side of the planet, undisturbed, shows “heads”. The thing is, you could move these boxes to opposite sides of the universe and the effect would be the same. Entangled particles behave like this and there is no (known) physical connection between them, nor is there anything transferred between the two (again, not that we know of). The effect is instantaneous and not limited by the speed of light – which is a limit on absolutely everything else in the known universe. The second strangeness is called “superposition”. This is where a particle can be in two opposing states simultaneously. In our box and coin analogy, it’s like that the coin is showing “heads” AND showing “tails” at the same time. So which one is it? The answer is “yes”. …until you measure it. Once you look into the box, the coin resolves into either “heads” or “tails”. The very act of measurement effects this result; however, the coins are actually both heads and tails at the same instant in time. The direct observation or measurement resolves the state of the coin into showing either “heads” or “tails”, but apart from that measurement, we can only understand how it interacts with other things. We can hear it moving inside the box, maybe even feel it moving around tapping the sides or spinning at the bottom. We know it’s in there and we know it could be “heads” or “tails”, but we don’t know which one until we look at it. Further, if the coins in box A and box B are entangled, we don’t know what either was until we look at one of them. Once we see just one, we know with certainty what both coins are showing. If you have made it this far, congratulations – you’ve made it. There are more details to this strangeness in subatomic land, but this is enough to appreciate what is happening. The classical type of computer we’ve used to build the internet, send a man to the moon and tweet our daily dissatisfaction with uses electrons to change the charge of something to be on or off (“1” or “0”). We call these things bits. The original Atari you might have played Pong on had 8-bits. The latest smartphone and laptop processors use 64-bits. Each of the bits can be only one of two different options. Quantum computers use the properties of superposition and entanglement to manipulate and observe special different types of bits called qubits (Kew-bits), which are not just on or off. Not just “1” or “0” anymore. Each of these qubits can be any number of options in specific probable combinations of “1” and “0” simultaneously. That gives them exponentially increased computing power. Photo by Alexander Sinn on Unsplash This is tremendously useful for the types of calculations that require parallel usage of many of the bits in a computer for each step of a process. Since a quantum computer can manipulate each bit in far more than just two states (on or off, 1 or 0), it can do things in a single step that might take a classical computer many thousands or millions of steps. This means that complex problems solving for a wide variety of possibilities like chemical molecular engineering, global logistics optimization, data security, cryptography, and artificial intelligence can get a HUGE boost from this technology. Really, more accurately, it makes some of these things possible that were practically impossible even for the greatest supercomputers on the planet. Unfortunately, this does nothing for streaming videos, sending emails, or playing video games – the things most of society use computers for. They just don’t get any advantage from parallel processing of complex algorithms. Quantum computers are destined to have a big impact on humanity, but not by virtue of your download speeds. What if I DO use computers to engineer molecules, encrypt data, or optimize logistics? What if I am constructing the next AI engine that will take over the world? You’re in luck! Researchers at IBM have built multiple 20-qubit quantum computer systems and made them available to the public – for free! These computers are some of the most advanced technology available to humanity. It has taken millions of dollars to develop and built and has to be meticulously and cryogenically maintained by a highly-skilled crew of technicians, and you can use it at absolutely no cost at all. It’s called the IBM Q Experience. With this program, IBM invites anyone interested to set up a free account, learn the new programming language (called “Quiskit”) and run your very own programs on their quantum systems. Check out the IBM Q Experience here. CDN Inc. is a product design and engineering firm that can adapt easily to your project needs; engineering, industrial design, prototyping & manufacturing.
<urn:uuid:87a4ec3e-7d16-4d35-b7f5-30d81bb379c7>
CC-MAIN-2020-16
https://www.cdn-inc.com/quantum-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370505826.39/warc/CC-MAIN-20200401161832-20200401191832-00126.warc.gz
en
0.947689
1,965
3.578125
4
Radio is made from atomic-scale defects in diamond Researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have made the world’s smallest radio receiver – built out of an assembly of atomic-scale defects in pink diamonds. This tiny radio — whose building blocks are the size of two atoms — can withstand extremely harsh environments and is biocompatible, meaning it could work anywhere from a probe on Venus to a pacemaker in a human heart. The radio uses tiny imperfections in diamonds called nitrogen-vacancy (NV) centers. To make NV centers, researchers replace one carbon atom in a tiny diamond crystal with a nitrogen atom and remove a neighboring atom — creating a system that is essentially a nitrogen atom with a hole next to it. NV centers can be used to emit single photons or detect very weak magnetic fields. They have photoluminescent properties, meaning they can convert information into light, making them powerful and promising systems for quantum computing, phontonics and sensing. Radios have five basic components — a power source, a receiver, a transducer to convert the high-frequency electromagnetic signal in the air to a low-frequency current, speaker or headphones to convert the current to sound and a tuner. In the Harvard device, electrons in diamond NV centers are powered, or pumped, by green light emitted from a laser. These electrons are sensitive to electromagnetic fields, including the waves used in FM radio. When NV center receives radio waves it converts them and emits the audio signal as red light. A common photodiode converts that light into a current, which is then converted to sound through a simple speaker or headphone. An electromagnet creates a strong magnetic field around the diamond, which can be used to change the radio station, tuning the receiving frequency of the NV centers. Shao and Loncar used billions of NV centers in order to boost the signal, but the radio works with a single NV center, emitting one photon at a time, rather than a stream of light. The radio is extremely resilient, thanks to the inherent strength of diamond. The team successfully played music at 350 degrees Celsius — about 660 Fahrenheit. “Diamonds have these unique properties,” said Loncar. “This radio would be able to operate in space, in harsh environments and even the human body, as diamonds are biocompatible.” Receive an email update when we add a new NANOELECTRONICS article. The Latest on: Nanoelectronics via Google News The Latest on: Nanoelectronics - Two walls may beat one for solar-panel nanotubeson March 30, 2020 at 5:08 pm This affects how suitable nested nanotube pairs may be for nanoelectronics applications, especially photovoltaics. In a 2002 study, the Rice University lab of materials theorist Boris Yakobson ... - Double-walled nanotubes have electro-optical advantageson March 27, 2020 at 1:38 pm This affects how suitable nested nanotube pairs may be for nanoelectronics applications, especially photovoltaics. The theoretical research by Yakobson's Brown School of Engineering group appears in ... - About the Laboratoryon March 26, 2020 at 5:00 pm Conducting research for the next generation of nanoelectronics and nanophotonics requires interdisciplinary collaboration between experts in solid mechanics, structural analysis, materials, ... - Center for Photonics, Electromagnetics and Nanoelectronics (CPEN)on March 25, 2020 at 5:00 pm The Center for Photonics, Electromagnetics and Nanoelectronics (CPEN) is to conduct innovative research in novel electromagnetic materials (metamaterials), optoelectronic devices, biosensors, antennas ... - Scientists develop platform for building nanoelectronics and quantum processorson March 24, 2020 at 7:54 am Scientists of Far Eastern Federal University (FEFU, Vladivostok, Russia) together with colleagues from the Chinese Academy of Sciences (Beijing) have designed a platinum-cobalt-magnesium oxide ... - 500 microkelvin nanoelectronicson March 20, 2020 at 3:19 am Pushing the low temperature limit of refrigerators beyond milli-kelvin regime holds the promise for new discoveries in the nano-electronic devices. Here, Sarsby et al. achieve 500 micro-kelvin ... - How Can Polymers Push the Boundaries of Nanoelectronics?on March 19, 2020 at 5:00 pm The fields of organic electronics and nanoelectronics have been undergoing significant advancements in recent years as scientists seek to miniaturize electronic devices via a number of routes. - Millimetre-scale transceiver boosts ingestible sensorson March 19, 2020 at 5:12 am Bachmann, who serves as programme manager for the Sensitive Networks project at Imec’s laboratory in Eindhoven, Netherlands, explains that these goals are only achievable thanks to an ongoing ... - Obtaining and observing single-molecule magnets on the silica surfaceon March 3, 2020 at 8:26 am The work carried out by a team led by Lukasz Laskowski from the Department of Molecular Engineering and Nanoelectronics of the Institute of Nuclear Physics of the Polish Academy of Sciences which ... via Bing News
<urn:uuid:6b4489bf-02bc-4be3-8202-e38f47775cd2>
CC-MAIN-2020-16
https://www.innovationtoronto.com/2016/12/2-atom-thick-radio-receiver-can-work-extremes-space-harsh-environments-human-body/?shared=email&msg=fail
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370506988.10/warc/CC-MAIN-20200402143006-20200402173006-00206.warc.gz
en
0.894324
1,136
4.34375
4
Try a quick experiment: Take two flashlights into a dark room and shine them so that their light beams cross. Notice anything peculiar? The rather anticlimactic answer is, probably not. That’s because the individual photons that make up light do not interact. Instead, they simply pass each other by, like indifferent spirits in the night. But what if light particles could be made to interact, attracting and repelling each other like atoms in ordinary matter? One tantalizing, albeit sci-fi possibility: light sabers — beams of light that can pull and push on each other, making for dazzling, epic confrontations. Or, in a more likely scenario, two beams of light could meet and merge into one single, luminous stream. It may seem like such optical behavior would require bending the rules of physics, but in fact, scientists at MIT, Harvard University, and elsewhere have now demonstrated that photons can indeed be made to interact — an accomplishment that could open a path toward using photons in quantum computing, if not in light sabers. In a paper published today in the journal Science, the team, led by Vladan Vuletic, the Lester Wolfe Professor of Physics at MIT, and Professor Mikhail Lukin from Harvard University, reports that it has observed groups of three photons interacting and, in effect, sticking together to form a completely new kind of photonic matter. In controlled experiments, the researchers found that when they shone a very weak laser beam through a dense cloud of ultracold rubidium atoms, rather than exiting the cloud as single, randomly spaced photons, the photons bound together in pairs or triplets, suggesting some kind of interaction — in this case, attraction — taking place among them. While photons normally have no mass and travel at 300,000 kilometers per second (the speed of light), the researchers found that the bound photons actually acquired a fraction of an electron’s mass. These newly weighed-down light particles were also relatively sluggish, traveling about 100,000 times slower than normal noninteracting photons. Vuletic says the results demonstrate that photons can indeed attract, or entangle each other. If they can be made to interact in other ways, photons may be harnessed to perform extremely fast, incredibly complex quantum computations. “The interaction of individual photons has been a very long dream for decades,” Vuletic says. Vuletic’s co-authors include Qi-Yung Liang, Sergio Cantu, and Travis Nicholson from MIT, Lukin and Aditya Venkatramani of Harvard, Michael Gullans and Alexey Gorshkov of the University of Maryland, Jeff Thompson from Princeton University, and Cheng Ching of the University of Chicago. Biggering and biggering Vuletic and Lukin lead the MIT-Harvard Center for Ultracold Atoms, and together they have been looking for ways, both theoretical and experimental, to encourage interactions between photons. In 2013, the effort paid off, as the team observed pairs of photons interacting and binding together for the first time, creating an entirely new state of matter. In their new work, the researchers wondered whether interactions could take place between not only two photons, but more. “For example, you can combine oxygen molecules to form O2 and O3 (ozone), but not O4, and for some molecules you can’t form even a three-particle molecule,” Vuletic says. “So it was an open question: Can you add more photons to a molecule to make bigger and bigger things?” To find out, the team used the same experimental approach they used to observe two-photon interactions. The process begins with cooling a cloud of rubidium atoms to ultracold temperatures, just a millionth of a degree above absolute zero. Cooling the atoms slows them to a near standstill. Through this cloud of immobilized atoms, the researchers then shine a very weak laser beam — so weak, in fact, that only a handful of photons travel through the cloud at any one time. The researchers then measure the photons as they come out the other side of the atom cloud. In the new experiment, they found that the photons streamed out as pairs and triplets, rather than exiting the cloud at random intervals, as single photons having nothing to do with each other. In addition to tracking the number and rate of photons, the team measured the phase of photons, before and after traveling through the atom cloud. A photon’s phase indicates its frequency of oscillation. “The phase tells you how strongly they’re interacting, and the larger the phase, the stronger they are bound together,” Venkatramani explains. The team observed that as three-photon particles exited the atom cloud simultaneously, their phase was shifted compared to what it was when the photons didn’t interact at all, and was three times larger than the phase shift of two-photon molecules. “This means these photons are not just each of them independently interacting, but they’re all together interacting strongly.” The researchers then developed a hypothesis to explain what might have caused the photons to interact in the first place. Their model, based on physical principles, puts forth the following scenario: As a single photon moves through the cloud of rubidium atoms, it briefly lands on a nearby atom before skipping to another atom, like a bee flitting between flowers, until it reaches the other end. If another photon is simultaneously traveling through the cloud, it can also spend some time on a rubidium atom, forming a polariton — a hybrid that is part photon, part atom. Then two polaritons can interact with each other via their atomic component. At the edge of the cloud, the atoms remain where they are, while the photons exit, still bound together. The researchers found that this same phenomenon can occur with three photons, forming an even stronger bond than the interactions between two photons. “What was interesting was that these triplets formed at all,” Vuletic says. “It was also not known whether they would be equally, less, or more strongly bound compared with photon pairs.” The entire interaction within the atom cloud occurs over a millionth of a second. And it is this interaction that triggers photons to remain bound together, even after they’ve left the cloud. “What’s neat about this is, when photons go through the medium, anything that happens in the medium, they ‘remember’ when they get out,” Cantu says. This means that photons that have interacted with each other, in this case through an attraction between them, can be thought of as strongly correlated, or entangled — a key property for any quantum computing bit. “Photons can travel very fast over long distances, and people have been using light to transmit information, such as in optical fibers,” Vuletic says. “If photons can influence one another, then if you can entangle these photons, and we’ve done that, you can use them to distribute quantum information in an interesting and useful way.” Going forward, the team will look for ways to coerce other interactions such as repulsion, where photons may scatter off each other like billiard balls. “It’s completely novel in the sense that we don’t even know sometimes qualitatively what to expect,” Vuletic says. “With repulsion of photons, can they be such that they form a regular pattern, like a crystal of light? Or will something else happen? It’s very uncharted territory.” This research was supported in part by the National Science Foundation.
<urn:uuid:6abe9682-649e-48c5-bd9a-812b84c6ea31>
CC-MAIN-2020-16
http://news.mit.edu/2018/physicists-create-new-form-light-0215
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370493120.15/warc/CC-MAIN-20200328194743-20200328224743-00367.warc.gz
en
0.94635
1,606
3.640625
4
Nothing is more frustrating than watching that circle spinning in the centre of your screen, while you wait for your computer to load a programme or access the data you need. Now a team from the Universities of Sheffield and Leeds may have found the answer to faster computing: sound. The research – published in Applied Physics Letters – has shown that certain types of sound waves can move data quickly, using minimal power. The world’s 2.7 zettabytes (2.7 followed by 21 zeros) of data are mostly held on hard disk drives: magnetic disks that work like miniaturised record players, with the data read by sensors that scan over the disk’s surface as it spins. But because this involves moving parts, there are limits on how fast it can operate. For computers to run faster, we need to create “solid-state” drives that eliminate the need for moving parts – essentially making the data move, not the device on which it’s stored. Flash-based solid-state disk drives have achieved this, and store information electrically rather than magnetically. However, while they operate much faster than normal hard disks, they last much less time before becoming unreliable, are much more expensive and still run much slower than other parts of a modern computer – limiting total speed. Creating a magnetic solid-state drive could overcome all of these problems. One solution being developed is ‘racetrack memory’, which uses tiny magnetic wires, each one hundreds of times thinner than a human hair, down which magnetic “bits” of data run like racing cars around a track. Existing research into racetrack memory has focused on using magnetic fields or electric currents to move the data bits down the wires. However, both these options create heat and reduce power efficiency, which will limit battery life, increase energy bills and CO2 emissions. Dr Tom Hayward from the University of Sheffield and Professor John Cunningham from the University of Leeds have together come up with a completely new solution: passing sound waves across the surface on which the wires are fixed. They also found that the direction of data flow depends on the pitch of the sound generated – in effect they “sang” to the data to move it. The sound used is in the form of surface acoustic waves – the same as the most destructive wave that can emanate from an earthquake. Although already harnessed for use in electronics and other areas of engineering, this is the first time surface acoustic waves have been applied to a data storage system. The Latest on: Faster computing via Google News The Latest on: Faster computing - Future quantum computers may pose threat to today's most-secure communicationson April 8, 2020 at 9:18 pm Quantum computers that are exponentially faster than any of our current classical computers and are capable of code-breaking applications could be available in 12 to 15 years, posing major risks to ... - Why the Market Dropped so Fast - The Reich Reporton April 8, 2020 at 9:00 pm In March, the stock market fell 30% in only 18 days which was by far the fastest 30% drop in history. By comparison, the next fastest was 55 days in ... - A student created a computer program that tells you when an Amazon Fresh or Whole Foods delivery slot opens upon April 8, 2020 at 5:23 pm It's almost impossible to find a delivery time slot using Amazon Fresh or Whole Foods right now. But a script can alert you when one opens up. Here's how to use it. - These devices should be your go-to computing options for remote learningon April 8, 2020 at 11:01 am As school districts across the country shut down, some for the rest of the school year, a lot of them have switched to some form of remote learning. With online teaching, kids are still able to ... - Sen. Sherrod Brown, other Democrats, ask Labor Department to help states get unemployment help out fasteron April 8, 2020 at 10:39 am Vendors that support those systems expanded it with more computer network servers and hardware to deal with increased traffic ... and to reach out workforce agencies in all 50 states to determine how ... - Mobile Computing Devices Market- Future Scope, Demands And Projected Market Growth Till 2027on April 8, 2020 at 8:58 am The global mobile computing devices market is segmented into type such as smartphones, tablets, laptops, wearable ... - Unisys Always-On Access™ Powered by Stealth™ Provides Fast, Encrypted Remote Access for Workerson April 8, 2020 at 5:00 am Unisys Corporation (NYSE: UIS) today announced the general availability of Unisys Always-On Access™ (AOA), powered by Stealth™, its award-winning Unisys Stealth® security software that provides ... - Honeywell’s Big Bet on Trapped Ion Quantum Computingon April 7, 2020 at 5:23 pm but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced ... - This tiny hybrid power adapter and hub fast charges multiple Apple devices and supports 5Gbps data transfer!on April 7, 2020 at 1:15 pm SuperHub uses USB 3.1 with 5Gbps data transfer, 10 times faster than traditional USB 2.0. You can transfer a movie in seconds. Perfect for anyone who doesn’t like waiting. Transferring photos from ... - Want a Virus Disaster Loan Fast? Easy, If You’re in Californiaon April 7, 2020 at 12:15 pm When it comes to federal aid for small businesses, U.S. states are learning a lesson already familiar to shoppers on the hunt for toilet paper: It pays to be first in line. via Bing News
<urn:uuid:a3b6f48d-949e-4620-b391-1d3992c17f4a>
CC-MAIN-2020-16
https://www.innovationtoronto.com/2015/11/the-solution-to-faster-computing-sing-to-your-data/?responsive=true
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371830894.88/warc/CC-MAIN-20200409055849-20200409090349-00209.warc.gz
en
0.929083
1,214
3.859375
4
This is the era of Artificial Intelligence (AI). Every industry is trying to automate processes and predict the market and future demands using AI, Machine Learning, Deep Learning, and a clutch of new technologies. To be able to differentiate between all these much-hyped terms, let us understand what each of them stands for. So, what is Artificial Intelligence? Techopedia says that Artificial Intelligence is an area of computer science that emphasizes the creation of intelligent machines that can work and react like humans eventually. Some of the activities AI is designed for are learning, planning, speech recognition, and problem-solving. Of course, machines can work and react like humans only if they have access to abundant information. AI should have access to categories, objects, properties, and the relationship between them to be able to initiate reasoning, make decisions, and plan to act. All the processes of rationalizing, categorizing, and training the machines to make human-like decisions and act accordingly are made possible by the combination of Machine Learning, Deep Learning, convolution learning algorithms, etc. So, AI is a superset of all the other terms. Each of these terms refers to a specific application of AI. Each is equally important for AI to work with high efficiency and accuracy. Now, let us look at what the terms Machine Learning and Deep Learning mean. Machine Learning is an application of AI that uses data analytics techniques and computational methods to “learn” information directly from data without relying on a predetermined equation as a model. Machine Learning algorithms can automatically learn and improve from experience without being explicitly programmed. These algorithms are built to do things by understanding labeled data, then use it to produce further outputs with more sets of data. Machine Learning develops computer programs that can access data and self-learn. Some real-time examples of Machine Learning are virtual personal assistants, video surveillance, email spam and malware filtering, and online customer support. Deep Learning, on the other hand, is a subset of Machine Learning that is capable of learning from massive volumes of unsupervised data, which may be unstructured or unlabeled. It is also termed as Deep Neural Learning or Deep Neural Networks. Some examples of Deep Learning at work are autonomous vehicles, image processing, etc. Deep Learning allows us to train an AI by giving a set of inputs and predicting the output. AI is trained by using both supervised and unsupervised learning. Academic publications claim that Deep Learning uses multiple layers to progressively extract higher-level features from the raw input. For example, in image processing, lower layers identify the dimensions of the image, while higher layers identify whether the object is a letter, a human face, or an animal. Deep Learning has been significantly successful for two reasons. One reason is that a deep neural network (DNN) has the capacity to store information from large data sets. The other reason is that many Machine Learning algorithms can suffer from bottlenecks when it comes to creating features. Features are the input parameters of the training examples that enable a particular Machine Learning algorithm to pick up data. So, we can conclude that Machine Learning is a subset of AI and Deep Learning is a subset of Machine Learning. It is important to understand how Artificial Intelligence, Machine Learning, and Deep Learning relate to each other and simulate human intelligence. It is also key to know how they incrementally build on each other. Each of them has different data requirements, level of complexity, transparency, and limitations. Each of them is different with respect to the types of problems each can solve, even when the context that tests the skill required to get a specific model up and running is the same or similar. Even though they solve different business case problems, these three terms are tightly linked. Choosing which one to use for a particular scenario is driven by various factors. For example, the first parameter of interest will be the amount of data available for use and the performance of the model when that data is scaled up. With an increase in the data volume, the parameters get tuned well and any bias in the model gets reduced. In another instance, suppose we want to analyze the data on a day-to-day basis; like, say the stock market for day traders. Machine Learning models will perform better than Deep Learning models in such scenarios where the amount of data is smaller. So, there is no distinct line where one stops and the other takes over. The advances made by researchers at DeepMind, Google Brain, Open AI, and various universities are startling. AI can now solve problems that humans can’t. And AI is changing faster than can be imagined. The power of AI grows with the power of computational hardware, and advances in a computational capacity like quantum computing or higher-quality chips. Interestingly, the simulation of human intelligence (sometimes called Machine Intelligence) is a combination of all the three terms working together. When they come together, they enable machines to predict, classify, learn, plan, reason, and/or perceive like humans. Latest posts by Calsoft Inc (see all) - ServiceNow Integration to Accelerate Your Business Growth - April 1, 2020 - 6-Stage BCP at Calsoft – Ensuring Business Continuity for Customers and Partners - March 31, 2020 - Parallel Linux Utilities for NFS - February 27, 2020
<urn:uuid:3cd08e25-9d37-434d-a683-d641badd1fa9>
CC-MAIN-2020-16
https://blog.calsoftinc.com/2020/02/ai-machine-learning-and-deep-learning-whats-same-whats-different.html
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371606067.71/warc/CC-MAIN-20200405150416-20200405180916-00331.warc.gz
en
0.927724
1,089
3.8125
4
Viewpoint: Unlocking the Hidden Information in Starlight A provocative new result by Mankei Tsang, Ranjith Nair, and Xiao-Ming Lu of the National University of Singapore suggests that a long-standing limitation to the precision of astronomical imaging, the Rayleigh criterion, proposed in 1879 is itself only an apparition. Using quantum metrology techniques, the researchers have shown that two uncorrelated point-like light sources, such as stars, can be discriminated to arbitrary precision even as their separation decreases to zero. Quantum metrology, a field that has existed since the late 1960s with the pioneering work of Carl Helstrom , is a peculiar hybrid of quantum mechanics and the classical estimation theory developed by statisticians in the 1940s. The methodology is a powerful one, quantifying resources needed for optimal estimation of elementary variables and fundamental constants. These resources include preparation of quantum systems in a characteristic (entangled) state, followed by judiciously chosen measurements, from which a desired parameter, itself not directly measurable, may be inferred. In the context of remote sensing, for example, in the imaging of objects in the night sky, the ability to prepare a physical system in an optimal state does not exist. In the case of starlight, the typical assumption is that the source is classical thermal light, the state of maximum entropy or “uninformativeness.” Imaging such sources is plagued by the limits of diffraction when the objects are in close proximity. The wave-like nature of light causes it to spread as it moves through space, bending around obstacles, for example when traversing a telescope aperture. This results in a diffraction pattern described by a so-called point spread function (PSF) in the image plane. The Rayleigh criterion states that two closely spaced objects are just resolvable—that is, discernable from one another—when the center of the diffraction pattern, or peak of the PSF, of one object is directly over the first minimum of the diffraction pattern of the other. Roughly, the PSF maxima must be farther apart than their widths (Fig. 1). Some astronomers say they are able to resolve objects that are slightly closer than the Rayleigh limit allows. Yet inevitably, as the angular separation between the objects decreases, the information that can be obtained about that separation using direct detection becomes negligible, and even the most optimistic astronomer, utilizing the most sophisticated signal-processing techniques, must admit defeat. Correspondingly, as the separation approaches zero, the minimum error on any unbiased estimation of the separation blows up to infinity, which has limited angular resolution in imaging since the time of Galileo. Typically, the mean-squared error on the estimation of a parameter scales with the number of repeated measurements or data points, , as . Even for a large error per measurement, any desired precision is attained by taking multiple data points. When, however, the lower bound on direct estimation of the separation is divergent because of the Rayleigh limit, the factor makes no impact. This is what Tsang and collaborators call Rayleigh’s curse. Using a quantum metrology formalism to minimize the estimation error, the initial achievement of their work has been to show that there is no fundamental obstacle to the estimation of the separation of two PSFs in one dimension (that is, for sources that sit on a line). As the separation of two PSFs decreases to zero, the amount of obtainable information stays constant. This discovery is nicely summed up by Tsang, who says we should apologize to the starlight “as it travels billions of light years to reach us, yet our current technology and even our space telescopes turn out to be wasting a lot of the information it carries.” It could be suggested that this is merely a theoretical proof; the quantum metrology formalism indicates that there is always an optimal measurement, which minimizes the estimation error for the separation parameter. Paradoxically, this optimal measurement can, however, depend on the value of the parameter. To obviate such concerns, Tsang and his colleagues propose a strategy, based on state-of-the-art quantum optics technology, that produces a minimal error in the estimation of the separation variable—counterintuitively, this error remains constant for all separation values, under the assumption that the PSFs have a Gaussian shape. The method, which the authors call spatial mode demultiplexing (SPADE), splits the light from the two sources into optical waveguides that have a quadratic refractive-index lateral profile. Mathematically, this SPADE measurement decomposes the overlapping PSFs (a real function in one dimension) into the complete basis of Hermite functions, just as a Fourier transform provides a decomposition of a real function into a superposition of sine and cosine terms. A posteriori, one may be tempted to use intuition to explain why this Hermite basis measurement seems not to suffer Rayleigh’s curse, but then again, were intuition forthcoming, the result may not have been hidden from view for so long. (This elusiveness relates to subtleties in the estimation of a single parameter extracted from the joint statistics of two incoherent light sources.) One minor caveat of the approach is that full imaging of two point sources at positions and requires estimation of both separation and centroid parameters. SPADE is only optimal when the centroid parameter is already known to high precision. Centroid estimation, however, has no equivalent analog to the Rayleigh curse; it may be made via direct imaging. Errors can be reduced appropriately via the factor for data points with much greater than 1. A second detail worth pondering is that this result utilized techniques from the quantum domain to reveal a classical result. (All of the physical assumptions about starlight admit a classical model.) The quantum metrology formalism has been used to optimally estimate a parameter, but no quantum correlations exist in the system for any value of that parameter, that is, for any angular separation of two stars. When no quantum correlations are present, the formalism will still indicate the best possible measurement strategy and the smallest achievable estimation error. An added blessing of quantum metrology is that it allows the development of generalized uncertainty relationships, for example between temperature and energy for a system at equilibrium , or photon number and path-length difference between the two arms of an interferometer. The result of Tsang and his colleagues can be presented as another type of generalized uncertainty, between source separation and “momentum.” The mean-squared error associated with separation estimation scales inversely with the momentum (Fourier) space variance of the overlapping PSFs. Regarding impact on the field, the authors’ study produced a flurry of generalizations and other experimental proposals. During the past six months there have been four proof-of-principle experiments, first in Singapore by Tsang’s colleague Alex Ling and collaborators , and then elsewhere in Canada and Europe [7–9]. A subsequent theory paper from researchers at the University of York extends Tsang and colleagues’ theory result, which was for incoherent thermal sources such as starlight, to any general quantum state existing jointly between the two sources. This work exploits the roles of squeezing (of quantum fluctuations) and of quantum entanglement to improve measurement precision, extending applicability to domains in which control of the source light is possible, such as microscopy. Tsang and his colleagues have provided a new perspective on the utility of quantum metrology, and they have reminded us that even in observational astronomy—one of the oldest branches of science—there are (sometimes) still new things to be learned, at the most basic level. This research is published in Physical Review X. - M. Tsang, R. Nair, and X.-M. Lu, “Quantum Theory of Superresolution for Two Incoherent Optical Point Sources,” Phys. Rev. X 6, 031033 (2016). - L. Rayleigh, “XXXI. Investigations in Optics, with Special Reference to the Spectroscope,” Philos. Mag. 8, 261 (1879). - C. W. Helstrom, “Resolution of Point Sources of Light as Analyzed by Quantum Detection Theory,” IEEE Trans. Inf. Theory 19, 389 (1973). - Private Communication. - B. Mandelbrot, “On the Derivation of Statistical Thermodynamics from Purely Phenomenological Principles,” J. Math. Phys. 5, 164 (1964). - T. Z. Sheng, K. Durak, and A. Ling, “Fault-Tolerant and Finite-Error Localization for Point Emitters Within the Diffraction Limit,” arXiv:1605.07297. - F. Yang, A. Taschilina, E. S. Moiseev, C. Simon, and A. I. Lvovsky, “Far-Field Linear Optical Superresolution via Heterodyne Detection in a Higher-Order Local Oscillator Mode,” arXiv:1606.02662. - W. K. Tham, H. Ferretti, and A. M. Steinberg, “Beating Rayleigh’s Curse by Imaging Using Phase Information,” arXiv:1606.02666. - M. Paur, B. Stoklasa, Z. Hradil, L. L. Sanchez-Soto, and J. Rehacek, “Achieving Quantum-Limited Optical Resolution,” arXiv:1606.08332. - C. Lupo and S. Pirandola, “Ultimate Precision Bound of Quantum and Sub-Wavelength Imaging,” arXiv:1604.07367.
<urn:uuid:24e205c5-75f4-44b1-a599-0a15da8487a8>
CC-MAIN-2020-16
https://physics.aps.org/articles/v9/100
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370493684.2/warc/CC-MAIN-20200329015008-20200329045008-00251.warc.gz
en
0.900898
2,048
3.515625
4
1: The Strangest Force Begin your exploration of gravity with Isaac Newton and the famous story of the apple. Why was it such a breakthrough to connect a falling apple with the faraway moon? Review the essential characteristics of gravity and learn why small asteroids and large planets have such different shapes. 2: Free Fall and Inertia Review three great discoveries by the "grandfather" of gravity research, Galileo Galilei. His most famous experiment may never have happened, but his principle of inertia, law of free fall, and principle of relativity are the basis for everything that comes later in the science of gravity-including key breakthroughs by Einstein. 3: Revolution in the Heavens Drawing on ideas and observations of Nicolaus Copernicus and Tycho Brahe, Johannes Kepler achieved a great insight about gravity by discovering three laws of planetary motion, relating to the mathematics of orbits. The cause of planetary motion, he determined, must lie in the sun. 4: Universal Gravitation See how Newton was able to finish Kepler's revolution by formulating the law of universal gravitation, which says that every object exerts an attractive force on every other object. Also explore Newton's related discovery of the three laws of motion, which underlie the science of mechanics. 5: The Art of Experiment Learn how distances in the solar system were first determined. Then chart Henry Cavendish's historic experiment that found the value of Newton's gravitational constant. Cavendish's work allows almost everything in the universe to be weighed. Then see a confirmation of the equivalence principle, which says that gravitational and inertial mass are identical. 6: Escape Velocity, Energy, and Rotation Begin the first of several lectures that dig deeper into Newton's laws than Newton himself was able to go. In this lecture, apply the key concepts of energy and angular momentum to study how gravity affects motion. As an example, use simple algebra to calculate the escape velocity from Earth. 7: Stars in Their Courses-Orbital Mechanics Newton was the first to realize that objects could, in theory, be sent into orbit around Earth. Explore how this works in practice, using the ideas of energy and angular momentum to study how satellites, moons, planets, and stars move through space. 8: What Are Tides? Earth and Beyond Trace the origin of tides to the simple fact that gravity varies from point to point in space. This leads not just to the rise and fall of the ocean, but to the gradual slowing of Earth's rotation, Saturn's spectacular ring system, volcanoes on Jupiter's moon Io, and many other phenomena. 9: Nudge-Perturbations of Orbits For the next three lectures, study the effects of gravity on the motions of more than two bodies. Here, see how even very small orbital changes-small perturbations-are significant. Such effects have revealed the presence of unknown planets, both in our own solar system and around other stars. 10: Resonance-Surprises in the Intricate Dance Resonance happens whenever a small periodic force produces a large effect on a periodic motion-for example, when you push a child on a swing. Learn how resonance due to gravitational interactions between three bodies can lead to amazing phenomena with planets, asteroids, and rings of planets. 11: The Million-Body Problem Consider the problem of gravitational interactions between millions of bodies, such as the countless stars in a galaxy. Amazingly, mathematics can reveal useful information even in these complicated cases. Discover how the analysis of the motions of galaxies led to the prediction of dark matter. 12: The Billion-Year Battle Explore the physics of stars, which are balls of gas in a billion-year battle between the inward pull of gravity and the outward pressure produced by nuclear fusion. Follow this story to its ultimate finish-the triumph of gravity in massive stars that end their lives as black holes. 13: From Forces to Fields For the rest of the course, focus on the revolutionary view of gravitation launched by Albert Einstein. Review new ideas about fields that allowed physics to extend beyond Newtonian mechanics. Then see how Einstein modified Newton's laws and created the special theory of relativity. 14: The Falling Laboratory Einstein focused on gravity in his general theory of relativity. Hear about his "happiest thought"-the realization that a man in free fall perceives gravity as zero. This simple insight resolved a mystery going all the way back to Newton and led Einstein to the startling discovery that gravity affects light and time. 15: Spacetime in Zero Gravity In an influential interpretation of relativity, Einstein's former mathematics professor Hermann Minkowski reformulated the theory in terms of four-dimensional geometry, which he called spacetime. Learn how to plot events in this coordinate system in cases where gravity is zero. 16: Spacetime Tells Matter How to Move See how gravity affects Minkowski's spacetime geometry, discovering that motion in a gravitational field follows the straightest path in curved spacetime. The curvature in spacetime is not caused by gravity; it is gravity. This startling idea is the essence of Einstein's general theory of relativity. 18: Light in Curved Spacetime See how Einstein's general theory of relativity predicts the bending of light in a gravitational field, famously confirmed in 1919 by the British scientist Arthur Eddington. Learn how this phenomenon creates natural gravitational lenses-and how the bending of light reveals invisible matter in deep space. 19: Gravitomagnetism and Gravitational Waves The general theory of relativity predicts new phenomena of gravity analogous to those of electromagnetism. Discover how ultra-sensitive experiments have detected the gravitomagnetism of the Earth, and follow the search for elusive gravitational waves that travel through space. 20: Gravity's Horizon-Anatomy of a Black Hole Plunge into the subject of black holes, which are massive objects that have collapsed completely under their own gravity. Learn how black holes distort spacetime and explore the supermassive black holes that lie at the hearts of galaxies. Then ask: Are there such things as micro-black holes? 21: Which Universe Is Ours? Investigate what Einstein called his "greatest mistake"-his rejection of his own theory's prediction that spacetime should be dynamic and evolving. Chart the work of a group of scientists, including Alexander Friedman, Georges Lemaître, and Edwin Hubble, who advanced the realization that our universe is expanding from an apparent big bang. 22: Cosmic Antigravity-Inflation and Dark Energy Using everything you've learned about gravity, investigate cosmic antigravity, starting with cosmic inflation, a phenomenon that exponentially increased the size of the universe during the big bang. Then, learn why dark matter cannot be made of ordinary protons and neutrons, and explore the recent discovery that the expansion of the universe is accelerating, powered by a mysterious dark energy inh... 23: The Force of Creation Use a black hole to test the laws of thermodynamics, taking a deeper look at the capacity of gravity to pull matter together and increase entropy at the same time. Probe Stephen Hawking's most surprising discovery, and then learn that the same force that pulls the apple down and steers the stars in their courses is also nature's ultimate source of order and complexity. 24: The Next Revolution Survey the greatest unsolved problem in theoretical physics: the search for a quantum theory of gravity. Examine string theory, loop quantum gravity, and also entropic gravity, which suggests a revolutionary link with thermodynamics. Close the course with a deepened appreciation for the connection between everyday features of gravity and the most exciting questions in contemporary physics and cosm... Gravity is about both phenomena near at hand at the human scale, everyday and intuitive, and phenomena far off at an astronomical scale. About Benjamin Schumacher Dr. Benjamin Schumacher is Professor of Physics at Kenyon College, where he has taught for 20 years. He received his Ph.D. in Theoretical Physics from The University of Texas at Austin in 1990. Professor Schumacher is the author of numerous scientific papers and two books, including Physics in Spacetime: An Introduction to Special Relativity. As one of the founders of quantum information theory, he introduced the term qubit, invented quantum data compression (also known as Schumacher compression), and established several fundamental results about the information capacity of quantum systems. For his contributions, he won the 2002 Quantum Communication Award, the premier international prize in the field, and was named a Fellow of the American Physical Society. Besides working on quantum information theory, he has done physics research on black holes, thermodynamics, and statistical mechanics. Professor Schumacher has spent sabbaticals working at Los Alamos National Laboratory and as a Moore Distinguished Scholar at the Institute for Quantum Information at California Institute of Technology. He has also done research at the Isaac Newton Institute of Cambridge University, the Santa Fe Institute, the Perimeter Institute, the University of New Mexico, the University of Montreal, the University of Innsbruck, and the University of Queensland.
<urn:uuid:b17e064f-0a58-485b-94b8-9a23d0c5b3cc>
CC-MAIN-2020-16
https://www.thegreatcoursesplus.com/black-holes-tides-and-curved-spacetime-understanding-gravity
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371826355.84/warc/CC-MAIN-20200408233313-20200409023813-00334.warc.gz
en
0.920532
1,880
3.6875
4
Modern computers needed just weeks to correctly solve models that took human theoretical physicists six years to figure out. OKINAWA, Japan — Machine learning, or the ability of AI systems and computers to learn and improve from experiences, has made some incredible leaps and bounds in recent years and is already starting to make its way into various industries, often times completely reinventing what would have been considered impossible a few decades ago. One such example would be the growing popularity and prevalence of self-driving cars. AI systems have also recently made headlines for besting the top-ranked human chess players in the world, or solving a Rubik’s cube in an absurdly insignificant amount of time. Now, an international study conducted at the Okinawa Institute of Science and Technology Graduate University finds that modern computers can also solve complex scientific problems just as accurately as human theoretical physicists — only much, much faster. It took such physicists six years to identify unusual magnetic phases within what’s known as a pyrochlore model. But, with the help of a machine, scientists were able to accomplish the same feat in a matter of weeks! “This feels like a really significant step,” says Professor Nic Shannon, leader of the Theory of Quantum Matter (TQM) Unit at OIST, in a release. “Computers are now able to carry out science in a very meaningful way and tackle problems that have long frustrated scientists.” Every single atom within a magnet is associated with a tiny magnetic moment, usually called a “spin.” In typical magnets, such as the ones that are in all likelihood stuck on your fridge right now, these spins are ordered so that they all point in a singular direction. It’s this corresponding pattern that results in a strong magnetic field. This same phenomenon applies to solid materials as well, all the atoms in a particular object are ordered in one direction. However, just like how matter can exist as a solid, a liquid, or a gas, so too can magnetic substances. Researchers in the quantum matter unit focus on especially unusual magnetic phases called “spin liquids.” Within these spin liquids, there are often competing, or “frustrated” interactions between individual spins, in which they constantly fluctuate in direction. This is similar to the disorder seen in liquid phases of matter. In the future, these phases may prove incredibly useful in quantum computing. The research team wanted to discover which of these spin liquids were capable of existing in frustrated pyrochlore magnets. To start, they built a diagram illustrating how different phases could occur as spins interacted in various ways when temperatures fluctuated. That diagram was completed in 2017, but actually reading and using the illustration to identify some semblance of rules governing these interactions between spins proved an incredibly difficult, and long, task. “These magnets are quite literally frustrating,” quips Professor Shannon. “Even the simplest model on a pyrochlore lattice took our team years to solve.” So, the research team decided to see if computers could help them out. “To be honest, I was fairly sure that the machine would fail,” Professor Shannon says. “This is the first time I’ve been shocked by a result – I’ve been surprised, I’ve been happy, but never shocked.” Researchers collaborated with machine learning experts from the University of Munich who had already developed a way to represent spin configurations in a computer. This innovation was then combined with a machine capable of categorizing complex data into different groups. “This is the first time I’ve been shocked by a result – I’ve been surprised, I’ve been happy, but never shocked.” -Professor Nic Shannon “The advantage of this type of machine is that unlike other support vector machines, it doesn’t require any prior training and it isn’t a black box – the results can be interpreted. The data are not only classified into groups; you can also interrogate the machine to see how it made its final decision and learn about the distinct properties of each group,” says Dr. Ludovic Jaubert, a CNRS researcher at the University of Bordeaux. The machine was provided with 250,000 spin configuration variations. Remarkably, without being given any information on which phases were present, the machine successfully created an identical replication of the phase diagram. Most importantly, when the research team looked into how the machine was able to classify all the different types of spin liquid, they discovered that it had calculated the exact mathematical equations representing each phase. A remarkable achievement that would have taken a team of humans years to accomplish, was completed by the machine within a matter of weeks. “Most of this time was human time, so further speed ups are still possible,” said Prof. Pollet. “Based on what we now know, the machine could solve the problem in a day.” “We are thrilled by the success of the machine, which could have huge implications for theoretical physics,” added Prof. Shannon. “The next step will be to give the machine an even more difficult problem, that humans haven’t managed to solve yet, and see whether the machine can do better.” The study is published in Physical Review B.
<urn:uuid:172d7a6b-4e96-419a-ac5e-ec356cc84380>
CC-MAIN-2020-16
https://www.studyfinds.org/ai-quickly-solves-complex-scientific-problems-that-have-long-frustrated-scientists/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370497309.31/warc/CC-MAIN-20200330212722-20200331002722-00415.warc.gz
en
0.966659
1,113
3.578125
4
What is Quantum Computing Quantum Computing Introduction Quantum computing is based on the principles of quantum physics. In physics, a quantum is the minimum amount of any physical entity involved in an interaction – Source Wikipedia. Quantum computing has intrigued and fascinated us since decades and still remains elusive.While technology has advanced by leaps and bounds, quantum computing is still in its infancy. However, with the rate we are advancing scientifically, quantum computers necessarily gonna be commercially viable and will certainly replace classical computers of today, though it will take time. Quantum computers make use of the principles of quantum mechanics (superposition and quantum entanglement – more on this later) to achieve the speed and efficiency which they are well known for. While classical computers use transistors as their building blocks, quantum computers are based on qubits which is a fundamental digital storage unit in quantum computers. Quantum Computers versus classical computers The classical computer equivalent of qubit is a bit which can either be 0 or 1 at a time unlike quantum computers wherein the qubit can be 0,1 or both at a time giving it additional processing capability and hence speed. In classical computers a bit represents any of the 2 possible discrete voltage levels (HIGH = 1 or LOW = 0) flowing through a digital circuit which is called a flip flop which is a binary storage device made of transistors. It’s the flip flop which stores binary data in classical computers. Below diagram represents these voltage levels. In quantum computers, qubit represents the atomic unit of data which stores info and exists in superposition which can be 0 or 1 or both at the same time. A qubit can be electron, photon, ion or an atom and their respective control devices which work in tandem act as registers called qubit registers and quantum processors. Since the qubits can be in both states simultaneously, a quantum computer is light years ahead in terms of speed when compared to their traditional classical counterparts. It has faster processing capability than the fastest supercomputer we have today. In a quantum computer, electron spin is used to represent the state of a qubit and the control devices like quantum dots, ion traps, optical traps etc. use electric fields to change the electron spin and hence control qubit states. These control devices are made of semiconducting materials. New advanced techniques of creating qubits using super conducting materials are being discussed and researched by Scientists and Engineers. More on Superposition and quantum entanglement Quantum superposition says that much like waves two or more quantum states (for eg. electron spin) can be superposed which will result in another valid state which can be a mix of two states i.e. 01 (in simple binary terms). Thus a qubit can have 3 values like 0, 1, 01. More the number of qubits, the permutations will drastically increase in terms of number of states and hence more data can be stored and processed simultaneously. However, the superposed state remains valid only until we observe them using some observation technique. The moment we see the quantum particles, their wave function collapses and we will always find them to be either in 1 or 0 state only. This has been experimentally verified using Double slit experiment. Entanglement in simple terms is a phenomenon in which quantum states of 2 or more objects or particles have to be described with respect to each other irrespective of the spatial difference between them. These states are intertwined for eg. if 2 particles are entangled and one spins down, the other should spin up and vice versa. Due to this principle, the measurement of state of one particle decides the state of another. This signifies an instantaneous information sharing mechanism which someone might perceive as faster than light transfer of information between quantum particles, however this is not the case. Quantum entanglement has applications in the emerging technologies of quantum computing and quantum cryptography, and has been used to realize quantum teleportation experimentally as shown by Chinese scientists Last year. Limitations of Today’s Quantum Computers 1) While quantum computing is advancing , we still have not been able to build a full fledged general purpose quantum computer.The quantum computers today are more geared towards solving complex computing problems only like an ASIC chip does and they are 100 times slower on some other computing operations like gaming video streaming etc. 2) They are expensive and produce a lot of noise in the qubits which makes them more error prone by destroying the data stored in qubits. 3) They are not energy efficient and thus consume a lot of power and also are every expensive. 4) They are to be maintained under sub zero temperatures which again calls for huge maintenance costs. Threats posed by quantum computers to cyber security and cryptography If a quantum computer having sufficient number of qubits can run without succumbing to noise, it can use Shor’s Algorithm to easily break public key encryption algorithms like RSA, DSS etc. This calls for more advanced cryptography techniques and algorithms. Thus, if quantum computers go mainstream, computer science fields like cryptography will need a drastic shift and advancement to be crack resistant to quantum computing. IBM Q Experience IBM has exposed a quantum computer on cloud here. It’s a good platform for quantum computing enthusiasts to create and run programs that execute on quantum computer like the Shor’s Algorithm which runs in Polynomial time on quantum computer but takes exponential time on classical computers.
<urn:uuid:f63e031f-acd4-4d49-b11b-cd0b3ab6bce4>
CC-MAIN-2020-16
https://www.golibrary.co/what-is-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371660550.75/warc/CC-MAIN-20200406200320-20200406230820-00296.warc.gz
en
0.924969
1,106
3.734375
4
Understanding magnetism at its most fundamental level is vital for developing more powerful electronics, but materials with more complex magnetic structures require more sophisticated tools to study them — powerful tools called simply “neutrons”. The two most powerful sources in the world neutron scattering The Oak Ridge National Laboratory (ORNL) of the US Department of Energy (DOE) is undergoing modernization. Adding an advanced feature called spherical neutron polarimetry will allow researchers using the ORNL (HFIR) and Spallation Neutron Source (SNS) isotope reactors to measure materials with exotic magnetic structures and quantum states that were previously unavailable in the United States. “Neutrons are ideal for studying magnetic phenomena,” said ORNL researcher Nicholas Silva. “They are electrically neutral or chargeless and have magnetic moments that make them look like tiny magnets.” When neutrons pass through a material and scatter the magnetic fields created by the atoms of the material, they draw an atomic portrait or even a three-dimensional model of the atomic arrangement of the material and show how the atoms behave in the system. Neutrons have a “rotation” or orientation, similar to the north and south poles of fridge magnets. In a typical neutron beam, the neutrons within the beam have spins arranged randomly. The measurement of some highly dynamic or complex magnetic systems, however, requires more uniformity, which is ensured by a polarized neutron beam, in which each neutron spin is aligned in parallel and with the same orientation. “Neutron polarization filters allow us to see what we don’t want to see, which can spoil the signal of interest to us,” said a scientist from Barry Wynn. “Just like polarized lenses allow anglers to see fish floating below that would otherwise be blocked by water reflection.” Neutrons will change their spins in a predictable way when they scatter. The use of a polarized beam allows researchers to better understand what is happening in the material by setting the neutron spin to and measuring the neutron spin after the beam hits the sample. For example, a neutron spin can be flipped in the opposite direction during scattering. “In the USA, most of the measurements we have done with polarized neutrons so far have been based on whether the neutron scattered from the material or its a magnetic fieldrotates 180 degrees or maintains its orientation. We call it a coup, not a coup, ”said Wynn. “But there is a problem with that. If we get any scatter in the sample other than a coup without a coup or a coup with a spin — or something other than 0 and 180 degrees — the strategy will explode in our face. ” This strategy works well for conventional magnetic materials, such as ferromagnets and antiferromagnets, in which all magnetic atoms are directed either in the same direction or in alternative directions, but remain parallel to their neighbors. However, the strategy does not work for more complex magnetic structures. For example, the technique is limited when it comes to the study of exotic particles, such as skyrmions, quasiparticles that exhibit chiral motion, or entangled vortices, or whirlpools of asymmetric field lines of force. Such particles provide exciting potential for materials used in modern data storage and quantum computing applications. To solve this problem, polarization scientist Peter Jiang leads the ORNL team, which includes Wynne and Silva, as part of a research project aimed at developing spherical neutron polarimetry for several ORNL ray lines. The technology will allow neutron measurements of materials that do not correspond to traditional areas with spin flip and without flip, or, in other words, allow researchers to see the dynamic magnetic behavior that exists between them. “Traditional methods are not complicated enough to study some complex magnetic systems,” Jiang said. “Now we are no longer limited to coups. This allows us to look at magnetic devices that we could not understand before. ” Spherical neutron polarimetry was used in Europe, and now Jiang and the ORNL team are adapting the technology to SNS and HFIR instruments. They create technology based on ongoing research by Tianhao Wang, first a graduate student at Indiana University, Bloomington, and then a postdoctoral research on the ORNL team. The basic technology includes additional optical devices installed both on the incoming beam, which hits the sample – incident beam, and on the output beam, scattering it, which allows measurements of scattered neutrons oriented in any direction. ORNL technology is based on previous prototype designs and will offer several innovations. In ORNL spherical neutron polarimetry devices, the scattered beam path does not have to coincide with the incident beam, but instead can be angled around the sample. “This means that if the neutron does not experience a complete flip, we can adjust the field at the other end or move the apparatus to detect neutron scattering in different directions,” Silva explained. The team also developed two independent cooling systems to allow researchers to learn how magnetic structures change according to temperature. The first system cools two spherical components of the polarization of neutrons located on both sides of the sample to make them superconducting. The second system introduces an additional cryostat with liquid helium gas station an ability that allows researchers to more easily study materials in the temperature range without affecting the temperatures needed for superconductivity in the first system. Finally, spherical neutron polarimetric devices are made of more efficient materials. While niobium was used for the superconducting sheets in previous designs, yttrium-barium-copper oxide (YBCO) was used in the new design, which was superconducted at a temperature of 93 Kelvin (-292 ° F), which is significantly higher than that of its niobium precursor . In addition, superconducting films are bonded to Mu metal yokes, which combine to shield all other magnetic fields and establish a zero field around the sample to study the spins of the materials in their natural state. “To achieve superconductivity, a significant amount of cooling power is required. To maintain superconductivity, niobium must be cooled to a temperature below 10 K, therefore, European designs required extensive cooling systems, which often had to be filled with liquid helium manually, ”Jiang said. “With YBCO high-temperature films, we can use a single-stage closed-cycle refrigerator to cool the film to a temperature well below its critical temperature, so we don’t worry about any loss in superconductivity. And with added liquid helium, an autofill system for the cryostat and a closed-loop cooling system, the device will be easier to use and more efficient. " Moreover, the system is compact compared to previous systems – high-temperature superconductors, which eliminate the need for a large cooling system, make it mobile. “In any case, there is evidence of how portable the device is. We moved it to nuclear reactor at the university of missourithen back to HFIR and from HFIR to SNS, ”said Silva. “I assembled it and disassembled it several times, and each time I found simpler ways to connect the parts – only a small quality of the changes in life that we make to increase its usefulness.” The system was successfully tested, with full polarization measurements using several known materials, including silicon, manganese oxide, and bismuth-iron oxide. The team plans to implement the system on a three-phase HFIR PTAX spectrometer and a GP-SANS diffractometer, which will be optimized for the stationary neutron beam of the reactor, with full functionality expected by the end of 2020. Subsequently, the team will develop a similar spherical neutron A polarimetric device exclusively for the HYSPEC SNS device, making it the only device in the world that connects super mirror array and the possibility of a wide angle. The device will also benefit from the unique capabilities provided by the SNS pulse source accelerator. “At the same time,” said Wynn, “we will have a workhorse at PTAX that will knock out our socks.” Oak Ridge National Laboratory ORNL neutrons add enhanced polarization capabilities for measuring magnetic materials (2020, March 16) restored March 16, 2020 This document is protected by copyright. Other than honest deals for private study or research, no Part may be reproduced without written permission. Content is provided for informational purposes only.
<urn:uuid:da18d9f0-68f8-4a3e-9921-b700e2312761>
CC-MAIN-2020-16
https://www.newsround.net/ornl-neutrons-add-enhanced-polarization-capabilities-for-measuring-magnetic-materials/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371606067.71/warc/CC-MAIN-20200405150416-20200405180916-00336.warc.gz
en
0.930838
1,774
3.78125
4
September 30, 2019 feature A ten-qubit solid-state spin register with remarkable quantum memory In years to come, quantum computers and quantum networks might be able to tackle tasks that are inaccessible to traditional computer systems. For instance, they could be used to simulate complex matter or enable fundamentally secure communications. The elementary building blocks of quantum information systems are known as qubits. For quantum technology to become a tangible reality, researchers will need to identify strategies to control many qubits with very high precision rates. Spins of individual particles in solids, such as electrons and nuclei have recently shown great promise for the development of quantum networks. While some researchers were able to demonstrate an elementary control of these qubits, so far, no one has reported entangled quantum states containing more than three spins. In order to reach the computational power necessary to complete complex tasks, quantum registers should be significantly larger than those realized so far. However, controlling individual spins within complex and strongly interacting quantum systems has so far proved to be very challenging. Recently, a team of researchers at TU Delft and Element Six has successfully demonstrated a fully controllable ten-qubit spin register with a quantum memory up to one minute. Their findings, presented in a paper published in Physical Review X, could pave the way for the development of larger and yet controllable quantum registers, ultimately opening up new exciting possibilities for quantum computing. "The main objective of our study was to realize a precisely controlled system of a large amount of qubits using the spins of atoms embedded in a diamond," Tim Taminiau, one of the researchers who carried out the study, told Phys.org via email. "These spins are promising quantum bits for quantum computation and quantum networks, but previous results were limited to just a few qubits. The key open challenge is that on the one hand, all the spins in the system must be coupled together to function as a single quantum processor, but on the other hand, this makes it difficult to selectively control them with high precision." Taminiau and his colleagues successfully developed an entirely new method to control multiple qubits. This technique uses an electron spin qubit to selectively control many individual nuclear spin qubits, while simultaneously decoupling them and thus protecting them from unwanted interactions in the system. Using their method, the researchers were able to control a considerably larger number of spins compared to previous studies, with remarkably high precision. They applied their technique to a system composed of 10 spins associated to a nitrogen-vacancy (NV) center in diamond. This NV center has an electron spin providing a qubit than can be optically read out (i.e. one can determine its value) and that can be controlled with microwave pulses. "This electron spin couples to nuclear spins in the environment," Conor Bradley, a Ph.D. student and lead author of the study, explained. "One such nuclear spin is the intrinsic nitrogen nuclear spin of the NV. The additional 8 qubits are carbon-13 nuclear spins surrounding the NV. Naturally about 1.1 percent of the carbon atoms in diamond are carbon-13 and have a spin, i.e. they can be used as a qubit, the other carbon atoms are carbon-12 and carry no spin." Although the researchers applied their method to a specific 10-qubit system, they believe that it could also be implemented on other systems, including other defect centers in diamond and silicon carbide, quantum dots and donors in silicon. The qubits hosted by these other systems each have their own strengths for completing a variety of complex tasks. "The main achievement of our study is a 10-spin-qubit quantum system that can store quantum information for very long times up to 75 seconds," Taminiau said. "Although other researchers were able to attain similar results with ions trapped in vacuum, this combination of many qubits, precise control and long-lived quantum memory is unique for chip-based quantum bits." The system demonstrated by Taminiau and his colleagues could be a key building block for large quantum networks in which multiple NV centers, each providing several qubits, are connected together optically. This particular capability was already outlined and shown by the researchers in a previous study. "Besides the importance of this study as a demonstration towards larger quantum information systems, this work also provides new insights into the decoherence—the loss of quantum information—for spins in solids," Taminiau said. The findings gathered by this team of researchers highlight the feasibility of studying how entangled states of multiple spin qubits decohere, as well as how correlations in the noise environment can play a vital role in this process. The method they developed also opens up new possibilities for quantum sensing and atomic-scale imaging of individual spins, where the goal is not to control spins but rather to detect them, in order to gather insight into interesting samples for studies in chemistry, biology and material science. In their future research, Taminiau and his colleagues plan to demonstrate a technique called quantum error correction. This particular type of error correction could help to overcome all of the inevitable imperfections of existing quantum systems, ultimately enabling the creation of large-scale quantum systems. "This will require encoding quantum states over many qubits and performing careful measurements to detect and correct errors without disturbing the encoded information," Taminiau added. "This has so far been out of reach for any system, but our results now make it possible to pursue this using spins in diamond." David D. Awschalom et al. Quantum technologies with optically interfaced solid-state spins, Nature Photonics (2018). DOI: 10.1038/s41566-018-0232-2 J. Cramer et al. Repeated quantum error correction on a continuously encoded qubit by real-time feedback, Nature Communications (2016). DOI: 10.1038/ncomms11526 G. Waldherr et al. Quantum error correction in a solid-state hybrid spin register, Nature (2014). DOI: 10.1038/nature12919 B. Hensen et al. Loophole-free Bell inequality violation using electron spins separated by 1.3 kilometres, Nature (2015). DOI: 10.1038/nature15759 © 2019 Science X Network
<urn:uuid:e56ca9c6-58a3-465a-a0a1-9a85db06c490>
CC-MAIN-2020-16
https://phys.org/news/2019-09-ten-qubit-solid-state-register-remarkable-quantum.html
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370494064.21/warc/CC-MAIN-20200329074745-20200329104745-00139.warc.gz
en
0.932276
1,299
3.828125
4
The story of winged human flight begins without a tail--the Wright Brothers' first successful glider didn't have one. Soon, biplanes ushered in the now-standard tube-and-wing design for aircraft, but experiments with blended wings never really stopped. The planes are potentially more aerodynamic and consume less fuel, though they are harder to maneuver. Researchers hope that computerized, fly-by-wire systems will soon overcome the control challenges and spawn an era of fuel-efficient heavy lifters. One proposed design is the SAX-40, an airliner that could trim fuel use by more than 20 percent and fly quietly enough to take off and land during late-night hours that are currently restricted. According to Jim Hileman, a researcher at MIT and chief engineer on the project, expanding the hours of operation for airports could reduce air traffic congestion--and the fuel wasted by circling planes--while avoiding legal battles over new runway construction. The plane is just a thought experiment for now, created by Hileman and his colleagues in the Silent Aircraft Initiative, a U.K.-funded collaboration between Cambridge University and MIT. But the design's enthusiasts are encouraged by successful, ongoing tests of the X-48B, a blended-wing prototype built by Boeing in cooperation with NASA and the Air Force. The company is focusing purely on military applications, but Hileman points out that wrestling civilian benefits from defense research is a grand old aviation tradition--the 707, Boeing's first commercial passenger jet, had a military lineage. "Maybe the U.S. Air Force will build a better tanker or bomber, which leads to a blended-wing airliner." Ultimately, Hileman says, aeronautical engineers will have to step up their game. "We haven't reached full maturity with our designs," he says. "We can still make a real impact on fuel use and aircraft noise." Like so many of the best things in life, the inspiration for cloaking technology comes from the Klingons, who used it on their starships. Researchers have had some success "cloaking" an object by redirecting light around it to render it invisible. But the principle might work even better to shield buildings from earthquake damage. The structures would incorporate "metamaterials" patterned with tiny circles whose size is proportional to the wavelength of seismic disturbances. The waves would travel along the material, missing the structure. Real-World Potential: The theory seems sound, but years of experimentation lie ahead. And engineers would then need to devise ways to build the technology into new buildings. (Retrofits would likely remain impossible.) Oil companies employ drones called "pigs" to crawl through pipelines, spotting corrosion. Fancier pipe bots are in development, destined for heroic feats such as shimmying through shattered plumbing to find earthquake survivors. But the most useful job for such robots could be patrolling thousands of miles of leaking municipal water lines. One design group took the inspiration for its bloblike robot from amoebas, but most of the new bots resemble snakes. A Canadian robot called Regina Pipe Crawler (RPC) is nearing commercialization. Controlled remotely, RPC can inspect a bending 6-inch-diameter pipe while the water is flowing at full strength. With enough pipe bots on the job, engineers could stop wasteful leaks and prevent catastrophic failures. Star Trek-style teleporters will never, ever be invented. And that's okay--after all, who would agree to be obliterated and then reconstituted by a guy named Scotty, trusting that no atom or eyeball was out of place? But scientists at the University of Maryland have teleported data, swapping the quantum states of two atoms positioned a meter apart. It was a step toward the creation of quantum computers, which could perform many simultaneous operations, crunching data exponentially faster than today's systems. Real-World Potential: On a rudimentary level, the technology works now, but practical (let alone world-changing) quantum computing is decades in the future. This past March, a 154-foot-wide asteroid came within 48,800 miles of Earth, just twice the altitude of some satellites. It was big enough to destroy a city. No one saw it coming. Not that it would have helped: There's no procedure in place for deflecting space rocks, just a list of concepts. But two astronauts--Rusty Schweickart, chairman of the Association of Space Explorers-Near Earth Object Committee, and Thomas D. Jones, a PM contributing editor--have a plan. (1) Build more asteroid-hunting telescopes. Projects that need more funding include a Canadian space-based telescope and a series of ground-based systems in Hawaii. (2) Assign asteroid-deflection authority to an international committee. With no one in charge, individual nations might launch Pyrrhic schemes: "In the past, there was a Russian proposal to have a 50-megaton nuclear missile in the silo, ready to launch at any asteroid that shows up," Jones says. Bad idea--it could create a lethal storm of fragments. (3) Run a rehearsal mission. One plan is to park an unmanned spacecraft alongside the offending asteroid, while kinetic impactors--guided missiles minus the warheads--slam into it. The first craft helps the impactors target the object and acts as a gravity tractor, using its mass to nudge the rock off course. If something really huge heads our way, we could always resort to that 50-megaton nuke tactic--with luck the bomb would ignite gases in the asteroid that would spew outwards and nudge the rock from its apocalyptic path.
<urn:uuid:d5b0f8fc-7778-4355-9d89-57c21033882e>
CC-MAIN-2020-16
https://www.popularmechanics.com/technology/gadgets/a4496/4322589/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370524604.46/warc/CC-MAIN-20200404165658-20200404195658-00261.warc.gz
en
0.949249
1,156
3.65625
4
If you have ever applied for a job before you’ve likely encountered this requirement: critical thinking skills. Throughout our day-to-day lives, we are constantly presented with choices that we need to make. Should I hit the snooze button? Should I wear a tie or not? Should I ask for a raise at work? All these choices make us stop for a moment to evaluate our options. If I hit the snooze button, then I’ll get more sleep but might be late for work. If I don’t hit the snooze button I might be tired for work, but at least I’ll show up on time. This deconstruction of weighing the pros and cons is what critical thinking is all about. According to the University of Toronto, critical thinking is the practice of using a number of different advanced thinking techniques in a variety of complex ways. Obviously, this can sound like a fairly vague definition. In its most basic sense, critical thinking involves gathering massive amounts of information, systematically analyzing that information, and making rational and informed conclusions. To go into more detail, we split critical thinking skills into three general components: - it focuses on how things are proven or presented, - it involves reflection on our decisions and the process, - and it is discipline specific. How is critical thinking different than regular thinking? To examine the difference between these two thinking techniques, we need to look at three things: - what we are focusing on, - how do we do it, - and what’s the ultimate goal. With regular thinking, we focus on the facts at hand. For example, it’s 7:30 am, I’m going to be late for work. Next, we attempt to construct relationships between different ideas and develop inferences based on those relationships. Finally, we form a plan of action for what we are thinking about. When it comes to critical thinking skills, the main idea is that the regular thinking process is undertaken in much more detail. We focus on different points of views or opinions and the merits of each. Next, we examine the relationships in depth. We must evaluate not only other people’s methods of thinking, but also our own. Finally, we use the material we have assessed to make an informed decision about what we have been thinking about, and how we thought about it. In a sense, we are thinking about thinking. Simple enough right? Well, without further ado, here are 10 sure-fire ways to improve your critical thinking skills. 1. Know what question you want to ask Before thinking about any idea critically, you want to know what question you are trying to ask. You must approach the question with an open mind and understand the reason why you want this particular problem solved. To improve your critical thinking skills, you must examine the question from a logical standpoint, not an emotional one. 2. Be self-aware One of the most important characteristics of people who think critically is that they are self-aware. They know that they aren’t always right. Critical thinkers are open to the views and opinions of others and will take their input into consideration with the same weight as their own. 3. Act with integrity Again, we are trying to improve our thinking skills, not our ability to always be right. To be a productive thinker, one must act honestly and with integrity. It’s only by acting with integrity that eventually we can come to a rational and logical conclusion. 4. Ask simple questions Going back to tip #1, the question you want to ask doesn’t need to be profoundly difficult. Does every earthly problem require a drawn out and elaborate thinking process? Sometimes when we overthink things, the original question gets lost in the quagmire. To combat this, break the overall question into smaller ones: what do I currently know about the problem? How did I come to know this information? What am I trying to solve? 5. Don’t assume things Assuming makes an *** out of you and me. You know the old saying. Even if something is globally assumed, you should question it. Way back in the day people assumed the Earth was flat. However, because critical thinkers don’t assume things, they analyzed the data and came to know that the Earth was a sphere. 6. Swap relationships For example, let’s just say that excessive video game use causes us to smoke. Instead of looking at relationships from one point of view, try swapping them. Does smoking cause excessive video game use? Although this example is merely hypothetical, switching variables in relationships allows to deconstruct these relationships and make more informed decisions. 7. Gather the information you’re presented with and evaluate it without bias Tip #2 tells us that to be a critical thinker we must be self-aware. Aware that other people’s opinions are just as important as our own. Therefore, we need to take the information they present to us and evaluate it in the same way that we evaluate our own. For example, if someone told you about the relationship between video games and smoking, you should ask yourself how they got this information and why. This is the main concept behind the media reporting on a new scientific study. Every day the media tells us that some new study shows how X causes Y. But, as all scientists know, correlation does not prove causation. We need to examine who conducted the study, how they conducted it, and why they conducted it. 8. Don’t solely rely on others Although critical thinking requires intense levels of research and analysis, don’t sell yourself short. Even if you are not an expert in the question you want answered, you should never discount your own views and ideas. Sure, you might not be an expert on Quantum Entanglement, but always include your own thoughts (however limited they may be) in the thinking process. 9. Combine all the information gathered from tips #1-#8 You’ve been open-minded, you sought others advice, you were unbiased, and you didn’t make assumptions. Now you need to combine all of this information to make a conclusion. You have all your deconstructed ideas and opinions and now need to weigh the implications of each decision. In other words, you’re examining the pros and cons of one decision vs. the other. You’ve done your research on Quantum Entanglement so now it’s time to decide if you are for it, or against it. Weigh the pros and the cons, examine the implications of your choice, and arrive at a logical conclusion. 10. Don’t try and think critically exclusively Critical thinking involves massive amounts of research, information processing, and analysis. Obviously, you can’t think this way all the time. You would never get anything done! Should you hit the snooze button? “Well, let’s examine my own rationale and the views of my co-workers, and then conduct extensive literature research on the relationship between sleeping and work productivity”. By the time you thought about this decision critically, you already missed a full day of work and the point is moot. Save your critical thinking skills for the important decisions in life. Like that honors thesis or your investment strategy. There you have it, 10 sure-fire ways to improve your critical thinking skills. When it comes to improving thinking skills, the jargon can get fairly wordy and complicated. If this all seems confusing, the best course of action would be to think critically about critical thinking! Okay, maybe that didn’t lessen the confusion. Regardless, if you want to make informed and sound decisions in life, critical thinking is your friend. It is in your best interests to learn these tips, apply them, and get thinking about thinking!
<urn:uuid:a0adf065-b6c4-4110-931a-93e630763c66>
CC-MAIN-2020-16
https://www.sciencelass.com/mind-and-brain/10-sure-fire-ways-to-improve-your-critical-thinking-skills/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370511408.40/warc/CC-MAIN-20200410173109-20200410203609-00303.warc.gz
en
0.948203
1,655
3.671875
4
Break RSA encryption with this one weird trick Cryptographers HATE it! Too much math; didn’t read — Shor’s algorithm doesn’t brute force the entire key by trying factors until it finds one, but instead uses the quantum computer to find the period of a function which contains the RSA key and classically computes the greatest common divisor. RSA encryption is strong because factoring is a one-way problem. It’s very easy to multiply two primes together, but very difficult to find prime factors of a large number. That’s what the technology relies on. And the simplicity of RSA encryption made it very popular. However, one technology can render RSA useless. (Hint: it’s a quantum computer) Shor’s algorithm can crack RSA. But how does it really work? It’s not about trying all prime factor possibilities simultaneously. In (relatively) simple language: We can crack RSA if we have a fast way of finding the period of a known periodic function f(x) = m^x (mod N) Five Steps of Shor So how does Shor’s algorithm work? In the five steps of Shor’s algorithm, only ONE requires the use of a quantum computer. The other steps can be solved classically. Step 1: use the classical greatest common divisor (gcd) algorithm on N and m, where N is the number you are trying to factor, and m is a random positive integer less than N. If the gcd(m, N) = 1, continue. If you find a factor using gcd, you’ve found a non-trivial factor and are done. Step 2: find the period P of: m mod N, m^2 mod N, m^3 mod N This is the quantum step Step 3: if the period P is odd, go back to step 1 and choose another random integer. Otherwise, continue Step 4: check that If that is true, go to Step 5 Otherwise, go back to Step 1 Step 5: Solve The answer is a non-trivial prime factor of N, and you now have the key to break RSA. How does Step 2 work? But how does a quantum computer find the period of the function, as in step 2? And why is this important? We are looking for the phase (period P) of m mod N, m^2 mod N, m^3 mod N (While this is an exponential function, we can transform a complex exponential into hyperbolic sin and cos and get a periodicity) This period finding step relies on the quantum superposition. With a quantum computer and its ability to be in a superposition of states, we can find the period of the function. To do so, we: 1. Apply the Hadamard gate to create a quantum superposition 2. Implement the function into a quantum transform 3. Perform the quantum Fourier transform. Like it’s classical analog, after these transformations, a measurement will yield an approximation to the period of the function (you can read the ‘peak’, like in the classical Fourier transform, with a high probability). Using the quantum Fourier transform, we can solve the order-finding problem and factoring problem, which are equivalent. The quantum Fourier transform allows a quantum computer to perform phase estimation (the approximation of eigenvalues of a unitary operator). As you exit the quantum portion (step 2), you check the period for validity and use another classical greatest common divisor algorithm to get the prime factor of the key. Interestingly enough, since the technique is not about trying all the potential prime factors, just the potential periods, you do not have to try many random numbers to successfully find a prime factor of N. The probability that P is odd, and you have to return to step one, is where k is the number of distinct prime factors of N. So even if you double the key length (N), there will not be a slowdown in finding the factors. RSA is not secure and doubling key size will not help in achieving a level of safety against a quantum adversary. The RSA-2048 Challenge Problem would take 1 billion years with a classical computer. A quantum computer could do it in 100 seconds –Dr. Krysta Svore, Microsoft Research The quantum Fourier transform is applied to a quantum circuit built just out of 1 qubit and 2 qubit gates, making the physical implementation of Shor’s algorithm one of the easiest tasks for a quantum computer. The quantum Fourier transform is the key to many of these quantum algorithms. It doesn’t speed up finding classical Fourier transforms, but can perform a Fourier transform on a quantum amplitude. It is exponentially faster to solve the quantum Fourier transfer on a quantum computer. Though there are subtleties beyond directly mapping classical Fourier transform problems, a quantum computer can also, for example, solve the hidden subgroup problem, which solves the discrete logarithm problem, or counting solutions, which crack other forms of modern cryptography. More importantly, the quantum Fourier transform can be applied to machine learning, chemistry, materials science, and, obviously, simulating quantum systems. At the core of Shor’s factoring algorithm is order finding, which can be reduced to the Abelian hidden subgroup problem, which is solved using the quantum Fourier transform. — NIST Quantum Zoo Just one of the steps of Shor’s algorithm needs to be implemented on a quantum computer, while the rest can be done on a classical supercomputer. The quantum subroutine will be performed and fed back to the supercomputer to continue the calculation. A quantum computer will likely never be a standalone system, but together with a supercomputer, the time to break an RSA key will be quite reasonable. There are a lot of mathematical details that have been glossed over, as well as the proofs of these steps as it is beyond the scope of this article. If you’re curious about the mathematical explanations, with intense linear algebra, group theory, and higher level mathematics, check out these sources: NIST Quantum Zoo — http://math.nist.gov/quantum/zoo/ — a list of all the quantum algorithms
<urn:uuid:3ea516bf-5dc5-4004-a099-ab0f38fb4b70>
CC-MAIN-2020-16
https://www.amarchenkova.com/2015/08/13/break-rsa-encryption-with-this-one-weird-trick/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370494064.21/warc/CC-MAIN-20200329074745-20200329104745-00146.warc.gz
en
0.887947
1,340
3.765625
4
Generation Z (or the iGeneration), who are in school and university today, have never witnessed the introduction of the Internet, smart phones and tablets, video games, on-demand TV and film, and social media. For them, this is all very much the norm and part of their everyday lives. The skills they are developing interacting with technology every day, will be essential as they step out into an adult world of emerging technology breakthroughs in AI, robotics, quantum computing, biotechnology and so on. Education has been late off the technology starting-blocks, but is now learning how to harness technology to support learning and teaching. Teachers are learning from each other how to successfully use technology as a teaching and learning tool, by sharing knowledge and outcomes with each other, both on a micro-level within their own schools and local communities, and on a macro-level through the Internet. As well as supporting teaching and learning, most educators across the globe will agree that assessment is another key aspect of schooling that technology has a phenomenal potential to enhance. The world is changing and changing fast, and the challenge for educators across the globe is to ensure student assessment remains fit for purpose and relevant in the digital age. To succeed in an increasingly connected world, students need the right attitudes, adaptable skills and cultural sensitivity – and assessment needs to be able to evaluate and help foster these skills in learners. By harnessing the digital tools around us, educators can better assess students’ abilities; identify their strongest and weakest skills; and encourage them to demonstrate their whole skillset – all the while moving away from traditional, memory-based, and often stressful, examination methods. Combining traditional academic merit and contemporary digital innovation, in 2016 the International Baccalaureate (IB) introduced eAssessment to its Middle Years Programme (MYP). MYP eAssessment is designed to focus on scenarios in which students must use knowledge and skills to analyse unfamiliar situations, thus challenging them to connect what they have learned with what they might learn next, and apply big ideas to solve real-world problems. This has traditionally been harder to achieve using paper-based examinations but, with technology, more is possible. Different types of tasks are used within the on-screen examinations to test specific skills, meaning that students’ achievement against all subject objectives are thoroughly tested. For example, writing a short static essay assesses writing capability, whilst creating an infographic assesses interactive communication and presentation skills. With the use of images, videos, animations and models, and through interactive tools, candidates can create, manipulate and make decisions about how to manage data. On-screen tools can help students who aren’t working in their first language too, and built-in adaptive technologies can ensure that the eAssessment is open to students with access and inclusion needs, providing all participants with the best opportunity possible to demonstrate their knowledge, skills and abilities. Commenting on the MYP eAssessment, IB Co-ordinator Jaya Kalsy from Victorious Kidss Educares in India, said: “Students are encouraged to utilize online learning tools, as well as exploring digital methods to present their work. Digital methods have become the window to the world. There is a sense of encouragement and advancement in the process of schooling as well as the assessment process.” Increasingly, it is recognised that technology can add real value for students, and MYP schools offering the eAssessment say that it has enabled them to participate in exciting innovation in education. Feedback from schools, via an IB survey conducted with educators all over the globe, illustrates the natural connection between eAssessment and what is being taught and learnt in the classroom. Schools have said that, through digital assessment, they are able to assess skills, concepts and thinking, in context, rather than knowledge recall. From our research, it is rewarding to see that schools understand that eAssessment supports conceptual teaching and learning, and is not something that can be crammed for – only good MYP practice supports good preparation. Unlike other education environments, in the MYP educators do not teach to test, but their teaching does align with assessment requirements and this brings enrichment and focus to their students’ learning. MYP eAssessment is still relatively new but the results of the first four years clearly demonstrate an encouraging upward trend, which shows that teachers are able to connect what’s happening in their classrooms to what’s happening in the on-screen assessments, examining students’ higher thinking skills and pushing them well beyond the rote memorisation of subject-specific content. In 2018, the MYP eAssessment was successfully recognised in the ‘Best assessment solution’ category at the ScooNews Global Education Awards in Udaipur, India as well as winning the ‘Best use of summative assessment’ award in the eAssessment Awards in London. For more information about the IB and the MYP, please visit www.ibo.org . The above article is authored by Eleonore Kromhout, Senior Manager Assessment Development and Delivery, International Baccalaureate
<urn:uuid:c6f4fd58-d9a3-4fed-bb6d-485ac950348e>
CC-MAIN-2020-16
https://www.educationworld.in/introducing-eassessment-in-middle-years-programme/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370524604.46/warc/CC-MAIN-20200404165658-20200404195658-00270.warc.gz
en
0.95659
1,050
3.546875
4
Today’s computing systems, although having significantly improved decade after decade, can only solve problems up to a certain size and complexity. More complex issues require advanced computational power, and quantum computing promises to deliver such power. Classical computers rely on individual bits to store and process information as binary 0 and 1 states. Quantum computers rely on quantum bits – qubits – to process information; in doing so, they use two key quantum mechanical properties: superposition and entanglement. Superposition is the ability of a quantum system to be in multiple states at the same time. Qubits still use the binary 0 and 1 system, but the superposition property allows them to represent a 0, a 1, or both at the same time. Instead of analysing 0s and 1s sequence by sequence, two qubits in superposition can represent four scenarios at the same time, thus reducing the time needed to process a data set. Entanglement is a strong correlation between quantum particles, allowing them to be inextricably linked in perfect unison, even if separated by great distances. When two qubits are entangled, there is a special connection between them: If the individual qubits are measured, the outcome of the measurements could be 0 or 1; but the outcome of the measurement on one qubit will always be correlated to the measurement on the other qubit. And this is always the case, even if the particles are separated from each other by a large distance. In essence, superposition allows quantum computers to solve some problems exponentially faster than classical computers, while entanglement makes quantum computers significantly more powerful. Qubits can be created through different methods, such as using superconductivity to create and maintain a quantum state. Superconductivity requires low temperatures, which is why quantum computers need to be kept cold to maintain their stability. One main problem with qubits is that they are very tricky to manipulate: Any disturbance makes them fall out of their quantum state or ‘decohere’. Significant research is being carried out on identifying ways to overcome this decoherence problem and make qubits co-operate. While quantum computers can work with classical algorithms, quantum algorithms are obviously more appropriate as they can solve some problems faster. One example of a quantum algorithm is Grover’s algorithm, which can search through an unstructured database or unordered list significantly faster than any classical algorithm. It is important to note that problems fundamentally unsolvable by classical algorithms (called undecidable class problems) cannot be solved by quantum algorithms either. Applications of quantum computing The unprecedented power of quantum computers makes them useful in many scenarios where classical computers would require an impractical amount of time to solve a problem. For example, they could simulate quantum systems, allowing scientists to study in detail the interactions between atoms and molecules. This, in turn, could help in the design of new materials (e.g. electronics, chemical materials) or new medicines. As they are significantly faster than classical computers, quantum computers will also be far more efficient at searching through a space of potential solutions for the best solution to a given problem. Quantum computers can thus pave the way for unparalleled innovations in medicine and healthcare, allowing for the discovery of new medications to save lives or of new AI methods to diagnose diseases. They can also support the discovery of new materials, the development of enhanced cybersecurity methods, the elaboration of much more efficient traffic control and weather forecasting systems, and more. Researchers around the world are working on and with quantum technology in various fields. Airbus has launched a quantum computing challenge to encourage the development of quantum solutions in aircraft climb and loading optimisation, as well as wingbox design optimisation. Daimler is working with Google on using quantum computing in the fields of materials science and quantum chemical simulation. The US Department of Energy is funding research projects that could lead to the development of very sensitive sensors (with applications in medicine, national security, and science) and provide insights into cosmic phenomena such as dark matter and black holes. Google, IBM, Intel, Microsoft, and other major tech companies are allocating significant resources to quantum computing research, in their efforts to pioneer breakthroughs in areas such as AI and machine learning, medicine, materials, chemistry, supply chains and logistics, financial services, astrophysics, and others. Quantum communication and cryptography Beyond powerful quantum computers, quantum technology has applications in other areas too, such as quantum cryptography and quantum communication, both of which are closely interlinked. Quantum cryptography is a method used for the secured, encrypted transfer of information. Unlike other forms of cryptography, it ensures security by the laws of physics; it is not dependent on mathematical algorithms and unsecure exchanges of keys. Quantum communication based on quantum cryptography currently qualifies as highly secure, making it impossible to wiretap or intercept. Here, the most well known application is quantum key distribution (QKD), which relies on the use of quantum mechanical effects to perform cryptographic tasks. One possible means of quantum communication is quantum teleportation. Although the name can be misleading, quantum teleportation is not a form of the transport of physical objects but a form of communication. This teleportation is the process of transporting a qubit from one location to another without having to transport the physical particle to which that qubit is attached. Even quantum teleportation depends on the traditional communication network, making it impossible to exceed the speed of light. Quantum computers already exist, but their power is still rather limited and several tech companies are continuously working on improving this power. For instance, in January 2019, IBM announced its first commercial quantum computer that can work outside the research lab, but with a power of only 20 qubits. Later on, in October 2019, the company's engineers announced the development of a 53-qubit computer. In another example, the startup Rigetti Computing developed a 32-qubit computer and is now working on a 128-qubit one too. In October 2019, Google claimed that it achieved ‘quantum supremacy’ with a 53-qubit quantum computing chip that took 200 seconds to carry out a specific calculation which would have taken a classical computer 10 000 years to complete. IBM soon challenged that claim, arguing that the problem solved by Google’s computer could also be solved in just 2.5 days through a different classical technique. We can expect these and other companies to discover further improvements in processing power, allowing quantum computers to solve problems that classical computers cannot. While this race is ongoing, the hype around this technology should also be looked at with a degree of caution. As the Massachusetts Institute of Technology (MIT) explains, quantum supremacy is an ‘elusive concept’. First of all, we are still far from quantum computers that can do significant work; Wired magazine estimates that at least thousands of qubits would be required for fully functional quantum computers to solve real-life problems (current quantum computers that operate with less than 100 qubits are far from such a reality). In addition, quantum computers are prone to many more errors than classical computers and, as already explained, the risk of decoherence makes it very difficult to maintain the quantum nature of qubits. The more qubits a quantum computer has, the more difficult it is to overcome such challenges. Moreover, a quantum computer cannot simply speed up the process of solving any task given to it; scientists explain that, for certain calculations, a quantum computer can be even slower than a classical one. Plus, only a limited number of algorithms have been developed so far where a quantum computer would clearly have supremacy over a classical computer. Governmental initiatives and policy issues The promises that quantum computing holds also make it the subject of an ongoing ‘race for supremacy’ not only among tech companies, but among nations too. The USA and China are currently at the forefront, while the EU, Japan, and others are following closely. In the USA, the National Quantum Initiative Act was adopted in December 2018, setting up a ‘federal programme to accelerate quantum research and development for the economic and national security of the United States’. The Act enables the allocation of over US$1 billion to support the research and development (R&D) of quantum technologies, including quantum computing. In March 2019, the White House Office of Science and Technology Policy created a National Quantum Coordination Office to ‘work with federal agencies in developing and maintaining quantum programmes, connecting with stakeholders, [and] enabling access and use of R&D infrastructure’. And in August 2019, President Trump adopted an executive order establishing the National Quantum Initiative Advisory Committee. China, on the other hand, is allocating substantial financial resources to university-based quantum research centres and is planning to open a National Laboratory for Quantum Information Science in 2020 (with an investment of around US$1 billion). On the R&D side, researchers have built a satellite that can send quantum-encrypted messages between distant locations, and a terrestrial ultra-secure network between Beijing and Shanghai that allows for the transmission of sensitive data with the help of quantum-encrypted keys. Beyond this ‘race for supremacy’, progress in quantum computing is also paving the way to new policy issues. For example, one immediate concern is that quantum computers could be used to break encryption systems that are utilised nowadays to secure online banking and shopping, for example. While quantum processors do not yet have such power, the potential is real and governments and companies have started to look into this issue. It is also likely that regulatory and ethical issues will emerge related to the use of the technology: How to ensure that quantum computing will be used for social good? Similar to the ongoing discussions regarding ethics and AI, will there be a need to implement ethical principles in the development of applications based on quantum computing?
<urn:uuid:3d73c3fd-d930-4f04-997e-546150aefb89>
CC-MAIN-2020-16
https://dig.watch/trends/quantum-computing
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370493120.15/warc/CC-MAIN-20200328194743-20200328224743-00389.warc.gz
en
0.939242
2,002
4.25
4
Nothing is more frustrating than watching that circle spinning in the centre of your screen, while you wait for your computer to load a programme or access the data you need. Now a team from the Universities of Sheffield and Leeds may have found the answer to faster computing: sound. The research – published in Applied Physics Letters – has shown that certain types of sound waves can move data quickly, using minimal power. The world’s 2.7 zettabytes (2.7 followed by 21 zeros) of data are mostly held on hard disk drives: magnetic disks that work like miniaturised record players, with the data read by sensors that scan over the disk’s surface as it spins. But because this involves moving parts, there are limits on how fast it can operate. For computers to run faster, we need to create “solid-state” drives that eliminate the need for moving parts – essentially making the data move, not the device on which it’s stored. Flash-based solid-state disk drives have achieved this, and store information electrically rather than magnetically. However, while they operate much faster than normal hard disks, they last much less time before becoming unreliable, are much more expensive and still run much slower than other parts of a modern computer – limiting total speed. Creating a magnetic solid-state drive could overcome all of these problems. One solution being developed is ‘racetrack memory’, which uses tiny magnetic wires, each one hundreds of times thinner than a human hair, down which magnetic “bits” of data run like racing cars around a track. Existing research into racetrack memory has focused on using magnetic fields or electric currents to move the data bits down the wires. However, both these options create heat and reduce power efficiency, which will limit battery life, increase energy bills and CO2 emissions. Dr Tom Hayward from the University of Sheffield and Professor John Cunningham from the University of Leeds have together come up with a completely new solution: passing sound waves across the surface on which the wires are fixed. They also found that the direction of data flow depends on the pitch of the sound generated – in effect they “sang” to the data to move it. The sound used is in the form of surface acoustic waves – the same as the most destructive wave that can emanate from an earthquake. Although already harnessed for use in electronics and other areas of engineering, this is the first time surface acoustic waves have been applied to a data storage system. The Latest on: Faster computing via Google News The Latest on: Faster computing - When Is It Better to Restart vs. Shut Down Your Computer?on March 23, 2020 at 8:10 pm However, since Windows 8, a new feature called Fast Startup has altered this considerably.” How has that changed things, exactly? “Shutting down a Windows computer actually creates a deep hibernation ... - The New Xbox: Just How Fast Is 12 TeraFLOPS?on March 23, 2020 at 10:16 am Comparing the Series X to contemporary computer hardware, performance is on par with NVIDIA’s current flagship ... Advances in hardware have come thick and fast, and the roughly half-decade lives of ... - Coronavirus researchers to get fast-track access to Irish supercomputeron March 23, 2020 at 5:04 am All content copyright 2002-2020 Silicon Republic Knowledge & Events Management Ltd. Reproduction without explicit permission is prohibited. All rights reserved. Designed by Zero-G and Square1.io The ... - [email protected] Now Faster Than World’s Top 7 Supercomputers Combined - Coronavirus or computer virus – both are spreading fast, which one to prevent?on March 23, 2020 at 3:20 am Both on high priority! So vulnerable we humans are! Today, we stand humbled in front of a tiny microorganism and so are our computers, always at the risk of a virus attack. Exposure to a virus can ... - Coronavirus And [email protected]; More On How Your Computer Helps Medical Researchon March 22, 2020 at 7:09 am Can I Make Sure My Computer Only Works on the COVID-19 Problem? No, but you don’t need to since the group is already prioritizing the coronavirus effort. Although the software does offer the option to ... - Chip-based device opens new doors for augmented reality and quantum computingon March 22, 2020 at 7:00 am Trapped ion quantum computers are among the most promising practical designs for quantum computing, an emerging technology expected to be significantly faster than traditional computing. The new ... - Apple's 2020 iPad Pro: Finally, the Laptop Is Dead? Not So Fast.on March 22, 2020 at 5:27 am Not So Fast. The launch of the 2020 iPad Pro coincides with the imminent release of iPadOS ... s ongoing attempts to fully realize the iPad’s potential as a productivity platform. The iPad Pro is now ... - Amazon Web Services offers $20 million worth of credits and technical support for customers working on faster COVID-19 testingon March 20, 2020 at 10:03 am Amazon Web Services will offer $20 million worth of credits and technical support to customers who are working to develop faster COVID-19 testing. - How Fast Does a Virus Spread? Let’s Do the Mathon March 20, 2020 at 6:00 am How far and how fast will the Covid-19 pandemic spread ... Here's what that looks like: So, how do you tell if something is exponential? You could use a computer to fit an exponential function to the ... via Bing News
<urn:uuid:d2c1573b-9106-4129-8e87-3250e445e463>
CC-MAIN-2020-16
https://www.innovationtoronto.com/2015/11/the-solution-to-faster-computing-sing-to-your-data/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370496669.0/warc/CC-MAIN-20200330054217-20200330084217-00233.warc.gz
en
0.924542
1,187
3.859375
4
The definition of the quantum computer is quite simple. It is a computer that exploits the laws of physics and quantum mechanics for data processing using the qubit as a fundamental unit. Unlike electronic calculation, at the base of computers as we have always known them, whose fundamental unit is the bit! In particular, quantum bits have some properties that derive from the laws of quantum physics such as: - The superposition of states (they can be 0 and 1 at the same time) due to which parallel rather than sequential calculations can be made as happens today with the computational capacity of "traditional" computers. - The entanglement that is the correlation (the bond) that exists between one qubit and another, a very important aspect because it has a strong acceleration in the calculation process derives due to the influence that one qubit can produce on another even if they have distance. - Quantum interference: It is, in fact, the effect of the first principle (the superposition of states); quantum interference allows you to "control" the measurement of qubits based on the wave nature of the particles. The interference represents the superposition of two or more waves that depend on whether there is an overlap or not between grows and bellies. For instance, higher and lower parts of the wave - constructive interference can occur. When crests or bellies coincide and form a wave, which is the sum of the overlapping waves, or destructive interference when overlapping are the crest of a wave and belly of another, in this case, the two waves cancel each other out. To understand how we got to the quantum computer, we have to go back to the miniaturization of circuits and Moore's Law. From the 1960s onwards, there has been a progressive increase in the computing power of computers, an increase that has gone hand in hand with the miniaturization of the electronic circuits from which it derives the famous Moore's Law. According to this law, “the complexity of a microcircuit, measured with the number of transistors in a chip (processor), and the relative calculation speed doubles every 18 months ". Following this law - which over time has become a real measurement parameter and also guide of objectives for processor manufacturers - we have come to have integrated microchips, i.e., processors that integrate a CPU, a GPU, and a Digital Signal inside them processing, within our smartphones. However, a threshold that today has reached the limits of quantum mechanics, making it very complex (almost impossible) to continue the path of miniaturization, together with the increase in the density of transistors. Limit that has actually opened the way to a paradigm shift trying to exploit the laws of physics and quantum mechanics to achieve a computing power higher than that of computers based on electronic calculation without necessarily thinking about the miniaturization of circuits. The information units that encode two states open and closed (whose values are 1 and 0) of a switch, exploit those that are called qubits. The units of quantum information that are coded not by 1 or 0 but by the quantum state in which a particle or atom is found, which can have both the value 1 and the value 0 at the same time. Moreover, in a variety of combinations that produce different quantum states (a particle can be 70% in state 1 and 30% in state 0, or 40% and 60%, or 15 and 85). A condition that takes on an incredible meaning when you think of mathematical progression such as 2 qubits can have 4 states simultaneously. For example, a pair of qubits can be in any quantum superposition of 4 states), 3 qubits can be in any 8 state superpositions. And, eight strings of three different bits: 000, 001, 010, 011, 100, 101, 110 and 111), 4 qubits in overlapping 16 states, 8 qubits of 256 states and so on. In a quantum computer, the n qubits can be in any superposition up to 2 to ‘n’ different states. In fact, atomic and subatomic particles can exist in an overlap of quantum states, a situation that greatly expands the possibilities of encoding information by opening the possibility of exploiting this processing capacity for the resolution of extremely complex problems, such as those underlying the Artificial intelligence. The critical issues that have so far slowed down the race to develop these systems are related to the controlled manipulation of atoms and particles. It is possible with a few qubits but for complex processing hundreds and thousands of qubits are needed. Their connection and communication, as well as the development of algorithms are suitable for the quantum computer. The functioning of the quantum computer, as mentioned in the first paragraph of this service) is based on two laws of quantum mechanics: - The superposition principle from which derives, as we have seen, the possibility for the particles to be simultaneously in several different states. The superposition of states, in quantum physics, represents the simultaneous existence of all possible states of a particle or physical entity before its measurement. Only with the measurement, it is possible to define precisely the property of the qubit, and this is one of the most critical aspects that have not yet made the quantum computer available on a large scale. The particles are unstable, and their measurement is very complex, to which it must be added that the instability of the particles generates heat, which, to date, can only be controlled with advanced cooling systems. - The quantum correlation (entanglement): It expresses the constraint, the correlation precisely that exists between two particles or two qubits. According to this principle, it is possible to know the state of a particle (or a qubit) by measuring the other with which it has the constraint. According to Gartner analysts, applications for quantum computing will be restricted and targeted, as the general-purpose quantum computer - most likely - will fail to be economically accessible on a large scale (at least not in the short term). However, technology has the potential to revolutionize certain sectors. Quantum calculation could allow discoveries and be applied in many sectors: - Machine-learning: improved machine learning due to a faster forecasting structure (due to parallel calculation). Examples include quantum Boltzmann machines, semi-supervised learning, unsupervised learning, and deep learning. - Artificial intelligence: faster calculations could improve the perception, understanding, and diagnosis of circuit faults / binary classifiers. - Chemistry: New fertilizers, catalysts, battery chemicals will bring enormous improvements in the use of resources; - Biochemistry: New drugs, customized drugs, personalized medicine. - Finance: the quantum calculation could allow the so-called faster and more complex "Monte Carlo simulations"; for example in the field of trading, optimization of "trajectories," market instability, price optimization, and hedging strategies. - Medicine and health: DNA gene sequencing, such as optimization of radiation therapy treatment/brain tumor detection, could be done in seconds rather than hours or weeks. - Materials: super-resistant materials; anti-corrosive paints, lubricants, semiconductors, the research could be greatly accelerated due to super-fast calculations.
<urn:uuid:116f2972-f301-4211-82c2-e242d4611f7d>
CC-MAIN-2020-16
https://essay.biz/article/what-is-a-quantum-computer
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371830894.88/warc/CC-MAIN-20200409055849-20200409090349-00233.warc.gz
en
0.936501
1,461
4.09375
4
For decades, scientists have used techniques such as X-ray crystallography and nuclear magnetic resonance (NMR) imaging to gain invaluable insight into the atomic structure of molecules. Such efforts have long been hampered by the fact that they demand large quantities of a specific molecule, often in ordered and crystalized form, to be effective — making it all but impossible to peer into the structure of most molecules. Harvard researchers say those problems may soon be a thing of the past. A team of scientists, led by Professor of Physics and of Applied Physics Amir Yacoby, has developed a magnetic resonance imaging (MRI) system that can produce nanoscale images, and may one day allow researchers to peer into the atomic structure of individual molecules. Their work is described in a March 23 paper in Nature Nanotechnology. “What we’ve demonstrated in this new paper is the ability to get very high spatial resolution, and a fully operational MRI technology,” Yacoby said. “This work is directed toward obtaining detailed information on molecular structure. If we can image a single molecule and identify that there is a hydrogen atom here and a carbon there … we can obtain information about the structure of many molecules that cannot be imaged by any other technique today.” Though not yet precise enough to capture atomic-scale images of a single molecule, the system already has been used to capture images of single electron spins. As the system is refined, Yacoby said he expects it eventually will be precise enough to peer into the structure of molecules. While the system designed by Yacoby and colleagues operates in much the same way conventional MRIs do, the similarities end there. “What we’ve done, essentially, is to take a conventional MRI and miniaturize it,” Yacoby said. “Functionally, it operates in the same way, but in doing that, we’ve had to change some of the components, and that has enabled us to achieve far greater resolution than conventional systems.” Yacoby said that while conventional systems can achieve resolutions of less than a millimeter, they are effectively limited by the magnetic field gradient they can produce. Since those gradients fade dramatically within just feet, conventional systems built around massive magnets are designed to create a field large enough to image an object — like a human — that may be a meter or more in length. The nanoscale system devised by Yacoby and colleagues, by comparison, uses a magnet that’s just 20 nanometers in diameter — about 300 times smaller than a red blood cell — but is able to generate a magnetic field gradient 100,000 times larger than even the most powerful conventional systems. The difference, Yacoby explained, is that the nanoscale magnet can be brought incredibly close, within a few billionths of a meter, to the object being imaged. “By doing that, we can achieve spatial resolution that’s far better than one nanometer,” he said. The departures from conventional MRI systems, however, didn’t end there. To construct a sensor that could read how molecules react to that magnetic field gradient, Yacoby and colleagues turned to a field that would appear to be unconnected to imaging — quantum computing. Using ultra-pure, lab-grown diamonds, the team milled tiny devices, each of which ended in a super-fine tip, and embedded an atomic-scale impurity, called a nitrogen-vacancy (NV) center in each tip, creating a single quantum bit, or qubit — the essential building block of all quantum computers. In experiments published last year, Yacoby and his collaborators showed that as the tip was scanned across the surface of a diamond crystal, the quantum bit interacted with electron spins near the crystal’s surface. Those interactions could then be used to create an image of individual electron spins. However, while the sensitivity of the quantum bit sensor is sufficient to detect individual electron spins and represents a quantum leap forward from earlier efforts, its spatial resolution is limited by its distance from the object that is being imaged. To create truly 3-D images, Yacoby and colleagues combined the quantum-bit sensing approach with the large-field gradient by bringing the nanomagnet in close proximity to both the sample of interest and the qubit sensor. By scanning the magnet in 3-D, but very close to the sample, they were able to detect individual electron spins as they reacted to the magnetic field. “This is really a game of bringing both the magnet very close to generate large gradients, and bringing the detector very close to get larger signals,” Yacoby said. “It’s that combination that gives us both the spatial resolution and the detectability. “Our current system is already capable of imaging individual electron spins with sub-nm [subnanometer] resolution,” he said. “The goal, eventually, is to put a molecule in proximity to our NV center to try to see the components within that molecule, namely the nuclear spins of the individual atoms composing it. This is by no means an easy task, since the nuclear spin generates a signal that is 1,000 times smaller than that of the electron spin … but that’s where we’re headed.”
<urn:uuid:3555fb91-ce45-4178-888a-306607dfd2d4>
CC-MAIN-2020-16
https://news.harvard.edu/gazette/story/2014/04/mri-on-a-molecular-scale/
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371880945.85/warc/CC-MAIN-20200409220932-20200410011432-00433.warc.gz
en
0.949048
1,105
3.5625
4
Diamonds have always been considered among the rarest of gems. They stand for innocence and elegance. But did you know that diamonds will soon be a good replacement for silicon? It is possible due to the presence of a special atomic impurity in each diamond known as “doping.” Silicon is a widely used semiconductor. However, it has a very high melting point and sluggish performance. However, the uses of this gem are not restricted to that alone. A new application of the diamond is being discovered in the field of semiconductor manufacturing – the diamond sensor. This unique material can detect impurities and defects in the material used in semiconductors. Quantum diamond is the most innovative diamond sensor. It has numerous benefits in comparison to the current diamond sensor technologies. As a result, it is a very lucrative diamond sensor technology used in many labs worldwide. Let’s check out what a quantum diamond sensor is. What Is a Quantum Diamond Sensor? A quantum diamond sensor is a sensor that exists at the atomic level. It is a device that uses diamonds to measure the electrons in a specific material at a particular time. The device works by placing the diamond sensor in a vacuum chamber and applying a static magnetic field. The name “quantum diamond sensor” is derived from the element diamonds being used, giving off light when electrons within the diamond crystals transition from a higher energy level to a lower one. The sensor is also called a diamond nanocrystal, or DNC. The sensor is not only embedded with the power to detect, but it is also very sensitive to changes. It is an amazing discovery that can revolutionize the electronics industry. The quantum diamond sensor also identifies the grade of the diamond and the proportion of flaws. With the technology of the quantum diamond sensor, you can purchase a higher quality diamond at a lower cost than buying a lower quality diamond at a higher price. Quantum diamonds are created in colder temperatures than lab grown gems using a different process. These diamonds typically have fewer impurities and more vibrant colors than their lab grown cousins. So what is a lab grown diamond? Lab grown diamonds are man-made diamonds grown in a lab. They are made by melting carbon and turning it into the diamond’s crystal structure through high pressure and heat. These diamonds are mined in a controlled lab setting. This cost-effective, green alternative to mined diamonds is an environmentally friendly option that is irreversible, renewable, and non-polluting. How does a Quantum Diamond Sensor Work? A quantum sensor is a type of sensor that can read the vibrations on the pressure plate. It completes a circuit that changes the color of the liquid crystal display. The entire device is about the size of your hand, and the pressure plate is about half the size of the machine. The quantum diamond sensor works by bouncing a laser beam off the diamond and then measuring the time it takes for the light to bounce back. This determines the size and shape of the diamond. The whole device is connected to the pressure plate. The device needs to be cooled down at that temperature to reduce the thermal energy of the electrons and atoms. It will allow the spin of the electrons to remain unchanged when the magnetic field is applied. The device consists of a series of diamond layers. What makes this method so fascinating is that it can reveal the different impurities in the diamond, giving the owner a better idea of what type of diamond they are buying. What are the Applications of Quantum Diamond Sensors? Quantum diamond sensors have a wide range of applications in many different scientific disciplines. One use of quantum diamond sensors is detecting trace amounts of toxic gasses. The device uses a modified version of a technology known as Surface Enhanced Raman Spectroscopy (SERS). It detects trace amounts of gasses without being affected by other environmental gasses, such as oxygen. SERS is a technique that utilizes the surface of a diamond to concentrate and amplify a signal. Such as pressure or force, allowing users to detect atomic or molecular structures. Another application of quantum diamond sensors is in the detection of radioactive material. Quantum diamond sensors are an excellent material for this purpose due to their atomic structure. They can distinguish between different atoms and detect the presence of these substances. Diamond sensors can also be used in many other fields, such as studying photovoltaic cells, magnetic field strength, magnetoresistance, and more. Quantum diamond sensors are used to study diamond films, which are films that are grown on a substrate to create different diamond structures. For example, a diamond film that is just a single diamond layer deposited on top of the substrate or many layers of diamond films stacked on top of one another could be grown. This diamond film aims to study how diamond films will change with different amounts of pressure and heat. Quantum diamond sensors can measure that response and have already shown promise in predicting the properties of diamond films, which may be useful in developing realistic diamond films. Quantum diamond sensors can be used to create quantum computers, which would be far more advanced than their digital counterparts. It is also used in astronomy to detect hidden planets. This sensor can also be placed underwater to see changes in the environment. It is extremely useful when wanting to detect underwater objects or changes in the water. Overall, the applications of this sensor are endless. You can use it in medical, transportation, and many more. The diamond sensor is one of the most innovative technologies in today’s modern industries. Quantum sensors are more precise than traditional photoelectric sensors. It can detect objects that are smaller than the wavelength of light. They can see various materials, including organic, inorganic, and semi-organic objects. Quantum diamond sensor is becoming a new standard in the measurement of quantum entanglement. With quantum entanglement, there is a mysterious connection between two quantum objects. When quantum diamonds are entangled, there are many possible uses. It can help in teleportation, increasing computing power, and even communication between objects that are far away. Also, quantum diamonds are much smaller and can store more information, which will make their use in quantum computing a vital step in the future.
<urn:uuid:3809db87-a64a-4ceb-b13c-1a82b0fee522>
CC-MAIN-2022-33
https://texillo.com/all-you-need-to-know-about-quantum-diamond-sensor/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572033.91/warc/CC-MAIN-20220814113403-20220814143403-00604.warc.gz
en
0.928384
1,278
3.578125
4
The simple answer to this question is no — within present quantum technology we are unable to build one with sufficient power to replace ordinary computers. In order to build a quantum computer you need particles that can behave like qubits, that is, the quantum analogue of the bits used by a classical computer. Qubits must be able to represent the values 0 and 1, but crucially, they must also be able to exist in a superposition of the two (see this article to find out more about how quantum computers work). The idea is to embody those qubits in particles — photons, electrons, atoms; researchers are working on a host of possibilities. And they have already succeeded in building quantum devices that use only a small number of qubits. For an example, think of a beam splitter: a half-silvered mirror that will reflect half of a beam of light that's shone on it and let the other half pass through. Since you can think of light as consisting of particles called photons, you can ask what happens to an individual photon as it hits the surface of the half-mirror. A possible answer is that it's got a 50:50 chance of being transmitted through the mirror or reflected — either one or the other. But that's not actually what happens. When the photon hits the mirror it enters a superposition state of simultaneously being reflected and transmitted. A beam splitter. In the image on the right, think of a photon as representing a qubit in state 0 if it is travelling in the vertical direction and a qubit in state 1 if it is travelling in the horizontal direction. Similarly, write 0 if it continues along the vertical path after hitting the beam splitter and 1 if it continues along the horizontal path. The beam splitter therefore turns an input qubit that's in the state 0 or 1 into a qubit that is in superposition of 0 and 1. We won't go further into the details here, but using such beam-splitters and photons that impinge on them it's possible to build quantum gates such as the Hadamard gate We mentioned in this article. Using a combination of just a few quantum gates, contraptions that are able to process one or two qubits, you can implement any quantum algorithm you can come up with — people have been able to prove this fact mathematically. So why can't we just bang together a few of those "universal" quantum gates to create a quantum computer that can perform tasks involving many qubits? That's exactly what happens in ordinary computing, where combinations of individual logic gates can perform all sorts of computations (see this article for an example). There are many difficulties involved in getting particles to behave and interact in the qubit way, and especially in building systems of many qubits, which are needed to get the computing benefits. But the most important hurdle faced by quantum computing is the fact that superposition states are delicate: they can only survive for any length of time when the quantum system in which they occur is extremely well-isolated from its environment. If it isn't, then its quantum nature sort of "leaks out" and dissipates in a process called decoherence (the system becomes entangled with its environment). It happens very quickly, and all that's left are states we are used to seeing in real life. It's the difficulty in keeping quantum systems isolated that stops us from building quantum computers that can handle more than just a few qubits. So when can we expect to have a fully-fledged, uncontroversial and practically useful quantum computer? "There are all sorts of wonderful and exotic things that people try to build quantum computers out of: defects in diamonds, electrons floating on liquid helium, superconducting qubits, and all sorts of amazingly imaginative stuff." says Jozsa. "There is very good progress in all of these things, it's getting better and better all the time. You can't say that [quantum computing] is far off because these developments don't occur incrementally. The transistor for classical computing didn't emerge gradually: one day there wasn't one, but a few weeks later it existed and computing exploded." says Jozsa. He quotes the physicist N. David Mermin, who in his lecture notes on quantum computation states that "Only a rash person would declare that there will be no useful quantum computers by the year 2050, but only a rash person would predict that there will be." To find out more about quantum computing, read the following articles: - How does quantum computing work? - Quantum computing: Some (not so) gruesome details - What can quantum computers do? About this article Marianne Freiberger is Editor of Plus. She would like to thank Richard Jozsa, Leigh Trapnell Professor of Quantum Physics at the University of Cambridge, for his extremely helpful, very patient and generally invaluable explanations. Thanks for such wonderful insights, I would like to read more about Quantum computing, if you have expanded you research.
<urn:uuid:b92b5496-bc66-4edd-b028-7a8f00048d86>
CC-MAIN-2022-33
https://plus.maths.org/content/do-quantum-computers-exist
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571692.3/warc/CC-MAIN-20220812105810-20220812135810-00406.warc.gz
en
0.956377
1,030
3.796875
4
phonlamaiphoto - stock.adobe.com There are some problems that are simply too complex for even the most powerful of today’s computers, and researchers are trying to overcome the limits of traditional computer designs to enable computationally difficult problems to be solved. The von Neumann architecture, which has defined the layout of computing for the past 75 years, is being pushed in directions that it was never designed to take. This is the classical computing architecture, which effectively defines the way a processor fetches program instructions from memory, runs them and stores values back into memory. But the stored program architecture described by John von Neumann is less efficient at solving certain complex problem areas, compared with entirely new approaches to computing. Quantum computing is one of the new approaches to computing that opens up the ability to run calculations that would be impossible to complete on a classical computing architecture. However, quantum computers are currently highly specialised devices. Only recently have scientists been able to demonstrate a device that does not need to be supercooled and kept at near absolute zero (-270°C). Peter Chapman, CEO and president of IonQ, which recently listed on NYSE, said that when running handwriting recognition, an 11 qubit quantum computer outperformed classical computing and was more accurate in its ability to handle noisy data. “Machine learning is the first application area that will go to quantum computing,” he said. “It is much faster at creating models, and models are better.” Unlike the classical approach, which needs to be programmed in a way that can compensate for noise in the dataset, “a little bit of noise actually helps”, he said. What is more, although Moore’s Law has held true for classical computer architectures, where processing power doubles every 18 months to two years, scalability in quantum computing grows exponentially. “We are doubling the number of qubits every 10 months,” said Chapman. In a machine with n qubits, the computational power is expressed as 2n. In effect, each additional qubit doubles the processing power. To put this into perspective, said Chapman, the number of simultaneous states that a 120-qubit system could handle would be equivalent to the number of atoms in the universe. Read more about next-generation hardware - In The Hitchhiker’s Guide to the Galaxy, Deep Thought calculates the answer to the ultimate question of life, the universe and everything as ‘42’. But what of quantum computing? - In The Terminator, Arnold Schwarzenegger is transported back to 1984. If Arnie had to rely on cloud connectivity, he’d still be walking around naked. According to Chapman, modelling certain chemical reactions would require the computational power that is only available in the realms of quantum computing. But even in the real world, certain types of optimisations are simply too complicated for classical computing. There are numerous reports about how the programmers who developed the route optimisation software for logistics firm UPS only used right turns in their calculations. Looking at route optimisation, Chapman said: “What we do today is far from the optimal route as there is a set of cheats that programmers have figured out.” If an individual driver makes 120 deliveries a day, the number of different permutations of routes is a 200-digit number, he said. Multiply that by the number of drivers and, from a calculations perspective, the problem space quickly become astronomical. “A quantum approach offers a different way to solve the problem,” said Chapman. IonQ is developing a quantum computer that does not need to be supercooled. According to its roadmap, the company plans to offer a rack-mounted quantum computer by 2023. Such a system would avoid the latency associated with running quantum computing as a cloud resource, to support applications in high-performance computing that need low-latency connectivity to supercomputers and applications that rely on real-time processing. It is this idea of taking computing from the cloud towards the edge that is driving Intel’s new-generation Loihi chip architecture for neuromorphic computing. Loihi 2, unveiled at the end of September, is Intel’s second-generation neuromorphic research chip. The company has also released Lava, an open source software framework for developing neuro-inspired applications. Neuromorphic computing adapts the fundamental properties of neural architectures found in nature to build a new model of computer architecture. The paper Advanced neuromorphic computing with Loihi describes neuromorphic computing as classes of brain-inspired computation that challenge the von Neumann model. The paper’s authors said one of the most promising application areas of neuromorphic technology is in emulating how the biological brain has evolved to solve the challenges of interacting with dynamic and often unpredictable real-world environments. Mirroring the biological world, a neuromorphic chip has a neuron, synapses for neuron-to-neuron connectivity and dendrites, which enable the neuron to receive messages from multiple neurons. According to Intel’s specifications, each Loihi 2 chip consists of microprocessor cores and up to 128 fully asynchronous neuron cores connected by a network-on-chip (NoC). The neuron cores are optimised for neuromorphic workloads, each implementing a group of “spiking” neurons, including all synapses connecting to the neurons. All communication between neuron cores is in the form of spike messages, which mimics neural networks in a biological brain. Whereas the previous Loihi chip had three microprocessor cores, Intel said it has doubled the number of embedded microprocessor cores in Loihi 2 to six. Garrick Orchard, a researcher at Intel Labs, said: “We are not trying to directly model biology, but taking some things we think are important.” On the Loihi chip, to model biological neuron behaviour, one part of the chip functions as the neuron’s core, he said.“We have a bit of code that describes the neuron,” he added. There are also neuromorphic computing versions of biological synapses and dendrites, all built using asynchronous digital complementary metal-oxide semiconductor (CMOS) technology. Deep neural networks Given that neuromorphic computing is inspired by biological systems, deep neural networks (DNN) for machine learning is one of the application areas being targeted. Orchard added: “Using neuromorphic computing for a DNN is something people understand, but we need to differentiate. We are not trying to be a DNN accelerated. There’s more to AI than deep learning.” Where Loihi 2 and neuromorphic computing as a whole seem to have a good fit is in the area of edge computing for processing sensor data at low latency. Orchard said it could be used within a microphone or camera and offer visual and tactile perception similar to biological systems, in systems such as robotics arm controllers that can adapt to the weight of an object it is trying to lift, or within a drone to provide very low latency control. Within a datacentre environment, a neuromorphic computer could power a recommendation engine or be used in scientific computing to model how forces propagate through a physical structure, said Orchard. There is something of an overlap in application areas with quantum computing. Orchard said a neuromorphic computer can be applied to solve a certain class of hard optimisation, such as scheduling at train operator Deutsche Bahn, which is currently investigating its use. But although there may be an overlap in application areas, Orchard said that, unlike a quantum computer, it is much easier to scale up a neuromorphic computer. The Loihi 2 chip can scale simply by wiring chips together. “You can build very large systems,” he added. With Loihi 2 and Lava, neuromorphic computing is pushing closer to commercialisation, said Orchard. Both Intel and IonQ are looking at putting next-generation computing nearer to the edge. Intel’s approach with Loihi is effectively about designing a semiconductor chip to behave in a similar way to a brain neuron, and then use biologically inspired algorithms to run on this new architecture. Quantum computing is built on a foundation of quantum physics. Although they are very different, both approaches offer an insight into how computationally complex problems could be tackled in the future.
<urn:uuid:2ced90f3-3bce-492d-ace2-192099d5bebf>
CC-MAIN-2022-33
https://www.computerweekly.com/news/252507793/The-power-of-two-Quantum-or-Neuromorphic-computing
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571246.56/warc/CC-MAIN-20220811073058-20220811103058-00609.warc.gz
en
0.94512
1,739
3.625
4
Have you ever seen weird tunnels running through space-time? Well, that just might be a wormhole. We’ve all heard about fancy portals that allow us to travel to far-flung locations in a matter of seconds. It sucks and spits you out somewhere else. Then you ask yourself, “How did it happen?” That one person suddenly produces a sheet of paper, folds it in half, and punches a hole through the fold. There you go, Wormhole! A fold that connects two far-flung places. What are wormholes? A wormhole is a specific solution to Einstein’s general relativity equations that creates a tunnel between two distant places in space or time. The length of this tunnel should ideally be less than the distance between those two places, effectively making the wormhole a shortcut. Wormholes are, as far as we know, only hypothetical. They are a staple of science fiction and have caught the popular imagination. Although they are valid general relativity solutions, scientists have never been able to create a stable wormhole in the real world. However, the stability factor prohibits it from existing. White holes have been explained in a vacuum so far. White holes vanish as soon as we add mass to the system. As a result, the wormholes become unstable. Another possibility is that a wormhole has two black holes at its endpoints. A wormhole cannot simply be ‘built’ in any event. Before a wormhole can be produced quantum entanglement is used. Types of wormholes Lorentzian wormholes (general relativity) and Euclidean wormholes are the two main forms of wormholes studied by physicists (particle physics). ⦁ Lorentzian wormholes Traversable wormholes would allow rapid travel in both directions from one section of the universe to another inside the same universe, as well as transit across universes. These wormholes are essentially time and space shortcuts. The good thing about these wormholes is that we still can’t show evidence after ten years of research. The bad thing is that these odd objects require a lot of negative mass to keep them open and prevent them from collapsing if they exist at all. If Lorentzian wormholes exist, it appears to be quite simple to convert them into time machines. ⦁ Euclidean wormholes Even stranger are Euclidean wormholes, which exist in “imaginary time” and are fundamentally virtual quantum mechanical processes. These things can’t be beautifully explained in terms of a well-behaved classical gravitational field. You’ll need tons of quantum physics knowledge to understand even their most basic qualities. How can you find wormholes? Do you want to know how to spot a wormhole? Such paths could connect one part of our universe to another part of our universe at a different time and/or place, or even to another universe entirely. The strategy focuses on detecting a wormhole near Sagittarius A*, which is assumed to be a supermassive black hole at the galaxy’s center. While there’s no indication of a wormhole there, it’s a good spot to look because wormholes are thought to require high gravitational circumstances like those seen in supermassive black holes. Where to find wormholes? ⦁ Center of the Milky Way In the year 2015, Italian astronomers proposed that a wormhole could exist 27,000 light-years away in the Milky Way’s center. Normally, exotic matter would be required to keep wormholes open, however, scientists believe dark matter is capable of doing so. ⦁ Quantum foam Even though space isn’t empty—at the atomic level, it’s a cauldron of seething energy that comes and goes. In the ‘quantum foam,’ temporary black holes are constantly being formed. However, if we wanted to make one permanent, we’d require a lot more energy. ⦁ Inside a black hole Some researchers believe we’d find a wormhole instead of a singularity at the center, as predicted by general relativity. The jury is yet out on whether it is large enough for a human to pass through. How to identify a wormhole? ⦁ Echo of gravitational waves Gravitational waves from merging black holes fade fast, but two colliding wormholes would produce an echo that may be detected in future studies. Microlensing occurs when a wormhole passes in front of a distant star, bending the star’s light slightly. This method has already been used to locate rogue planets. ⦁ Approaching one Some physicists believe that wormholes are black holes in disguise. It’s a risky venture, but sending something into one would confirm whether or not wormholes exist. Are wormholes dangerous for humans? Moving at a quicker rate than light is one way to travel around the cosmos in a single lifetime, yet we could be able to achieve it in a single second by using a physical wormhole to travel vast distances at once. And it turns out that humans may be able to make the trek but there is a catch to it. There are disadvantages to this strategy, including the fact that such wormholes would be minuscule, meaning that even the most rigorous workout routine would not be enough to make humans slim enough for the journey. From one side to the other, the wormhole traveler would only take around a second. However, anyone not accompanying them would witness hundreds of years pass by. It’s not as if you can just shove them in. This is significant for several reasons, the most important of which is the ever-increasing intersection of physical risks to human survival. From this post, we can conclude that wormholes may connect not only two different sections of the universe but perhaps two entire universes. Similarly, some scientists believe that time travel may be possible if one of the wormholes mouths is manipulated in a precise way. Even if wormholes could be discovered, today’s technology is insufficient to enlarge or stabilize them. However, scientists are continuing to research the concept as a means of space travel in the hopes that technology may be able to use it in the future.
<urn:uuid:17b12147-fdc4-4efa-8dc5-2ce34df9d954>
CC-MAIN-2022-33
https://sciencesite.com/astronomy/wormholes/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572174.8/warc/CC-MAIN-20220815115129-20220815145129-00211.warc.gz
en
0.938829
1,314
3.640625
4
Scientists have reached a significant milestone in the advancement of quantum computing. Quantum computing is a new technology that promises a paradigm shift in computing and faster solutions to a wide range of problems. However, quantum devices are still in their infancy, with most having only a few qubits. This necessitates the use of simulation to develop quantum algorithms and test these devices. While there are many algorithms for simulating quantum circuits, there are (at the time of writing) no tools that use OpenCL to parallelize this simulation, allowing it to take advantage of devices such as GPUs while remaining portable. Quantum computers have the potential to revolutionize science by enabling computations that were previously thought to be impossible. However, there is a long way to go and many difficult tests to pass before quantum computers become a commonplace reality. One of the experiments involves using quantum computers to simulate material properties for next-generation quantum technologies. Quantum computing has the potential to solve some of our planet’s most pressing problems, including those in the environment, agriculture, health, energy, climate, materials science, and others we haven’t yet encountered. Classical computing is becoming increasingly difficult to solve for some of these problems as the system grows in size. We want to learn how to use new and emerging computational technologies. Developing robust strategies early in the history of quantum computing is an important first step toward understanding how to use these machines efficiently in the future.Giulia Galli Researchers from the U.S. Department of Energy’s (DOE) Argonne National Laboratory and the University of Chicago conducted quantum simulations of spin defects, which are specific impurities in materials that could provide a promising foundation for new quantum technologies, in a new study. By correcting for noise introduced by quantum hardware, the researchers improved the accuracy of calculations on quantum computers. “We want to learn how to use new and emerging computational technologies. Developing robust strategies early in the history of quantum computing is an important first step toward understanding how to use these machines efficiently in the future.” Giulia Galli, Argonne National Laboratory and University of Chicago. The research was conducted as part of the Midwest Integrated Center for Computational Materials (MICCoM), a DOE computational materials science program headquartered at Argonne, as well as Q-NEXT, a DOE National Quantum Information Science Research Center. “We do these kinds of simulations to gain a fundamental understanding of material properties and also to tell experimentalists how to eventually better design materials for new technologies,” said Giulia Galli, a professor at the University of Chicago’s Pritzker School of Molecular Engineering and Department of Chemistry, senior scientist at Argonne National Laboratory, Q-NEXT collaborator, and director of MICCoM. “The experimental results for quantum systems are frequently complicated and difficult to interpret. A simulation is necessary to aid in the interpretation of experimental results and the formulation of new predictions.” While quantum simulations have for a long time been done on traditional computers, quantum computers might be able to solve problems that even the most powerful traditional computers today can’t tackle. Reaching that target remains to be seen, as researchers around the work continue the effort to build and use quantum computers. “We want to learn how to use new computational technologies that are emerging,” said Galli, the paper’s lead author. “Developing robust strategies in the early days of quantum computing is a critical first step toward understanding how to use these machines efficiently in the future.” Examining spin defects provides a real-world system for validating quantum computer capabilities. “The vast majority of quantum computer calculations these days are on model systems,” Galli explained. “These models are interesting in theory, but simulating a real material of experimental interest is more valuable to the scientific community as a whole.” Calculating the properties of materials and molecules on quantum computers encounters a problem that classical computers do not: hardware noise. Noisy calculations produce slightly different results each time a calculation is performed; for example, a noisy addition operation might produce values slightly different from 4 each time the question “What is 2 plus 2?” is asked. “The uncertainty in the measurement is dependent on the quantum hardware,” said Argonne scientist Marco Govoni, co-lead author of the study. “One of our accomplishments was that we were able to correct our simulations to compensate for the noise that we encountered on the hardware.” Understanding how to handle noise in quantum computers for realistic simulations is a significant result, according to the study’s first author, University of Chicago graduate student Benchen Huang. “We can expect noiseless quantum computing in the future; learning how to eliminate or cancel noise in our simulation will also teach us whether quantum advantage will become a reality and for which problems in materials science.” Finally, the groundbreaking potential of quantum computers, according to Galli, will motivate more work in this area. “We’ve only just begun,” she explained. “The road ahead appears to be full of exciting challenges.”
<urn:uuid:ba21575b-470f-402f-b2fb-ca7d22a15291>
CC-MAIN-2022-33
https://assignmentpoint.com/quantum-computers-are-used-to-simulate-quantum-materials/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573242.55/warc/CC-MAIN-20220818154820-20220818184820-00211.warc.gz
en
0.928077
1,065
3.953125
4
Cryptography is the art of encoding sensitive information so that only authorized users can decode it. Existing coding methods convert information using a shared key (a sequence of bits) that specifies the conversion details. When communicating partners wish to generate a key for secure communication, they exchange information over a public channel, but in a form difficult for an eavesdropper attempting to extract the key. Current key generation protocols rely on mathematical complexity to achieve this capability. With an accelerating pace of development in quantum computing, such encryption methods are a risk -- quantum computers solve mathematically complex problems much faster compared to conventional computers. For example, the ubiquitous RSA encryption scheme is rendered insecure by employing a quantum-factorization algorithm. Consequently, data that requires long-term security needs to be encrypted in a quantum-secure manner so that they cannot be intercepted today and decrypted tomorrow by a future quantum computer. Quantum Key Distribution and Post-Quantum Cryptography provide schemes that are resilient against this threat posed by quantum computers. Currently, a popular encryption method called the Advanced Encryption Standard-Galois Counter Mode (AES-GCM) is the standard proposed by NIST for two parties to code and decode messages using a shared secret key (i.e. the key is symmetric). To establish this key, the parties follow a key exchange protocol e.g. the Transport Layer Security (TLS) handshake. This process uses an asymmetric key pair, consisting of mathematically-linked private and public keys. One party “signs-off” her transmission with her private key, while the other party mathematically verifies the signature using the public key. Security is based on the difficulty of solving mathematical problems, e.g. factorizing large prime numbers in the RSA protocol. However, a quantum computer will break all existing public key exchange methods -- an adversary deploying Shor’s quantum-factorization algorithm will solve this type of mathematical problem exponentially faster than a classical computer. AES-GCM variants, operating with key sizes at less than 128-bits or less, will also be compromised by the quantum Grover’s search algorithm, which provides a quadratic speed up when searching through all possible keys for deciphering an encrypted message. Fortunately, this threat can be countered by extending the key length to 256 bits, increasing the search time to an impractical extent, even for a quantum computer. Similarly, hash functions producing 256-bit outputs, widely-used for fingerprinting data, are not expected to be broken by this attack. However, one has to assume that a quantum-attack more efficient than Grover’s search does not exist. In response to the quantum computing threat posed to existing cryptographic techniques, two approaches have been developed: Post Quantum Cryptography (PQC) and Quantum Key Distribution (QKD). PQC are mathematically complex algorithms resistant to quantum computing attacks. A suitable PQC public key exchange standard has yet to be established. Potential candidates are currently being reviewed by the National Institute of Standards and Technology (NIST). Quantum Key Distribution has now begun to see commercial adoption. The security of the key material is based on the laws of quantum physics, rather than mathematical complexity, and is therefore quantum-safe. |Cryptographic Algorithm||Type||Purpose||Quantum Safe?||Available Now?| |RSA, ECDSA||Asymmetric||Key Establishment, Signatures||No||Yes| |AES-GCM||Symmetric||Encryption||Larger Key Sizes Needed||Yes| |SHA-3||-||Hash Function||Larger Output Needed||Yes| |Post Quantum Cryptography||Public||Encryption, Key Establishment, Signatures||Yes||No| |Quantum Key Distribution||Symmetric||Key Generation||Yes||Yes| Quantum Key Distribution is the generation and distribution of cryptographic keys secured by quantum physics. Information required to generate the keys are encoded in the properties of photons, which can be distributed over long distances via an optical link. Quantum Key Distribution security leverages on quantum physics, which specifies that an unknown photon state cannot be measured or copied without altering the original state -- an eavesdropper inadvertently reveals her presence as she introduces a detectable, irreversible error. S-Fifteen Instruments Quantum Key Distribution system implements the BBM92 protocol which exhibits fewer vulnerabilities compared to systems running the more common BB84 protocol. We use entangled photon pairs for distributing quantum states -- a single photon of the pair for each party across an optical link. Although each photon of the pair is correlated through quantum entanglement, their individual states are inherently random. This inherent randomness is achieved without active optical components commonly found in prepare-and-measure protocols. The inclusion of active elements, e.g. phase modulators, has been shown to potentially leak information and require countermeasures whose implementation increases system complexity, and requires additional security verification. Our implementation uses exclusively passive components, which simplifies auditing our system for vulnerabilities. Overall, our BBM92 system is intrinsically immune to attacks targeting the following security issues -- addressing these in a BB84 system typically require additional countermeasures: Trojan-horse, multi-photon emissions, phase-correlation between signal pulses. A notable aspect of the BBM92 protocol we have adopted is the direct use of quantum randomness. We do not need to rely on a separate random number generator for controlling the active elements in our hardware -- such devices typically require their own security certification. We rely instead on the intrinsic unpredictability of the polarization of photons when prepared in an entangled state, and the path chosen when passing through a 50:50 beam-splitter, for sources of quantum randomness. Quantum randomness has the advantage of being intrinsically unpredictable and fundamentally inaccessible to any external party -- our system derives randomness directly from the photon source used for communication, rather than from an additional source. Any cryptographic system needs to prove its resilience against attacks. We actively investigate potential vulnerabilities in our implementation and develop countermeasures to improve security. In the past we have looked into the timing information exchanged between communicating parties as a side channel from which the attacker could collect a large amount of information about the key. This vulnerability is neutralized in our current QKD implementation by randomizing photon emission times using a free-running entangled photon source. Currently, we are investigating detector-blinding attacks as part of a comprehensive vulnerability study. Work with us to make your organization quantum-safe.
<urn:uuid:6aec7c5b-9180-4fd0-990c-5fb2462c001b>
CC-MAIN-2022-33
https://s-fifteen.com/pages/qkd-horizontal-timeline
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571086.77/warc/CC-MAIN-20220809185452-20220809215452-00211.warc.gz
en
0.89172
1,342
3.578125
4
Atom Arrays for Superresolution Imaging A touchstone for superresolution optical imaging techniques for cold atomic gases is the precision with which they can resolve individual atoms, which are far smaller than the wavelength of the light used for imaging (see Viewpoint: Zooming in on Ultracold Matter). Now, a team based in the US and Germany has turned this problem on its head. Instead of using light to probe atoms, they use atoms to probe an electromagnetic field, taking advantage of the atoms’ tiny size to image the field with high resolution . Using a one-dimensional array of rubidium atoms trapped in optical tweezers, Emma Deist, at the University of California, Berkeley, and colleagues scanned a light field in a cavity with a spatial resolution below the wavelengths of both the cavity field and of the tweezer light. The work may usher in a new generation of metrology and sensing schemes based on individually controlled neutral atoms. It also introduces, for the first time, large arrays of individual atoms into the cavity quantum-electrodynamics (cavity QED) toolbox. Techniques developed over the last few years have allowed researchers to create large, defect-free atomic arrays that can now approach 1000 atoms in two dimensions . These advances have made atomic arrays a prominent platform for quantum information science, with applications ranging from quantum simulation to quantum computing to precision optical metrology . Deist and colleagues extend the use of atom arrays in the latter category, employing a comparatively modest, one-dimensional array of up to ten atoms to probe the intensity of a standing-wave field inside an optical cavity. The row of atoms used by the researchers extends along the radial direction of the cavity mode (i.e., perpendicular to the cavity axis). They scan this array along the cavity axis, thereby encompassing the two-dimensional plane defined by the cavity’s radial and axial directions (Fig. 1). The cavity field shifts the resonance frequency of an atomic transition, and, by measuring the rate of the fluorescence induced by a calibrated probe beam, the team determines the intensity of the field at any position in the wave. The result is a beautiful map of the profile of the field, revealing a Gaussian field distribution with a radius of , consistent with predictions based on the cavity geometry and on the cavity field’s wavelength of 1560 nm. Since the tweezer array localizing the atoms is positioned with electrically actuated adaptive optical elements, the positions of the atoms can be controlled with a precision much greater than the optical wavelength of the light used to generate the tweezers and detect the atoms. For example, the acousto-optic deflector (AOD) used to generate the tweezers provides a mapping between the radio frequency in the AOD and the position of the tweezer in the focal plane , with a typical conversion of . Thus, radio-frequency precision at the kHz level can, in principle, position a tweezer with a spatial precision of despite the size of the tweezer itself being . Deist and colleagues employ this capability to probe the cavity field on length scales shorter than the field’s 1560-nm wavelength and the 780-nm wavelength of the fluoresced light used to perform the measurement. The ultimate limit on the spatial precision of their approach is set by the accuracy with which the atoms can be localized within their respective tweezers. For atoms cooled to the motional ground state of their optical tweezers, the atomic wave function can be as small as (assuming a trap depth of and a tweezer waist radius of ). In their experiment, Deist and colleagues have slightly hotter atoms which, when combined with other technical factors, limit the spatial resolution to . However, further cooling and other technical upgrades are readily available, offering significant improvements to the possible resolution. The researchers also introduce a second field into the cavity with a wavelength of 781 nm—slightly larger than half that of the first field. This short-wavelength field induces forces sufficiently large to displace the atoms in the tweezers, thereby distorting the measurements of the long-wavelength cavity field made using the method described above. In this way, the team uses the atom array as a field-sensitive force sensor—an atomic-force supermicroscope. The techniques demonstrated by Deist and colleagues could be helpful for diagnosing optical fields in myriad applications—not only in optical cavity systems such as that used in their experiment but also in optical lattice systems, where, for example, the investigation of Hubbard models requires ever-improving control of optical potentials . Beyond optical metrology, marrying a scalable array of single atoms to an optical cavity with strong atom-photon coupling is an enabling breakthrough for quantum information technologies. Systems with atoms coupled to optical cavities—described by cavity QED—offer the ability to entangle atomic spins with photons. Such entanglement could then be used to generate atom-atom entanglement within the cavity, perform nondemolition measurements of atomic spins, and generate remote entanglement between separated systems via a photonic quantum bus—essential operations for quantum computing and communication hardware built on this platform. Cavity QED with single atoms or with atomic ensembles has been studied for the past few decades, but only recently has it been applied to a pair of individual atoms and to an ordered array of atomic ensembles . Deist and colleagues are the first to demonstrate a cavity QED system combined with a scalable array of individual atoms. Another field of application involves schemes with highly excited “Rydberg” states , which have recently become the dominant approach to entanglement of neutral atoms. Entanglement “fidelities” using this approach now exceed 0.99 —beyond what can readily be achieved using photon-mediated interactions in a cavity . However, as Rydberg atoms interact with each other via their electric dipole moments, Rydberg interactions are inherently short range, which complicates the development of many-body entangled states such as logically encoded qubits. Photon-mediated interactions in optical cavities could solve this problem [9, 10] because they act on an infinitely long range. Unfortunately, stray electric fields generated by the dielectric surfaces of the mirrors cause a transient Stark shift in the Rydberg states that adversely affects the ability to reliably perform Rydberg-based entangling operations. Deist and colleagues avoid this problem by employing a near-concentric cavity with a large (1 cm) mirror spacing. Near-concentric cavities provide small mode volumes (and thus strong atom-photon coupling) even with such large mirror separation. Since the mirrors in this setup are sufficiently far away from the atoms, the marriage of cavity QED with strong coupling and Rydberg atom arrays is made possible for the first time. - E. Deist et al., “Superresolution microscopy of optical fields using tweezer-trapped single atoms,” Phys. Rev. Lett. 128, 083201 (2022). - S. Ebadi et al., “Quantum phases of matter on a 256-atom programmable quantum simulator,” Nature 595, 227 (2021). - A. M. Kaufman and K.-K. Ni, “Quantum science with optical tweezer arrays of ultracold atoms and molecules,” Nat. Phys. 17, 1324 (2021). - P. Zupancic et al., “Ultra-precise holographic beam shaping for microscopic quantum control,” Opt. Express 24, 13881 (2016). - S. Welte et al., “Photon-mediated quantum gate between two neutral atoms in an optical cavity,” Phys. Rev. X 8, 011018 (2018). - Avikar Periwal et al., “Programmable interactions and emergent geometry in an array of atom clouds,” Nature 600, 630 (2021). - M. Saffman et al., “Quantum information with Rydberg atoms,” Rev. Mod. Phys. 82, 2313 (2010). - I. S. Madjarov et al., “High-fidelity entanglement and detection of alkaline-earth Rydberg atoms,” Nat. Phys. 16, 857 (2020). - W. Huie et al., “Multiplexed telecommunication-band quantum networking with atom arrays in optical cavities,” Phys. Rev. Res. 3, 043154 (2021). - J. Ramette et al., “Any-to-any connected cavity-mediated architecture for quantum computing with trapped ions or Rydberg arrays,” arXiv:2109.11551 .
<urn:uuid:a709f4f1-4e12-4822-8f2c-ac5a5b38a51d>
CC-MAIN-2022-33
https://physics.aps.org/articles/v15/23
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572908.71/warc/CC-MAIN-20220817122626-20220817152626-00412.warc.gz
en
0.878728
1,849
3.5
4
Scientists are using quantum computing to help them discover signs of life on other planets Quantum computers are assisting researchers in scouting the universe in search of life outside of our planet – and although it’s far from certain they’ll find actual aliens, the outcomes of the experiment could be almost as exciting. During the eight-week program, quantum resources will be combined with classical computing tools to resolve complex calculations with better accuracy, with the end goal of finding out whether quantum computing could provide a useful boost to the work of astrophysicists, despite the technology’s current limitations. SEE: There are two types of quantum computing. Now one company says it wants to offer both Detecting life in space is as tricky a task as it sounds. It all comes down to finding evidence of molecules that have the potential to create and sustain life – and because scientists don’t have the means to go out and observe the molecules for themselves, they have to rely on alternative methods. Typically, astrophysicists pay attention to light, which can be analyzed through telescopes. This is because light – for example, infrared radiation generated by nearby stars – often interacts with molecules in outer space. And when it does, the particles vibrate, rotate, and absorb some of the light, leaving a specific signature on the spectral data that can be picked up by scientists back on Earth. For researchers, therefore, all that is left to do is to detect those signatures and trace back to which molecules they correspond. The problem? MIT researchers have previously established that over 14,000 molecules could indicate signs of life in exoplanets’ atmospheres. In other words, there is still a long way to go before astrophysicists have drawn a database of all the different ways that those molecules might interact with light – of all the signatures that they should be looking for when pointing their telescopes to other planets. That’s the challenge that the University of Hull has set for itself: the institution’s Centre for Astrophysics is effectively hoping to generate a database of detectable biological signatures. For over two decades, explains David Benoit, senior lecturer in molecular physics and astrochemistry at the University of Hull, researchers have been using classical means to try and predict those signatures; but the method is rapidly running out of steam. The calculations carried out by the researchers at the center in Hull involve describing exactly how electrons interact with each other within a molecule of interest – think hydrogen, oxygen, nitrogen and so on. “On classical computers, we can describe the interactions, but the problem is this is a factorial algorithm, meaning that the more electrons you have, the faster your problem is going to grow,” Benoit tells ZDNet. “We can do it with two hydrogen atoms for example, but by the time you have something much bigger, like CO2, you’re starting to lose your nerve a little bit because you’re using a supercomputer and even they don’t have enough memory or computing power to do that exactly.” Simulating these interactions with classical means, therefore, ultimately comes at the cost of accuracy. But as Benoit says, you don’t want to be the one claiming to have detected life on an exo-planet when it was actually something else. Unlike classical computers, however, quantum systems are built on the principles of quantum mechanics – those that govern the behavior of particles when they are taken at their smallest scale: the same principles as those that underlie the behavior of electrons and atoms in a molecule. This prompted Benoit to approach Zapata with a “crazy idea”: to use quantum computers to solve the quantum problem of life in space. “The system is quantum, so instead of taking a classical computer that has to simulate all of the quantum things, you can take a quantum thing and measure it instead to try and extract the quantum data we want,” explains Benoit. Quantum computers, by nature, could therefore allow for accurate calculations of the patterns that define the behavior of complex quantum systems like molecules, without calling for the huge compute power that a classical simulation would require. The data that is extracted from the quantum calculation about the behavior of electrons can then be combined with classical methods to simulate the signature of molecules of interest in space, when they come into contact with light. It remains true that the quantum computers that are currently available to carry out this type of calculation are limited: most systems don’t break the 100-qubit count, which is not enough to model very complex molecules. SEE: Preparing for the ‘golden age’ of artificial intelligence and machine learning Benoit explains that this has not put off the center’s researchers. “We are going to take something small and extrapolate the quantum behavior from that small system to the real one,” says Benoit. “We can already use the data we get from a few qubits, because we know the data is exact. Then, we can extrapolate.” That is not to say that the time has come to get rid of the center’s supercomputers, continues Benoit. The program is only starting, and over the course of the next eight weeks, the researchers will be finding out whether it is possible at all to extract those exact physics on a small scale, thanks to a quantum computer, in order to assist large-scale calculations. “It’s trying to see how far we can push quantum computing,” says Benoit, “and see if it really works, if it’s really as good as we think it is.” If the project succeeds, it could constitute an early use case for quantum computers – one that could demonstrate the usefulness of the technology despite its current technical limitations. That in itself is a pretty good achievement; the next milestone could be the discovery of our exo-planet neighbors.
<urn:uuid:3fcf02e8-8914-489c-8a67-28485d12fa9b>
CC-MAIN-2022-33
https://techandsciencepost.com/news/tech/computerscience/scientists-are-using-quantum-computing-to-help-them-discover-signs-of-life-on-other-planets/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571222.74/warc/CC-MAIN-20220810222056-20220811012056-00013.warc.gz
en
0.939562
1,238
3.515625
4
Today’s computers use pulses of electricity and flipping magnets to manipulate and store data. But information can be processed in many other, weirder, ways . 1. Optical computing There’s nothing weird about encoding data in light – global communications depend on optical fibre. But using light signals to actually process data and carry out computations is still not practical. Optical computers are a worthwhile goal because using light could increase a computer’s speed and the quantity of data it can handle. But trapping, storing and manipulating light is difficult. Research by people like Paul Braun, at the University of Illinois, Urbana Champaign, US, is bringing us closer to this goal. He has created 3D optical waveguides out of photonic crystals that should make possible to trap light, slow it down and bend it around sharp corners, without fear of it escaping. Meanwhile Mikhail Lukin at Harvard University has developed what is essentially an optical version of the transistor that underlies all today’s computing power. Lukin and colleagues have created a way to make a single photon from one light signal switch another light signal on and off. 2. Quantum computing If you want to tear up all the rules of classical computing, look no further than quantum computers. Instead of using electronic bits of information that exist in either 1 or 0 states, they use quantum mechanical effects to create qubits that can be in both states at once. Calculations show that this ability allows many parallel computations to be carried out. As the number of qubits a quantum computer increases, the data it can process increases exponentially. That would make possible things that are unfeasible with today’s computers – such as rapidly factoring extremely large numbers to crack cryptographic keys. 3. DNA computing DNA may be the perfect material for carrying out computations. In a sense that is precisely what it evolved to do: DNA processes data and runs programs stored in sequences of genomic base pairs, as well as coordinating proteins that process information themselves to keep organisms alive. The first person to co-opt these processes for computational problems was Leonard Adleman at the University of Southern California. In 1994, he used DNA to solve a well-known mathematical problem called the 7-point Hamiltonian Path problem. The basic principle is to use sequences of DNA to recognise shorter “input” strands, and to produce different “output” sequences. The results can then be read, for example, through the activation of fluorescent proteins. Recently DNA-computing enthusiasts have become interested in having their creations go to work inside biological systems like the human body. It makes sense, because that’s where they fit in best – and where conventional computers fit in least. 4. Reversible computing Some people think we should be recycling our bits as well as our trash. Hardware companies have long tried to reduce the power consumption of computers. One unusual way to do this is by engineering chips that are “reversible”. Normally every computational operation that involves losing a bit of information also discards the energy used to represent it. Reversible computing aims to recover and reuse this energy. One way to do this, which is being developed by Michael Frank at the University of Florida, US, involves making versions of logic gates than can run in reverse. Every computing operation involves feeding inputs into logic gates, which produce output signals. Instead of discarding the energy of those signals, Frank’s gates run in reverse after every operation. That returns the energy of the output signal to the start of the circuit where it is used to carry a new input signal. It may sound odd, but according to Frank, as computing power improves it won’t be long before chips’ wastefulness will be a major limit to their performance. 5. Billiard Ball computing Computing today involves chain reactions of electrons passing from molecule to molecule inside a circuit. So it makes sense to try and harness other kinds of chain reaction for computing – even dominoes or marbles. Logic gates have been made by carefully arranging dominoes or chutes for marbles to roll down (video). Basic computing circuits like half-adders can also be made. But making something as powerful as a microprocessor this way would require acres of space – unless your balls or dominoes are very small. Researchers at IBM have experimented with logic circuits that use cascades of atoms bouncing off each other like billiard balls to pass information along their length. Such gates can only be used once, but could be significantly smaller than even the tiniest existing transistors. 6. Neuronal computing Why start from scratch when you can borrow already successful ideas? Some researchers hope to get ahead by copying nature’s very own computers. Output from light sensors on the robot was passed to the neurons, and their responses used to control the robot’s movement. The brain cells normally used by the lamprey to orientate itself proved capable of making the robot follow a light source. It’s not the first time a critter’s brain has been co-opted in this way. Claire Rind, a neurobiologist at the University of Newcastle, UK, used recordings of the neuronal activity of locusts watching manoeuvring “TIE-fighter” spacecraft from the movie Star Wars to develop extremely accurate obstacle avoidance systems. 7. Magnetic (NMR) computing Every glass of water contains a computer, if you just know how to operate it. Susan Stepney and colleagues at the University of York, UK, use strong magnetic fields (nuclear magnetic resonance) to control and observe the way in which molecules interact. This method can represent information in 3D and can also exploit the natural dynamics of how molecules interact. If successful it may prove possible to model something as complex as our atmosphere using just a thimble of water. So far, however, the group have only carried out a proof of principle by, somewhat ironically, simulating the water-based computer on a classical computer. 8. Glooper Computer One of the weirdest computers ever built forsakes traditional hardware in favour of “gloopware”. Andrew Adamatzky at the University of the West of England, UK, can make interfering waves of propagating ions in a chemical goo behave like logic gates, the building blocks of computers. The waves are produced by a pulsing cyclic chemical reaction called the Belousov-Zhabotinsky reaction. Adamatzky has shown that his chemical logic gates can be used to make a robotic hand stir the mixture in which they exist. As the robot’s fingers stimulate the chemicals further reactions are triggered that control the hand. The result is a sort of robotic existential paradox – did the chemical brain make the robot’s hand move, or the hand tell the brain what to think? Eventually Adamatzky aims to couple these chemical computers to an electroactive gel-based “skin” to create a complete “blob-bot”. 9. Mouldy computers Even a primitive organism like slime mould can be used to solve problems that are tricky for classical computers. Toshiyuki Nakagaki at the Institute of Physical and Chemical Research in Nagoya, Japan, has shown that slime mould can work out the shortest route through a maze. In his experiments, the masses of independent amoeba-like cells that act as a single organism would initially spread out to explore all the possible paths of a maze. But when one train of cells found the shortest path to some food hidden at the maze’s exit the rest of the mass stopped exploring. The slime mould then withdrew from the dead end routes and followed the direct path to the food. This is interesting for computer scientists because maze solving is similar to the travelling salesman problem, which asks for the shortest route between a number of points in space. The problem quickly scales in complexity as more points are added, making it a tough problem for classical computers. 10. Water wave computing Perhaps the most unlikely place to see computing power is in the ripples in a tank of water. Using a ripple tank and an overhead camera, Chrisantha Fernando and Sampsa Sojakka at the University of Sussex, used wave patterns to make a type of logic gate called an “exclusive OR gate”, or XOR gate. Perceptrons, a type of artificial neural network, can mimic some types of logic gates, but not a XOR. Only encoding the behaviour of a XOR gate into ripples made it possible for the perceptron to learn how that gate works.
<urn:uuid:fce87b76-25d5-4b5c-bc90-995c79862728>
CC-MAIN-2022-33
https://www.newscientist.com/article/dn13656-ten-weirdest-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570868.47/warc/CC-MAIN-20220808152744-20220808182744-00419.warc.gz
en
0.923314
1,808
3.78125
4
The researchers, from the University of Cambridge, were able to inject a ‘needle’ of highly fragile quantum information in a ‘haystack’ of 100,000 nuclei. Researchers during a recent study have found a way to use light and a single electron to communicate with a cloud of quantum bits and sense their behaviour. This discovery will help in making it possible to detect a single quantum bit in a dense cloud. The researchers, from the University of Cambridge, were able to inject a ‘needle’ of highly fragile quantum information in a ‘haystack’ of 100,000 nuclei. Using lasers to control an electron, the researchers could then use that electron to control the behaviour of the haystack, making it easier to find the needle. They were able to detect the ‘needle’ with a precision of 1.9 parts per million: high enough to detect a single quantum bit in this large ensemble. The technique makes it possible to send highly fragile quantum information optically to a nuclear system for storage, and to verify its imprint with minimal disturbance, an important step in the development of a quantum internet based on quantum light sources. The results are reported in the journal Nature Physics. The first quantum computers – which will harness the strange behaviour of subatomic particles to far outperform even the most powerful supercomputers – are on the horizon. However, leveraging their full potential will require a way to network them: a quantum internet. Channels of light that transmit quantum information are promising candidates for a quantum internet, and currently, there is no better quantum light source than the semiconductor quantum dot: tiny crystals that are essentially artificial atoms. However, one thing stands in the way of quantum dots and a quantum internet: the ability to store quantum information temporarily at staging posts along with the network. “The solution to this problem is to store the fragile quantum information by hiding it in the cloud of 100,000 atomic nuclei that each quantum dot contains, like a needle in a haystack,” said Professor Mete Atature from Cambridge’s Cavendish Laboratory, who led the research. “But if we try to communicate with these nuclei like we communicate with bits, they tend to ‘flip’ randomly, creating a noisy system.” The cloud of quantum bits contained in a quantum dot don’t normally act in a collective state, making it a challenge to get information in or out of them. However, Atature and his colleagues showed in 2019 that when cooled to ultra-low temperatures also using light, these nuclei can be made to do ‘quantum dances’ in unison, significantly reducing the amount of noise in the system. Now, they have shown another fundamental step towards storing and retrieving quantum information in the nuclei. By controlling the collective state of the 100,000 nuclei, they were able to detect the existence of the quantum information as a ‘flipped quantum bit’ at an ultra-high precision of 1.9 parts per million: enough to see a single bit flip in the cloud of nuclei. “Technically this is extremely demanding,” said Atature, who is also a Fellow of St John’s College. “We don’t have a way of ‘talking’ to the cloud and the cloud doesn’t have a way of talking to us. But what we can talk to is an electron: we can communicate with it sort of like a dog that herds sheep.” Using the light from a laser, the researchers are able to communicate with an electron, which then communicates with the spins, or inherent angular momentum, of the nuclei. By talking to the electron, the chaotic ensemble of spins starts to cool down and rally around the shepherding electron; out of this more ordered state, the electron can create spin waves in the nuclei. “If we imagine our cloud of spins as a herd of 100,000 sheep moving randomly, one sheep suddenly changing direction is hard to see,” said Atature. “But if the entire herd is moving as a well-defined wave, then a single sheep changing direction becomes highly noticeable.” In other words, injecting a spin-wave made of a single nuclear spin-flip into the ensemble makes it easier to detect a single nuclear spin-flip among 100,000 nuclear spins. Using this technique, the researchers are able to send information to the quantum bit and ‘listen in’ on what the spins are saying with minimal disturbance, down to the fundamental limit set by quantum mechanics. “Having harnessed this control and sensing capability over this large ensemble of nuclei, our next step will be to demonstrate the storage and retrieval of an arbitrary quantum bit from the nuclear spin register,” said co-first author Daniel Jackson, a PhD student at the Cavendish Laboratory. “This step will complete a quantum memory connected to light – a major building block on the road to realising the quantum internet,” said co-first author Dorian Gangloff, a Research Fellow at St John’s College. Besides its potential usage for a future quantum internet, the technique could also be useful in the development of solid-state quantum computing.
<urn:uuid:4a614160-c33f-4931-83e5-ac7e93a235e8>
CC-MAIN-2022-33
https://vigorcolumn.com/science/light-quantum-information-100000-nuclear/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571982.99/warc/CC-MAIN-20220813172349-20220813202349-00419.warc.gz
en
0.923521
1,102
3.515625
4
Future of Information Systems Today’s computers use bits as data units. A bit value can only be either 0 or 1, as we discussed in Chapter 2. Quantum computers use qubit, which can represent a combination of both 0 and 1 simultaneously, leveraging the principles of quantum physics. This is a game-changer for computing and will disrupt all aspects of information technology. The benefits include a significant speed increase in calculations that will enable solutions for unsolvable problems today. However, there are many technical problems to be solved yet since all the IS elements will need to be re-imagined. Google announced the first real proof of a working quantum computer in 2019 (Menard, et al., 2020). Menard et al. also indicated that the industries that would benefit from this new computer type would be industries with complex problems to solve, such as pharmaceutical, autonomous vehicles, cybersecurity, or intense mathematical modeling such as Finance, Energy. For a full report, please visit McKinsey.com. A blockchain is a set of blocks or a list of records linked using cryptography to record a transaction and track assets in a network. Anything of value can be considered an asset and be tracked. Examples include a house, cash, patents, a brand. Once a transaction is recorded, it cannot be changed retroactively. Hence, it is considered highly secured. Blockchain has many applications, but bitcoin is mostly associated with it because it was the first application using blockchain technology. Sometimes bitcoin and blockchain are mistakenly meant to be the same thing, but they are not. Bitcoin is digital money or a cryptocurrency. It is an open-source application built using blockchain technology. It is meant to eliminate the need for a central bank since people can directly send bitcoins. Simply put, bitcoin keeps track of a list of who sends how many bitcoins to another person. One difference with today’s money is that a bitcoin's value fluctuates since it works like a stock. Anyone can buy different bitcoin cryptocurrencies or other cryptocurrencies on bitcoin exchanges such as Coinbase. Bitcoin and other cryptocurrencies are accepted by a few organizations such as Wikimedia, Microsoft, Wholefoods. However, bitcoin’s adoption is still uncertain. If the adoption by major companies is accelerated, then banking locally and globally will change significantly. Some early businesses have begun to use blockchain as part of their operations. Kroger uses IBM blockchain to trace food from the farms to its shelves to respond to food recalls quickly (IBM.com.) Amazon Managed Blockchain is a fully managed service that makes it easy to create and manage scalable blockchain networks. Artificial Intelligence (AI) Artificial intelligence (AI) comprises many technologies to duplicate the functions of the human brain. It has been in research since the 1950s and has seen an ebb and flow of interest. To understand and duplicate a human brain, AI is a complex interdisciplinary effort that involves multiple fields such as computer science, linguistics, mathematics, neuroscience, biology, philosophy, and psychology. One approach is to organize the technologies as below, and commercial solutions have been introduced: Expert systems: also known as decision support systems, knowledge management. These solutions have been widely deployed for decades, and we have discussed in earlier chapters such as knowledge management, decision support, customer relationship management system, financial modeling. Robotics: this trend is more recent even though it has been in research for decades. Robots can come in different shapes, such as a familiar object, an animal, or a human. It can be tiny or as big as it can be designed: A nanobot is a robot whose components are on the scale of about a nanometer. A robot with artificial skins to look like a human is called a humanoid. They are being deployed in limited situations such as assistants to police, senior citizens who need help, etc. Two popular robots are Atlas from Boston Dynamic and humanoid Sophia from Hanson Robotics. Consumer products such as the smart vacuum iRobot Roomba are now widely available. The adoption of certain types of robots has accelerated in some industries due to the pandemic: Spot, the dog-like robot from Boston dynamics, is used to patrol for social distancing. Natural language: voice as a form of communication with our smart devices is now the norm—for example, Apple’s Siri, Amazon’s Alexa. Vision: advanced progress has been made towards camera technologies and solutions to store and manipulate visual images. Examples include advanced security systems, drones, face recognition, smart glasses, etc. Learning systems: Learning systems allow a computer (i.e., a robot) to react to situations based on the immediate feedback it receives or the collection of feedback stored in its system. Simple forms of these learning systems can be found today in customers' online-chat support, also known as ‘AI bot.’ One such example is IBM’s Watson Assistant. Neural networks: This is a collection of hardware and software technologies. The hardware includes wearable devices that allow humans to control machines using thoughts such as Honda Motor’s Brain-Machine Interface. This is still in the research phase, but its results can impact many industries such as healthcare. The goal of 100% duplicating a human brain has not been achieved yet since no AI systems have passed the Alan Turing test known as Turing Test to answer the question 'Can a machine think?" Alan is widely considered a founder of the AI field and devises a test to a machine's ability to show the equivalent intelligent behavior to that humans. The test does not look for correct answers but rather answers closely resemble those a human would give. Even though AI has not been to duplicate a human brain yet, its advances have introduced many AI-based technologies such as AI bot, robotics in many industries. AI progress has contributed to producing many practical business information systems that we discussed throughout this book such as, voice recognition, cameras, robots, autonomous cars, etc. It has also raised concerns over how ethical is the development of some AI technologies as we discussed in previous chapters. Advances in artificial intelligence depend on the continuous effort to collect vast amounts of data, information, and knowledge, advances in hardware, sophisticated methods to analyze both unconnected and connected large datasets to make inferences to create new knowledge, supported by secured, fast networks. Boston Dynamics’ dog-like robot Spot is being used on coronavirus social distancing patrol (2020). Retrieved December 13, 2020, from https://www.cnbc.com/2020/05/15/boston-dynamics-dog-like-robot-spot-used-on-social-distancing-patrol.html. Changing your idea of what robots can do. Retrieved December 13, 2020, from https://www.bostondynamics.com/. Honda's Brain-Machine Interface: controlling robots by thoughts alone (2009). Retrieved December 11, 2020, from https://newatlas.com/honda-asimo-brain-machine-interface-mind-control/11379/#:~:text=Honda%20Research%20Institute%2C%20Japan%2C%20has,using%20nothing%20more%20than%20thought.&text=Then%2C%20the%20doors%20will%20be,and%20act%20directly%20upon%20them. Kroger uses IBM Blockchain technology for farm to fork food traceability. Retrieved December 11, 2020, from https://mediacenter.ibm.com/media/Kroger+uses+IBM+Blockchain+technology+for+farm+to+fork+food+traceability/0_527q9xfy. Menard A., Ostojic I., and Patel M. (2020, February 6). A game plan for quantum computing. Retrieved December 10, 2020, from https://www.mckinsey.com/business-functions/mckinsey-digital/our-insights/a-game-plan-for-quantum-computing. The smarter AI assistant for business. Retrieved December 11, 2020, from https://www.ibm.com/cloud/watson-assistant-2/
<urn:uuid:a702c9ad-5a04-4b0d-8cd3-03a9bfbd1e5f>
CC-MAIN-2022-33
https://workforce.libretexts.org/Courses/Prince_Georges_Community_College/INT_1010%3A_Concepts_in_Computing_(PGCC)/06%3A_Information_Systems_for_Business_(Revised_1st_Edition_2021)/6.03%3A_Information_Systems_Beyond_the_Organization/6.3.03%3A_Future_Trends_in_Information_Systems/6.3.3.04%3A_Future_of_Information_Systems
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571483.70/warc/CC-MAIN-20220811164257-20220811194257-00221.warc.gz
en
0.943073
1,730
3.671875
4
How a Student Photographed a Single Atom With a Store-Bought Camera Look closely and you’ll see it: a pale, purple pixel hanging in a black field between two cylindrical needles.What looks like a shimmering speck of dust is actually something much, much smaller: a single atom of strontium, isolated in an ion-trap machine at the University of Oxford. That’s small. Really small. Each atom is roughly 0.25 nanometers (or billionths of a meter) across; billions of the atoms would fit comfortably inside a single red blood cell. How do you capture a photo of something this seemingly infinitesimally small? One photographer, David Nadlinger, used a standard digital camera — but he had some help setting up the shot courtesy of Oxford’s Ion Trap Quantum Computing lab, where he is researching for his Ph.D. On Feb. 12, Nadlinger won first place in a national science photography competition organized by the Engineering and Physical Sciences Research Council for capturing this rare photo of a single illuminated atom. “I think what makes this picture particularly interesting to people is that you can see the surrounding apparatus,” Nadlinger told Live Science. “And I think people are also surprised by how big the atom looks here. … I hope I’m not undoing 100 years of science education with this photo — atoms actually are unbelievably small!” To be clear, Nadlinger said, the purple speck at the center of this photo is not the true size of the strontium atom itself; it’s the light from an array of surrounding lasers being re-emitted by the atom. When bathed in a specific wavelength of blue light, strontium creates a glow hundreds of times wider than the radius of the atom itself (which is about a quarter of a nanometer, or 2.5×10 to the -7 meters, Nadlinger said). This glow would be barely perceptible with the naked eye but becomes apparent with a little camera manipulation. “The apparent size you see in the picture is what we’d call optical aberration,” Nadlinger said. “The lens we’re seeing it through is not perfect — also it’s slightly out of focus and slightly overexposed. You could compare it to looking at the stars in the night sky, which appear bright but are actually much, much smaller than the size they seem to be, just because our eyes (or the camera) don’t have enough resolution to process them.” So, seeing a single atom with the naked eye is impossible. Trapping one in a lab, however, is a little more doable. To catch an ion by the toe To make a single atom camera-ready like this, researchers first need to turn it into an ion: an atom with an unequal number of protons and electrons, giving it a positive or negative net charge. “We can only ever trap charged particles,” Nadlinger said. “So, we take a stream of neutral strontium atoms, which come from an oven, and shine lasers at them to selectively photo-ionize them. This way, we can create single ions.” When placed in an ion-trap apparatus, single atoms are held in place by four blade-shaped electrodes like those seen above and below the strontium speck in Nadlinger’s photo (two additional electrodes are out of view). These electrodes create a current that keeps the atom fixed on the vertical axis; the two needle-shaped cylinders on either side of the atom keep it trapped horizontally. As the currents from these electrodes interact, they create what is called a rotating saddle potential. “You can see videos online where people literally take a saddle and rotate it and put a ball on it; because of the rotation, the ball actually stays in the center of the saddle. So that’s what these electrodes do to confine the ion,” Nadlinger said. Once an atom is confined, an array of lasers hits the atom, which scatters light in all directions; in Nadlinger’s photo, you can see traces of the blue laser throughout the background. Using this system, researchers can potentially trap strings of hundreds of ions between the little electrodes, resulting in some stunning imagery. “On our website, we have a picture of nine ions trapped in a string,” Nadlinger said. “In terms of the science, that’s actually more interesting than having a single bright pixel surrounded by the ion trap. But to illustrate the concept, this might be more appealing.” Nadlinger does not believe he is the first researcher to take such a photo, but he may well be the most successful at capturing the public’s attention with one. “A group led by Hans Dehmelt, a pioneer of ion trapping and a Nobel laureate [in 1989], once took a picture of a single barium atom in their lab,” Nadlinger said. “It was a single bright speck on a dark background, apart from some laser scatter. There’s this story that they submitted this image to some conference proceedings — and the image editor just stamped out the ion because he thought it was a speck of dust.” Source: Live Science
<urn:uuid:8d4de8af-ad05-4ffd-8183-8e83d4670fff>
CC-MAIN-2022-33
https://theyoungvision.com/student-photographed-single-atom-store-bought-camera/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571198.57/warc/CC-MAIN-20220810161541-20220810191541-00622.warc.gz
en
0.932632
1,118
3.703125
4
If information cannot be destroyed, what happens when a black hole that has swallowed a mega belly full of information disappears? A seemingly unsolvable black hole paradox first proposed by physicist Stephen Hawking may finally be resolved, by wormholes through space time. The “black hole information paradox” refers to the fact that information cannot be destroyed in the universe, and yet when a black hole eventually evaporates, any information sucked up by this cosmic vacuum cleaner should be long gone. . The new study proposes that the paradox could be resolved by means of nature. latest cheat code: wormholes or passages in space-time. “A wormhole connects the inside of the black hole and the outside radiation, like a bridge,” said Kanato Goto, a theoretical physicist at RIKEN’s Interdisciplinary Program for Theoretical and Mathematical Sciences in Japan. he said in a press release. According to Goto’s theory, a second surface appears within a black hole’s event horizon, the boundary beyond which nothing can escape. Threads from a wormhole connect this surface to the outside world, entangling information between the interior of the black hole and the radiation that seeps through its edges. Black hole information paradox In the 1970s, Hawking discovered that black holes aren’t exactly black, but at first he didn’t realize what a big problem he had created. Before his discovery, physicists had assumed that black holes were extremely simple. Sure, all sorts of complicated stuff fell into it, but the black holes locked away all that information, never to be seen again. But Hawking discovered that black holes do release radiation and it can eventually evaporate completely, in a process now known as Hawking radiation, but this radiation itself did not carry information. In fact, he couldn’t; by definition, the event horizon of a black hole prevents information from getting out. So when a black hole evaporates and eventually disappears from the universe, where did all its locked up information go? This is the black hole information paradox. One possibility is that the information could be destroyed, which seems to violate everything we know about physics. (For example, if information can be lost, the past cannot be reconstructed from present events or future events cannot be predicted.) Instead, most physicists try to resolve the paradox by finding a way, any way, for the information inside the black hole to escape. through Hawking radiation. Thus, when the black hole disappears, the information is still present in the universe. Either way, describing this process requires new physics. “This suggests that general relativity and quantum mechanics as they currently stand are incompatible with each other,” said Goto. “We need to find a unified framework for quantum gravity technology.” A tale of two entropies In 1992, physicist Don Page, a former Hawking graduate student, saw the problem of the information paradox differently. He started by looking at quantum entanglement, that is, when distant particles have their fates linked. This entanglement acts as the quantum mechanical connection between the Hawking radiation and the black hole itself. Page measured the amount of entanglement by calculating “entanglement entropy,” which is a measure of the amount of information contained in the entangled Hawking radiation. In Hawking’s original calculation, no information is leaked and the entanglement entropy always increases until the black hole finally disappears. But Page found that if black holes release information, the entropy of entanglement initially increases; then, halfway through the black hole’s lifetime, it decreases before finally reaching zero, when the black hole evaporates (i.e., all the information inside the black hole eventually escaped). If Page’s calculations are correct, it suggests that if black holes allow information to leak out, then something special must be happening halfway through their lives. Although Page’s work hasn’t solved the information paradox, it has given physicists something juicy to work on. If they could give black holes a mid-life crisis, then this solution could resolve the paradox. More recently, various teams of theorists have applied mathematical techniques borrowed from string theory – an approach to unifying Einstein’s relativity with quantum mechanics – to examine this problem. They were examining how spacetime near an event horizon might be more complex than scientists initially thought. How complex? As intricate as possible, allowing for all kinds of bends and curves on a microscopic scale. His work led to two surprising features. One was the appearance of an “extreme quantum surface” just below the event horizon. This inner surface moderates the amount of information that comes out of the black hole. At first glance, it is not very useful. But when the black hole is in the middle of its life, it begins to dominate entanglement, reducing the amount of information released, so that the entropy of entanglement follows Page’s predictions. Second, the calculations revealed the presence of wormholes, lots of them. These wormholes seemed to connect the extreme quantum surface to the outside of the black hole, allowing information to bypass the event horizon and be released as Hawking radiation. But this earlier work only applied to highly simplified “toy” models (such as one-dimensional versions of black holes). With Goto’s work, this same result has now been applied to more realistic scenarios, a major advance that brings this work closer to explaining reality. However, there are many questions. For one thing, it is not yet clear whether the wormholes that appear in the math are the same wormholes that we think of as shortcuts in time and space. They are so deeply buried in mathematics that it is difficult to determine their physical meaning. On the one hand, it could literally mean wormholes going in and out of an evaporating black hole. Or it could simply be a sign that spacetime near a black hole is not local, which is a feature of entanglement: two entangled particles don’t need to be in causal contact to influence each other. Originally published on Live Science.
<urn:uuid:c44e78ec-be54-4eb1-aca2-dfb171a957ac>
CC-MAIN-2022-33
https://mesonstars.com/space/spiderweb-of-wormholes-could-solve-a-black-hole-paradox-1st-proposed-by-stephen-hawking/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571097.39/warc/CC-MAIN-20220810010059-20220810040059-00622.warc.gz
en
0.936706
1,262
3.609375
4
Prior to the mid-18th century, it was tough to be a sailor. If your voyage required east-west travel, you couldn't set out to a specific destination and have any real hope of finding it efficiently. At the time sailors had no reliable method for measuring longitude, the coordinates that measure a point's east-west position on the globe. To find longitude, you need to know the time in two places--the ship you're on, and the port you departed from. By calculating the difference between those times, sailors got a rough estimate of their position. The problem: The clocks back then just couldn't keep time that well. They lost their home port's time almost immediately after departing. Today, time is just as important to navigation, only instead of calculating positioning with margins of errors measured in miles and leagues, we have GPS systems that are accurate within meters. And instead of springs and gears, our best timepieces rely on cesium atoms and lasers. But given the history, it's fitting that scientists like Clayton Simien, a National Science Foundation (NSF)-funded physicist at the University of Alabama at Birmingham who works on atomic clocks, was inspired by the story of John Harrison, an English watchmaker who toiled in the 1700s to come up with the first compact marine chronometer. This device marked the beginning of the end for the "longitude problem" that had plagued sailors for centuries. "If you want to measure distances well, you really need an accurate clock," Simien said. Despite the massive leaps navigation technology has made since Harrison's time, scientists--many NSF-funded--are looking for new ways to make clocks more accurate, diminishing any variables that might distort precise timekeeping. Some, for example, are looking for ways to better synchronize atomic clocks on earth with GPS satellites in orbit, where atmospheric distortion can limit signal accuracy to degrees that seem minute, but are profound for the precise computer systems that govern modern navigation. The National Institute of Standards and Technology, Department of Defense, join NSF in the search for even better atomic clocks. But today's research isn't just about building a more accurate timepiece. It's about foundational science that has other ramifications. 'One Mississippi,' or ~9 billion atom oscillations Atomic clocks precisely measure the ticks of atoms, essentially tossing cesium atoms upward, much like a fountain. Laser-beam photons "cool down" the atoms to very low temperatures, so the atoms can transfer back and forth between a ground state and an excited state. The trick to this process is finding just the right frequency to move directly between the two states and overcome Doppler shifts that distort rhythm. (Doppler shifts are increases or decreases in wave frequency as the waves move closer or further away -- much like the way a siren's sound changes depending on its distance.) Laser improvements have helped scientists control atoms better and address the Doppler issue. In fact, lasers helped to facilitate something known as an optical lattice, which can layer atoms into "egg cartons" to immobilize them, helping to eliminate Doppler shifts altogether. That shift between ground state and excited state (better known as the atomic transition frequency) yields something equivalent to the official definition of a second: 9,192,631,770 cycles of the radiation that gets a cesium atom to vibrate between those two energy states. Today's atomic clocks mostly still use cesium. NSF-funded physicist Kurt Gibble, of Pennsylvania State University, has an international reputation for assessing accuracy and improving atomic clocks, including some of the most accurate ones in the world: the cesium clocks at the United Kingdom's National Physical Laboratory and the Observatory of Paris in France. But accurate as those are, Gibble says the biggest advance in atomic clocks will be a move from current-generation microwave frequency clocks -- the only kind currently in operation -- to optical frequency clocks. The difference between the two types of clocks lies in the frequencies they use to measure the signals their atoms' electrons emit when they change energy levels. The microwave technology keeps reliable time, but optical clocks offer significant improvements. According to Gibble, they're so accurate they would lose less than a second over the lifetime of the universe, or 13.8 billion years. Despite that promise of more accurate performance, the optical frequency clocks don't currently keep time. "So far, optical standards don't run for long enough to keep time," Gibble said. "But they will soon." Optical frequency clocks operate on a significantly higher frequency than the microwave ones, which is why many researchers are exploring their potential with new alkaline rare earth elements, such as ytterbium, strontium and gadolinium. "The higher frequency makes it a lot easier to be more accurate," Gibble said. Gibble is starting work on another promising elemental candidate: cadmium. Simien, whose research employs gadolinium, has focused on minimizing--or eliminating if possible--key issues that limit accuracy. "Nowadays, the biggest obstacle, in my opinion is the black body radiation shift," Simien said. "The black body radiation shift is a symptomatic effect. We live in a thermal environment, meaning its temperature fluctuates. Even back in the day, a mechanical clock had pieces that would heat up and expand or cool down and contract. "A clock's accuracy varied with its environment. Today's system is no longer mechanical and has better technology, but it is still susceptible to a thermal environment's effects. Gadolinium is predicted to have a significantly reduced black body relationship compared to other elements implemented and being proposed as new frequency standards." While Simien and Gibble agree that optical frequency research represents the next generation of atomic clocks, they recognize that most people don't really care if the Big Bang happened 13 billion years ago or 13 billion years ago plus one second. "It's important to understand that one more digit of accuracy is not always just fine tuning something that is probably already good enough," said John Gillaspy, an NSF program director who reviews funding for atomic clock research for the agency's physics division. "Extremely high accuracy can sometimes mean a qualitative breakthrough which provides the first insight into an entirely new realm of understanding--a revolution in science." Gillaspy cited the example of American physicist Willis Lamb, who in the middle of the last century measured a tiny frequency shift that led theorists to reformulate physics as we know it, and earned him a Nobel Prize. While research to improve atomic clocks is sometimes dismissed as trying to make ultra-precise clocks even more precise, the scientists working in the field know their work could potentially change the world in profound, unexpected ways. "Who knows when the next breakthrough will come, and whether it will be in the first digit or the 10th?" Gillaspy continued. "Unfortunately, most people cannot appreciate why more accuracy matters." From Wall Street to 'Interstellar' Atomic clock researchers point to GPS as the most visible application of the basic science they study, but it's only one of this foundational work's potential benefits. Many physicists expect it to provide insight that will illuminate our understanding of fundamental physics and general relativity. They say new discoveries will also advance quantum computing, sensor development and other sensitive instrumentation that requires clever design to resist natural forces like gravity, magnetic and electrical fields, temperature and motion. The research also has implications beyond the scientific world. Financial analysts worry that worldwide markets could lose millions due to ill-synchronized clocks. On June 30 th at 7:59:59 p.m. EDT, the world adds what is known as a "leap second" to keep solar time within 1 second of atomic time. History has shown, however, that this adjustment to clocks around the world is often done incorrectly. Many major financial markets are taking steps ranging from advising firms on how to deal with the adjustment to curtailing after-hours trading that would occur when the change takes place. Gibble says the goal of moving to ever more accurate clocks isn't to more precisely measure time over a long period. "It's the importance of being able to measure small time differences." GPS technology, for example, looks at the difference of the propagation of light from multiple satellites. To provide location information, several GPS satellites send out signals at the speed of light--or one foot per nanosecond--saying where they are and what time they made their transmissions. "Your GPS receiver gets the signals and looks at the time differences of the signals--when they arrive compared to when they said they left," Gibble said. "If you want to know where you are to a couple of feet, you need to have timing to a nanosecond--a billionth of a second." In fact, he said, if you want that system to continue to accurately operate for a day, or for weeks, you need timing significantly better than that. Getting a GPS to guide us in deserts, tropical forests, oceans and other areas where roads aren't around to help as markers along the way--one needs clocks with nanosecond precision in GPS satellites to keep us from getting lost. And if you're not traveling to those locales, then there's still the future to think about. "Remember the movie, 'Interstellar,'" Simien said. "There is someone on a spaceship far away, and Matthew McConaughey is on a planet in a strong gravitational field. He experiences reality in terms of hours, but the other individual back on the space craft experiences years. That's general relativity. Atomic clocks can test this kind of fundamental theory and its various applications that make for fascinating science, and as you can see, they also expand our lives."
<urn:uuid:602b8478-08f3-4cd9-8eb9-809a67cae8ff>
CC-MAIN-2022-33
https://beta.nsf.gov/news/precious-time
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573623.4/warc/CC-MAIN-20220819035957-20220819065957-00022.warc.gz
en
0.952609
2,024
3.890625
4
It was the eminent French philosopher and mathematician René Descartes who first suggested that the human mind may operate outside of the physical realm. He called it his mind-matter duality theory. The idea was that the human brain was above the physical world and could use its power to influence it. The “father of modern philosophy,” may have been more prescient than he’d ever realize. Currently, a theoretical physicist is gearing up to test this theory in modern form. Lucien Hardy of the Perimeter Institute in Canada, will use an EEG machine, to see if the mind operates on the quantum level or outside of it. The results could have vast implications for our understanding of consciousness and free will. The experiment centers on the concept of quantum entanglement. Here, particles influence each other, even when far apart. Photons are light particles. Say using a laser, you shoot them through a crystal. Two photons suddenly become entangled. Afterwards, they’re move quite a distance apart. If you interact with one photon it affects the other, instantaneously, no matter their distance from one another. In the 1930’s, Einstein—puzzled by this, called it a “spooky action at a distance.” One problem is that acting upon one particle causes changes in the other faster than the speed of light, something relativity states is impossible. Another weird effect, when we measure the spin of one entangled particle, the other always has the opposite spin, be it just around the corner from its partner or across the galaxy. This is as if measuring one influences the spin of the other at a rate faster than the speed of light. Is it true or is something else going on? This is one of the greatest mysteries of quantum physics. In 1964, famed physicist John Bell developed an experiment to test the spin of entangled particles, to find out if they held some kind of hidden information, as Einstein thought, or if the particles actually communicated with each other at a rate faster than the speed of light. He developed the Bell test to evaluate the spin of entangled particles. Here, particles are separated. One goes to location A and the other to location B. The spin of each is evaluated at each station. Since the angle of the measurement is taken at random, it isn’t possible to know the settings at any location beforehand. Each time particles are measured like this, when one registers a certain spin, say clockwise, the other always comes up its opposite. According to Dr. Lucien, an experiment based off of the Bell test should be able to tell us if the human brain operates within quantum mechanics or outside of it. He’s recruiting 100 participants. Each will have their brain attached to an EEG machine through a skull cap covered with sensors. These record brainwaves. Hardy wrote, “The radical possibility we wish to investigate is that, when humans are used to decide the settings (rather than various types of random number generators), we might then expect to see a violation of Quantum Theory in agreement with the relevant Bell inequality.” Participants will be 100 km. (approx. 62 mi.) apart. The signals from these caps will be used to change the settings on a measuring device. If the measurements don’t match up as expected, it could challenge our current understanding of physics. “[If] you only saw a violation of quantum theory when you had systems that might be regarded as conscious, humans or other animals,” Hardy writes, it could mean that the consciousness is able to supersede natural law. This would give a tremendous boost in the notion of free will, as a person’s will would literally defy the laws of physics. Yet, “It wouldn’t settle the question,” according to Hardy. Prevailing physics and neuroscience theories have favored predeterminism in recent decades. This experiment may also offer insight into human consciousness, where it stems from inside the brain, and even what it might be. What are the implications if we find out the human mind operates outside of quantum physics? The study fits into the fledgling field of quantum biology, which is shaking up our understanding of traditional biology in quite a number of ways. For instance, researchers at the University of California, Berkeley and at Washington University, in St. Louis, have found quantum effects operating within photosynthesis. Biophysicist Luca Turin has a theory, based on quantum physics, to explain how our sense of smell works. Others in quantum biology theorize about how antioxidants and enzymes work, among other processes. Splintering off of this is quantum neuroscience. Researchers here are looking at how quantum mechanics might explain the processes of the brain. Stuart Hameroff is a practicing anesthesiologist, and the director of the Center for Consciousness Studies, at the University of Arizona. He’s offered a theory using quantum mechanics to explain how anesthesia works. According to Dr. Hameroff, consciousness may also be born on the quantum level. Physicist Matthew Fisher at the University of California, Santa Barbara, has proposed a way in which the brain might operate as a quantum computer. Hardy’s experiment could support Hameroff and even Fisher’s conclusions. Others have doubted the claim. Since a quantum computer is very volatile system, any interference can cause decoherence, where the particles form a giant lump and no longer perform calculations. Critics argue that the human brain is awash in a host of different biochemicals and processes. So how could a quantum computer-like system operate there?
<urn:uuid:85e4c142-baec-4af8-b3f9-8defb344542f>
CC-MAIN-2022-33
https://www.soulask.com/human-brain-operates-outside-of-the-laws-of-physics-new-study-claims/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570741.21/warc/CC-MAIN-20220808001418-20220808031418-00423.warc.gz
en
0.936562
1,154
3.71875
4
Present simple — passive voice There are several reasons as to why we use the passive voice in English. In these notes, we are going to focus on the present simple in the passive voice. Generally, we use the passive voice when the focus is on the action and NOT on WHO or WHAT is performing the action. Present Simple passive construction: am/is/are + past participle Example verb: draw |I am drawn||We are drawn| |You are drawn||You (guys) are drawn| |He/she/it is drawn||They are drawn| The agent is unknown. We don’t know who is the agent - The man who is believed to have stolen the goods must be brought to justice. (we don’t know who is the man) We use the passive to emphasise the subject - Paris and London are visited by many people each year. (The emphasis is on Paris and London). We use the passive to talk about general truths - Certain animals are known to attack humans. We can use the passive if we want to be unclear or vague about the subject - Mistakes are committed. We use the passive when the subject is irrelevant (We don’t care who or what has caused the action to be). - English classes are taught here every day. (WHO teaches the classes is not important within the given situation). We use the passive in a more formal atmosphere like a thesis or an important piece of writing, especially scientifically speaking - The water is thus poured into the dish to form the desired product. - The whole scientific process is done over three years. Lesson #29: Present simple – passive voice Construction: am/is/are + past participle (helped, known, found) Example verb: make |I am made||We are made| |You are made||You (guys) are made| |He/she/it is made||They are made| - Which industries do you think will dominate the future, Sarah? - Well, we’re living in a very technological era,1 and I think we’re set2 to see the birth of technologies such as blockchain, cloud computers, electric cars and quantum computing. - It sounds incredible, doesn’t it?3 - It sure does. It is argued that cloud computing and quantum computers are the main innovations so far.4 - So, what is known about cloud computing thus far?5 - I only know from what I’ve read, but cloud computing is used by most of us already.6 - Oh really? How so? - The cloud is used for such things like7 our email accounts, documents and photos with Google etc., things like that, I guess. - Moreover, I’ve read that it’s expected8 we’ll see much more cloud computing in the future. - I sure hope so! - Well, we’re living in a very technological era. Here, the present continuous (we’re living) is used to talk about a present state. The state being ‘living in a very technological era’. The present simple could also be used here. - I think we’re set. The passive voice in the present simple is used here (we are set). The past participle is ‘set’ (set – set – set), and it’s being used to emphasise the subject ‘we’. - It sounds incredible, doesn’t it? ‘Doesn’t it’ is a question tag. The verb ‘do’ is used to form the question tag because ‘sounds’ is a normal verb. We always use ‘do’ as the default verb to make question tags with normal, non-auxiliary verbs. - It is argued that cloud computing and quantum computers are the main innovations so far. ‘It is argued’ is a passive construction for the present simple tense. The construction being the verb to be in third person singular (is) and the past participle of ‘argue’, ‘argued’. - What is known about cloud computing thus far? The present simple in the passive construction ‘is known’ is used because we don’t know anything about the subject. - Cloud computing is used by most of us already. The present simple in the passive ‘is used’ details the passive voice in the present simple. Emphasis is put on ‘cloud computing’. - The cloud is used for such things like… ‘is used’, is another use of the passive voice in present simple. - I’ve read that it’s expected. ‘It’s expected’ is the passive voice in the present simple. The passive is used here to be unclear or vague about ‘what is expected’. All passive forms: - Articles (a/an, the, zero article) - Pronouns: subject, object and possessive - Question tags - English conditionals - Interrogatives in English - Phrasal verbs - Prefixes and suffixes - Reported and direct speech - Numbers: cardinal, ordinal, and Roman numbers - The verb: “get” - ‘Get’ vs. ‘go’ and ‘got’ vs. ‘gotten’ - Copular verbs - Cleft sentences - Subjunctive in English - Vulgar and taboo in English - Split infinitive - Emphasis with inversion - Gerunds in English - To + infinitive - Bare infinitive - British and American spelling
<urn:uuid:68ea2d63-77d4-4d82-85b5-d520f4644a62>
CC-MAIN-2022-33
https://www.englishreservoir.com/all-passive-forms/present-simple-2-2/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571222.74/warc/CC-MAIN-20220810222056-20220811012056-00025.warc.gz
en
0.9175
1,260
3.953125
4
RICHMOND, Va. (March 19, 2007) – Researchers have made an important advance in the emerging field of ‘spintronics’ that may one day usher in a new generation of smaller, smarter, faster computers, sensors and other devices, according to findings reported in today's issue of the journal Nature Nanotechnology. The research field of ‘spintronics’ is concerned with using the ‘spin’ of an electron for storing, processing and communicating information. The research team of electrical and computer engineers from the Virginia Commonwealth University’s School of Engineering and the University of Cincinnati examined the ‘spin’ of electrons in organic nanowires, which are ultra-small structures made from organic materials. These structures have a diameter of 50 nanometers, which is 2,000 times smaller than the width of a human hair. The spin of an electron is a property that makes the electron act like a tiny magnet. This property can be used to encode information in electronic circuits, computers, and virtually every other electronic gadget. “In order to store and process information, the spin of an electron must be relatively robust. The most important property that determines the robustness of spin is the so-called ‘spin relaxation time,’ which is the time it takes for the spin to ‘relax.’ When spin relaxes, the information encoded in it is lost. Therefore, we want the spin relaxation time to be as long as possible,” said corresponding author Supriyo Bandyopadhyay, Ph.D., a professor in the Department of Electrical and Computer Engineering at the VCU School of Engineering. “Typically, the spin relaxation time in most materials is a few nanoseconds to a few microseconds. We are the first to study spin relaxation time in organic nanostructures and found that it can be as long as a second. This is at least 1000 times longer than what has been reported in any other system,” Bandyopadhyay said. The team fabricated their nanostructures from organic molecules that typically contain carbon and hydrogen atoms. In these materials, spin tends to remain relatively isolated from perturbations that cause it to relax. That makes the spin relaxation time very long. The VCU-Cincinnati team was also able to pin down the primary spin relaxation mechanism in organic materials, which was not previously known. Specifically, they found that the principal spin relaxation mechanism is one where the spin relaxes when the electron collides with another electron, or any other obstacle it encounters when moving through the organic material. This knowledge can allow researchers to find means to make the spin relaxation time even longer. “The organic spin valves we developed are based on self-assembled structures grown on flexible substrates which could have a tremendous impact on the rapidly developing field of plastic electronics, such as flexible panel displays,” said Marc Cahay, Ph.D., a professor in the Department of Electrical and Computer Engineering at the University of Cincinnati. “If the organic compounds can be replaced by biomaterials, this would also open news areas of research for biomedical and bioengineering applications, such as ultra-sensitive sensors for early detection of various diseases.” “These are very exciting times to form interdisciplinary research teams and bring back the excitement about science and engineering in students at a very young age to raise them to become the future generations of nanopioneers,” Cahay said. The fact that the spin relaxation time in organic materials is exceptionally long makes them the ideal host materials for spintronic devices. Organic materials are also inexpensive, and therefore very desirable for making electronic devices. The VCU-Cincinnati research advances nanotechnology, which is a rapidly growing field where engineers are developing techniques to create technical tools small enough to work at the atomic level. Additionally, by using nanoscale components researchers have the ability to pack a large number of devices within a very small area. The devices themselves are just billionths of a meter; and trillions of them can be packed into an area the size of a postage stamp. Furthermore, they consume very little energy when they process data. In 1994, Bandyopadhyay and colleagues were the first group to propose the use of spin in classical computing. Then two years later, they were among the first researchers to propose the use of spin in quantum computing. The recent work goes a long way toward implementing some of these ideas. The work is supported by the U.S. Air Force Office of Scientific Research and the National Science Foundation. Sandipan Pamanik, a graduate student in the VCU School of Engineering’s Department of Electrical and Computer Engineering, was first author of the study. The research team also included Carmen Stefanita, Ph.D., and graduate student, Sridhar Patibandla, both in the VCU Department of Electrical and Computer Engineering; and graduate students Kalyan Garre and Nick Harth from the University of Cincinnati’s Department of Electrical and Computer Engineering.
<urn:uuid:257c68c1-b86b-4327-882e-2d6aba397239>
CC-MAIN-2022-33
https://www.eurekalert.org/news-releases/816758
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573242.55/warc/CC-MAIN-20220818154820-20220818184820-00228.warc.gz
en
0.94703
1,049
3.71875
4
Flat solar panels still face big limitations when it comes to making the most of the available sunlight each day. A new spherical solar cell design aims to boost solar power harvesting potential from nearly every angle without requiring expensive moving parts to keep tracking the sun’s apparent movement across the sky. The spherical solar cell prototype designed by Saudi researchers is a tiny blue sphere that a person can easily hold in one hand like a ping pong ball. Indoor experiments with a solar simulator lamp have already shown that it can achieve between 15 percent and 100 percent more power output compared with a flat solar cell with the same ground area, depending on the background materials reflecting sunlight into the spherical solar cell. The research group hopes its nature-inspired design can fare similarly well in future field tests in many different locations around the world. “The placement and shape of the housefly’s eyes increase their angular field of view so they can see roughly 270 degrees around them in the horizontal field,” says Nazek El-Atab, a postdoctoral researcher in microsystems engineering at the King Abdullah University of Science and Technology (KAUST). “Similarly, the spherical architecture increases the ‘angular field of view’ of the solar cell, which means it can harvest sunlight from more directions.” To create the spherical solar cell design, El-Atab and her colleagues built upon their previous work, which demonstrated how to create thinner and more flexible solar cell designs based on a corrugated groove technique. The new work is detailed in a paper that has been submitted for review to the journal MRS Communications. Measurement setup of the spherical solar cell under a solar simulator in air and using a regular a white paper as the reflective background material.Photo: Nazek El-Atab/KAUST Testing with the solar simulator lamp showed that the spherical solar cell provided 24 percent more power output over a traditional flat solar cell upon immediate exposure to sunlight. That power advantage jumped to 39 percent after both types of solar cells had begun to heat up and suffered some loss in power efficiency—an indication that the spherical shape may have some advantages in dissipating heat. The spherical solar cell also delivered about 60 percent more power output than its flat counterpart when both could collect only scattered sunlight under a simulated roof rather than receiving direct sunlight. Additional experiments with different reflective backgrounds—including an aluminum cup, aluminum paper, white paper, and sand—showed that the hexagonal aluminum cup background helped the spherical solar cell outperform the flat solar cell by 100 percent in terms of power output. The Saudi team created the spherical solar cell using the monocrystalline silicon solar cells that currently account for almost 90 percent of the world’s solar power production. That choice sprang from the goal of helping to maximize the light-harvesting potential of such solar cells, along with the aim of potentially making it easier to scale up production if the design proves cost efficient. “What surprises me is the authors have demonstrated the ultra-flexibility that can be achieved with rigid silicon solar cells using the corrugation technique in a series of articles,” says Zhe Liu, a postdoctoral researcher in solar engineering at MIT, who was not involved in the study. “I’m more excited about the ability to make spherical cells, which means you can have industrial IBC-type (interdigitated back contact) silicon solar cells cover any shapes and ‘solarize’ everywhere.” Previous solar cell designs have fabricated tiny microscale spherical cells—sometimes made with nanowires or quantum dot cells—on top of a flat surface to help better collect both direct and scattered sunlight, says Rabab Bahabry, an assistant professor of physics at the University of Jeddah in Saudi Arabia. But the larger spherical solar cell may offer improved efficiency and coverage compared with the microsphere arrays when it comes to collecting sunlight reflected from background surfaces. Creating the large spherical solar cell required the researchers to etch alternating grooves in 15 percent of a flat solar cell to make a pattern resembling a band of elliptical shapes connected at the middle. A CO2 laser created the appropriate pattern in a polymeric hard mask covering the solar cell and allowed a deep reactive ion etching tool to create grooves in the exposed areas of the silicon solar cell. The flex and bend in those groove areas allowed the researchers to subsequently fold the solar cell into a spherical shape. Dust accumulation on a spherical solar cell is limited to the silicon area with a small tilt angle.Image: Rabab Bahabry/University of Jeddah and KAUST The loss of solar cell material in the areas that have been etched out reduces the overall potential solar power output. But the researchers see cost over time favoring spherical solar cells over flat solar cells in certain parts of the world because the spherical design is less prone to dust accumulation and may help dissipate heat that might otherwise reduce the solar cell’s efficiency. In addition, the spherical solar cells don’t require additional costly moving parts to continually track the sun. Still, the spherical solar cells may not replace traditional solar cell technology at utility-scale solar power plants, says Liu at MIT. In his view, this particular spherical solar cell design could find use in more niche market applications. He noted that one of his colleagues is currently searching for a solar cell design to cover a golf ball so that it can power a tracker inside the ball. But Liu sees much promise in such ultra-flexible solar cell designs being installed in buildings, cars, or even mobile devices. “The application of spherical design may seem very limited, but the ability to make commercial silicon solar cells into any shapes would enable broad adaption of photovoltaic in autonomous devices, such as IoT (Internet of Things) sensors, and autonomous vehicles,” Liu says. “If we can fully power these autonomous devices with shaped photovoltaic panels, this could be a game changer.” For future testing, Liu says he would like to see how the spherical solar cell performs in a wide array of both outdoor and indoor lighting environments at different times of day. He also wants to see how well the spherical solar cells can be integrated into certain applications that they might power. And he is curious about seeing a “quantified cost” summary of all the processing steps required to make such spherical solar cells in order to better understand the technology’s commercialization potential. The Saudi researchers had to manually fold and form their spherical solar cells in their latest demonstration, but they have already begun designing and developing ways to automate the process using “robotic hands” to mimic the manual folding, says Muhammad Mustafa Hussain, a professor of electrical and computer engineering at KAUST who was one of the study’s coauthors. Eventually, Hussain and his colleagues envision building and testing large arrays of the spherical solar cells. And they’re already working on new shapes that resemble tents or umbrellas to see if those offer any advantages. They are also integrating solar cells with the surfaces of drones that have unusual shapes. The COVID-19 pandemic that forced the closure of research labs has delayed the Saudi group’s initial plans for outdoor testing. But Hussain says the group still plans to move forward with field trials before the end of 2020. He expects help from the KAUST alumni network in eventually testing the spherical solar cells in California, along with countries such as Bangladesh, China, India, South Korea, Germany, Spain, Brazil, Colombia, Mexico, South Africa, Australia, and New Zealand. “We will be creating arrays of spherical cells for 100-square-foot to 1,000-square-foot areas, and will compare functionality over cost benefit with that of traditional cells,” Hussain says. “Next, we will deploy it in different geographic locations throughout the year to understand its performance and reliability.” Editor’s note: A correction to this article was made on 16 June 2020. The sentence on indoor experiments was revised to correct an inaccurate interpretation of the power output comparison between the spherical solar cell and flat solar cell in the submitted paper. Jeremy Hsu has been working as a science and technology journalist in New York City since 2008. He has written on subjects as diverse as supercomputing and wearable electronics for IEEE Spectrum. When he’s not trying to wrap his head around the latest quantum computing news for Spectrum, he also contributes to a variety of publications such as Scientific American, Discover, Popular Science, and others. He is a graduate of New York University’s Science, Health & Environmental Reporting Program.
<urn:uuid:1aa9031d-dbfc-4891-94b8-6ee6f3124cd2>
CC-MAIN-2022-33
https://spectrum.ieee.org/spherical-solar-cells-soak-up-scattered-sunlight
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570913.16/warc/CC-MAIN-20220809064307-20220809094307-00429.warc.gz
en
0.93002
1,781
3.546875
4
“SCIENTISTS FACTOR THE NUMBER 15.” Hardly a headline to grab the popular imagination. But when it’s done by a quantum computer – and one that’s scalable – it’s time to take notice. A paper published today in Science describes a five-atom quantum computer that can factor numbers – that is, start with a number and find numbers that, when multiplied, equal that first number. For instance, 15 factors into three times five. It’s also a striking illustration of how quantum computers will smash today’s internet encryption – when they arrive, that is. Computerised factoring is not new – quantum computers have factored numbers before (and those much bigger than 15). The key point here, though, is the new design can be upscaled to much more powerful versions simply by adding atoms. Many of the world’s public key security systems, which encrypt online banking transactions and the like, operate on a simple principle: that it’s easy to multiply two large prime numbers to generate a gigantic number. But given the gigantic number, it’s next to impossible to work out its factors, even using a computer. In March 1991 the encryption company RSA set a challenge – they published a list of very large numbers and announced cash awards for whoever could factor them. The prizes went from $1,000 for factoring a 100-digit number, up to $200,000 for a 617-digit number. A quarter of a century later, most of those numbers remain uncracked. But with a large enough quantum computer, factoring huge numbers – even those 600 digits long – would be child’s play. In classical computing, numbers are represented by either 0s or 1s called “bits”, which the computer manipulates in a series of linear, plodding logic operations trying every possible combination until it hits the right one. Without any prior knowledge of the answers, the system returned the correct factors (15 = 5 x 3), with a confidence of more than 99%. For example, to factor a 232-digit monster (the largest RSA number broken) took two years with hundreds of classical computers running in parallel – and ended up being solved too late to claim the $50,000 prize. In contrast, quantum computing relies on atomic-scale units, or “qubits”, that can be 0, 1 or – weirdly – both, in a state known as a superposition. This allows quantum computers to weigh multiple solutions at once, making some computations, such as factoring, far more efficient than on a classical computer. The problem has been building these qubits into a large-enough assembly to make meaningful calculations. The more atoms, the more they jostle together and the harder it is to control each one. And as superposition is a very delicate state, a small bump will cause an atom to flip to 0 or 1 easily. The new design, devised by physicists at the Massachusetts Institute of Technology and constructed at the University of Innsbruck in Austria, uses five calcium ions (atoms stripped of an electron) suspended in mid-air by electric and magnetic fields. The ions are close enough to one another – about a hundredth the width of a human hair – to still interact. The researchers use laser pulses to flip them between 0, 1 and superposition to perform faster, more efficient logic operations. Without any prior knowledge of the answers, the system returned the correct factors (15 = 5 x 3), with a confidence of more than 99%. Previous quantum computers achieved the same result with 12 ions. And this system is “straightforwardly scalable”, according to Isaac Chuang, a physicist at MIT whose team designed the computer. A truly practical quantum computer would likely require thousands of atoms manipulated by thousands of laser pulses. Meanwhile, other researchers are working on scalable computer systems using more conventional technology such as silicon. “It might still cost an enormous amount of money to build – you won’t be building a quantum computer and putting it on your desktop anytime soon – but now it’s much more an engineering effort, and not a basic physics question,” says Chuang. Whatever the cost, the abililty to crack internet security would make a large-scale quantum computer, literally, invaluable. Read our handy primer on quantum mechanics – Quantum physics for the terminally confused Read science facts, not fiction... There’s never been a more important time to explain the facts, cherish evidence-based knowledge and to showcase the latest scientific, technological and engineering breakthroughs. Cosmos is published by The Royal Institution of Australia, a charity dedicated to connecting people with the world of science. Financial contributions, however big or small, help us provide access to trusted science information at a time when the world needs it most. Please support us by making a donation or purchasing a subscription today.
<urn:uuid:17303406-0854-4243-92af-654400e98865>
CC-MAIN-2022-33
https://cosmosmagazine.com/science/physics/will-this-quantum-computer-take-down-internet-banking/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570901.18/warc/CC-MAIN-20220809033952-20220809063952-00229.warc.gz
en
0.924652
1,037
3.828125
4
We live in a time where the phrase “artificial intelligence” (called AI for short) is trendy and appears in the marketing descriptions of many products and services. But what is precisely AI? Broadly speaking, AI originated as an idea to create artificial “thinking” along the lines of the human brain. As of today, however, we can only make assumptions about how the human brain works, primarily based on medical research and observation. From a medical point of view, we know that the brain looks like a complex network of connections in which neurons are the main element and that our thoughts, memory, and creativity are a flow of electrical impulses. This knowledge has given hope to construct an analogous brain in an electronic version, either hardware or software, where neurons are replaced by electronics or software. However, since we are not 100% sure exactly how the brain works, all current models in AI are certain mathematical approximations and simplifications, serving only certain specific uses. Nevertheless, we know from observation that it is possible, for example, to create solutions that mimic the mind quite well – they can recognize the writing, images (objects), music, emotions, and even create art based on previously acquired experiences. However, the results of the latter are sometimes controversial. What else does AI resemble the human brain in? Well… it has to learn! AI solutions are based on one fundamental difference from classical algorithms: the initial product is a philosophical “tabula rasa”, or “pure mind”, which must first be taught. In the case of complex living organisms, knowledge emerges with development: the ability to speak, to move independently, to name objects, and in the case of humans and some animal species, there are elements of learning organized in kindergartens, schools, universities, and during work and independent development. Analogously in most artificial intelligence solutions – the AI model must first receive specific knowledge, most often in the form of examples, to be able to later function effectively as an “adult” algorithm. Some of the solutions learn once, while others improve their knowledge while functioning (Online Learning, or Reinforced Learning). It vividly resembles the human community: some people finish their education and work for the rest of their lives in one company doing one task. Others have to train throughout their lives as their work environment changes dynamically. Is AI already “smarter” than humans? As an interesting aside, we can compare the “computing power” of the brain versus the computing power of computers. It, of course, will be a simplification because the nature of the two is quite different. First, how many neurons does the average human brain have? It was initially estimated to be around 100 billion neurons. However, according to recent research (https://www.verywellmind.com/how-many-neurons-are-in-the-brain-2794889), the number of neurons in the “average” human brain is “slightly” less, by about 14 billion, or 86 billion neuronal cells. For comparison, the brain of a fruit fly is about 100 thousand neurons, a mouse 75 million neurons, a cat 250 million, a chimpanzee 7 billion. An interesting fact is an elephant’s brain (much larger than a human in terms of size), which has … 257 billion neurons, which is definitely more than the brain of a human. From medical research, we know that for each neuron, there are about 1000 connections with neighboring neurons or so-called synapses, so in the case of humans, the total number of connections is around 86 trillion (86 billion neurons * 1000 connections). Therefore, in simplified terms, we can assume that each synapse performs one “operation”, analogous to one instruction in the processor. At what speed does the brain work? In total … not much. We can determine it based on BCI type interfaces (Brain-Computer Interface), which not so long ago appeared as a result of the development of medical devices for electroencephalography (EEG), such as armbands produced by Emotiv, thanks to which we can control the computer using brain waves. Of course, they do not integrate directly with the cerebral cortex but measure activity by analyzing electrical signals. Based on this, we can say that the brain works at variable speed (analogous to the Turbo mode in the processor), and it is between 0.5Hz for the so-called delta state (complete rest) and about 100Hz for the gamma state (stress, full tension). Thus, we can estimate the maximum computational power of the brain as 8.6 billion operations (8.6*10^15) or 8.6 Petaflops! Despite the relatively slow performance of the brain, this is a colossal number thanks to the parallelization of operations. From Wikipedia (https://en.wikipedia.org/wiki/Supercomputer), we learn that supercomputers did not break this limit until the first decade of the 21st century. The situation will change with the advent of quantum computers, which inherently work in parallel, just like the human brain. However, as of today, quantum computing technology for cyber threat hunting is still in its infancy. In conclusion, at the moment, AI has not yet overtaken the human brain, but it probably will someday. However, we are only talking about learning speed here, leaving aside the whole issue of creativity, “coming up with” ideas, emotions, etc. AI and mobile devices Artificial intelligence applications require substantial computational power, especially at the so-called learning stage, and pose a significant challenge in integrating them with AR and VR solutions. Unfortunately, AR and VR devices mostly have very limited resources, as they are effectively ARM processor-based mobile platforms comparable in performance to smartphones. As a result, most artificial intelligence models are so computationally (mathematically) complex that they cannot be trained directly on mobile devices. OK – you can, but it will take an incredibly and unacceptably long time. So in most cases, to learn models, we use powerful PCs (clusters) and GPU gas pedals, mainly Nvidia CUDA. This knowledge is then “exported” into a simplified model “implanted” into AR and VR software or mobile hardware. In our next blog post, you’ll learn how we integrated AI into VR and AR, how we dealt with the limited performance of mobile devices, and what we use AI for in AR and VR.
<urn:uuid:4afa45f9-3352-4dbb-9846-9a699995e3aa>
CC-MAIN-2022-33
https://itsilesia.com/a-brief-overview-of-what-artificial-intelligence-is/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571993.68/warc/CC-MAIN-20220814022847-20220814052847-00029.warc.gz
en
0.944918
1,351
3.703125
4
When it comes to programming languages, the first name that comes to mind typically is C. Dating back to the ‘70s when it was developed at AT&T Bell Laboratories by Dennis Ritchie and Ken Thompson, it was easy to learn by those who wanted to work on computer coding. Most existing computer programs were written in assembly language, communicating directly with hardware but being complex, long, and hard to debug. C offered ease and intuitiveness of use and brought in a totally new audience to computer programming. Quantum computing represents the next stage in the development from classical computing. The latter encodes information as a series of 0s and 1s, while a qubit (from quantum computers) could be a 0, 1, or both at a time. Quantum computers use entangled quantum states with overlapping information bits to perform their calculations. This could make them much, much faster than classical computing at doing calculations and data-crunching, which is why they also looked at as the future of computing and as the source of programming languages for AI. Working with the full potential of quantum computing requires two things: - The most current technology - A quantum programming language to describe quantum algorithms Essentially, while the algorithm explains how to solve a problem, the programming language helps the computer to perform the necessary calculations by describing the algorithm. Present approaches to quantum computation look to adapt and use existing tools and technologies, as this would allow them to be run on devices that will be available over the next few years. Current quantum languages are somewhat similar to assembly languages in their expressiveness, as the programmer must provide every operation the computer is to perform. The former is also at a lower level than the latter in some respects – chiefly, in describing operations on individual quantum bits, more like what low-level hardware description languages do. Another shortcoming is how closely they are tied to specific hardware, describing the behavior of underlying circuits precisely and thereby requiring highly-detailed individual programming instructions describing the required minutiae. Given the complexity of current programming languages for quantum computers, a new language is needed. This is how Silq came about. It was created by researchers at ETH Zurich, Switzerland, and is claimed to be the first high-level quantum language in the world. Classical and quantum languages are currently quite far apart in their conceptual bases, and Silq looks to bridge that gap, offering an approach that is far more intuitive than imagined. And given how quantum computing could revolutionize AI, Silq could be a very useful programming language for AI. Silq offers several advantages, some of which are detailed below: - A level of abstraction close to that of C - Better usage of the potential of quantum computers than existing languages - The code used by Silq is more compact, faster, more intuitive, and easier to understand for programmers. - Existing quantum languages make it difficult to directly support subexpressions such as (a+b) + c, which are directly supported by Silq. - It facilitates the expression of the high-level intent of programmers through a descriptive view of quantum algorithms. A specialized compiler can take care of compiling these algorithms to low-level quantum circuits. - Programs in Silq are less-focused on low-level details, which makes analyzing such programs easier than the programs written in existing quantum languages. - Silq could facilitate the development of tools for analysis to support developers. What keeps Silq ahead of other languages is its design. It is the first programming language for quantum computing whose design does not limit its focus to the construction and functionality of underlying hardware. The design instead pays due consideration to the mindset of a programmer when a problem is to be solved, and helps in finding a solution that does not need the understanding of each detail of the architecture and implementation of the computer. Silq falls into the category of high-level programming languages, as it abstracts from technical details of a particular type of computer. It is the first such language for quantum computers, and is more expressive as it can use much lesser code to describe more complex algorithms and tasks. This is why programmers find it easier to comprehend and use, also because it works with different computer architectures. Possibly the most important innovation of Silq is in dealing with a particular common source of errors. More than one intermediary step makes up the process of calculating a task by a computer, in which process intermediate results or temporary values are created. Classical computers automatically get rid of these values in what is known as a process of “garbage collection”, which however is dicey for quantum computers, as previously-calculated values can interfere with correct calculations due to interactions with current values (which is also called quantum entanglement). This requires an uncomputation technique that is more advanced, and Silq allows such identification and erasure automatically. Silq is definitely a way ahead and is attracting more attention from computer scientists working on usable ideas. Given how it is easier to use, it could stimulate the development of further languages and algorithms for quantum computers.
<urn:uuid:88ce6e80-f687-4747-8f2b-1df80e37aab2>
CC-MAIN-2022-33
https://thequantuminsider.com/2020/08/08/silq-a-new-high-level-programming-language-for-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571536.89/warc/CC-MAIN-20220811224716-20220812014716-00630.warc.gz
en
0.951584
1,034
3.828125
4
Google researchers claim to have achieved a major milestone in computer science known as "quantum supremacy." Google scientists explain their breakthrough in a research paper, a copy of which was obtained by Fortune, that was briefly posted to a NASA website earlier this week before subsequently being taken down. NASA has been working with Google on one aspect of their quantum computing research. News of the paper's existence was first reported by The Financial Times on Friday. Google has declined to comment on the report. If the technology company has indeed achieved the milestone, it is a significant step towards the day when quantum computers, which use the powerful properties of quantum physics to perform their calculations, will be able to solve a vast array complex problems that lie beyond the abilities of today's most advanced supercomputers. Among the most anticipated uses of quantum computers is the ability to create new chemicals, like catalysts for producing nitrogen-based fertilizers or for use in cells in higher-powered batteries. Quantum computing could also be used to crack most commonly used forms of digital encryption. It may one day also be used to streamline logistics and delivery operations, as well as speeding up machine learning applications. But "quantum supremacy" does not mean quantum computers have yet arrived in the sense that they will soon replace the conventional computers that power our lives. What is quantum supremacy? Quantum supremacy means only that researchers have been able to use a quantum computer to perform a single calculation that no conventional computer, even the biggest supercomputer, can perform in a reasonable amount of time. In the case of Google, this calculation involved checking whether the output of an algorithm for generating random numbers was truly random. The researchers were able to use a quantum computer to perform this complex mathematical calculation in three minutes and 20 seconds, according to the paper. They say it would have taken Summit 3—an IBM-built machine that is the world's most powerful commercially-available conventional computer—about 10,000 years to perform the same task. How do quantum computers work? Quantum computers work by harnessing the properties of quantum mechanics. Classical computers process information in a binary format, called bits, which can represent either a 0 or 1. Quantum computers, in contrast, use logical units called quantum bits, or qubits for short, that can be put into a quantum state where they can simultaneously represent both 0 and 1. What's more, while the bits in a classical computer all operate independently from one another, in a quantum computer, the status of one qubit effects the status of all the other qubits in the system, so they can all work together to achieve a solution. These two properties are what give quantum computers so much more potential power than conventional computers. But while a conventional computer outputs the same answer to a problem every time you run a calculation, the outputs of a quantum computer are probabilistic. That means it does not always produce the same answer. So to use a quantum computer, you have to run a calculation through the system thousands or even millions of times, and the array of outputs converge around the answer that is most likely to be correct. In the case of Google's research, the company used a new quantum processor, which it named Sycamore, that has 54 qubits (although one did not function properly, the researchers said, so only 53 were actually used in the experiment) which sampled the random number generating circuit it was testing some 1 million times. What's so special about Sycamore? Sycamore is not the world's largest quantum processor. Google itself had produced a 72 qubit system last year. And Rigetti, a California startup working on quantum computers, has said it plans to have a 128 qubit system ready soon. But Google's researchers said they made major advances in how long its qubits can remain in a quantum state and how each qubit interacts with the other qubits next to it. That's important because when qubits fall out of a quantum state, they introduce errors into the calculations the quantum computer is performing. Those errors then have to be corrected by using additional qubits. These error rates are the reason that your laptop can beat today's quantum computers in getting a correct answer to most mathematical problems. Does quantum supremacy make quantum computers better than conventional computers? No. Google's achievement only means its quantum computer could outperform a classical supercomputer on this one complex calculation. The Google researchers say in their paper that their quantum computer may also have uses in optimization problems, machine learning as well as materials science and chemistry. But it is unclear how much of an advantage or increase in speed Google's new quantum computing hardware, which it used to achieve quantum supremacy, will have in these other applications. And Google's machine is not yet powerful enough to tackle other difficult mathematical problems, such as breaking current encryption systems, a task which involves factoring very large prime numbers, according to the research paper. For many business applications, in fact, today's quantum computers are no match for the power and accuracy of today's conventional laptops. Could hackers armed with quantum computers steal my bitcoin? For the moment, the public-private key encryption techniques on which bitcoin and other cryptocurrencies are based cannot be broken by a quantum computer. But Google's researchers, in their paper, predict that quantum computing power will continue to advance at a "double exponential rate," so those bitcoins may not be safe for all that much longer. The fear of quantum computers being capable of breaking most common encryption techniques has lead the U.S. National Security Agency to call for the adoption of new techniques that use different kinds of math that are not susceptible to attack from a quantum computer. Although the U.S. has not yet settled on which class of new algorithms should be used, a number of startups are currently helping financial firms and governments prepare their systems to use such "post-quantum" encryption methods. When can I have a quantum computer on my desk? Not any time soon. While almost any material that can be put into a quantum state can be used to form a qubit, the most advanced quantum systems today tend to use tiny bits of superconducting materials, often bonded together using fairly exotic materials. The qubits in Google's Sycamore processor used aluminum loops bonded with indium, an element that is about as rare as silver. To put those materials into a quantum state, and to safeguard the qubits from interference from outside energy sources, the quantum processors have to be carefully suspended in large dilution freezers at temperatures colder than those found in deep space. Ultimately, the companies racing to commercialize quantum computers— which besides Google and Rigetti, include IBM, Microsoft, Intel, D-Wave and a host of others—plan to offer customers the ability to run calculations on a quantum computer through the cloud. So it's more likely that one will never grace your desk, at all.
<urn:uuid:ed7af02a-e18c-436f-bd13-bfaf2541f835>
CC-MAIN-2022-33
https://www.polytrendy.com/what-is-quantum-supremacy-and-why-is-it-such-a-computing-milestone/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571097.39/warc/CC-MAIN-20220810010059-20220810040059-00629.warc.gz
en
0.946131
1,415
3.5
4
Researchers have fashioned ultrathin silicon nanoantennas that trap and redirect light, for applications in quantum computing, LIDAR and even the detection of viruses. Light is notoriously fast. Its speed is crucial for rapid information exchange, but as light zips through materials, its chances of interacting and exciting atoms and molecules can become very small. If scientists can put the brakes on light particles, or photons, it would open the door to a host of new technology applications. Now, in a paper published on Aug. 17, in Nature Nanotechnology , Stanford scientists demonstrate a new approach to slow light significantly, much like an echo chamber holds onto sound, and to direct it at will. Researchers in the lab of Jennifer Dionne , associate professor of materials science and engineering at Stanford, structured ultrathin silicon chips into nanoscale bars to resonantly trap light and then release or redirect it later. These "high-quality-factor" or "high-Q" resonators could lead to novel ways of manipulating and using light, including new applications for quantum computing, virtual reality and augmented reality; light-based WiFi; and even the detection of viruses like SARS-CoV-2. "We’re essentially trying to trap light in a tiny box that still allows the light to come and go from many different directions," said postdoctoral fellow Mark Lawrence , who is also lead author of the paper. "It’s easy to trap light in a box with many sides, but not so easy if the sides are transparent - as is the case with many Silicon-based applications." Make and manufactureBefore they can manipulate light, the resonators need to be fabricated, and that poses a number of challenges. A central component of the device is an extremely thin layer of silicon, which traps light very efficiently and has low absorption in the near-infrared, the spectrum of light the scientists want to control. The silicon rests atop a wafer of transparent material (sapphire, in this case) into which the researchers direct an electron microscope "pen" to etch their nanoantenna pattern. The pattern must be drawn as smoothly as possible, as these antennas serve as the walls in the echo-chamber analogy, and imperfections inhibit the light-trapping ability. "High-Q resonances require the creation of extremely smooth sidewalls that don’t allow the light to leak out," said Dionne, who is also Senior Associate Vice Provost of Research Platforms/Shared Facilities. "That can be achieved fairly routinely with larger micron-scale structures, but is very challenging with nanostructures which scatter light more." Pattern design plays a key role in creating the high-Q nanostructures. "On a computer, I can draw ultra-smooth lines and blocks of any given geometry, but the fabrication is limited," said Lawrence. "Ultimately, we had to find a design that gave good-light trapping performance but was within the realm of existing fabrication methods." High quality (factor) applicationsTinkering with the design has resulted in what Dionne and Lawrence describe as an important platform technology with numerous practical applications. The devices demonstrated so-called quality factors up to 2,500, which is two orders of magnitude (or 100 times) higher than any similar devices have previously achieved. Quality factors are a measure describing resonance behavior, which in this case is proportional to the lifetime of the light. "By achieving quality factors in the thousands, we’re already in a nice sweet spot from some very exciting technological applications," said Dionne. For example, biosensing. A single biomolecule is so small that it is essentially invisible. But passing light over a molecule hundreds or thousands of times can greatly increase the chance of creating a detectable scattering effect. Dionne’s lab is working on applying this technique to detecting COVID-19 antigens - molecules that trigger an immune response - and antibodies - proteins produced by the immune system in response. "Our technology would give an optical readout like the doctors and clinicians are used to seeing," said Dionne. "But we have the opportunity to detect a single virus or very low concentrations of a multitude of antibodies owing to the strong light-molecule interactions." The design of the high-Q nanoresonators also allows each antenna to operate independently to detect different types of antibodies simultaneously. Though the pandemic spurred her interest in viral detection, Dionne is also excited about other applications, such as LIDAR - or Light Detection and Ranging, which is laser-based distance measuring technology often used in self-driving vehicles - that this new technology could contribute to. "A few years ago I couldn’t have imagined the immense application spaces that this work would touch upon," said Dionne. "For me, this project has reinforced the importance of fundamental research - you can’t always predict where fundamental science is going to go or what it’s going to lead to, but it can provide critical solutions for future challenges." This innovation could also be useful in quantum science. For example, splitting photons to create entangled photons that remain connected on a quantum level even when far apart would typically require large tabletop optical experiments with big expensive precisely polished crystals. "If we can do that, but use our nanostructures to control and shape that entangled light, maybe one day we will have an entanglement generator that you can hold in your hand," Lawrence said. "With our results, we are excited to look at the new science that’s achievable now, but also trying to push the limits of what’s possible." Additional Stanford co-authors include graduate students David Russell Barton III and Jefferson Dixon, research associate Jung-Hwan Song, former research scientist Jorik van de Groep, and Mark Brongersma, professor of materials science and engineering. Jen is also an associate professor, by courtesy, of radiology and a member of the Wu Tsai Neurosciences Institute and Bio-X.
<urn:uuid:96765bcf-fefd-49a3-93ec-04573cb59c53>
CC-MAIN-2022-33
https://www.myscience.org/news/2020/slow_light_beam_steering-2020-stanford
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572220.19/warc/CC-MAIN-20220816030218-20220816060218-00230.warc.gz
en
0.945286
1,241
3.671875
4
Till 2025, the collective sum of the world’s data will grow from 33 zettabytes this year to a 175ZB by 2025. The security and privacy of such sensitive data remain a big concern. Emerging quantum communication and the latest computation technologies offer a promising solution. However, it requires powerful quantum optical circuits that can securely process the massive amounts of information we generate every day. To help enable this technology, scientists in USC’s Mork Family Department of Chemical Engineering and Materials Science have made a breakthrough in quantum photonics. A quantum optical circuit uses light sources to generate photons on-demand in real-time. The photons act as information-carrying bits (qubits). These light sources are nano-sized semiconductor “quantum dots”–tiny manufactured collections of tens of thousands to a million atoms packed within a volume of linear size less than a thousandth of the thickness of typical human hair buried in a matrix of another suitable semiconductor. They have so far been demonstrated to be the most flexible on-demand single-photon generators. The optical circuit requires these single-photon sources to be masterminded on a semiconductor chip. Photons with an almost identical wavelength from the sources should then be delivered a guided way. This permits them to be controlled to shape collaborations with different photons and particles to transmit and process information. Until now, there has been a significant barrier to the development of such circuits. The dots have different sizes, and shapes mean that the photons they release do not have uniform wavelengths. This and the lack of positional order make them unsuitable for use in the development of optical circuits. In this study, scientists showed that single photons could be emitted uniformly from quantum dots arranged precisely. Scientists used the method of aligning quantum dots to create single-quantum dot, with their remarkable single-photon emission characteristics. It is expected that the ability to align uniformly-emitting quantum dots precisely will enable the production of optical circuits, potentially leading to novel advancements in quantum computing and communications technologies. Jiefei Zhang, currently a research assistant professor in the Mork Family Department of Chemical Engineering and Materials Science, said, “The breakthrough paves the way to the next steps required to move from lab demonstration of single-photon physics to chip-scale fabrication of quantum photonic circuits. This has potential applications in quantum (secure) communication, imaging, sensing, and quantum simulations and computation.” The corresponding author Anupam Madhukar said, “it is essential that quantum dots be ordered in a precise way so that photons released from any two or more dots can be manipulated to connect on the chip. This will form the basis of building unit for quantum optical circuits.” “If the source where the photons come from is randomly located, this can’t be made to happen.” “The current technology that allows us to communicate online, for instance using a technological platform such as Zoom, is based on the silicon integrated electronic chip. If the transistors on that chip are not placed in exact designed locations, there would be no integrated electrical circuit. It is the same requirement for photon sources such as quantum dots to create quantum optical circuits.” Evan Runnerstrom, program manager, Army Research Office, an element of the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory, said, “This advance is an important example of how fundamental solving materials science challenges, like how to create quantum dots with precise position and composition, can have big downstream implications for technologies like quantum computing. This shows how ARO’s targeted investments in basic research support the Army’s enduring modernization efforts in areas like networking.” Using a method called SESRE (substrate-encoded size-reducing epitaxy), scientists created a precise layout of quantum dots for the circuits. They then fabricated regular arrays of nanometer-sized mesas with a defined edge orientation, shape, and depth on a flat semiconductor substrate composed of gallium arsenide (GaAs). Quantum dots are then created on top of the mesas by adding appropriate atoms using the following technique. Zhang said, “This work also sets a new world-record of ordered and scalable quantum dots in terms of the simultaneous purity of single-photon emission greater than 99.5%, and in terms of the uniformity of the wavelength of the emitted photons, which can be as narrow as 1.8nm, which is a factor of 20 to 40 better than typical quantum dots.” “That with this uniformity, it becomes feasible to apply established methods such as local heating or electric fields to fine-tune the photon wavelengths of the quantum dots to exactly match each other, which is necessary for creating the required interconnections between different quantum dots for circuits.” “We now have an approach and a material platform to provide scalably and ordered sources generating potentially indistinguishable single-photons for quantum information applications. The approach is general and can be used for other suitable material combinations to create quantum dots emitting over a wide range of wavelengths preferred for different applications, for example, fiber-based optical communication or the mid-infrared regime, suited for environmental monitoring and medical diagnostics.” - Jiefei Zhang, Qi Huang, Lucas Jordao, Swarnabha Chattaraj, Siyuan Lu, Anupam Madhukar. Planarized spatially-regular arrays of spectrally uniform single quantum dots as on-chip single-photon sources for quantum optical circuits. APL Photonics, 2020; 5 (11): 116106 DOI: 10.1063/5.0018422
<urn:uuid:42a4fa8f-9aaa-48b6-913c-a342f100138f>
CC-MAIN-2022-33
https://setnert.com/a-world-first-method-to-enable-quantum-optical-circuits-that-use-photons/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571987.60/warc/CC-MAIN-20220813202507-20220813232507-00631.warc.gz
en
0.902297
1,187
3.578125
4
Many scientists like to trace the start of nanoscience as a field of study back to “There’s Plenty of Room at the Bottom,” Richard Feynman’s address to the American Physical Society in 1959. Feynman envisioned building molecules and devices from the bottom up, much like putting together a Lego castle. He did not think this approach would tell us much about fundamental physics, but it would deliver many new technological applications. While researchers have come a long way in using nanoscience to probe physics and especially quantum physics, this month we focus on applications that would certainly have shocked Feynman. These range from origami robots smaller than a grain of sand and reconstructing how cells duplicate DNA to quantum computing, nonlinear optics, and even mind control. Origami robot smaller than a grain of sand Paul McEuen and Itai Cohen of the Kavli Institute at Cornell for Nanoscale Science are known for using origami to create nanobots. Their new creation takes this to the extreme. One-twentieth the size of a grain of sand (60 microns wide), their new origami bird snaps into shape in just 100 milliseconds. All it takes is a single jolt of electricity and the platinum-titanium-titanium dioxide bird will fold itself up and hold that shape indefinitely. Applying a negative voltage returns the 30-atom-thick bird (3,300 times thinner than a sheet of paper) back into its original shape. Next on the agenda, the researchers want to implant semiconductor circuitry, sensors, and a source of power — they have used solar cells in the past — so the robot can perform some basic tasks. We copy a light-year’s worth of DNA over our lifetimes. Scientists know a lot about the process, but we are far from understanding how the proteins that orchestrate this process place the right molecules in the right place at the right time to do it. Nynke Dekker, a member of Kavli Institute of Nanoscience Delft, and a team of researchers at the Francis Crick Institute, have unscrambled at least one part of the mystery. They started by attaching fluorescent labels to the origin recognition complex (ORC) protein, which begins the construction of the cell’s DNA replication machinery. They then watched as ORC diffused along strands of DNA—and came to a halt when it reached certain DNA sequences to kick off the process. They also discovered that a key “motor” for the replication machinery, know as MCM, exists in a previously unknown mobile form when not attached to ORC. By building this knowledge, Dekker hopes to one day construct a living synthetic cell from the ground up. Counting to millions of qubits Researchers have used several technologies to create circuits with up to 53 qubits (the quantum equivalents of transistors). Yet there is no clear way to scale these systems into the equivalent of digital chips packed with hundreds of millions or even billions of transistors. Now, Menno Veldhorst, a member of the Kavli Institute of Nanoscience Delft, thinks he may have found a technique that would scale up to millions of quantum elements. It is based on quantum dots, a technology often associated with high-definition televisions. In quantum computing, these artificial nanocrystals are used to trap an electron that researchers then entangle. So far, they have been unable to entangle more than two qubits. Veldhorst, however, switched to a system that works with holes (missing electrons) in germanium. Using simple circuitry, he has created a four-qubit grid that entangles easily and with very good control. Moreover, germanium is a well characterized semiconductor that interfaces well with other materials (like superconductors) being considered for quantum devices. A simpler way to control nonlinear devices Nonlinear devices are systems where one plus one does not equal two. Take, for example, nonlinear optical devices, which change light from one frequency to another. When you add strong nonlinearity to resonators, which trap and circulate light for lasers and other devices, the result is something Kavli Nanoscience Institute at Caltech member Alireza Marandi calls a “rich physics regime." That is another way of saying “really, really complicated,” which is the opposite of what engineers need if they want to design useful nonlinear resonators. Yet Marandi may have a solution for them. While testing a nonlinear resonator made of optical fiber and a nonlinear waveguide, he found that at certain lengths, the light entering the system made abrupt transitions to other frequencies (colors). This finding would enable engineers to tune a nonlinear resonator using just one variable. One day, Marandi speculates, this could lead to optical computers that would count colors the way digital computers count electric charges. For years, researchers have pursued technologies that would improve the ability of people who are paralyzed to communicate and control their surroundings by controlling a keyboard or robotic arm with their minds alone. The problem has always been reading the mind with enough precision to do something useful. Implanted electrodes in the brain do this, but they are invasive and potentially dangerous. Functional MRI works but requires costly machinery. Electroencephalography (EEG) is neither bulky nor expensive, but it lacks resolution. Now, a collaborative team that includes Kavli Nanoscience Institute at Caltech member Mikhail Shapiro, has developed a way to use ultrasound to read the part of the brain that controls motion planning. It works by measuring the movement of blood flowing through the brain. While still in its infancy, the technology can predict the arm and eye movements of two monkeys with good accuracy. The researchers hope they can develop the technology so that it is one day practical.
<urn:uuid:46c9d74f-bc27-4f58-99f2-cdf0f875ea12>
CC-MAIN-2022-33
https://kavlifoundation.org/news/plenty-of-room
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573163.7/warc/CC-MAIN-20220818033705-20220818063705-00433.warc.gz
en
0.939109
1,217
3.8125
4
What is 15 15 (fifteen) customary number following 14 preceding 16. what is 15 is: - A lucky number - A triangular number. - A hexagon number. - A pentatope number. Along with 13, one of the two numbers within the juvenile number range (13-19) so as not to use a single-digit number in their name prefix (the first syllable before the juvenile suffix); Instead, use the adjective form of five (fifth) as a prefix. The fifth bell number (i.e. the number of partitions for a set of size 4). a composite number; its correct divisors are 1, 3, and 5. the number of supersingular primes. A repdigit in binary (1111) and quaternary (33). In hexadecimal representation and all high bases, what is 15 is represented as F. the smallest number that can be factored in using Shor’s quantum algorithm. the magic constant of the standard magic square of single order-3: There are what is 15 perfect matches of the complete K6 graph and 15 binary trees with roots and four labelled leaves, both among the object types counted by double factorials. With only two exceptions, all prime quadruplets enclose a multiple of 15, and 15 itself is surrounded by the quadruple (11, 13, 17, 19). what is 15? perfect combinations of K6 Since what is 15 is the product of various Fermat primes 3 and 5, a regular what is 15 -sided polygon can be constructed using an unmarked compass and ruler. what is 15 is expressible by square roots. If a positive definite matrix of integers quadratic form represents all positive integers up to 15, it means all positive integers through the what is 15 sets 15 and 290. what is 15 contains the decimal digits 1 and 5 and is the result of adding the whole numbers from 1 to 5 (1 + 2 + 3 + 4 + 5 = 15) NCERT Solutions for Class 10 Maths Subdivision what is 15 The branch of mathematics that involves numerical descriptions of the outcome of an event, or whether or not it is actual, is called probability. In this regard, what is 15 vedantu provides accurate NCERT solutions for class 10 mathematical probabilities that include different types of sums that you can expect in exams. This is a scoring but tricky chapter of grade 10 math, so you should know the tips and tricks needed to solve number problems quickly. NCERT Maths Class 10 what is 15 Chapter Solution Guide is selected according to the latest CBSE Board Rules. The summaries of each topic are resolved with precision to clarify your concepts. Also, it is available in PDF format, and you can study it on the Vedantu website. Alternatively, you can even download it for free. You can also download NCERT Grade 10 Scientific Solutions and use them in your preparation. Access NCERT Solutions for Grade 10 Mathematics Chapter what is 15 – Probability - Complete the following statements: - Probability of event E + Probability of event “not E” = _____. If the probability of an event is pp, then the probability of “no event” is 1−p1−p. So the sum is p+1-p=1p+1-p=1. - The probability that an event will not occur is _____. The probability that an event will not occur is always 00. iii. The probability that an event occurs is _____. The probability that an event occurs is 11. Such an event is called a safe event. - The sum of the prospects of all elementary events in an experiment is _____. The quantity of the likelihoods of all specific events in an experiment is 11. - The probability of an event is inordinate than or equal to and less than or is equal to _____. The probability of an event is greater than or equal to 00 and less than or equal to 11. - Which of the following experiments has equally likely outcomes? To explain. - A driver tries to start a car. The car does not start or start. Equally likely outcomes defined as when each result is as expected to occur as the others. Therefore, the products are not equally likely. - A player efforts to shoot a basketball. He shoots or misses the shot. The outcomes are not equally likely. An attempt to answer true/false question made. The answer is correct or incorrect. Outcomes are equally likely outcomes. (iv) A baby is born. She is a boy or a girl. The outcomes are not equally likely. - Why is flipping a coin considered a fair way to decide which team should receive the ball at the start of a soccer game? We already know that there are only two sides to a coin, heads and tails so when we flip a coin, we get heads or tails there is no chance of the currency falling on edge and on the other hand, the opportunities of heads and tails are also equal. From this, it can be conclud that tossing a coin is a fair way to decide the outcome as it cannot be biase, and both teams have an equal chance of winning.
<urn:uuid:5421b9e5-8b0b-4599-9d57-5999195bbd52>
CC-MAIN-2022-33
https://www.techwadia.com/what-is-15/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573145.32/warc/CC-MAIN-20220818003501-20220818033501-00234.warc.gz
en
0.932874
1,144
3.515625
4
Professor Philip Walther – Indefinite Causal Order: Faster Computers and Fundamental Questions Quantum mechanics has greatly improved the speeds at which computers make calculations, but new research shows that quantum computers can be made to run even faster. Professor Philip Walther and his team at the University of Vienna have shown that the very orders in which quantum computers carry out operations can be superimposed, essentially meaning that two or more operations can be carried out at the same time. This work could give rise to even more efficient quantum computers in the near future, but also leaves some baffling questions about our physical understanding of the Universe. A Computational Revolution Over recent decades, research into quantum computing has laid the foundation for devices that will greatly improve the efficiency of classical computers. In regular digital computers, data is encoded into binary digits in two definite states – 0 and 1. These ‘bits’ of data are processed by logic gates, which output further bits whose states depend on those of the input bits. When sequences of these logic gates are arranged into circuits, they output information that acts as instructions to the computer, telling it what to do next. In quantum computers, however, quantum properties can be exploited to superimpose multiple states onto individual particles. These particles, known as quantum bits, or ‘qubits’, can essentially carry multiple 0s and 1s at the same time. As they pass through a quantum logic gate, all 0s and 1s are processed simultaneously. Quantum algorithms are designed so that the co-existence of 0s and 1s is affected by destructive and constructive interference, until only the 0s and 1s that are the sole output of the calculation are left. This approach offers huge advantages over regular computers, as instead of every bit of data corresponding to a single input state, many input states can be encoded onto just a few qubits, enabling the co-existence or ‘superposition’ of many different input states. This massive parallelism of input data and the possibility of having quantum circuits allows for quantum algorithms that require significantly fewer steps than classical algorithms in conventional computers. Overall, this means that quantum computers allow operations to be carried out far more efficiently, which reduces not only computational speeds, but energy consumption, and therefore costs. ‘It is truly remarkable that quantum physics keeps surprising us about possible concepts and applications for which quantum mechanical features can be exploited – and I am sure that we are still at the beginning of this journey’ However, Professor Philip Walther and his team at the University of Vienna believe that more complex quantum mechanical processes can be exploited to improve the efficiency of quantum computers even further. Even Faster Speeds for Quantum Computers In a 2015 study, Professor Walther and his colleagues showed that quantum mechanics allows for the superposition of not just quantum states on a single particle, but of entire circuits of quantum gates. This means that the order in which operations are carried out on sequences of quantum gates is indefinite. In other words, multiple operations could essentially be carried out at the same time. The team demonstrated that if a gate can be used multiple times, fewer gates need to be used overall, increasing the efficiency of the computer. By superimposing multiple circuits, the researchers could control which circuit was applied to an input qubit. Therefore, they could test whether the superposition of multiple circuits really improved computation speed by calculating the reduction in ‘query complexity’ (calculated from the smallest number of queries required to calculate a function) compared with conventional quantum computers. To implement their ideas experimentally, Professor Walther’s team created a simple quantum circuit, consisting of two logic gates they named Alice and Bob. Typically, an input qubit would either be sent from Alice to Bob or from Bob to Alice, resulting in two possible paths. However, the researchers added a layer of complexity to the scenario by encoding two qubits into the same photon (light particle) by using its path and polarisation as the variable parameters. The two qubits were named the ‘control qubit’, which would be acted upon by the scientists, and the ‘target qubit’, which would itself pass through the logic gates. The control qubit acts on the target qubit by defining the order of gate operations through which the target (input) qubit will propagate. When the control qubit is in one state, then the target qubit will first pass through Alice, and then through Bob, while when the control qubit is in the other state, the target qubit will pass through Bob first, and then Alice. Now, when the control bit is prepared in the superposition of both states, then the target qubit will have superimposed or indefinite orders: both Alice to Bob, and Bob to Alice. Therefore, the path taken by the target qubit depends entirely on the preparation of the control qubit. How Much Faster? When Alice and Bob are quantum gates, then this superposition of quantum gate orders is indefinite and does not allow us to know, even in principle, if one operation occurred before another operation, or the other way around. This means that two quantum logic gates A (for Alice) and B (for Bob) can be applied in both orders at the same time. In other words, gate A acts before B and B acts before A. Professor Walther’s team designed an experiment in which the two quantum logic gates were applied to single photons in both orders. The results of their experiment confirm that it is impossible to determine which gate acted first – but the experiment was not simply a curiosity. In fact, they were able to run a quantum algorithm to characterise the gates more efficiently than any previously known algorithm. From a single measurement on the photon, they probed a specific property of the two quantum gates thereby confirming that the gates were applied in both orders at once. For future developments as more gates are added to the task, this new method for quantum computers becomes even more efficient compared to previous techniques. The idea of ‘causality’ is fundamental to our understanding of how the Universe works. It defines the link between physical events that follow each other chronologically. If one event happens before a second event it’s linked to, it seems logical to us that the first event was the cause of the second – in other words, a ‘definite causal order’. However, in their exploration of the properties of quantum mechanics that allowed them to achieve faster computational speeds, Professor Walther’s team realised that their experiment appeared to utilise ‘indefinite causal order’. In their initial experiment, Professor Walther’s team could not observe indefinite causal order directly. The researchers had confirmed and quantified its apparent consequences with the faster computational speeds achieved, but they hadn’t yet measured the quantum mechanical properties that would confirm whether the causal order of the use of Alice and Bob was truly indefinite. To do this, they had to go significantly beyond the previous experiment by experimentally superimposing more complex processes for A and B. These processes included quantum measurements acting on the target bit when passing through Alice. Importantly, for enabling this in a circuit, the order of multiple quantum operations can be superimposed, and both possible outcomes of Alice were processed into Bob, or vice versa. From then on, there would be no chance to ever read the outcome of the initial gate – a measurement could only be made at the very end of the process, meaning it could never be determined which path was actually taken. This allowed the team to characterise the indefinite causal order by acquiring information from inside (where the superposition of causal orders take place) and outside, where the result after the processing through the circuit can be measured. As Professor Walther’s team mention in their paper, ‘this can lead to disconcerting consequences, forcing one to question concepts that are commonly viewed as the main ingredients of our physical description of the world. But these effects can be exploited to achieve improvements in computational complexity and quantum communications.’ It’s a somewhat startling idea. On a quantum scale, the comfortable notion that an outcome can be directly attributed to distinct previous events does not always hold, and yet this mysterious property can be exploited for our benefit. The work of Professor Walther and his colleagues has opened up a wide avenue of possibilities in quantum computing. There is now much progress both in increasing speeds and reducing costs of quantum computers in the near future – a significant step towards making them widely commercially available. Meet the researcher Professor Philip Walther Faculty of Physics University of Vienna Professor Philip Walther completed his PhD in Physics at the University of Vienna in 2005, after which he took a post as a postdoctoral researcher at Harvard University. He returned to Vienna in 2009, and is now a tenured Professor at the Faculty of Physics. His areas of research include various fields in the development of quantum computing, and investigating the interface between quantum physics and gravity. Professor Walther co-founded the TURIS research platform in 2017. He has received a variety of prestigious awards for his work and has been elected as a member of the Young Academy at the Austrian Academy of Sciences and as fellow of the American Physical Society. G Rubino, LA Rozema, A Feix, M Araújo, JM Zeuner, LM Procopio, Č Brukner, P Walther, Experimental verification of an indefinite causal order, Science Advances, 2017, 3, e1602589. LM Procopio, A Moqanaki, M Araújo, F Costa, I Alonso Calafell, EG Dowd, DR Hamel, LA Rozema, Č Brukner, P Walther, Experimental superposition of orders of quantum gates, Nature Communications, 2015, 6, 7913.
<urn:uuid:dc389475-9e90-482e-83f9-ee1e3d0a7ace>
CC-MAIN-2022-33
https://www.scientia.global/professor-philip-walther-indefinite-causal-order-faster-computers-and-fundamental-questions/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571222.74/warc/CC-MAIN-20220810222056-20220811012056-00036.warc.gz
en
0.949053
2,048
3.59375
4
Quantum computing promises to harness the strange properties of quantum mechanics in machines that will outperform even the most powerful supercomputers of today. But the extent of their application, it turns out, isn’t entirely clear. To fully realize the potential of quantum computing, scientists must start with the basics: developing step-by-step procedures, or algorithms, for quantum computers to perform simple tasks, like the factoring of a number. These simple algorithms can then be used as building blocks for more complicated calculations. Prasanth Shyamsundar, a postdoctoral research associate at the Department of Energy’s Fermilab Quantum Institute, has done just that. In a preprint paper released in February, he announced two new algorithms that build upon existing work in the field to further diversify the types of problems quantum computers can solve. “There are specific tasks that can be done faster using quantum computers, and I’m interested in understanding what those are,” Shyamsundar said. “These new algorithms perform generic tasks, and I am hoping they will inspire people to design even more algorithms around them.” Shyamsundar’s quantum algorithms, in particular, are useful when searching for a specific entry in an unsorted collection of data. Consider a toy example: Suppose we have a stack of 100 vinyl records, and we task a computer with finding the one jazz album in the stack. Classically, a computer would need to examine each individual record and make a yes-or-no decision about whether it is the album we are searching for, based on a given set of search criteria. “You have a query, and the computer gives you an output,” Shyamsundar said. “In this case, the query is: Does this record satisfy my set of criteria? And the output is yes or no.” Finding the record in question could take only a few queries if it is near the top of the stack, or closer to 100 queries if the record is near the bottom. On average, a classical computer would locate the correct record with 50 queries, or half the total number in the stack. A quantum computer, on the other hand, would locate the jazz album much faster. This is because it has the ability to analyze all of the records at once, using a quantum effect called superposition. With this property, the number of queries needed to locate the jazz album is only about 10, the square root of the number of records in the stack. This phenomenon is known as quantum speedup and is a result of the unique way quantum computers store information. The quantum advantage Classical computers use units of storage called bits to save and analyze data. A bit can be assigned one of two values: 0 or 1. The quantum version of this is called a qubit. Qubits can be either 0 or 1 as well, but unlike their classical counterparts, they can also be a combination of both values at the same time. This is known as superposition, and allows quantum computers to assess multiple records, or states, simultaneously. “If a single qubit can be in a superposition of 0 and 1, that means two qubits can be in a superposition of four possible states,” Shyamsundar said. The number of accessible states grows exponentially with the number of qubits used. Seems powerful, right? It’s a huge advantage when approaching problems that require extensive computing power. The downside, however, is that superpositions are probabilistic in nature — meaning they won’t yield definite outputs about the individual states themselves. Think of it like a coin flip. When in the air, the state of the coin is indeterminate; it has a 50% probability of landing either heads or tails. Only when the coin reaches the ground does it settle into a value that can be determined precisely. Quantum superpositions work in a similar way. They’re a combination of individual states, each with their own probability of showing up when measured. But the process of measuring won’t necessarily collapse the superposition into the value we are looking for. That depends on the probability associated with the correct state. “If we create a superposition of records and measure it, we’re not necessarily going to get the right answer,” Shyamsundar said. “It’s just going to give us one of the records.” To fully capitalize on the speedup quantum computers provide, then, scientists must somehow be able to extract the correct record they are looking for. If they cannot, the advantage over classical computers is lost. Amplifying the probabilities of correct states Luckily, scientists developed an algorithm nearly 25 years ago that will perform a series of operations on a superposition to amplify the probabilities of certain individual states and suppress others, depending on a given set of search criteria. That means when it comes time to measure, the superposition will most likely collapse into the state they are searching for. But the limitation of this algorithm is that it can be applied only to Boolean situations, or ones that can be queried with a yes or no output, like searching for a jazz album in a stack of several records. Scenarios with non-Boolean outputs present a challenge. Music genres aren’t precisely defined, so a better approach to the jazz record problem might be to ask the computer to rate the albums by how “jazzy” they are. This could look like assigning each record a score on a scale from 1 to 10. Previously, scientists would have to convert non-Boolean problems such as this into ones with Boolean outputs. “You’d set a threshold and say any state below this threshold is bad, and any state above this threshold is good,” Shyamsundar said. In our jazz record example, that would be the equivalent of saying anything rated between 1 and 5 isn’t jazz, while anything between 5 and 10 is. But Shyamsundar has extended this computation such that a Boolean conversion is no longer necessary. He calls this new technique the non-Boolean quantum amplitude amplification algorithm. “If a problem requires a yes-or-no answer, the new algorithm is identical to the previous one,” Shyamsundar said. “But this now becomes open to more tasks; there are a lot of problems that can be solved more naturally in terms of a score rather than a yes-or-no output.” A second algorithm introduced in the paper, dubbed the quantum mean estimation algorithm, allows scientists to estimate the average rating of all the records. In other words, it can assess how “jazzy” the stack is as a whole. Both algorithms do away with having to reduce scenarios into computations with only two types of output, and instead allow for a range of outputs to more accurately characterize information with a quantum speedup over classical computing methods. Procedures like these may seem primitive and abstract, but they build an essential foundation for more complex and useful tasks in the quantum future. Within physics, the newly introduced algorithms may eventually allow scientists to reach target sensitivities faster in certain experiments. Shyamsundar is also planning to leverage these algorithms for use in quantum machine learning. And outside the realm of science? The possibilities are yet to be discovered. “We’re still in the early days of quantum computing,” Shyamsundar said, noting that curiosity often drives innovation. “These algorithms are going to have an impact on how we use quantum computers in the future.” This work is supported by the Department of Energy’s Office of Science Office of High Energy Physics QuantISED program. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov. Source: Fermilab /
<urn:uuid:5fdfa3fb-fa49-4470-bd26-43d5bbc493fa>
CC-MAIN-2022-33
https://thequantuminsider.com/2021/04/13/fermilab-scientist-works-on-algorithms-to-make-quantum-computers-more-useful/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571056.58/warc/CC-MAIN-20220809155137-20220809185137-00043.warc.gz
en
0.93248
1,680
3.96875
4
In what may best be described as a quantum leap, a group of researchers from the University of Sussex have unveiled what they claim is the first realistic blueprint for the construction of a large scale quantum computer. As detailed in a paper published Wednesday in Science Advances, the quantum computer designed by the Sussex team leverages a new device they've created that allows quantum information to pass from one microchip of the quantum computer, to another using electric fields instead of fiber optic cables. This would allow for connection speeds between the microchips that are up to 100,000 times faster than those currently achievable using fiber optics. "There have been many studies where people made certain innovations to put us one step closer to a quantum computers," Winfried Hensinger, the head of the Ion Quantum Technology Group at the University of Sussex, told me. "But what we have done is quite a bit different: we've developed a nuts and bolts construction plan to build a large scale quantum computer." In other words, the Sussex group has taken a number of separate innovations in the field of quantum computing and brought them together to create a fine-grained blueprint for what is needed to build the first large scale quantum computer, covering everything from the back-of-house electronics to the power requirements for the machine. A large scale quantum computer would revolutionize the world of computing due to its ability to perform calculations that are impossible for a classical, binary computer to solve. This is a result of the way that quantum computers process information—unlike a classical computer, which stores information in bits (either a 1 or a 0), a quantum computer traffics in qubits, which can either be a 0, 1, or a combination of these two states at the same time (a property known as superposition). Experts worry that quantum computers will be able to easily break some of our most widely used forms of encryption today, and are preparing for that eventuality now. There are a number of proposals for how to actually go about building a quantum computer, but the most promising—and the kind described by the Sussex blueprint—is known as a trapped ion quantum computer. As its name suggests, this type of quantum computer makes use of ions (an atom or molecule with an electric charge) that are 'trapped' in electromagnetic fields. By changing the state of these ions—using microwaves to move an atom's electrons from one energy level to another—researchers make them function as qubits, or vessels of quantum information. To create a quantum computer, these qubits must interact with one another either by being physically moved from one location to another with lasers, or by emitting photons which are then transported through a fiber optic cable. In the Sussex blueprint, the quantum computer consists of a collection of hand-sized microchip modules, each of which, according to Hensinger, will be capable of trapping around 2,500 ions. When voltages are applied to these microchips, they create the electric field which traps the ions and levitates them above the microchip. Yet rather than having ions from one microchip interact with ions on another microchip using a complicated setup involving lasers or fiber optic cables, Hensinger and his colleagues have invented a device which uses the electric field itself to transport the ions from one microchip to an adjacent microchip. "They're trying to make the mechanisms for controlling and manipulating the qubits a lot easier," said Michele Mosca, a co-founder of the Institute for Quantum Computing at the University of Waterloo, who was not involved with the blueprint. "So instead of having countless lasers addressing individual ions, they want to use this [electric field] approach. It's impressive work." Aside from a much faster connection speed than using fiber optics to connect the microchip modules, the Sussex team's device offers another key improvement: much simpler and cheaper technology. Lasers work great when you're talking about a quantum computer that is only manipulating a handful of ions, but a large scale quantum computer that would be capable of, say, breaking the encryption standards used today would consist of millions of ions. This in turn would require millions of lasers, making it impractical with current technologies. Indeed, so far researchers have struggled to build trapped ion devices which are capable of manipulating more than about a dozen qubits. The Sussex blueprint, on the other hand, should be achievable with currently available technologies and able to manipulate far more qubits. Hesinger hopes that he and his colleagues will be able to build a small prototype over the next two years to prove the feasibility of their design. The prototype would only consist of only two microchips, but if it works, it could be the basis of a large scale quantum computer consisting of millions of ions and occupying a space the size of a football field (not to mention costing upwards of $120 million). How long it will take to get to a large scale quantum computer is anybody's guess. Just as a classical computer wasn't made in a day, the quantum computer will emerge in increments—the point, according to Hesinger, is that it's time to start building it. "This is not something we can do overnight, but our blueprint specifies what needs to be done," said Hesinger. "It won't be cheap or easy, but I think we're at a place now where we can think about engineering we need to do to build this machine." Get six of our favorite Motherboard stories every day by signing up for our newsletter .
<urn:uuid:29b1468b-3f56-454c-8592-81a547f17d2c>
CC-MAIN-2022-33
https://www.vice.com/en/article/pgzzgv/heres-how-to-build-the-first-large-scale-quantum-computer
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572089.53/warc/CC-MAIN-20220814234405-20220815024405-00443.warc.gz
en
0.947859
1,121
4.0625
4
Most experts agree that quantum computing is still in an experimental era. The current state of quantum technology has been compared to the same stage that classical computing was in during the late 1930s. Quantum computing uses various computation technologies, such as superconducting, trapped ion, photonics, silicon-based, and others. It will likely be a decade or more before a useful fault-tolerant quantum machine is possible. However, a team of researchers at MIT Lincoln Laboratory has developed a vital step to advance the evolution of trapped-ion quantum computers and quantum sensors. Most everyone knows that classical computers perform calculations using bits (binary digits) to represent either a one or zero. In quantum computers, a qubit (quantum bit) is the fundamental unit of information. Like classical bits, it can represent a one or zero. Still, a qubit can also be a superposition of both values when in a quantum state. Superconducting qubits, used by IBM and several others, are the most commonly used technology. Even so, trapped-ion qubits are the most mature qubit technology. It dates back to the 1990s and its first use in atomic clocks. Honeywell and IonQ are the most prominent commercial users of trapped ion qubits. Trapped-Ion quantum computers Honeywell and IonQ both create trapped-ion qubits using an isotope of rare-earth metal called ytterbium. In its chip using integrated photonics, MIT used an alkaline metal called strontium. The process to create ions is essentially the same. Precision lasers remove an outer electron from an atom to form a positively charged ion. Then, lasers are used like tweezers to move ions into position. Once in position, oscillating voltage fields hold the ions in place. One main advantage of ions lies in the fact that it is natural instead of fabricated. All trapped-ion qubits are identical. A trapped-ion qubit created on earth would be the perfect twin of one created on another planet. Dr. Robert Niffenegger, a member of the Trapped Ion and Photonics Group at MIT Lincoln Laboratory, led the experiments and is first author on the Nature paper. He explained why strontium was used for the MIT chip instead of ytterbium, the ion of choice for Honeywell and IonQ. “The photonics developed for the ion trap are the first to be compatible with violet and blue wavelengths,” he said. “Traditional photonics materials have very high loss in the blue, violet and UV. Strontium ions were used instead of ytterbium because strontium ions do not need UV light for optical control.” All the manipulation of ions takes place inside a vacuum chamber containing a trapped-ion quantum processor chip. The chamber protects the ions from the environment and prevents collisions with air molecules. In addition to creating ions and moving them into position, lasers perform necessary quantum operations on each qubit. Because lasers and optical components are large, it is by necessity located outside the vacuum chamber. Mirrors and other optical equipment steer and focus external laser beams through the vacuum chamber windows and onto the ions. The largest number of trapped-ion qubits being used in a quantum computer today is 32. For quantum computers to be truly useful, millions of qubits are needed. Of course, that means many thousands of lasers will also be required to control and measure the millions of ion qubits. The problem becomes even larger when two types of ions are used, such as ytterbium and barium in Honeywell’s machine. The current method of controlling lasers makes it challenging to build trapped-ion quantum computers beyond a few hundred qubits. Rather than resorting to optics and bouncing lasers off mirrors to aim beams into the vacuum chamber, MIT researchers have developed another method. They have figured out how to use optical fibers and photonics to carry laser pulses directly into the chamber and focus them on individual ions on the chip. A trapped-ion strontium quantum computer needs lasers of six different frequencies. Each frequency corresponds to a different color that ranges from near-ultraviolet to near-infrared. Each color performs a different operation on an ion qubit. The MIT press release describes the new development this way, “Lincoln Laboratory researchers have developed a compact way to deliver laser light to trapped ions. In the Nature paper, the researchers describe a fiber-optic block that plugs into the ion-trap chip, coupling light to optical waveguides fabricated in the chip itself. Through these waveguides, multiple wavelengths [colors] of light can be routed through the chip and released to hit the ions above it.” In other words, rather than using external mirrors to shine lasers into the vacuum chamber, MIT researchers used multiple optical fibers and photonic waveguides instead. A block equipped with four optic fibers delivering a range of colors was mounted on the quantum chip’s underside. According to Niffenegger, “Getting the fiber block array aligned to the waveguides on the chip and applying the epoxy felt like performing surgery. It was a very delicate process. We had about half a micron of tolerance, and it needed to survive cool down to 4 Kelvin.” I asked Dr. Niffenegger his thoughts about the long-term implications of his team’s development. His reply was interesting. “I think many people in the quantum computing field think that the board is set and all of the leading technologies at play are well defined. I think our demonstration, together with other work integrating control of trapped ion qubits, could tip the game on its head and surprise some people that maybe the rules aren’t what they thought. But really I just hope that it spurs more out of the box ideas that could enable quantum computing technologies to break through towards practical applications.” - Integrating optical waveguides into ion traps represents a step forward toward the goal of building a useful quantum computer with thousands to millions of qubits. - MIT’s technique also provides a development path for portable trapped-ion quantum sensors and clocks. - Integrated photonics is inherently resistant to vibrations. With external lasers, vibrations cause pulses to miss the ion. Integrated optics should eliminate most effects of vibrations. - The stability offered by integrated photonics will help qubits maintain quantum states longer so that deeper and more complex computations can be performed. - Initially I had some concerns about loss of optical power due to compromises that may have been made in the grating coupler to accommodate different wavelengths. Keep in mind there are four fibers and six colors. The shortest of the six laser wavelengths is 405 nm and the longest is 1092 nm. Dr. Niffenegger pointed out there are separate gratings for the shortest and longest wavelengths. He also said there are some power losses, but they are in the path from where light enters the optical waveguide to where it exits the coupler grating. Despite this minor optical power loss, tighter focus provided by the existing diffraction gratings provides enough power for operations on the ions. - Dr. Niffenegger and the MIT research team will focus future research on reducing two qubit gate errors caused by heating of the motional state of ion qubits. The rate at which ions heat up is much higher in traps with integrated photonic chips than traditional surface traps without photonics Note: Moor Insights & Strategy writers and editors may have contributed to this article.
<urn:uuid:1c16d273-15cc-4358-bcc9-7d71d409f3bd>
CC-MAIN-2022-33
https://moorinsightsstrategy.com/mit-lincoln-laboratory-creates-the-first-trapped-ion-quantum-chip-with-fully-integrated-photonics/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572174.8/warc/CC-MAIN-20220815115129-20220815145129-00244.warc.gz
en
0.926628
1,553
4.09375
4
Quantum Computing is based on physical materials that deal with very low temperature close to absolute zero, today, in order to increase abilities of Quantum Computers and to make them more convenient, the most important question is to handle is temperature problem. Semiconducting materials are the best choices to solve low temperature problem by approaching room temperature conditions. Yet, since many semiconducting materials can have many quantum degrees of freedom, the qubits may interact and dechore quickly. Thanks to growing atomic engineering and advanced semiconductor fabrication technologies, these effects are reducing day by day. Hence, in this presentation, I’m going to talk about semiconductor roles in Q.C. and how to people solve (their approach to solve) the interaction problem with qubits. Today, compared to classical computing (e.g. classical computers), quantum computing is the most effective way to store and manipulate information. For instance, instead of capacitors in classical computers where we store information such as empty ones (0’s) and filled ones (1’s), in quantum computing we are using quantum states (quantum bits – qubits) with quantum mechanical properties. Hence, we don’t only use zeros and ones as binary states from classical computers but we also use quantum states that represents zeros and ones at the same time. (a) Quantum Mechanical Properties of a Qubit In quantum computing, we owe the quantum mechanical properties that provides the best ability to store and manipulate information such as; -superposition-, -entanglement-, interference- . ✦ Quantum superposition: If we add two or more quantum states, their result will also be a quantum state. ✦ Quantum entanglement: If you have two identical particles and if you separate them (very far distance), a situation effects also effects the very distant one. ✦ Quantum interference: A particle can’t be more than one place at the same time, but sometimes it crosses its own trajectory and interfere its own path. (b) Structure of Solid State Q.C. ✦ (1) & (5) are the amplifiers that capture and process read out signals. ✦ (2) & (3 ) transmits the input and output qubits respectively . ✦ (4) enables qubits signals to go forward while preventing noise from compromising qubit quality. ✦ (6) the quantum processor sits inside a shield that protects it from electromagnetic radiation. ✦ (7) provides the necessary cooling power. (c) Heat in Solid State Q.C. As I mentioned in part (b), we are dealing with very low temperatures close to absolute zero (~ 0°K). Since dealing with qubits, we don’t want them to dechore quickly. While the temperature is getting lower, the degrees of freedom is getting reduced. If we consider %75 of a Q.C. as a refrigerator, it seems impossible to operate Q.C. in room temperature before qubit decoherence occurs. How to Run a Q.C. in Room Temperature Conditions ? In 2013, Canadian researches stored a qubit in room temperature for 39 minutes by stored quantum information in the nuclear spins of phosphorous-31 atoms in a silicon-28 crystal. Since, phosphorous atoms in silicon at room temperature tend to give up their electrons and become positive ions, at first they cooled its crystal to 4.2 °K and used laser and radio frequency (RF) pulses to put neutral phosphorous atoms into specific quantum states. A laser pulse then ionized the atoms before the crystal was warmed up to room temperature (~ 298°K). [Original article is placed to bottom] As a result, RF pulses were used to perform a “spin echo” (refocusing of spin magnetisation by a pulse of resonant electromagnetic radiation) measurement of the coherence time, which was found to be 39 minutes. Thus, imagine that, what if we reduce %75 cooler part of the quantum computer and optimize for our daily life ? With that much computational power, our classical computers would be like todays calculators. Imagine that simulations that takes months with performed by classical computers would be done in hours at your home. Therefore, it’s the future obviously. However, except the advantages of using nuclear spin qubits, there are also disadvantages. Challenges of Using Semiconductor Qubits In semiconductors many quantum degrees of freedom are present, and all tend to interact with each other. Thus, semiconductor qubits may decohere rapidly and in order to store and manipulate information quantum logic operations must be performed on a qubit before decoherence occurs. In order to avoid decoherence, devices must be engineered at or near the atomic level with respect to spin-orbit interaction. Most effective semiconductor fabrication techniques, to avoid spin-orbit interaction problems, are SRT-Embedded Heterostructures and Quantum Dot Arrays. Heterostructures are basically semiconductor structures where chemical composition changes with respect to position. In our case, it’s beneficial to use SRT (Spin Resonance Transistors) Embedded Heterostructures, since it enables quantum entanglement between qubits. Hence, we can use them for quantum logic gates (e.g. CNOT gates) on the surface of the semiconductor heterostructure. The other remarkable technique is the Quantum Dot Arrays where we can use them to lower electron tunneling barrier when two qubits couple, placed on top of a semiconductor heterostructure. Therefore, we can use them as entanglement switches such as, when the electrical field is turned off, the quantum dot qubits entangle. To sum up, Quantum Computing is the most efficient way for computational operations today and also future. However, since it operates low temperatures close to absolute zero and to provide that temperature conditions, its %75 of the structure is consistent of cooler mechanisms and its cost due to that needs don’t make them the number one choice. However, in 2013, the Canadian researchers stored a quantum state (a.k.a qubit) in room temperature conditions and show that we can optimize a solid state quantum computer by solving the low temperature problem. If we solve the problem, quantum computers would be smaller (like classical computers) and will be inevitable to operate that much computational power at our homes. Yet, since optimizing the computer structure and dealing with spin-orbit interactions, semiconductor structure must be fabricated at atomic level. The most effective fabrication processes are SRT- Embedded Heterostructures and Quantum Dot Arrays to provide quantum mechanical properties of qubits and computational needs (store-manipulate) of the computer. Thus, in the future we would be using our quantum computers, since researches are accelerated that much. Semiconductor Qubits for Quantum Computation, presentation by Matthias Fehr (TU Munich) JASS (2005) St.Petersburg/Russia — Semiconductor devices for Quantum Computing, presentation by Bruce Kane(University of Maryland) (2004) — http://www.ibm.com/quantum-computing/ — http://www.semiengineering.com/quantum-computing-becoming-real –www.eetimes.com/purdue-builds-quantum-computing-semiconductor-chip/# — Room-Temperature Quantum Bit Storage Exceeding 39 Minutes Using Ionized Donors in Silicon-28. Saeedi, Simmons, Salvail, et al. Science 342 (6160): 830-833 (2013)
<urn:uuid:9e666781-20bb-4449-a88e-7320c7f4c522>
CC-MAIN-2022-33
https://cryptoquantus.com/2020/07/10/growing-semiconductor-technologies-in-solid-state-q-c/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573623.4/warc/CC-MAIN-20220819035957-20220819065957-00044.warc.gz
en
0.897006
1,599
3.953125
4
Quantum computing is based on quantum mechanics, which governs how nature works at the smallest scales. The smallest classical computing element is a bit, which can be either 0 or 1. The quantum equivalent is a qubit, which can also be 0 or 1 or in what’s called a superposition — any combination of 0 and 1. Performing a calculation on two classical bits (which can be 00, 01, 10 and 11) requires four calculations. A quantum computer can perform calculations on all four states simultaneously. This scales exponentially: 1,000 qubits would, in some respects, be more powerful than the world’s most powerful supercomputer. In this digital-oriented world, hackers are evolving in parallel to technological advancements. Fortunately, engineers, mathematician and physicists are simultaneously working on innovative concepts that harness the progression of classical encryption methods. New devices are utilizing principles of quantum physics and deploying sophisticated and powerful algorithms for safe communication. What is cryptography? Cryptography is a means of securing data and information to dodge malicious hackers. Thanks to cryptographic methods, everything from web conferences to individual browsing history remain privileged and safe. Data are protected using algorithms that require a unique key for decryption and encryption. Utilization of the same private key, i.e. a specific string of bits for decryption and encryption, is called symmetric cryptography. Utilization of public keys for encryption and private keys for decryption — each of which are created by algorithm-fuelled random number generators — is called asymmetrical cryptography. Genuine randomness is considered unachievable by purely classical means, but can be accomplished with the added application of quantum physics. Quantum key distribution There are two methods by which large-scale quantum and classical computers can obscure private information. • Method #1: Recover the key generated during the key agreement phase. • Method #2: Interrupt the encryption algorithm. Quantum key distribution (QKD) is a quantum cryptographic primitive designed to generate unbreakable keys. QKD ensures key agreement, including well-known BB84 and E91 algorithms. In 2017, a Chinese team successfully demonstrated that satellites can perform safe and secure communications with the help of symmetrical cryptography and QKD. Still, it’s clear that QKD alone can’t satisfy all protection requirements, but there are other mechanisms for security enhancement by utilizing “quantum-safe” encryption algorithms based on solving mathematical problems instead of laws of quantum physics. An optimistic view of quantum-computing obstacles The most immediate challenge is accomplishing the most sufficient number of fault-tolerant qubits to boost quantum computing’s computational promises. Tech giants such as Google, Amazon, IBM and Honeywell are taking this problem under consideration and investing in it to come up with a solid solution. Currently, quantum computers are programmed for individual quantum logic gates. This might be acceptable for small-scale quantum computers, but less so once we come across a large number of qubits. Organizations such as IBM and Classiq are developing more and more abstract layers in the programming stack, allowing developers to nurture incredible and powerful quantum applications to provide solutions to real-world problems. For the implementation of complex problems including error-correction schemes, organizations need to prove that they can control numerous qubits. This control must have low latency and it must come from adaptive-feedback control circuits based on CMOS. Ultimately, the issue of “fan-out” must be addressed. The question that needs to be answered is how to pace up a number of qubits within a quantum chip. Multiple lasers or control wires are currently required, but it’s hard to see how we can develop multiple qubit chips with millions of wires connected to the circuit board or coming out of the cryogenic measurement chamber. Applying quantum computing to cybersecurity In recent years, researchers and analysts have been striving for the development of quantum-safe encryption. According to American Scientist, the United States National Institute of Standards and Technology is presently evaluating 69 new methods known as “post-quantum cryptography,” or PQC. Quantum computing offers an eminent, potential solution to cybersecurity and encryption threats. Any security-forward organization ought to develop an understanding of crypto agility. Quantum revolution is uncertain. While the intense impact of extensive fault-tolerant quantum computers may be far off, near-time quantum computers still present enormous advantages in enhancing levels of communication privacy and security. All organizations must consider developing innovative strategies around the long-term benefits and risks of quantum technology and computing, and be ready for the forthcoming quantum revolution. Today’s classical computers use two primary classes of algorithms for encryption: symmetric and asymmetric. • In symmetric encryption, the same key is used to encrypt and decrypt a given piece of data. The Advanced Encryption Standard (AES) is an example of a symmetric algorithm. Adopted by the US government, the AES algorithm supports three key sizes: 128 bits, 192 bits, and 256 bits. Symmetric algorithms typically are used for bulk encryption tasks, such as enciphering major databases, file systems, and object storage. • In asymmetric encryption, data is encrypted using one key (usually referred to as the public key) and is decrypted using another key (usually referred to as the private key). Although the private key and public key are different, they are mathematically related. The widely employed Rivest, Shamir, Adleman (RSA) algorithm is an example of an asymmetric algorithm. Even though it is slower than symmetric encryption, asymmetric algorithms solve the problem of key distribution, which is an important issue in encryption. Quantum risks to cybersecurity The advent of quantum computing will lead to changes to encryption methods. Currently, the most widely used asymmetric algorithms are based on difficult mathematical problems, such as factoring large numbers, which can take thousands of years on today’s most powerful supercomputers. However, research conducted by Peter Shor at MIT more than 20 years ago demonstrated the same problem could theoretically be solved in days or hours on a large-scale quantum computer. Future quantum computers may be able to break asymmetric encryption solutions that base their security on integer factorization or discrete logarithms. Although symmetric algorithms are not affected by Shor’s algorithm, the power of quantum computing necessitates a multiplication in key sizes. For example, large quantum computers running Grover’s algorithm, which uses quantum concepts to search databases very quickly, could provide a quadratic improvement in brute-force attacks on symmetric encryption algorithms, such as AES.⁵ To help withstand brute-force attacks, key sizes should be doubled to support the same level of protection. For AES, this means using 256-bit keys to maintain today’s 128-bit security strength. Even though large-scale quantum computers are not yet commercially available, initiating quantum cybersecurity solutions now has significant advantages. For example, a malicious entity can capture secure communications of interest today. Then, when large-scale quantum computers are available, that vast computing power could be used to break the encryption and learn about those communications. Eclipsing its potential risks, quantum cybersecurity can provide more robust and compelling opportunities to safeguard critical and personal data than currently possible. It is particularly useful in quantum machine learning and quantum random number generation. Why create a Quantum computer? The reasons are not only to improve the processing capacity and solve the problems that cannot be done with traditional computers. In the last 20 years, the complexity and number of transistors in a single CPU have increased exponentially. It seems that we found the limits of the transistor technology in the integrated circuit. The extreme miniaturization of electronic doors is causing the effects of a phenomenon that become much more significant, such as Electromigration and the Sub-threshold. These obstacles are, among other factors, that make researchers study new computing methods, such as the quantum computer. Preparing for The Quantum Future The quantum revolution is upon us. Although the profound impact of large-scale fault-tolerant quantum computers may be a decade off, near-term quantum computers will still yield tremendous benefits. We are seeing substantial investment in solving the core problems around scaling qubit count, error correction and algorithms. From a cybersecurity perspective, while quantum computing may render some existing encryption protocols obsolete, it has the promise to enable a substantially enhanced level of communication security and privacy. Organizations must think strategically about the longer-term risks and benefits of quantum computing and technology and engage in a serious way today to be ready for the quantum revolution of tomorrow. If you want more updates on latest technologies, please follow deeptechknowledge.com where we post about the upcoming technologies and their uses.
<urn:uuid:f3ab5281-5e2a-4cef-8074-144988508491>
CC-MAIN-2022-33
https://worldleadersummit.com/how-companies-can-use-quantam-technology-and-ai-to-improve-cyber-security/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571483.70/warc/CC-MAIN-20220811164257-20220811194257-00246.warc.gz
en
0.91327
1,818
3.96875
4